PHP ODBC execute is trying to open a file - php

For some reason this code is causing odbc_execute(); to attempt to open a file...
$file = fopen('somefile.csv', 'r');
fgetcsv($file); // Skip the first line
$data = [];
while (($line = fgetcsv($file)) != false) {
$data[] = $line;
}
fclose($file);
try {
$conn = odbc_connect("Teradata", "User", "Pass");
odbc_autocommit($conn, false);
odbc_exec($conn, 'DELETE FROM table');
foreach ($data as &$test) {
$stmt = odbc_prepare($conn, 'INSERT INTO table (experiment_code, experiment_name, variant_code, variant_name, version_number, report_start_date, report_end_date, status, transaction_date, experiment_test_id, test_manager, product_manager, pod, created_date, last_updated_date) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)');
odbc_execute($stmt, $test);
}
odbc_commit($conn);
$result = odbc_exec($conn, 'SELECT * FROM table');
odbc_result_all($result);
} catch (Exception $e) {
odbc_rollback($conn);
echo $e->getMessage();
}
Here is a snip-it of the CSV file...
H1225,Some random text,H1225:001.000,Control,3,02/06/2014,03/31/2014,Completed,,HMVT-1225,Some random name,Some random name,Checkout,03/31/2014 16:54,02/06/2014 16:38
H1225,Some random text,H1225:001.000,Control,3,02/06/2014,03/31/2014,Completed,,HMVT-1225,Some random name,Some random name,Checkout,03/31/2014 16:54,02/06/2014 16:38
And here is the type of error I am getting...
Warning: odbc_execute(): Can't open file Control in C:\wamp\www\HEXinput\assets\php\dumpCSV.php on line 19
I get multiple version of the same error just with a different file name. The file name seems to be coming from column 3 (0 based). Another weird thing is that it actually does insert some lines correctly.
The final error I get is...
Fatal error: Maximum execution time of 120 seconds exceeded in C:\wamp\www\HEXinput\assets\php\dumpCSV.php on line 27
I am using Teradatas ODBC Drivers for version 15 on windows 7 64bit.
What could be causing this?

Turns out that some of the fields in the CSV file had single quotes in them which broke the query.
Simple but annoying oversight.

Related

PHP sqlsrv BULK INSERT incomplete

I am trying to insert a big file (few millions row) via SQL server BULK INSERT functionality. My SQL query will look like:
BULK INSERT MY_TABLE
FROM '\\myserver\open\myfile.csv'
WITH (
firstrow=2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
BATCHSIZE = 50000,
ERRORFILE = '\\myserver\open\myfileerror.log'
);
When I trigger it from MSSQL Server Management Studio, it always import it completely.
When I do it from my PHP code, sometimes it stop in the middle, without any error messages.
I tried with both sqlsrv_query or sqlsrv_prepare/sqlsrv_execute, same result.
sql; //like the query above
$statement = sqlsrv_query($connection, $sql);
if($statement === false) {
$error = sqlsrv_errors();
$error['sql'] = $sql;
throw new Exception(json_encode($error));
}
Would it be possible to get the logs of MSSQL from the $statement, the same I get from the MSSQL Studio? e.g. (50000 row(s) affected).
As a workaround, I have increased the BATCHSIZE to 1000000, but that is not a real solution.
Background information:
- PHP 7.1.9
- sqlsrv version: 4.3.0+9904
- sqlsrv.ClientBufferMaxKBSize: 10240
- Windows 2012 R2 Server
The issue was about the statement buffer. When I read it with sqlsrv_next_result, processing continue.
$statement = sqlsrv_query($connection, $sql);
if($statement === false) {
$error = sqlsrv_errors();
$error['sql'] = $sql;
throw new Exception(json_encode($error));
} else if($statement) {
while($next_result = sqlsrv_next_result($statement)){
#echo date("Y-m-d H:i:s",time()). " Reading buffer...\n";
}
}

SQL import CSV file with PHP

I'm trying to import a pretty big CSV file into my database (locally)
the file is 230MB and its about 8.8 million lines
the problem I have isn't opening the CSV or dont know how to import it,
the file opens, imports about 500,000 lines and then it quits and trows no error or timeout or anything, i just get to see my webpage.
this is the code:
try {
$conn = new PDO("mysql:host=$servername;dbname=adresses_database", $username, $password);
// set the PDO error mode to exception
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
echo "Connected successfully";
$row = 1;
if (($handle = fopen("bagadres.csv", "c+")) !== FALSE) {
while (($data = fgetcsv($handle, '', ";")) !== FALSE) {
if (!isset($write_position)) { // move the line to previous position, except the first line
$write_position = 0;
$num = count($data); // $num is 15
$row++; //i dont need this?
$stmt = $conn->prepare("INSERT INTO adresses (openbareruimte, huisnummer, huisletter, huisnummertoevoeging, postcode, woonplaats, gemeente, provincie, object_id, object_type, nevenadres, x, y, lon, lat) VALUES (:openbareruimte, :huisnummer, :huisletter, :huisnummertoevoeging, :postcode, :woonplaats, :gemeente, :provincie, :object_id, :object_type, :nevenadres, :x, :y, :lon, :lat)");
$stmt->bindParam(':openbareruimte', $data[0]);
$stmt->bindParam(':huisnummer', $data[1]);
$stmt->bindParam(':huisletter', $data[2]);
$stmt->bindParam(':huisnummertoevoeging', $data[3]);
$stmt->bindParam(':postcode', $data[4]);
$stmt->bindParam(':woonplaats', $data[5]);
$stmt->bindParam(':gemeente', $data[6]);
$stmt->bindParam(':provincie', $data[7]);
$stmt->bindParam(':object_id', $data[8]);
$stmt->bindParam(':object_type', $data[9]);
$stmt->bindParam(':nevenadres', $data[10]);
$stmt->bindParam(':x', $data[11]);
$stmt->bindParam(':y', $data[12]);
$stmt->bindParam(':lon', $data[13]);
$stmt->bindParam(':lat', $data[14]);
$stmt->execute();
} else {
$read_position = ftell($handle); // get actual line
fseek($handle, $write_position); // move to previous position
fputs($handle, $line); // put actual line in previous position
fseek($handle, $read_position); // return to actual position
$write_position += strlen($line); // set write position to the next loop
}
fflush($handle); // write any pending change to file
ftruncate($handle, $write_position); // drop the repeated last line
flock($handle, LOCK_UN);
}
fclose($handle);
}
}
catch(PDOException $e)
{
echo "Connection failed: " . $e->getMessage();
}
$conn = null;
i came this far looking for help on stackoverflow and PHP manual, i also searched if it were a mysql error.
but i cannot figure this out,
(for any suggestions about mysql settings im using linux mint 18)
I would strongly recommend that you use MySQL's LOAD DATA INFILE, which is probably the fastest and most efficient to get CSV data into a MySQL table. The command for you setup would look something like this:
LOAD DATA INFILE 'bagadres.csv'
INTO TABLE adresses
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
If your fields are not enclosed by quotes, or are enclosed by something other than quotes, then remove or modify the ENCLOSED BY clause. Also, IGNORE 1 ROWS will ignore the first row, which would make sense assuming that the first line of your file were a header row (i.e. not actual data but column labels).

mysql too many connection error.

I have a .txt file with a list of 60,000 English words. I wanted to insert those to my database, so I simply just did as show here.
$file = new SplFileObject('list.txt');
foreach ($file as $line => $word) {
$p = new PDO('mysql:host=localhost; dbname=test_dictionary', 'root', 'test');
$p->query("INSERT INTO words (english) VALUES('$word') ");
}
Now, after I run this script, I get the following error:
Fatal error: Uncaught exception 'PDOException' with message 'SQLSTATE[HY000] [1040] Too many connections' in /var/www/skillz/test/curl/index.php:17 Stack trace: #0 /var/www/test/index.php(17): PDO->__construct('mysql:host=loca...', 'root', 'test') #1 {main} thrown in /var/www/test/index.php on line 4
That line 4 is where the new PDO('mysql:') is located. So, I tried to search this error, and found this answer that seemed a solution. And I edited mysql accordingly, as
$ vi /etc/my.cnf
max_connections=250
But I still get the same error, I have MySql 5.5.38 running PHP-FPM, NGINX in CentOS 6.5
Don't open a new Connection for every word. You only need one connection open for the lifetime of your inserts. I'm not sure about the true lifetime of a PDO object, I know they're cleaned up when they're not used, but garbage collection might not do that for a couple of minutes, and for 60,000 words, you're going to hit your limit of connections to the database faster than it can clean them up.
$file = new SplFileObject('list.txt');
$p = new PDO('mysql:host=localhost; dbname=test_dictionary', 'root', 'test');
foreach ($file as $line => $word) {
$p->query("INSERT INTO words (english) VALUES('$word') ");
}
You should declare the SQL connection outside the foreach.
Because it will make 60.000 connections.
$file = new SplFileObject('list.txt');
$p = new PDO('mysql:host=localhost; dbname=test_dictionary', 'root', 'test');
foreach ($file as $line => $word) {
$p->query("INSERT INTO words (english) VALUES('$word') ");
}
You only need to declare the SQL connection once and you can use it anytime as you want.
If you put it in foreach it will make a SQL connection every word, that's why you got that message.
Solution 1
Use batch insert statement :
INSERT INTO words (col1, col2) VALUES ('val1', 'val2'), ('val3', 'val4'), ...('val3n', 'val4n');
This fails if you also want to check if some rows failed or not. So, below is another solution.
Solution 2
Create a persistent database connection. This will use the same connection in all iterations of the loop.
$file = new SplFileObject('list.txt');
$p = new PDO('mysql:host=localhost; dbname=test_dictionary', 'root', 'test', array(
PDO::ATTR_PERSISTENT => true)); //Persistent Database Connection
foreach ($file as $line => $word) {
$p->query("INSERT INTO words (english) VALUES('$word') ");
}
$p = null; //Destroy Connection

putcsv formatting wrong when result is from pdo

I am trying to import data from a db via pdo and output the results to a csv file. I am able to output to a screen correctly but the formatting in the csv is wild, double names and no '\n'
<?php
require_once('auth.php');
$conn = new PDO("mysql:host=localhost;dbname=$dbname", $username, $pw);
if (($handle = fopen("nameList2.txt", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, " ")) !== FALSE) {
$firstname = $data[0];
$lastname = $data[1];
$stmt = $conn->prepare("SELECT * FROM list WHERE FName = :firstname AND LName = :lastname");
$stmt->bindParam(':firstname', $firstname);
$stmt->bindParam(':lastname', $lastname);
$stmt->execute();
$result = $stmt->fetchAll();
//var_dump($firstname);
//var_dump($lastname);
//var_dump($result);
$fp = fopen('file.csv', 'w');
foreach($result as $chunk){
echo $chunk[4]." ".$chunk[6]." ".$chunk[7]." ".$chunk[10]." ".$chunk[11]."".$chunk[12]." ".$chunk[13]." ".$chunk[18]." ".$chunk[19]." ".$chunk[20]."<br />";
fputcsv($fp, $chunk);
}
fclose($fp);
}
fclose($handle);
//fclose($fp);
}
?>
You are feeding fputcsv bad data, so it's giving you bad output. Specifically, fetchAll retrieves each row as an array with both numeric and string keys, so each value appears twice.
Fix this by setting the fetch mode appropriately, for example
$result = $stmt->fetchAll(PDO::FETCH_NUM);
It's unclear what the problem with the line endings is -- you don't say, and I can't tell from the screenshot. What is certain is that fputcsv writes a single line feed as the line termination character. While the vast majority of programs will correctly detect and handle these Unix-style line endings, there are some others (e.g. Notepad) that won't.
Your problem with double names is because you doesn't use the method fetchAll() right:
you get the names twice in the $result.
Use that:
$result = $stmt->fetchAll(PDO::FETCH_ASSOC);
To fix the problem with \n try
ini_set('auto_detect_line_endings', true);

Error - Bad Row Offset

When getting all values from db table, I get this error on the last result value.
Warning: mssql_result() function.mssql-result:
Bad row offset (32) in C:\ms4w\Apache\htdocs\mapserver\data\.... on line 38
Line 38:
$str = "MyMap_".mb_convert_encoding(mssql_result($result_set, $row, 0),"UTF-8","SJIS")."_".mb_convert_encoding(mssql_result($result_set, $row, 1),"UTF-8","SJIS");
AND my settings($sql, $con):
$con = mssql_connect ("myServer", "myUsername", "myPassword");
$sql = "SELECT * FROM m_group_layer WHERE group_id=\"".$_SESSION["group_id"]."\" ORDER BY display_order";
$rs_group_layer = mssql_query ($sql, $con);
$group_layer_row = mssql_num_rows($rs_group_layer);
/* EDIT:
Function:
function getLayer($result_set, $row){
$str = "MyMap_".mb_convert_encoding(mssql_result($result_set, $row, 0),"UTF-8","SJIS")."_".mb_convert_encoding(mssql_result($result_set, $row, 1),"UTF-8","SJIS");
return "var ".$str.";\n\n";
}
Loop:
for($i=0; $i<=$group_layer_row; $i++){
echo getLayer($rs_group_layer, $i);
}
*/
Honestly I am not exactly sure what this error this. So if I could get some suggestions first about what typically causes this error, if there is other code that may be responsible I will post as needed.
Thanks for your help.
Check your main loop, the index variable must start with 0, so if you has 3 lines returning, the last index must be 2.
EDIT:
Change to for($i=0; $i<$group_layer_row; $i++){, without =, it's a common mistake

Categories