PHP MySQL query stops after 1273459 loops - php

I am using the script:
$file=fopen('part1.csv', 'r');
mysql_connect('localhost', '~~~', '~~~') or die(mysql_error());
mysql_select_db('Stubby') or die(mysql_error());
while (($buffer = fgets($file, 4096)) !== false) {
//echo $buffer;
$q = mysql_query('INSERT INTO allCombos (combo) VALUES (\'' . $buffer . '\')') or die(mysql_error());
}
fclose($file);
To load the very long contents of a CSV into a database. The CSV has around 3.5M lines. The querys stop at 1273459 lines. Why?

PHP generally sets its default script load timelimit to 30 seconds; you're probably hitting that limit. You can manually override it.
set_time_limit(0); //sets the time limit to infinity
set_time_limit(600); //sets the time limit to 10 minutes
Another possibility is that your script has run out of memory. You can raise it by doing something like:
ini_set('memory_limit', '32M'); //raises limit to 32 megabytes

Just stops, or any error is shown? On the other hand, you seem just copy file to database, why not to use LOAD DATA INFILE command? So you don't need a loop, and maybe even php application, that command can load csv file into the table (the fastest way to do it).
In your case you can execute the followig command to export:
LOAD DATA INFILE 'part1.csv' INTO TABLE allCombos
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
(just copied from MySQL LOAD DATA description page and set your parameters). Btw, are there any other fields, or csv file has everything what should be in that table?

I'm guessing you're running out of memory or exceeding the processing time allowed. Why not chunk it into smaller pieces and do it in groups of 500,000 or something? Or adjust the timeout.
http://php.net/manual/en/function.set-time-limit.php
Increasing memory you would have to do through the php.ini file.
http://php.net/manual/en/ini.core.php

build one big insert and hit database ones:
q = 'INSERT INTO allCombos (combo) VALUES ';
while (($buffer = fgets($file, 4096)) !== false) {
//echo $buffer;
$q .=' (\'' . $buffer . '\'), ';
}
q = substr ( q, 0, -2 ); // remove last comma
mysql_query (q);
playing with time limits also will help, however it will be more effective using resources...
most effective is using LOAD DATA INFILE form Maxym

Related

limitation with csv bulk insert in mysql

I have a functionality with csv file upload. Functionality is working if csv is having nearly 30k rows. But whenever csv file will have more than 30k rows then bulk insert is not working. Below is my code for reading csv and inserting into table.
$csvfile = fopen($file, 'r');
$i = 0;
$data4 = "";
while (!feof($csvfile))
{
$csv_data[] = fgets($csvfile, 1024);
$csv_array = explode(";", $csv_data[$i]);
$data4.= "('".$csv_array[2]."', '".$csv_array[4]."', '".$csv_array[0]."','".$csv_array[1]."'),";
}
$i++;
fclose($csvfile);
$data4 = substr($data4,0,-1);
$sql = "INSERT INTO csv_table(`column1`,`column2`,`column3`,`column4`) VALUES $data4";
mysqli_query($mysqliConn,$sql);
Only I am having issue when I have records more than 30k. Please suggest me changes here.
Thanks in advance
Pro tip: "Not working," of course, can mean anything from "my server caught fire and my data center burned to the ground," to "all my values were changed to 42," to "the operation had no effect." Understand your errors. Check the errors that come back from operations like mysqli_query().
That being said...
You're slurping up your entire CSV file's contents and jamming it into a single text string. It's likely that method falls over when the csv file is too long.
There's a limit on the length of a MySQL query. It's large, but not infinite, and it's set both by a server parameter and the server configuration. Read this. https://dev.mysql.com/doc/refman/5.7/en/packet-too-large.html
php can run out of memory as well.
How to fix? Process your CSV file not all at once, but in chunks of fifty rows or so. Once you've read fifty rows, do an insert.
Pro tip 2: Sanitize your input data. What will happen to your table if somebody puts a row like this in an uploaded CSV file?
"bwahahahaha!'; -- DROP TABLE csv_table;", "something", "somethingelse"
You may be OK. But do you want to run a risk like this?
Be aware that the public net is crawling with cybercriminals, and somebody will detect and exploit this kind of vulnerability in days if you leave it running.

Importing CSV with odd rows into MySQL

I'm faced with a problematic CSV file that I have to import to MySQL.
Either through the use of PHP and then insert commands, or straight through MySQL's load data infile.
I have attached a partial screenshot of how the data within the file looks:
The values I need to insert are below "ACC1000" so I have to start at line 5 and make my way through the file of about 5500 lines.
It's not possible to skip to each next line because for some Accounts there are multiple payments as shown below.
I have been trying to get to the next row by scanning the rows for the occurrence of "ACC"
if (strpos($data[$c], 'ACC') !== FALSE){
echo "Yep ";
} else {
echo "Nope ";
}
I know it's crude, but I really don't know where to start.
If you have a (foreign key) constraint defined in your target table such that records with a blank value in the type column will be rejected, you could use MySQL's LOAD DATA INFILE to read the first column into a user variable (which is carried forward into subsequent records) and apply its IGNORE keyword to skip those "records" that fail the FK constraint:
LOAD DATA INFILE '/path/to/file.csv'
IGNORE
INTO TABLE my_table
CHARACTER SET utf8
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 4 LINES
(#a, type, date, terms, due_date, class, aging, balance)
SET account_no = #account_no := IF(#a='', #account_no, #a)
There are several approaches you could take.
1) You could go with #Jorge Campos suggestion and read the file line by line, using PHP code to skip the lines you don't need and insert the ones you want into MySQL. A potential disadvantage to this approach if you have a very large file is that you will either have to run a bunch of little queries or build up a larger one and it could take some time to run.
2) You could process the file and remove any rows/columns that you don't need, leaving the file in a format that can be inserted directly into mysql via command line or whatever.
Based on which approach you decide to take, either myself or the community can provide code samples if you need them.
This snippet should get you going in the right direction:
$file = '/path/to/something.csv';
if( ! fopen($file, 'r') ) { die('bad file'); }
if( ! $headers = fgetcsv($fh) ) { die('bad data'); }
while($line = fgetcsv($fh)) {
echo var_export($line, true) . "\n";
if( preg_match('/^ACC/', $line[0] ) { echo "record begin\n"; }
}
fclose($fh);
http://php.net/manual/en/function.fgetcsv.php

Retrive Data Fron Txt File And Save It in Database

I have A data in txt file I want to save it in database
mytxtfile
xxxxxxxxx
xxxxxxxxx
xxxxxxxxx
I want to Save each line in database...
I want to make a PHP file which take 1st line from txt file and save it in db. Then again second line and so on....
myphpcode
$myFile = "data.txt";
$fh = fopen($myFile, 'r');
$theData = fgets($fh);
$con=mysqli_connect("example.com","peter","abc123","my_db");
// Check connection
if (mysqli_connect_errno())
{
echo "Failed to connect to MySQL: " . mysqli_connect_error();
}
mysqli_query($con,"INSERT INTO Persons (ID)
VALUES (''.$theData.'')");
mysqli_close($con);
?>
This Code Is Just Saving 1st line..
I'll answer your question first. Your code is fine in it's present condition but the problem is only one line being read. If you want to read the complete file content, you'll have to loop through each line. Change your code to include a while loop, like so:
$fh = fopen('file.txt', 'r');
while(!feof($fh)){
$theData = fgets($fh);
echo $theData; //save it to DB
}
feof() checks for end-of-file on a file pointer. If EOF (end of file) is reached, it'll exit the loop.
Alternative solution:
Using LOAD DATA INFILE:
Example:
LOAD DATA INFILE 'data.txt' INTO TABLE Persons
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
The advantage of using LOAD DATA INFILE is that it's considerably faster than other methods, but the speed may vary depending on other factors.
You should try something like:
while (!feof($fh)) {
$line = fgets($fh);
mysqli_query($con,"INSERT INTO Persons (ID)
VALUES ('".$line."')");
}
The loop goes through each line of the file and then inserts it into the database.
It is not the most efficient way, especially if you have a very large number of lines, but it does the trick. If you do have very many lines, than you should try and do everything in a single INSERT.
it's as simple as:
$ip = fopen('data.txt', "r");
$line = trim(fgets($ip));
while(!feof($ip)){
// do something with line in $line
//....
// reading next line
$line = trim(fgets($ip));
}
Two possibilities for this question:
First one just parse file using the SplFileObject http://php.net/manual/fr/class.splfileobject.php
In my mind using Spl object is always a good way of coding in PHP.
The second one consist in using LOAD DATA INFILE functionality of MySQL http://dev.mysql.com/doc/refman/5.6/en/load-data.html
This is a good way if you don't need to use data from the file but just storing them.

csv data import into mysql database using php

Hi I need to import a csv file of 15000 lines.
I m using the fgetcsv function and parsing each and every line..
But I get a timeout error everytime.
The process is too slow and data is oly partially imported.
Is there any way out to make the data import faster and more efficient?
if(isset($_POST['submit']))
{
$fname = $_FILES['sel_file']['name'];
$var = 'Invalid File';
$chk_ext = explode(".",$fname);
if(strtolower($chk_ext[1]) == "csv")
{
$filename = $_FILES['sel_file']['tmp_name'];
$handle = fopen($filename, "r");
$res = mysql_query("SELECT * FROM vpireport");
$rows = mysql_num_rows($res);
if($rows>=0)
{
mysql_query("DELETE FROM vpireport") or die(mysql_error());
for($i =1;($data = fgetcsv($handle, 10000, ",")) !== FALSE; $i++)
{
if($i==1)
continue;
$sql = "INSERT into vpireport
(item_code,
company_id,
purchase,
purchase_value)
values
(".$data[0].",
".$data[1].",
".$data[2].",
".$data[3].")";
//echo "$sql";
mysql_query($sql) or die(mysql_error());
}
}
fclose($handle);
?>
<script language="javascript">
alert("Successfully Imported!");
</script>
<?
}
The problem is everytime it gets stuck in between the import process and displays the following errors:
Error 1 :
Fatal Error: Maximum time limit of 30 seconds exceeded at line 175.
Error 2 :
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'S',0,0)' at line 1
This error I m not able to detect...
The file is imported oly partial everytime.. oly around 200 300 lines out of a 10000 lines..
You can build a batch update string for every 500 lines of csv and then execute it at once if you are doing the mysql inserts on each line. It'll be faster.
Another solution is to read the file with an offset:
Read the first 500 lines,
Insert them to the database
Redirect to csvimporter.php?offset=500
Return the 1. step and read the 500 lines starting with offset 500 this time.
Another solution would be setting the timeout limit to 0 with:
set_time_limit(0);
Set this at the top of the page:
set_time_limit ( 0 )
It will make the page run endlessly. However, that is not recommended but if you have no other option then cant help!
You can consult the documentation here.
To make it faster, you need to check your the various SQL you are sending and see if you have proper indexes created.
If you are calling user defined functions and these functions are referring to global variables, then you can minimize the time take even more by passing those variables to the function and change the code so that the function refers to those passed variables. Referring to global variables is slower than local variables.
You can make use of LOAD DATA INFILE which is a mysql utility, this is much faster than fgetcsv
more information is available on
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
simply use this # the beginning of your php import page
ini_set('max_execution_time',0);
PROBLEM:
There is a huge performance impact on the way you INSERT data into your table. For every one of your records you send an INSERT request to the server, 15000 INSERT requests that's huge!
SOLUTION::
Well you should group your data like the way mysqldump does. In your case you just need three insert statement not 15000 as below:
before the loop write:
$q = "INSERT into vpireport(item_code,company_id,purchase,purchase_value)values";
And inside the loop concatenate the records to the query as below:
$q .= "($data[0],$data[1],$data[2],$data[3]),";
Inside the loop check that the counter is equal to 5000 OR 10000 OR 15000 then insert data to the vpireprot table and then set the $q to INSERT INTO... again.
run the query and enjoy!!!
If this is a one-time exercise, PHPMyAdmin supports Import via CSV.
import-a-csv-file-to-mysql-via-phpmyadmin
He also notes the user of leveraging MySQL's LOAD DATA LOCAL INFILE. This is a very fast way to import data into a database table. load-data Mysql Docs link
EDIT:
Here is some pseudo-code:
// perform the file upload
$absolute_file_location = upload_file();
// connect to your MySQL database as you would normally
your_mysql_connection();
// execute the query
$query = "LOAD DATA LOCAL INFILE '" . $absolute_file_location .
"' INTO TABLE `table_name`
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(column1, column2, column3, etc)";
$result = mysql_query($query);
Obviously, you need to ensure good SQL practices to prevent injection, etc.

break down the large number of rows in CSV and process them as a batch wise php

I am upto edit and upload a CSV with more then 50000 records (shopping cart produts). and it should update the number of tables in the system. So i m using zend framework with my shopping cart.
im planing to break them down (50000 CSV records) in memory before processing them batch-wise using PHP/MYSQL
please any one can give me advice on this
What im up to now is
public function getDataFromPath($path=null) {
if($path == null) {
$path = $this->_path;
}
ini_set("auto_detect_line_endings", 1);
$fp = fopen($path, "r");
while(($line = fgetcsv($fp, 5000, ",")) !== FALSE) {
$line = self::trimArray($line);
$this->data[] = $line;
}
fclose($fp);
return $this->data;
}
regards
roshan
Every RDBMS out there shouldn't have problems with 50.000 rows. That's nothing. There's no need to process them batch wise.
Just use the LOAD DATA INFILE command and you will be fine.
For an example see here: LOAD DATA INFILE easily convert YYYYMMDD to YYYY-MM-DD?
UPDATE (cause of comment to Ion Wood's answer): To create a CSV file you can use the SELECT .. INTO OUTFILE command.
SELECT a,b,a+b INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;
For more info, see the manual.
This is a job for...
DATABASE MAN!!!
Load your csv directly into a table using the load data infile business and do what ever magic you need to after that.

Categories