Hi I need to import a csv file of 15000 lines.
I m using the fgetcsv function and parsing each and every line..
But I get a timeout error everytime.
The process is too slow and data is oly partially imported.
Is there any way out to make the data import faster and more efficient?
if(isset($_POST['submit']))
{
$fname = $_FILES['sel_file']['name'];
$var = 'Invalid File';
$chk_ext = explode(".",$fname);
if(strtolower($chk_ext[1]) == "csv")
{
$filename = $_FILES['sel_file']['tmp_name'];
$handle = fopen($filename, "r");
$res = mysql_query("SELECT * FROM vpireport");
$rows = mysql_num_rows($res);
if($rows>=0)
{
mysql_query("DELETE FROM vpireport") or die(mysql_error());
for($i =1;($data = fgetcsv($handle, 10000, ",")) !== FALSE; $i++)
{
if($i==1)
continue;
$sql = "INSERT into vpireport
(item_code,
company_id,
purchase,
purchase_value)
values
(".$data[0].",
".$data[1].",
".$data[2].",
".$data[3].")";
//echo "$sql";
mysql_query($sql) or die(mysql_error());
}
}
fclose($handle);
?>
<script language="javascript">
alert("Successfully Imported!");
</script>
<?
}
The problem is everytime it gets stuck in between the import process and displays the following errors:
Error 1 :
Fatal Error: Maximum time limit of 30 seconds exceeded at line 175.
Error 2 :
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'S',0,0)' at line 1
This error I m not able to detect...
The file is imported oly partial everytime.. oly around 200 300 lines out of a 10000 lines..
You can build a batch update string for every 500 lines of csv and then execute it at once if you are doing the mysql inserts on each line. It'll be faster.
Another solution is to read the file with an offset:
Read the first 500 lines,
Insert them to the database
Redirect to csvimporter.php?offset=500
Return the 1. step and read the 500 lines starting with offset 500 this time.
Another solution would be setting the timeout limit to 0 with:
set_time_limit(0);
Set this at the top of the page:
set_time_limit ( 0 )
It will make the page run endlessly. However, that is not recommended but if you have no other option then cant help!
You can consult the documentation here.
To make it faster, you need to check your the various SQL you are sending and see if you have proper indexes created.
If you are calling user defined functions and these functions are referring to global variables, then you can minimize the time take even more by passing those variables to the function and change the code so that the function refers to those passed variables. Referring to global variables is slower than local variables.
You can make use of LOAD DATA INFILE which is a mysql utility, this is much faster than fgetcsv
more information is available on
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
simply use this # the beginning of your php import page
ini_set('max_execution_time',0);
PROBLEM:
There is a huge performance impact on the way you INSERT data into your table. For every one of your records you send an INSERT request to the server, 15000 INSERT requests that's huge!
SOLUTION::
Well you should group your data like the way mysqldump does. In your case you just need three insert statement not 15000 as below:
before the loop write:
$q = "INSERT into vpireport(item_code,company_id,purchase,purchase_value)values";
And inside the loop concatenate the records to the query as below:
$q .= "($data[0],$data[1],$data[2],$data[3]),";
Inside the loop check that the counter is equal to 5000 OR 10000 OR 15000 then insert data to the vpireprot table and then set the $q to INSERT INTO... again.
run the query and enjoy!!!
If this is a one-time exercise, PHPMyAdmin supports Import via CSV.
import-a-csv-file-to-mysql-via-phpmyadmin
He also notes the user of leveraging MySQL's LOAD DATA LOCAL INFILE. This is a very fast way to import data into a database table. load-data Mysql Docs link
EDIT:
Here is some pseudo-code:
// perform the file upload
$absolute_file_location = upload_file();
// connect to your MySQL database as you would normally
your_mysql_connection();
// execute the query
$query = "LOAD DATA LOCAL INFILE '" . $absolute_file_location .
"' INTO TABLE `table_name`
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(column1, column2, column3, etc)";
$result = mysql_query($query);
Obviously, you need to ensure good SQL practices to prevent injection, etc.
Related
I have a functionality with csv file upload. Functionality is working if csv is having nearly 30k rows. But whenever csv file will have more than 30k rows then bulk insert is not working. Below is my code for reading csv and inserting into table.
$csvfile = fopen($file, 'r');
$i = 0;
$data4 = "";
while (!feof($csvfile))
{
$csv_data[] = fgets($csvfile, 1024);
$csv_array = explode(";", $csv_data[$i]);
$data4.= "('".$csv_array[2]."', '".$csv_array[4]."', '".$csv_array[0]."','".$csv_array[1]."'),";
}
$i++;
fclose($csvfile);
$data4 = substr($data4,0,-1);
$sql = "INSERT INTO csv_table(`column1`,`column2`,`column3`,`column4`) VALUES $data4";
mysqli_query($mysqliConn,$sql);
Only I am having issue when I have records more than 30k. Please suggest me changes here.
Thanks in advance
Pro tip: "Not working," of course, can mean anything from "my server caught fire and my data center burned to the ground," to "all my values were changed to 42," to "the operation had no effect." Understand your errors. Check the errors that come back from operations like mysqli_query().
That being said...
You're slurping up your entire CSV file's contents and jamming it into a single text string. It's likely that method falls over when the csv file is too long.
There's a limit on the length of a MySQL query. It's large, but not infinite, and it's set both by a server parameter and the server configuration. Read this. https://dev.mysql.com/doc/refman/5.7/en/packet-too-large.html
php can run out of memory as well.
How to fix? Process your CSV file not all at once, but in chunks of fifty rows or so. Once you've read fifty rows, do an insert.
Pro tip 2: Sanitize your input data. What will happen to your table if somebody puts a row like this in an uploaded CSV file?
"bwahahahaha!'; -- DROP TABLE csv_table;", "something", "somethingelse"
You may be OK. But do you want to run a risk like this?
Be aware that the public net is crawling with cybercriminals, and somebody will detect and exploit this kind of vulnerability in days if you leave it running.
I am trying to import a CSV file into my SQL database. This is what I have:
if ($_FILES[csvFile][size] > 0)
{
$file = $_FILES[csvFile][tmp_name];
$handle = fopen($file,"r");
do {
if ($data[0])
{
$insert_query = "REPLACE INTO `teacherNames` SET
`schoolName` = '".addslashes($schoolname)."',
`teacherName` = '".addslashes($data[0])."'
;";
$result = mysql_query($insert_query);
echo $insert_query; -- SEE RESULTING QUERY BELOW
echo $data[0]." added\n<br />";
}
}
while ($data = fgetcsv($handle,1000,",","'"));
The CSV file has 3 records and it looks correct. The procedure works to an extent but for some reason it is not reading the CSV file correctly and the resulting query is like this:
REPLACE INTO `teacherNames` SET `schoolName` = 'Brooks', `teacherName` = 'RMG JMC PMC';
When I would expect to get 3 separate queries - one for each record. It does not seem to be reading the CSV file as 3 separate records but as 1. Can anyone see why?
UPDATE:
The CSV contents are:
RMG
JMC
PMC
The anwer of Julio Martins is better if you have the file on the same computer as the MySQL server.
But if you need to read the file from inside the PHP, there is a note from PHP.NET at http://php.net/manual/en/function.fgetcsv.php :
Note: If PHP is not properly recognizing the line endings when reading
files either on or created by a Macintosh computer, enabling the
auto_detect_line_endings run-time configuration option may help
resolve the problem.
How is the line endings on your file? As all lines are being read as one, it can be your case i guess.
To turn auto_detect_line_endings on, use ini_set("auto_detect_line_endings", true); as said Pistachio at http://php.net/manual/en/filesystem.configuration.php#107333
Use while instead do-while:
while ($data = fgetcsv($handle,1000,",","'")) {
//...
}
Try load data:
LOAD DATA INFILE '{$filepath}'
INTO TABLE '{$table}'
FIELDS TERMINATED BY ','
It is cleaner
I'm faced with a problematic CSV file that I have to import to MySQL.
Either through the use of PHP and then insert commands, or straight through MySQL's load data infile.
I have attached a partial screenshot of how the data within the file looks:
The values I need to insert are below "ACC1000" so I have to start at line 5 and make my way through the file of about 5500 lines.
It's not possible to skip to each next line because for some Accounts there are multiple payments as shown below.
I have been trying to get to the next row by scanning the rows for the occurrence of "ACC"
if (strpos($data[$c], 'ACC') !== FALSE){
echo "Yep ";
} else {
echo "Nope ";
}
I know it's crude, but I really don't know where to start.
If you have a (foreign key) constraint defined in your target table such that records with a blank value in the type column will be rejected, you could use MySQL's LOAD DATA INFILE to read the first column into a user variable (which is carried forward into subsequent records) and apply its IGNORE keyword to skip those "records" that fail the FK constraint:
LOAD DATA INFILE '/path/to/file.csv'
IGNORE
INTO TABLE my_table
CHARACTER SET utf8
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 4 LINES
(#a, type, date, terms, due_date, class, aging, balance)
SET account_no = #account_no := IF(#a='', #account_no, #a)
There are several approaches you could take.
1) You could go with #Jorge Campos suggestion and read the file line by line, using PHP code to skip the lines you don't need and insert the ones you want into MySQL. A potential disadvantage to this approach if you have a very large file is that you will either have to run a bunch of little queries or build up a larger one and it could take some time to run.
2) You could process the file and remove any rows/columns that you don't need, leaving the file in a format that can be inserted directly into mysql via command line or whatever.
Based on which approach you decide to take, either myself or the community can provide code samples if you need them.
This snippet should get you going in the right direction:
$file = '/path/to/something.csv';
if( ! fopen($file, 'r') ) { die('bad file'); }
if( ! $headers = fgetcsv($fh) ) { die('bad data'); }
while($line = fgetcsv($fh)) {
echo var_export($line, true) . "\n";
if( preg_match('/^ACC/', $line[0] ) { echo "record begin\n"; }
}
fclose($fh);
http://php.net/manual/en/function.fgetcsv.php
I have large database table, approximately 5GB, now I wan to getCurrentSnapshot of Database using "Select * from MyTableName", am using PDO in PHP to interact with Database. So preparing a query and then executing it
// Execute the prepared query
$result->execute();
$resultCollection = $result->fetchAll(PDO::FETCH_ASSOC);
is not an efficient way as lots of memory is being user for storing into the associative array data which is approximately, 5GB.
My final goal is to collect data returned by Select query into an CSV file and put CSV file at an FTP Location from where Client can get it.
Other Option I thought was to do:
SELECT * INTO OUTFILE "c:/mydata.csv"
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY "\n"
FROM my_table;
But I am not sure if this would work as I have cron that initiates the complete process and we do not have an csv file, so basically for this approach,
PHP Scripts will have to create an CSV file.
Do a Select query on the database.
Store the select query result into the CSV file.
What would be the best or efficient way to do this kind of task ?
Any Suggestions !!!
You can use the php function fputcsv (see the PHP Manual) to write single lines of csv into a file. In order not to run into the memory problem, instead of fetching the whole result set at once, just select it and then iterate over the result:
$fp = fopen('file.csv', 'w');
$result->execute();
while ($row = $result->fetch(PDO::FETCH_ASSOC)) {
// and here you can simply export every row to a file:
fputcsv($fp, $row);
}
fclose($fp);
I have a thousands of data parsed from huge XML to be inserted into database table using PHP and MySQL. My Problem is it takes too long to insert all the data into table. Is there a way that my data are split into smaller group so that the process of insertion is by group? How can set up a script that will process the data by 100 for example? Here's my code:
foreach($itemList as $key => $item){
$download_records = new DownloadRecords();
//check first if the content exists
if(!$download_records->selectRecordsFromCondition("WHERE Guid=".$guid."")){
/* do an insert here */
} else {
/*do an update */
}
}
*note: $itemList is around 62,000 and still growing.
Using a for loop?
But the quickest option to load data into MySQL is to use the LOAD DATA INFILE command, you can create the file to load via PHP and then feed it to MySQL via a different process (or as a final step in the original process).
If you cannot use a file, use the following syntax:
insert into table(col1, col2) VALUES (val1,val2), (val3,val4), (val5, val6)
so you reduce to total amount of sentences to run.
EDIT: Given your snippet, it seems you can benefit from the INSERT ... ON DUPLICATE KEY UPDATE syntax of MySQL, letting the database do the work and reducing the amount of queries. This assumes your table has a primary key or unique index.
To hit the DB every 100 rows you can do something like (PLEASE REVIEW IT AND FIX IT TO YOUR ENVIRONMENT)
$insertOrUpdateStatement1 = "INSERT INTO table (col1, col2) VALUES ";
$insertOrUpdateStatement2 = "ON DUPLICATE KEY UPDATE ";
$counter = 0;
$queries = array();
foreach($itemList as $key => $item){
$val1 = escape($item->col1); //escape is a function that will make
//the input safe from SQL injection.
//Depends on how are you accessing the DB
$val2 = escape($item->col2);
$queries[] = $insertOrUpdateStatement1.
"('$val1','$val2')".$insertOrUpdateStatement2.
"col1 = '$val1', col2 = '$val2'";
$counter++;
if ($counter % 100 == 0) {
executeQueries($queries);
$queries = array();
$counter = 0;
}
}
And executeQueries would grab the array and send a single multiple query:
function executeQueries($queries) {
$data = "";
foreach ($queries as $query) {
$data.=$query.";\n";
}
executeQuery($data);
}
Yes, just do what you'd expect to do.
You should not try to do bulk insertion from a web application if you think you might hit a timeout etc. Instead drop the file somewhere and have a daemon or cron etc, pick it up and run a batch job (If running from cron, be sure that only one instance runs at once).
You should put it as said before in a temp directory with a cron job to process files, in order to avoid timeouts (or user loosing network).
Use only the web for uploads.
If you really want to import to DB on a web request you can either do a bulk insert or use at least a transaction which should be faster.
Then for limiting inserts by batches of 100 (commiting your trasnsaction if a counter is count%100==0) and repeat until all your rows were inserted.