Fastest way to import csv file into MYSQL - php

This method uploads csv file to mysql .But upon thousands of data in csv file it takes lots of time to upload the data which is annoying.
$deleterecords = "TRUNCATE TABLE discount"; //empty the table of its current records
mysql_query($deleterecords);
//readfile($name);
//Import uploaded file to Database
$handle = fopen($name, "r");
$i=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($i>0){
$import="INSERT into discount(id,title,expired_date,amount,block)values('".$data[0]."','".$data[1]."','".$data[2]."','".$data[3]."','".$data[4]."')";
//imports data serially to the allocated columns.
mysql_query($import) or die(mysql_error());//query
}
$i=1;
}
fclose($handle);
//closing the handle
// print "Import done ";
?>
Can anyone suggest faster method for uploading data ?

Instead of writing a script to pull in information from a CSV file, you can link MYSQL directly to it and upload the information using the following SQL syntax.
To import an Excel file into MySQL, first export it as a CSV file. Remove the CSV headers from the generated CSV file along with empty data that Excel may have put at the end of the CSV file.
You can then import it into a MySQL table by running:
load data local infile 'uniq.csv' into table tblUniq fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)
as read on: Import CSV file directly into MySQL

Use the LOAD DATA INFILE statement.
https://dev.mysql.com/doc/refman/5.1/en/load-data.html
Load the data in a temporary table and use that for inserting to your main table with one statement.

You can insert data this way. This is default way to insert rows in a table.
$deleterecords = "TRUNCATE TABLE discount"; //empty the table of its current records
mysql_query($deleterecords);
//readfile($name);
//Import uploaded file to Database
$handle = fopen($name, "r");
$i=0;
$ins = "INSERT into discount(id,title,expired_date,amount,block) values ";
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($i>0){
$import .= $ins."('".$data[0]."','".$data[1]."','".$data[2]."','".$data[3]."','".$data[4]."'),";
//imports data serially to the allocated columns.
}
$import = rtrim($import,',');
mysql_query($import) or die(mysql_error());//query
$i=1;
}
fclose($handle);
//closing the handle
// print "Import done ";
?>

Instead of having multiple insert, build a big query and execute a single insert.
<?php
$deleterecords = "TRUNCATE TABLE discount"; //empty the table of its current records
mysql_query($deleterecords);
//readfile($name);
//Import uploaded file to Database
$handle = fopen($name, "r");
$i=0;
$what_to_insert = array();
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($i>0){
array_push($what_to_insert, "('".$data[0]."','".$data[1]."','".$data[2]."','".$data[3]."','".$data[4]."')");
}
$i=1;
}
fclose($handle);
if (count($what_to_insert)>0){
$import="INSERT into discount(id,title,expired_date,amount,block) values " . implode(",", $what_to_insert);
mysql_query($import) or die(mysql_error());//query
}
?>

If phpMyAdmin is available you can use the CSV import feature.

Related

Avoid inserting duplicate data in csv with PHP

I store data into a csv file by this:
$out = fopen('wpo_stock.csv', 'a');
fputcsv($out, array($order_id,$product_id,$product_name,$product_quantity, $regular_price));
fclose($out);
That stores my data in csv perfectly.
The problem is: Every time the page refreshed, it keeps inserting duplicated datas in the csv file.
How can I avoid inserting duplicated data by order_id (order_id is a unique value in my project)?
Code Updated :
$handle = fopen('wpo_stock.csv', 'r');
while (($data = fgetcsv($handle, ",")) !== FALSE) {
if($data[0] != $order_id){
$out = fopen('wpo_stock.csv', 'a');
fputcsv($out, array($order_id,$product_id,$product_name,$product_quantity, $regular_price));
} break;
}
fclose($handle);
I assume you are trying to override your data all the time when a page refreshes. In that case, you should open your file like this $out = fopen('wpo_stock.csv', 'w');.
But if you are trying to append new data, then you need to read all data from your file and compare with a new one.

Loading CSV file into mysql database using php

I ma struggling to load information from a csv file using PHP and want to save the contents of csv (mobile phone numbers) into mysql database table.The file contents look like this:
CSV file contents ( individual record per single line with no commas)
44762126064
447508751
4474669756
44771466603
444584871
445574805
447455471039
44777451345
447460345819
44793342963
44734838672
44752845528
4474537291
44779645078
I am try to upload csv file using form and submit.The code read csv file and tries to write the content into mysql table.The problem is that code is returning all the csv records in one array element like this:
Array(
[0] => 44762126064 447508751 4474669756 44771466603 444584871 445574805 447455471039 44777451345 447460345819 44793342963 44734838672 44752845528 4474537291 44779645078
);
and inserting the whole array as one record rather one mobile number per row in mysql table.
The code :
if (isset($_POST['submit'])) {
if (is_uploaded_file($_FILES['filename']['tmp_name'])) {
echo "<h1>" . "File ". $_FILES['filename']['name'] ." uploaded successfully." . "</h1>";
echo "<h2>Displaying contents:</h2>";
# readfile($_FILES['filename']['tmp_name']);
}
//Import uploaded file to Database
$handle = fopen($_FILES['filename']['tmp_name'], "r");
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$import="INSERT into mobilecsv(phoneMobile,status) values('$data[0]',0)";
mysql_query($import) or die(mysql_error());
}
fclose($handle);
print "Import done";
}
I have tried explode(), array_slice() etc but none of them helped me to split this array contents to save them as individual records/phone numbers in mysql database.
How can i split individual contents of first array element(as in my case) to save them as separate individual records as appearing in my CSV? I am a new programmer and would appreciate your help in this regard.Thanks
The problem is fgetcsv isn't detecting a comma which is at the heart of a COMMA separated value (CSV) file. Add this line at the top of your file after the php opening tag:
ini_set("auto_detect_line_endings", true);
change your import code to:
//Import uploaded file to Database
$handle = fopen($_FILES['filename']['tmp_name'], "r");
while(($data = fgetcsv($handle)) !== FALSE){
$phoneMobile = $line[0];
$import="INSERT into mobilecsv(phoneMobile,status) values('$phoneMobile',0)";
mysql_query($import) or die(mysql_error());
}
fclose($handle);
print "Import done";
}

Importing .csv file into mysql in php

I have a .csv file with me but i am unable to import it into the database. I have parsed my .csv file using the below query. Can you please help me how to insert data into MySql.
My code is:-
$fp = fopen('test.csv','r') or die("can't open file");
print "<table>\n";
while($csv_line = fgetcsv($fp,1024))
{
print '<tr>';
for ($i = 0, $j = count($csv_line); $i < $j; $i++) {
print '<td>'.$csv_line[$i].'</td>';
$data[] = $csv_line[$i];
}
print "</tr>\n";
}
print '</table>\n';
fclose($fp) or die("can't close file");
In MySQL we can import data from CSV file to a table very easily:
LOAD DATA LOCAL INFILE 'import.csv' INTO TABLE from_csv FIELDS TERMINATED BY ','  LINES TERMINATED BY '\n'  (FIELD1, FIELD2, FIELD3);
http://dev.mysql.com/doc/refman/5.0/en/load-data.html
Example:
http://www.techtrunch.com/linux/import-csv-file-mysql-table
The code seems to take the csv file and export it out to the browser as a tabular data table.
You mentioned importing into a mysql table, but there are no mysql information listed. Create a mysql table and try to map your fields you are importing to database fields in a table. The easiest way for me is to use phpmyadmin to generate the load data sql although the load data local infile mentioned in an answer will work as well if you understand enough to change it.
When I learned how to do it I used this tutorial.
http://vegdave.wordpress.com/2007/05/19/import-a-csv-file-to-mysql-via-phpmyadmin/
you can use this code. I hope this will be helpfull
//after uploading csv file from a upload file field
$handle = fopen($_FILES['csvfile']['tmp_name'], "r");
$header = fgetcsv($handle);
while(! feof($handle)){
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$import="INSERT into csvtest($header[0], $header[1], $header[2], $header[3], $header[4], $header[5],$header[6]) values ('$data[0]', '$data[1]', '$data[2]', '$data[3]', '$data[4]', '$data[5]', '$data[6]')";
mysql_query($import) or die(mysql_error());
}
}
fclose($handle);

How to loop through CSV mysql update quicker?

I have a csv file with 26,000 rows which i'm looping through rows and updating records (sometimes multiple) in a table with 250,000+ records. At the moment, its taken ages! I was wondering if there was an alternative way to do this quicker (in code or mysql/etc)
$row = 1;
if (($handle = fopen("zip-codes-database-DELUXE-BUSINESS2.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($row> 1){
# GET THE AREACODE FROM data 20
# name is: 21
$insert = "UPDATE ".TBLPREFIX."cities SET data = '".escape(serialize($data))."' WHERE area_code = ".$data[20]." AND title = '".trim(strtoupper($data[21]))."'";
mysql_query($insert) or die(mysql_error());
}
$row++;
}
fclose($handle);
}
Based on nothing I might try:
get the csv into a table via cmd line or 'load data infile'
update the records into a temp table using a 'insert ... select' where you join the old and new
move the temp table back onto the original (delete/rename)
Seems like it would be faster.. if a bit kludgy.

How to handle large CSV files to insert into mysql

I have a file a csv file (made as .txt) that I am currently parsing right now, but the file is about 350mb uncompressed. When it's zipped, it shows in the zip file as 23mb. My system completely freezes when I try to parse the 350mb file. I store the lines in an array like this. The first row are the headings.
$fh = fopen($inputFile, 'r');
$contents = fread($fh, filesize($inputFile)); // 5KB
fclose($fh);
//$contents = str_replace('"','',$contents);
$fileLines = explode("\n", $contents); // explode to make sure we are only using the first line.
Then I go through each line to insert it in a loop into mySQL. Since the file is about 350mb, would there be a way to parse it from the .zip file like .zip_filename.txt or would that even make a difference at all?
The file is too large to insert directly into mysql through the import method.
Use the built in function fgetcsv:
<?php
$row = 1;
if (($handle = fopen($inputFile, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);
echo "<p> $num fields in line $row: <br /></p>\n";
$row++;
for ($c=0; $c < $num; $c++) {
echo $data[$c] . "<br />\n";
}
}
fclose($handle);
}
?>
Also use multi insert if possible. Instead of running multiple queries:
insert into table (col1, col2) values("row1-col1", "row1-col2");
insert into table (col1, col2) values("row2-col1", "row2-col2");
Building one query like this is much quicker:
insert into table (col1, col2)
values ("row1-col1", "row1-col2"),
("row2-col1", "row2-col2");
By the way, you can also load a file directly into mysql:
load data local infile 'file.csv' into table table_name fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(col1, col2)
Consider using LOAD DATA INFILE, that'll allow you to insert the contents of a CSV file directly.

Categories