I am new to PHP
I am trying to load a large 14MB .csv into the mysql table.
But it is not fully uploaded into db, probably due to large file (~400000 rows). ERROR page took too long to respond.
Is there any faster way to do it.
My DB on Amazon RDS, PHP on EC2.
My current code is
<?php
header('Access-Control-Allow-Origin: *');
require "../config.php";
//$user_id = $_REQUEST['user_id'];
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
// path where your CSV file is located
define('CSV_PATH','./');
$csv_file = CSV_PATH . "data_unique.csv";
if (($handle = fopen($csv_file, "r")) !== FALSE) {
fgetcsv($handle);
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);
for ($c=0; $c < $num; $c++) {
$col[$c] = $data[$c];
}
$col1 = $col[0];
$col2 = $col[1];
$col3 = $col[2];
$col4 = $col[3];
$col5 = $col[4];
$col6 = $col[5];
// SQL Query to insert data into DataBase
$query = "INSERT INTO uniqueid_master(autoid,package_id,unique_id,user_id,issued,book_code) VALUES('".$col1."','".$col2."','".$col3."','".$col4."','".$col5."','".$col6."')";
$result = $conn->query($query);
}
fclose($handle);
}
echo "File data successfully imported to database!!";
$conn->close();
?>
I think you should try LOAD DATA Mysql statement.This will be very fast since you don't have to read everything into php.
mysqli_query($dblink, '
LOAD DATA LOCAL INFILE "'.$file.'"
INTO TABLE transactions
FIELDS TERMINATED by ","
OPTIONALLY ENCLOSED BY "\'"
LINES TERMINATED BY "\n"
');
could be write like this too:
$sql = "LOAD DATA LOCAL INFILE '/path/to/file.csv'
REPLACE INTO TABLE table_name FIELDS TERMINATED BY ','
ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES";
$result = $mysqli->query($sql);
OR
For an alternate method : Refer to this question too
This will improve some of PHP performance variables.
ini_set('memory_limit','-1');
ini_set('max_execution_time', 0);
This might solve your problem. But , it could still be possible, that, you reach a memory error. In that case, divide the csv file into multiple chunks and handle them one after another.
This is a good way to handle it.
How to extract data from csv file in PHP
Related
I have a database that I am loading a large csv file into. Until recently the script below worked without any issues, and I have not made any changes to the script itself, however it has suddenly stopped working.
An example of the CSV file is
"Offer.ID","Offer.PhoneCost","Offer.TotalCost","Offer.MonthlyCost","Offer.FreeGift","Offer.FreeGiftImage","Offer.FreeGiftCategory","Offer.FullFreeGift","Offer.OfferCashback","Offer.AutoCashback","Offer.Clearance","Offer.OfferMins","Offer.OfferTxts","Offer.OfferRental","Offer.OfferLength","Offer.OfferText","Offer.Link"
"8676820","0.00","165.00","13.75","","","","15.00 Guaranteed Cashback","15.00","1","0","0","0","15.00","0","£15.00 Automatic Cashback","http://www.urmob.co.uk/t/a/psav89/URMOB-xmakex-xmodelx-xtariffx/track.php%253fid=8676820"
and the PHP script I'm using is...
<?php
require_once 'dbconnect.php';
$dbh = db_connect();
$log = fopen("database-log.txt", "w");
$time = date("Y-m-d H:i:s");
//start the log
$logstart = "------------------------------------------------\n" . $time . " log start\n------------------------------------------------\n\n";
fwrite($log, $logstart);
//clean the temp table
$query = "TRUNCATE TEMPDATA";
$trunc = mysqli_query($dbh, $query);
if ($trunc) {
fwrite($log, "TEMPDATA cleared\n");
} else {
fwrite($log, "TEMPDATA failed\n");
}
//load affiliate window csv
$query = "load data local infile '/home/data.csv' into table rim6jtvnox6vmwxk.TEMPDATA FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 ROWS";
$update = mysqli_query($dbh, $query);
fwrite($log, $update);
if ($update) {
fwrite($log, "TEMPDATA Table updated with CSV\n");
} else {
fwrite($log, "Failed CSV update - " . mysqli_error($dbh) . "\n");
}
mysqli_close($dbh);
$endtime = date("Y-m-d H:i:s");
$logend = "\n\n------------------------------------------------\n" . $endtime . " log end\n------------------------------------------------\n";
//end the log
fwrite($log, $logend);
fclose($log);
?>
I know that the connection to the database is working without any issues, and can see the database being cleared. But the load script doesn't seem to execute and doesn't give any error from mysqli_error either.
I have a wordpress plugin that exports form entries to a txt file. So I need to write a php script to add them to a sql database as I want the submissions added to a database on a different domain (otherwise I’d just get the plugin to do it for me). I’m fine about how I get it to connect to the database, it’s just how I code it to interpret the data as the column names are always next to the field as shown.
{"Entry_ID":"235","Name":"matt","Email":"matt#gmail.com","Date":"03/10/2017"}{"Entry_ID":"236","Name":"matt","Email":"matt#btinternet.com","Date":"10/10/2017"}
Is there a way to get it to ignore the column name and only interpret the data within the “” after the : ?
Once these have been added to the sql database I would then need to get the lines removed from the txt
So far I have this but it isn't working...
$file= fopen('http://mpcreations.staging.wpengine.com/wp-content/themes/red-seal-resources/test.txt', 'r');
while (($data = fgetcsv($file)) !== FALSE) {
$object = json_encode($data[0]);
$servername = "";
$username = "";
$password = "";
$dbname = "";
// Create connection
$conn = mysqli_connect($servername, $username, $password, $dbname);
// Check connection
if (!$conn) {
die("Connection failed: " . mysqli_connect_error());
}
$query = "INSERT INTO 'wp_forms' LINES TERMINATED BY '\n';
if (mysqli_multi_query($conn, $query)) {
echo "New records created successfully";
} else {
echo "Error: " . $query . "<br>" . mysqli_error($conn);
}
mysqli_close($conn);
}
Any help would be greatly appreciated.
Thank you
Each line in the txt file has JSON data? Process the txt file, parse the data and INSERT it into the database table.
$file= fopen('file.txt', 'r');
while (($data = fgetcsv($file)) !== FALSE) {
$object = json_encode($data[0]);
// Prepare INSERT query here...
}
I'm trying to insert values extracted from a csv file to a mysql table. It runs but the table is not populated. I've tried to debug for the last XXXX but just can't see my error. Echo-ing out the values give me the correct SQL but when it comes to the INSERT - no dice.
Thanks very much for your help.
<?php
$host = 'localhost';
$user = 'fulltime_admin';
$pass = 'secret';
$database = 'fulltime_db';
$db = mysql_connect($host, $user, $pass);
mysql_query($database, $db);
//////////////////////////////// EDIT ////////////////////////////////////
$redirect_num = 500; // Select how many rows to insert each time before refresh.
// More rows = faster insertion. However cannot be too high otherwise it will timeout.
$filename = "ps4_emails.csv"; // The file we are going to get the data from...
$table = "`ps4_emails`";
////////////////////////////// END EDIT //////////////////////////////////
$file = file($filename);
$lines = count($file);
// Have we just redirected?
$nextline = $_GET['nextline'];
if (!isset($nextline)){
$nextline = 0;
}
$query = "INSERT INTO ".$table." (email) VALUES ('".$final_line[0]."')";
for ($line=$nextline; $line<=$lines; $line++){
$final_line = explode(",", $file[$line]);
if ($line!=$lines){
mysql_query($query,$db);
}
if ($line % $redirect_num){
// something needs to go here
} else {
$nextline = $line+1;
exit ('<meta http-equiv="refresh" content="0;url=texttomysqlemails.php?nextline='.$nextline.'" />');
}
echo ( $line==$lines ) ? "Done" : "";
}
?>
Put your query inside loop in order use it with variable $final_line.
Try this :
$final_line = explode(",", $file[$line]);
if ($line!=$lines){
$query = "INSERT INTO ".$table." (email) VALUES ('".$final_line[0]."')";
mysql_query($query,$db);
}
Don't use mysql_*. It's deprecated and removed from PHP 7. Use mysqli_* or PDO.
This seems like a perfect script to run from the command line PHP CLI and therefore you can forget about all the refresh complexity.
If the file is huge, like your comment suggest, loading all the file into memory may also bring you up against the PHP memory limits, so it might be better to read a line at a time rather than the whole file using fgetcsv() which is intended for reading csv files.
<?php
$host = 'localhost';
$user = 'fulltime_admin';
$pass = 'secret';
$database = 'fulltime_db';
$db = mysql_connect($host, $user, $pass);
mysql_query($database, $db);
$filename = "ps4_emails.csv";
$table = "";
$handle = fopen('ps4_emails.csv', 'r');
if ( ! $handle ) {
echo 'File does not exists in this location';
exit;
}
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$query = "INSERT INTO `ps4_emails` (email) VALUES( '{$data[0]}')";
mysql_query($query,$db);
}
?>
You now just run this script from the command line/terminal like
>php script.php
And it can run for minutes/hours/days with no likleyhood of blowing any limits.
I have to mention this or someone will nag me for not saying it
Please dont use the mysql_ database extension, it is deprecated (gone for ever in PHP7)
Especially if you are just learning PHP, spend your energies learning the PDO or mysqli_ database extensions,
and here is some help to decide which to use
When you need to upload the real file, it would also be a good idea to add a restart mechanism, so you can restart the process from whereever a problem happened or someone shut the database down for a backup or some other unforseen hiccup.
<?php
$host = 'localhost';
$user = 'fulltime_admin';
$pass = 'secret';
$database = 'fulltime_db';
$restart_from = 0;
$db = mysql_connect($host, $user, $pass);
mysql_query($database, $db);
$filename = "ps4_emails.csv";
$table = "";
$handle = fopen('ps4_emails.csv', 'r');
if ( ! $handle ) {
echo 'File does not exists in this location';
exit;
}
// is it a restart?
if ( file_exists('restart.txt') ) {
// its a restart
$restart_from = file_get_contents('restart.txt');
// read up the file to the last good row inserted
for ( $i=0; $i<=$restart_from; $i++ ) {
$data = fget($handle, 1000);
}
}
$upd_cnt = restart_from;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$query = "INSERT INTO `ps4_emails` (email) VALUES( '{$data[0]}')";
mysql_query($query,$db);
$upd_cnt++;
file_put_contents('restart.txt', $upd_cnt);
}
?>
The above restart code is not tested, but I have used something very like this in the past very successfully. So you will have to check I have not made any silly mistakes, but it should give you an idea of how to do a restart from the last row successfully updated before a crash.
You can use LOAD DATA INFILE to insert from file.
refer http://dev.mysql.com/doc/refman/5.7/en/load-data.html
insert csv file data into mysql
This is my code to generate csv file.When I click php button to generate Csv file,which is filled withthe contents based on the category column from the database.But my problem here is when the contents are getting populated twice in the csv file as shown below.Please help to out where i have to modify the code so that i can get only one time populated content as shown below as expected.Thanks in advance.
createcsv.php
<?php
$servername = "localhost";
$username = "user";
$password = "";
$dbname = "stats";
define("DB_SERVER", "localhost");
define("DB_NAME", "stats");
define("DB_USER", "user");
define("DB_PASSWORD", '');
$dbconn = #mysql_connect(DB_SERVER, DB_USER, DB_PASSWORD);
$conn = #mysql_select_db(DB_NAME,$dbconn);
// Create connection
//$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
echo "DB connection failed";
}
// Query DB to fetch hit count for each category and in turn create corresponding .csv file
function createCSVFile($type) {
$msql = "SELECT TRIM(TRAILING '.000000' from UNIX_TIMESTAMP(hitdate)*1000) as unixdate,count from h_stats where category='".$type."' order by unixdate asc";
$query = mysql_query($msql);
$type = str_replace(' ', '', $type);
$tmp_file = "data/tmp_".$type.".csv";
$fp = fopen("$tmp_file", "w");
// Write the query contents to temp file
while($row = mysql_fetch_array($query))
{
fputcsv($fp, $row);
}
fclose($fp);
// Modify the contents of the file as per the high chart input data format
$fp = fopen("$tmp_file", 'r+');
rewind($fp);
$file = "data/".$type.".csv";
$final = fopen("$file", 'w');
while($line = fgets($fp)){
trim($line);
$line = '['.$line.'],';
fputs($final,$line);
}
// Append var $type and remove the trailing ,
$final = file_get_contents($file);
$content = 'var '.$type .'= [' . rtrim($final, ","). ']';
file_put_contents("$file",$content);
}
// Query DB to fetch success/failure count for Hits and in turn create corresponding .csv file
function createHitOutcomeCSVFile($type,$category) {
$sql = "SELECT TRIM(TRAILING '.000000' from UNIX_TIMESTAMP(hitdate)*1000) as unixdate,".$type." from h_stats where category='".$category."' order by unixdate asc";
$query = mysql_query($sql);
$tmp_file = "data/tmp_".$type."_".$category.".csv";
$fp = fopen("$tmp_file", "w");
// Write the query contents to temp file
while($row = mysql_fetch_array($query)){
fputcsv($fp, $row);
}
fclose($fp);
// Modify the contents of the file as per the high chart input data format
$fp = fopen("$tmp_file", 'r+');
rewind($fp);
$category = str_replace(' ', '', $category);
$file = "data/".$type."_".$category.".csv";
$final = fopen("$file", 'w');
while($line = fgets($fp)){
trim($line);
$line = '['.$line.'],';
fputs($final,$line);
}
// Append var $type and remove the trailing ,
$final = file_get_contents($file);
$content = 'var '.$type.'_'.$category.'= [' . rtrim($final, ","). ']';
file_put_contents("$file",$content);
}
// Invoke function to create the Hits.csv file
createCSVFile('Hits');
// Invoke function to get Three Hits csv file
createHitOutcomeCSVFile('TCount','Hits');
// Invoke function to get O2 Hits csv file
createHitOutcomeCSVFile('BCount','Login');
echo "Generated successfully";
?>
not expected csv file with twice populated data:
var Login_Hits= [[1427826600000,1427826600000,8763,8763
]]
Expected csv file as per highcharts format:
var Login_Hits= [[1427826600000,8763
]]
Try to debug it...
it will be easier than seeing typo or so...
it looks like the tmp file is already corrupted...
try to display the $row variable and the $query...
the problem may come from here...
In while loop I have used mysql_fetch_assoc instead of mysql_fetch_array at both the functions
while($row = mysql_fetch_assoc($query))
{
fputcsv($fp, $row);
}
The content is not repeating twice in the Csv file.This works try it!
Hope someone can help me with what I think will be something minor (I'm still learning...). I'm trying to write the entire contents of a CSV File server based to an SQL database here is the code I presently have. The line // out writes perfectly and generates a new record. The $ar0 values generate no entries into the table named order - even though the csv file is about 100 lines long I just get
Error: INSERT INTO order (Picker,Order_Number,Timestamp,System)values ('','','','')
$file = "Pal.ORD.csv";
$tbl = "order";
$f_pointer=fopen("$file","r"); // file pointer
while(! feof($f_pointer)){
$ar=fgetcsv($f_pointer);
//$sql="INSERT INTO `order` (Picker,Order_Number,Timestamp,System)values ('Me','9999','23-01-2015','ORD')";
$sql="INSERT INTO `order` (Picker,Order_Number,Timestamp,System)values ('$ar[0]','$ar[1]','$ar[2]','$ar[3]')";
echo $sql;
echo "<br>";
}
if ($connect->query($sql) === TRUE) {
echo "New records created successfully";
} else {
echo "Error: " . $sql . "<br>" . $conn->error;
}
What I think may be going on is that your file probably has an empty line/carriage return as the last line in the file and is using that to insert the data as blank entries.
I can't be 100% sure about this since you have not provided a sample of your CSV file, however that is what my tests revealed.
Based on the following CSV test model: (Sidenote: blank lines will be ignored)
a1,a2,a3,a4
b1,b2,b3,b4
c1,c2,c3,c4
Use the following and replace with your own credentials.
This will create a new entry/row for each line found in a given file based on the model I have provide above.
<?php
$DB_HOST = 'xxx';
$DB_USER = 'xxx';
$DB_PASS = 'xxx';
$DB_NAME = 'xxx';
$db = new mysqli($DB_HOST, $DB_USER, $DB_PASS, $DB_NAME);
if($db->connect_errno > 0) {
die('Connection failed [' . $db->connect_error . ']');
}
$file = "Pal.ORD.csv";
$delimiter = ',';
if (($handle = fopen("$file", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, $delimiter)) !== FALSE) {
foreach($data as $i => $content) {
$data[$i] = $db->real_escape_string($content);
}
// echo $data[$i].""; // test only not required
$db->query("INSERT INTO `order`
(Picker, Order_Number, Timestamp, System)
VALUES ('" . implode("','", $data) . "');");
}
fclose($handle);
}
if($db){
echo "Success";
}
else {
echo "Error: " . $db->error;
}
At a quick glance it seems like this:
$f_pointer=fopen("$file","r"); // file pointer
Should be this:
$f_pointer=fopen($file,"r"); // file pointer
You might not be reading anything from the file. You can try outputting the file contents to see if that part is working, since you've confirmed that you can insert into the DB.