I have the following issue that I hope ya'll can shed some light on. I am trying to import a csv that although it looks normal (, for seperators) yet when I attempt to import there is always a line break after the 40th ish column.
This is my code for the import:
<?php
// include mysql database configuration file
require_once "layouts/config.php";
if (isset($_POST['submit']))
{
// Open uploaded CSV file with read-only mode
$csvFile = fopen(__DIR__. '/uploads/upload.csv', 'r');
//fgetcsv($csvFile);
$values = array();
while (($getData = fgetcsv($csvFile, 300, ",")) !== FALSE)
{
$values[]= "('".$getData[0]."' , '".$getData[1]."' ,'".$getData[2]."' ,'".$getData[3]."' ,'".$getData[4]."' ,
'".$getData[5]."' , '".$getData[6]."' ,'".$getData[7]."' ,'".$getData[8]."' ,'".$getData[9]."' ,
'".$getData[10]."', '".$getData[11]."','".$getData[12]."','".$getData[13]."','".$getData[14]."',
'".$getData[15]."', '".$getData[16]."','".$getData[17]."','".$getData[18]."','".$getData[19]."',
'".$getData[20]."', '".$getData[21]."','".$getData[22]."','".$getData[23]."','".$getData[24]."',
'".$getData[25]."', '".$getData[26]."','".$getData[27]."','".$getData[28]."','".$getData[29]."',
'".$getData[30]."', '".$getData[31]."','".$getData[32]."','".$getData[33]."','".$getData[34]."',
'".$getData[35]."', '".$getData[36]."','".$getData[37]."','".$getData[38]."','".$getData[39]."',
'".$getData[40]."', '".$getData[41]."','".$getData[42]."','".$getData[43]."','".$getData[44]."',
'".$getData[45]."', '".$getData[46]."','".$getData[47]."','".$getData[48]."','".$getData[49]."',
'".$getData[50]."', '".$getData[51]."','".$getData[52]."','".$getData[53]."','".$getData[54]."',
'".$getData[55]."', '".$getData[56]."','".$getData[57]."')";
}
$sql = "INSERT INTO machine (PresetNumber, Dates, Starts, Stops, ProductCode, ProductName, Proper, TargetWeight1, Totals, Mean, StandardD, Maxs, Mins,
ProductRange, Under, Over, OverScale, ReCheckCnt, TotalNegErr, Preset, TargetWeight2, BagLength, BagWidth, StartIPS, StopIPS, TotalTime, PlannedTime, ITPSDownTime,
ITPSOff, LowProductTime, DownStreamDownTime, OperatingTime,FullProductTime,GoodBagMakingTime, NetOperatingTime, FacultyBagMakingTime, PlannedShutdown, Speed,
ITPSOEE, SystemAvalability, WeigherEfficiency, BagMakerEfficiency, WeightLossRatio, ProductWaste, InputProduct, OutputProduct, AverageWeight,
StandardDeviation, CombinationCount, FilledBags, GoodBags, GoodProductBags, FilmWaste, InputFilm, OutputFilm, FilmWasteUpstream, FilmWasteMiddle, FilmWasteDownStream)
VALUES " . implode(',',$values);
//fclose($csvFile);
if ($link->query($sql) === TRUE)
{
echo "File(s) uploaded successfully!";
}
else {
echo "Error: " . $sql . "<br>" . $link->error;
}
}
''''
My CSV File is
5,2022/05/17,16:22,11:21,AJ,,112965,39.0 g,4485.5 kg,39.707 g,0.495 g,42.0 g,39.0 g,3.0 g,1035,2218,451,58,0 g,5,39.0g,1.92inch,1.27inch,2022.5.19. 09:52,2022.5.19. 11:01,68min34sec,68min34sec,7min25sec,4min30sec,2min55sec,0min0sec,61min10sec,0min0sec,0min0sec,0min0sec,61min9sec,0,0bpm,0.0%,89.1%,0%,0.0%,0.0%,0.2%,226.3kg,225.8kg,0.000g,0.000g,5811,0,0,0,0.0%,0.00feet,0.00feet,0.0,0.0,0.0
Yet for some reason when I import this CSV the line always breaks after the 40th column. Now If I import a CSV with some dummy data (sequential count from 1 to 58) this file will import fine. Im pretty new to this so I am not sure what I am missing.
Thanks Much
I store data into a csv file by this:
$out = fopen('wpo_stock.csv', 'a');
fputcsv($out, array($order_id,$product_id,$product_name,$product_quantity, $regular_price));
fclose($out);
That stores my data in csv perfectly.
The problem is: Every time the page refreshed, it keeps inserting duplicated datas in the csv file.
How can I avoid inserting duplicated data by order_id (order_id is a unique value in my project)?
Code Updated :
$handle = fopen('wpo_stock.csv', 'r');
while (($data = fgetcsv($handle, ",")) !== FALSE) {
if($data[0] != $order_id){
$out = fopen('wpo_stock.csv', 'a');
fputcsv($out, array($order_id,$product_id,$product_name,$product_quantity, $regular_price));
} break;
}
fclose($handle);
I assume you are trying to override your data all the time when a page refreshes. In that case, you should open your file like this $out = fopen('wpo_stock.csv', 'w');.
But if you are trying to append new data, then you need to read all data from your file and compare with a new one.
This method uploads csv file to mysql .But upon thousands of data in csv file it takes lots of time to upload the data which is annoying.
$deleterecords = "TRUNCATE TABLE discount"; //empty the table of its current records
mysql_query($deleterecords);
//readfile($name);
//Import uploaded file to Database
$handle = fopen($name, "r");
$i=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($i>0){
$import="INSERT into discount(id,title,expired_date,amount,block)values('".$data[0]."','".$data[1]."','".$data[2]."','".$data[3]."','".$data[4]."')";
//imports data serially to the allocated columns.
mysql_query($import) or die(mysql_error());//query
}
$i=1;
}
fclose($handle);
//closing the handle
// print "Import done ";
?>
Can anyone suggest faster method for uploading data ?
Instead of writing a script to pull in information from a CSV file, you can link MYSQL directly to it and upload the information using the following SQL syntax.
To import an Excel file into MySQL, first export it as a CSV file. Remove the CSV headers from the generated CSV file along with empty data that Excel may have put at the end of the CSV file.
You can then import it into a MySQL table by running:
load data local infile 'uniq.csv' into table tblUniq fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)
as read on: Import CSV file directly into MySQL
Use the LOAD DATA INFILE statement.
https://dev.mysql.com/doc/refman/5.1/en/load-data.html
Load the data in a temporary table and use that for inserting to your main table with one statement.
You can insert data this way. This is default way to insert rows in a table.
$deleterecords = "TRUNCATE TABLE discount"; //empty the table of its current records
mysql_query($deleterecords);
//readfile($name);
//Import uploaded file to Database
$handle = fopen($name, "r");
$i=0;
$ins = "INSERT into discount(id,title,expired_date,amount,block) values ";
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($i>0){
$import .= $ins."('".$data[0]."','".$data[1]."','".$data[2]."','".$data[3]."','".$data[4]."'),";
//imports data serially to the allocated columns.
}
$import = rtrim($import,',');
mysql_query($import) or die(mysql_error());//query
$i=1;
}
fclose($handle);
//closing the handle
// print "Import done ";
?>
Instead of having multiple insert, build a big query and execute a single insert.
<?php
$deleterecords = "TRUNCATE TABLE discount"; //empty the table of its current records
mysql_query($deleterecords);
//readfile($name);
//Import uploaded file to Database
$handle = fopen($name, "r");
$i=0;
$what_to_insert = array();
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($i>0){
array_push($what_to_insert, "('".$data[0]."','".$data[1]."','".$data[2]."','".$data[3]."','".$data[4]."')");
}
$i=1;
}
fclose($handle);
if (count($what_to_insert)>0){
$import="INSERT into discount(id,title,expired_date,amount,block) values " . implode(",", $what_to_insert);
mysql_query($import) or die(mysql_error());//query
}
?>
If phpMyAdmin is available you can use the CSV import feature.
I have a .csv file with me but i am unable to import it into the database. I have parsed my .csv file using the below query. Can you please help me how to insert data into MySql.
My code is:-
$fp = fopen('test.csv','r') or die("can't open file");
print "<table>\n";
while($csv_line = fgetcsv($fp,1024))
{
print '<tr>';
for ($i = 0, $j = count($csv_line); $i < $j; $i++) {
print '<td>'.$csv_line[$i].'</td>';
$data[] = $csv_line[$i];
}
print "</tr>\n";
}
print '</table>\n';
fclose($fp) or die("can't close file");
In MySQL we can import data from CSV file to a table very easily:
LOAD DATA LOCAL INFILE 'import.csv' INTO TABLE from_csv FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' (FIELD1, FIELD2, FIELD3);
http://dev.mysql.com/doc/refman/5.0/en/load-data.html
Example:
http://www.techtrunch.com/linux/import-csv-file-mysql-table
The code seems to take the csv file and export it out to the browser as a tabular data table.
You mentioned importing into a mysql table, but there are no mysql information listed. Create a mysql table and try to map your fields you are importing to database fields in a table. The easiest way for me is to use phpmyadmin to generate the load data sql although the load data local infile mentioned in an answer will work as well if you understand enough to change it.
When I learned how to do it I used this tutorial.
http://vegdave.wordpress.com/2007/05/19/import-a-csv-file-to-mysql-via-phpmyadmin/
you can use this code. I hope this will be helpfull
//after uploading csv file from a upload file field
$handle = fopen($_FILES['csvfile']['tmp_name'], "r");
$header = fgetcsv($handle);
while(! feof($handle)){
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$import="INSERT into csvtest($header[0], $header[1], $header[2], $header[3], $header[4], $header[5],$header[6]) values ('$data[0]', '$data[1]', '$data[2]', '$data[3]', '$data[4]', '$data[5]', '$data[6]')";
mysql_query($import) or die(mysql_error());
}
}
fclose($handle);
I have a file a csv file (made as .txt) that I am currently parsing right now, but the file is about 350mb uncompressed. When it's zipped, it shows in the zip file as 23mb. My system completely freezes when I try to parse the 350mb file. I store the lines in an array like this. The first row are the headings.
$fh = fopen($inputFile, 'r');
$contents = fread($fh, filesize($inputFile)); // 5KB
fclose($fh);
//$contents = str_replace('"','',$contents);
$fileLines = explode("\n", $contents); // explode to make sure we are only using the first line.
Then I go through each line to insert it in a loop into mySQL. Since the file is about 350mb, would there be a way to parse it from the .zip file like .zip_filename.txt or would that even make a difference at all?
The file is too large to insert directly into mysql through the import method.
Use the built in function fgetcsv:
<?php
$row = 1;
if (($handle = fopen($inputFile, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);
echo "<p> $num fields in line $row: <br /></p>\n";
$row++;
for ($c=0; $c < $num; $c++) {
echo $data[$c] . "<br />\n";
}
}
fclose($handle);
}
?>
Also use multi insert if possible. Instead of running multiple queries:
insert into table (col1, col2) values("row1-col1", "row1-col2");
insert into table (col1, col2) values("row2-col1", "row2-col2");
Building one query like this is much quicker:
insert into table (col1, col2)
values ("row1-col1", "row1-col2"),
("row2-col1", "row2-col2");
By the way, you can also load a file directly into mysql:
load data local infile 'file.csv' into table table_name fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(col1, col2)
Consider using LOAD DATA INFILE, that'll allow you to insert the contents of a CSV file directly.