Avoid inserting duplicate data in csv with PHP - php

I store data into a csv file by this:
$out = fopen('wpo_stock.csv', 'a');
fputcsv($out, array($order_id,$product_id,$product_name,$product_quantity, $regular_price));
fclose($out);
That stores my data in csv perfectly.
The problem is: Every time the page refreshed, it keeps inserting duplicated datas in the csv file.
How can I avoid inserting duplicated data by order_id (order_id is a unique value in my project)?
Code Updated :
$handle = fopen('wpo_stock.csv', 'r');
while (($data = fgetcsv($handle, ",")) !== FALSE) {
if($data[0] != $order_id){
$out = fopen('wpo_stock.csv', 'a');
fputcsv($out, array($order_id,$product_id,$product_name,$product_quantity, $regular_price));
} break;
}
fclose($handle);

I assume you are trying to override your data all the time when a page refreshes. In that case, you should open your file like this $out = fopen('wpo_stock.csv', 'w');.
But if you are trying to append new data, then you need to read all data from your file and compare with a new one.

Related

Php Upload CSV and Get Column data

What I am trying to do is Upload a CSV file with Php. The first line is the Column names and below that the data (of course). Each column name can change depends on the end user uploads. So the main column names we need can change spots (A1 or B1 etc...) So lets say the column I need is B1 and I need to get all the data in B. Not sure on how to go by it. So far this is what I have. Any ideas?
ini_set("allow_url_fopen", 1);
$handle = fopen($_FILES['fileToUpload']['tmp_name'], 'r') or die ('cannot open the file');
while(!feof($handle)) {
$data[] = fgetcsv($handle);
}
var_dump($data);
fclose($handle);
UPDATE:
I am importing this file from .CSV to PHP
I need to search for column header that starts with “SKU” and then “COST”
From there once those are found then I want the whole column… B, E. But those column letters can change, depends on how it is being exported by the end user. I do not need the rows, just columns.
Once the file is uploaded into the server, use something like the following code to parse it and actually use it as an array[];
Code:
$filename = "upload/sample.csv";
if (($handle = fopen($filename, 'r')) !== FALSE){
while (($row = fgetcsv($handle, 1000, ",")) !== FALSE){
print_r($row);
}
}
That's one way of doing it, you could also read more about it here.
If you want the value of a specific column for each row then you need to loop through the results and pick it out. It looks like you are getting an array of arrays so...(EDITED to get the column based on the header name):
$header = $data[0];
unset($data[0]); // delete the header row so those values don't show in results
$sku_index = '';
$cost_index = '';
// get the index of the desired columns by name
for($i=0; $i < count($header); $i++) {
if($header[$i] == 'SKU') {
$sku_index = $i;
}
if($header[$i] == 'COST') {
$cost_index = $i;
}
}
// loop through each row and grab the values for the desired columns
foreach($data as $row) {
echo $row[$sku_index];
echo $row[$cost_index];
}
Should get what you want.

PHP mysql INSERT causing blank first row

I can't seem to figure this out but my code is inserting 1 blank row (1st row). The blank row has blank car name, blank car brand, and only has "0.00" in car price. This code is for uploading a csv file and getting the data from that csv file and inserting to database. The first row is the column headers and I was assuming that the first call of $GetHeaders = fgetcsv($file); would have been for the headers.
$file = fopen($_FILES['fileupload']['tmp_name'],"r");
$GetHeaders = fgetcsv($file);
$CarName = array_search('Car Name', $GetHeaders);
$CarBrand = array_search('Car Brand', $GetHeaders);
$CarPrice = array_search('Car Price', $GetHeaders);
$theQue = "";
while(! feof($file))
{
$GetHeaders = fgetcsv($file);
$theQue .= "INSERT INTO cardb (CarName, CarBrand, Carprice) VALUES ('$GetHeaders[$CarName]', '$GetHeaders[$CarBrand]', '$GetHeaders[$CarPrice]')";
}
fclose($file);
if (mysqli_multi_query($connection, $theQue))
{
echo "Success";
}
Posting and ending this with this findings so others who encounter the issue with fgetcsv will get this hint.
This is weird, but after a couple more testing, I found that fgetcsv is reading each rows from the CSV file but there's an additional row being read by fgetcsv which is a NULL.
It's as if there's a invisible row and it's the last row. It is only showing in the first row when in database probably because of auto sort or something but it again the last row fgetcsv got is a NULL. I thought it should have detected NULL as EOF?
What I did to detect the bug is something I should have done in the first place which is to use echo vardump which displayed all the car names and the last car was named "NULL"
Thank you for the help guys, each of you gave me ideas which led me to finding this prick lol
A couple things could be causing this.. First of all try to delete your header row in your csv file. Next put in a check that the data row is not equal to blank or null, before writing it to the database.
<?php
//open the csv file for reading
$jhandle = fopen($file_path, 'r');
$row_limit=1000;
while (($jdata = fgetcsv($jhandle, $row_limit, ",")) !== FALSE) {
$car_name = $jdata[0];
$car_brand = $jdata[1];
$car_price = $jdata[2];
If(($car_name != '')||($car_brand != '')||($car_price > 0)){
//write to your database here.
}
//close your while statement
}

Fastest way to import csv file into MYSQL

This method uploads csv file to mysql .But upon thousands of data in csv file it takes lots of time to upload the data which is annoying.
$deleterecords = "TRUNCATE TABLE discount"; //empty the table of its current records
mysql_query($deleterecords);
//readfile($name);
//Import uploaded file to Database
$handle = fopen($name, "r");
$i=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($i>0){
$import="INSERT into discount(id,title,expired_date,amount,block)values('".$data[0]."','".$data[1]."','".$data[2]."','".$data[3]."','".$data[4]."')";
//imports data serially to the allocated columns.
mysql_query($import) or die(mysql_error());//query
}
$i=1;
}
fclose($handle);
//closing the handle
// print "Import done ";
?>
Can anyone suggest faster method for uploading data ?
Instead of writing a script to pull in information from a CSV file, you can link MYSQL directly to it and upload the information using the following SQL syntax.
To import an Excel file into MySQL, first export it as a CSV file. Remove the CSV headers from the generated CSV file along with empty data that Excel may have put at the end of the CSV file.
You can then import it into a MySQL table by running:
load data local infile 'uniq.csv' into table tblUniq fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)
as read on: Import CSV file directly into MySQL
Use the LOAD DATA INFILE statement.
https://dev.mysql.com/doc/refman/5.1/en/load-data.html
Load the data in a temporary table and use that for inserting to your main table with one statement.
You can insert data this way. This is default way to insert rows in a table.
$deleterecords = "TRUNCATE TABLE discount"; //empty the table of its current records
mysql_query($deleterecords);
//readfile($name);
//Import uploaded file to Database
$handle = fopen($name, "r");
$i=0;
$ins = "INSERT into discount(id,title,expired_date,amount,block) values ";
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($i>0){
$import .= $ins."('".$data[0]."','".$data[1]."','".$data[2]."','".$data[3]."','".$data[4]."'),";
//imports data serially to the allocated columns.
}
$import = rtrim($import,',');
mysql_query($import) or die(mysql_error());//query
$i=1;
}
fclose($handle);
//closing the handle
// print "Import done ";
?>
Instead of having multiple insert, build a big query and execute a single insert.
<?php
$deleterecords = "TRUNCATE TABLE discount"; //empty the table of its current records
mysql_query($deleterecords);
//readfile($name);
//Import uploaded file to Database
$handle = fopen($name, "r");
$i=0;
$what_to_insert = array();
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($i>0){
array_push($what_to_insert, "('".$data[0]."','".$data[1]."','".$data[2]."','".$data[3]."','".$data[4]."')");
}
$i=1;
}
fclose($handle);
if (count($what_to_insert)>0){
$import="INSERT into discount(id,title,expired_date,amount,block) values " . implode(",", $what_to_insert);
mysql_query($import) or die(mysql_error());//query
}
?>
If phpMyAdmin is available you can use the CSV import feature.

Importing .csv file into mysql in php

I have a .csv file with me but i am unable to import it into the database. I have parsed my .csv file using the below query. Can you please help me how to insert data into MySql.
My code is:-
$fp = fopen('test.csv','r') or die("can't open file");
print "<table>\n";
while($csv_line = fgetcsv($fp,1024))
{
print '<tr>';
for ($i = 0, $j = count($csv_line); $i < $j; $i++) {
print '<td>'.$csv_line[$i].'</td>';
$data[] = $csv_line[$i];
}
print "</tr>\n";
}
print '</table>\n';
fclose($fp) or die("can't close file");
In MySQL we can import data from CSV file to a table very easily:
LOAD DATA LOCAL INFILE 'import.csv' INTO TABLE from_csv FIELDS TERMINATED BY ','  LINES TERMINATED BY '\n'  (FIELD1, FIELD2, FIELD3);
http://dev.mysql.com/doc/refman/5.0/en/load-data.html
Example:
http://www.techtrunch.com/linux/import-csv-file-mysql-table
The code seems to take the csv file and export it out to the browser as a tabular data table.
You mentioned importing into a mysql table, but there are no mysql information listed. Create a mysql table and try to map your fields you are importing to database fields in a table. The easiest way for me is to use phpmyadmin to generate the load data sql although the load data local infile mentioned in an answer will work as well if you understand enough to change it.
When I learned how to do it I used this tutorial.
http://vegdave.wordpress.com/2007/05/19/import-a-csv-file-to-mysql-via-phpmyadmin/
you can use this code. I hope this will be helpfull
//after uploading csv file from a upload file field
$handle = fopen($_FILES['csvfile']['tmp_name'], "r");
$header = fgetcsv($handle);
while(! feof($handle)){
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$import="INSERT into csvtest($header[0], $header[1], $header[2], $header[3], $header[4], $header[5],$header[6]) values ('$data[0]', '$data[1]', '$data[2]', '$data[3]', '$data[4]', '$data[5]', '$data[6]')";
mysql_query($import) or die(mysql_error());
}
}
fclose($handle);

How to loop through CSV mysql update quicker?

I have a csv file with 26,000 rows which i'm looping through rows and updating records (sometimes multiple) in a table with 250,000+ records. At the moment, its taken ages! I was wondering if there was an alternative way to do this quicker (in code or mysql/etc)
$row = 1;
if (($handle = fopen("zip-codes-database-DELUXE-BUSINESS2.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($row> 1){
# GET THE AREACODE FROM data 20
# name is: 21
$insert = "UPDATE ".TBLPREFIX."cities SET data = '".escape(serialize($data))."' WHERE area_code = ".$data[20]." AND title = '".trim(strtoupper($data[21]))."'";
mysql_query($insert) or die(mysql_error());
}
$row++;
}
fclose($handle);
}
Based on nothing I might try:
get the csv into a table via cmd line or 'load data infile'
update the records into a temp table using a 'insert ... select' where you join the old and new
move the temp table back onto the original (delete/rename)
Seems like it would be faster.. if a bit kludgy.

Categories