PHP mysql INSERT causing blank first row - php

I can't seem to figure this out but my code is inserting 1 blank row (1st row). The blank row has blank car name, blank car brand, and only has "0.00" in car price. This code is for uploading a csv file and getting the data from that csv file and inserting to database. The first row is the column headers and I was assuming that the first call of $GetHeaders = fgetcsv($file); would have been for the headers.
$file = fopen($_FILES['fileupload']['tmp_name'],"r");
$GetHeaders = fgetcsv($file);
$CarName = array_search('Car Name', $GetHeaders);
$CarBrand = array_search('Car Brand', $GetHeaders);
$CarPrice = array_search('Car Price', $GetHeaders);
$theQue = "";
while(! feof($file))
{
$GetHeaders = fgetcsv($file);
$theQue .= "INSERT INTO cardb (CarName, CarBrand, Carprice) VALUES ('$GetHeaders[$CarName]', '$GetHeaders[$CarBrand]', '$GetHeaders[$CarPrice]')";
}
fclose($file);
if (mysqli_multi_query($connection, $theQue))
{
echo "Success";
}

Posting and ending this with this findings so others who encounter the issue with fgetcsv will get this hint.
This is weird, but after a couple more testing, I found that fgetcsv is reading each rows from the CSV file but there's an additional row being read by fgetcsv which is a NULL.
It's as if there's a invisible row and it's the last row. It is only showing in the first row when in database probably because of auto sort or something but it again the last row fgetcsv got is a NULL. I thought it should have detected NULL as EOF?
What I did to detect the bug is something I should have done in the first place which is to use echo vardump which displayed all the car names and the last car was named "NULL"
Thank you for the help guys, each of you gave me ideas which led me to finding this prick lol

A couple things could be causing this.. First of all try to delete your header row in your csv file. Next put in a check that the data row is not equal to blank or null, before writing it to the database.
<?php
//open the csv file for reading
$jhandle = fopen($file_path, 'r');
$row_limit=1000;
while (($jdata = fgetcsv($jhandle, $row_limit, ",")) !== FALSE) {
$car_name = $jdata[0];
$car_brand = $jdata[1];
$car_price = $jdata[2];
If(($car_name != '')||($car_brand != '')||($car_price > 0)){
//write to your database here.
}
//close your while statement
}

Related

PHP How to handle/parse csv files that have missing columns

I have many csv files generated by a third party, for which I have no say or control.
So each day I must import these csv data to mysql.
Some tables have correct matching number of columns to header.
Others do not.
Even when I did a prepared statement, it still did not import.
I tried to create a repair csv function, to add extra columns to each row, if their count of columns was less than the count of header columns.
As part of this project I am using the composer package league csv.
https://csv.thephpleague.com/
But here is my function code:
public function repaircsv(string $filepath) {
// make sure incoming file exists
if (!file_exists($filepath)) {
// return nothing
return;
}
// setup variables
$tempfile = pathinfo($filepath,PATHINFO_DIRNAME).'temp.csv';
$counter = 0;
$colcount = 0;
$myline = '';
// check if temp file exists if it does delete it
if (file_exists($tempfile)) {
// delete the temp file
unlink($tempfile);
}
// C:\Users\admin\vendor\league\csv
require('C:\Users\admin\vendor\league\csv\autoload.php');
// step one get header column count
$csv = Reader::createFromPath($filepath);
// set the header offset
$csv->setHeaderOffset(0);
//returns the CSV header record
$header = $csv->getHeader();
// get the header column count
$header_count = count($header);
// check if greater than zero and not null
if ($header_count < 1 || empty($header_count)) {
// return nothing
return $header_count;
}
// loop thru csv file
// now read file line by line skipping line 1
$file = fopen($filepath, 'r');
$temp = fopen($tempfile, 'w');
// loop thru each line
while (($line = fgetcsv($file)) !== FALSE) {
// if first row just straight append
if ($counter = 0) {
// append line to temp file
fputcsv($temp, $line);
}
// if all other rows compare column count to header column count
if ($counter > 0) {
// get column count for normal rows
$colcount = count($line);
// compare to header column count
$coldif = $header_count - $colcount;
// loop til difference is zero
while ($colcount != $header_count) {
// add to line extra comma
$line .= ',';
// get new column count
$colcount = count($line);
}
// append to temp file
fputcsv($temp, $line);
// show each line
$myline .= 'Line: ['.$line.']<br/><br/>';
}
// increment counter
$counter++;
}
// check file size of temp file
$fs = filesize($tempfile);
// if below 200 ignore and do not copy
if ($fs > 200) {
// copy temp to original filename
copy($tempfile,$filepath);
}
return $myline;
}
The logic is to copy the original csv file to a new temp csv file and add extra commas to rows of data that have missing columns.
Thank you for any help.
Edit: So the various csv's contain private data, so I can not share them.
But let us for example say i download multiple csvs for different data daily.
Each csv has a header row, and data.
If the number of columns in each row isn't 100% the same number of columns as in the header, it errors out.
If there are any special characters, it errors out.
There are 1000's of rows of data.
The code above is my first attempt to try to fix rows that have missing columns.
Here is an example
FirstName, LastName, Email
Steve,Jobs
,Johnson,sj#johns.com
Just a very small example.
I have no control of how the csvs are created, I do control the download process and import process.
Which then i use the csv data to update mysql tables.
I have tried the load data infile but that errors out too.
So I need to fix the csv files after they are downloaded.
Any ideas?
Do not mix array and string, instead of
$line .= ',';
do
$Line[]= '';
Also fix:
$myline .= 'Line: ['.implode(',', $line).']<br/><br/>';
Suggestion, you can replace your while loop with:
$line = array_pad($line, $header_count, ''); // append missing items
$line = array_slice($line, 0, $header_count); // remove eventual excess items

Avoid inserting duplicate data in csv with PHP

I store data into a csv file by this:
$out = fopen('wpo_stock.csv', 'a');
fputcsv($out, array($order_id,$product_id,$product_name,$product_quantity, $regular_price));
fclose($out);
That stores my data in csv perfectly.
The problem is: Every time the page refreshed, it keeps inserting duplicated datas in the csv file.
How can I avoid inserting duplicated data by order_id (order_id is a unique value in my project)?
Code Updated :
$handle = fopen('wpo_stock.csv', 'r');
while (($data = fgetcsv($handle, ",")) !== FALSE) {
if($data[0] != $order_id){
$out = fopen('wpo_stock.csv', 'a');
fputcsv($out, array($order_id,$product_id,$product_name,$product_quantity, $regular_price));
} break;
}
fclose($handle);
I assume you are trying to override your data all the time when a page refreshes. In that case, you should open your file like this $out = fopen('wpo_stock.csv', 'w');.
But if you are trying to append new data, then you need to read all data from your file and compare with a new one.

Php Upload CSV and Get Column data

What I am trying to do is Upload a CSV file with Php. The first line is the Column names and below that the data (of course). Each column name can change depends on the end user uploads. So the main column names we need can change spots (A1 or B1 etc...) So lets say the column I need is B1 and I need to get all the data in B. Not sure on how to go by it. So far this is what I have. Any ideas?
ini_set("allow_url_fopen", 1);
$handle = fopen($_FILES['fileToUpload']['tmp_name'], 'r') or die ('cannot open the file');
while(!feof($handle)) {
$data[] = fgetcsv($handle);
}
var_dump($data);
fclose($handle);
UPDATE:
I am importing this file from .CSV to PHP
I need to search for column header that starts with “SKU” and then “COST”
From there once those are found then I want the whole column… B, E. But those column letters can change, depends on how it is being exported by the end user. I do not need the rows, just columns.
Once the file is uploaded into the server, use something like the following code to parse it and actually use it as an array[];
Code:
$filename = "upload/sample.csv";
if (($handle = fopen($filename, 'r')) !== FALSE){
while (($row = fgetcsv($handle, 1000, ",")) !== FALSE){
print_r($row);
}
}
That's one way of doing it, you could also read more about it here.
If you want the value of a specific column for each row then you need to loop through the results and pick it out. It looks like you are getting an array of arrays so...(EDITED to get the column based on the header name):
$header = $data[0];
unset($data[0]); // delete the header row so those values don't show in results
$sku_index = '';
$cost_index = '';
// get the index of the desired columns by name
for($i=0; $i < count($header); $i++) {
if($header[$i] == 'SKU') {
$sku_index = $i;
}
if($header[$i] == 'COST') {
$cost_index = $i;
}
}
// loop through each row and grab the values for the desired columns
foreach($data as $row) {
echo $row[$sku_index];
echo $row[$cost_index];
}
Should get what you want.

Want to insert data in oracle table from 2nd row of a text file in php

i want to insert data in oracle database from a text file. The first row of text file contains the header. i want to skip the first line. Below is my code.
for($i =1;($data = fgetcsv($handle, 10000, ",")) !== FALSE; $i++) {
// The query uses placeholders for data
$sql_insert = oci_parse($conn, 'insert into auto_debit_data_input (input_id,req_by,company_name) values (auto_debit_input_id_seq.nextval,:req_by,:company_name)');
oci_bind_by_name($sql_insert, ':req_by', $data[0]);
oci_bind_by_name($sql_insert, ':company_name', $data[0]);
$result=oci_execute($sql_insert);
if (!$result) {
$errmsg="No Data inserted. Please check all field";
//exit;
}
}
Below is my file data.
REQ_BY,Name
Mr X, Bangladesh
Mr Y, India
My code inserting from the header of the file. But i want to insert from 2nd line. Please help me to fix this.
I have also tried using below line but no luck.
for($i =2;($data = fgetcsv($handle, 10000, ",")) !== FALSE; $i++)
Just add following line at first place of you loop (if you are starting from $i = 1):
if($i==1){
continue;
}
Not tested, But Hope It will work;
Logic:
Skip the first iteration of loop...:)

How to loop through CSV mysql update quicker?

I have a csv file with 26,000 rows which i'm looping through rows and updating records (sometimes multiple) in a table with 250,000+ records. At the moment, its taken ages! I was wondering if there was an alternative way to do this quicker (in code or mysql/etc)
$row = 1;
if (($handle = fopen("zip-codes-database-DELUXE-BUSINESS2.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($row> 1){
# GET THE AREACODE FROM data 20
# name is: 21
$insert = "UPDATE ".TBLPREFIX."cities SET data = '".escape(serialize($data))."' WHERE area_code = ".$data[20]." AND title = '".trim(strtoupper($data[21]))."'";
mysql_query($insert) or die(mysql_error());
}
$row++;
}
fclose($handle);
}
Based on nothing I might try:
get the csv into a table via cmd line or 'load data infile'
update the records into a temp table using a 'insert ... select' where you join the old and new
move the temp table back onto the original (delete/rename)
Seems like it would be faster.. if a bit kludgy.

Categories