LOAD DATA INFILE executed but zero row effected - php

I added two additional columns in my csv file which contain some Excel formula because I needed to download the MySQL data as an excel file in which those formulas will react.
But after adding these two additional columns the data is not getting inserted into the table.
My code is:
$a = file('/Applications/XAMPP/xamppfiles/htdocs/upload/data.csv');// get array of lines
$new = '';
$i=1;
//$formula1 = "=IF(A$i>=40,2,IF(A$i=20,1))";
foreach($a as $line){
$line = trim($line);// remove end of line
$line .="|=IF(A$i>=40,2,IF(A$i=20,1))"."|=IF(COUNTIF(\$B:\$B,B".$i.")=1,1,C".$i."/COUNTIF(\$B:\$B,B".$i."))";// append new column
$new .= $line.PHP_EOL;//append end of line
$i++;
}
$outputfile = file_put_contents('/Applications/XAMPP/xamppfiles/htdocs/upload/data.csv', $new);// overwrite the same file with new data
$termit = "|";
$encls = "'\"'";
$lyn = "'\r\n'";
$cvsfile = "'/Applications/XAMPP/xamppfiles/htdocs/upload/data.csv'";
$dalete = mysql_query("delete FROM upload_excel_file");
$data_upload = mysql_query ("LOAD DATA INFILE $cvsfile INTO TABLE $cvsfile FIELDS TERMINATED BY '|' ENCLOSED BY $encls LINES TERMINATED BY $lyn IGNORE 1 LINES;");

Related

Incrementing number inside bloc string using fwrite

im creating file retrieving data from mysql database.
$fo = fopen($newName, 'w')or die("can't open file")
After querying db, lets start writing.
<?php fwrite($fo, $row['entetequestionmonochoix']. PHP_EOL);?>
Correspondance to 'entetequestionmultichoix' is the following:
\begin{question}{01}\scoring{b=1,e=0,m=0,V=0}
im facing a small obstacle when generating file.
The while loop allow me to read and then insert but inside the correspondance i need to increment {01} after each loop.
\begin{question} **{01}**\scoring{b=1,e=0,m=0,V=0}
Use str_replace() to replace {01} with a string containing an incrementing variable.
$i = 1;
while ($row = $results->fetch_assoc()) {
$line = str_replace('{01}', sprintf('{%02d}', $i++), $row['entetequestionmonochoix']);
fwrite($fo, $line . PHP_EOL);
}

PHP How to handle/parse csv files that have missing columns

I have many csv files generated by a third party, for which I have no say or control.
So each day I must import these csv data to mysql.
Some tables have correct matching number of columns to header.
Others do not.
Even when I did a prepared statement, it still did not import.
I tried to create a repair csv function, to add extra columns to each row, if their count of columns was less than the count of header columns.
As part of this project I am using the composer package league csv.
https://csv.thephpleague.com/
But here is my function code:
public function repaircsv(string $filepath) {
// make sure incoming file exists
if (!file_exists($filepath)) {
// return nothing
return;
}
// setup variables
$tempfile = pathinfo($filepath,PATHINFO_DIRNAME).'temp.csv';
$counter = 0;
$colcount = 0;
$myline = '';
// check if temp file exists if it does delete it
if (file_exists($tempfile)) {
// delete the temp file
unlink($tempfile);
}
// C:\Users\admin\vendor\league\csv
require('C:\Users\admin\vendor\league\csv\autoload.php');
// step one get header column count
$csv = Reader::createFromPath($filepath);
// set the header offset
$csv->setHeaderOffset(0);
//returns the CSV header record
$header = $csv->getHeader();
// get the header column count
$header_count = count($header);
// check if greater than zero and not null
if ($header_count < 1 || empty($header_count)) {
// return nothing
return $header_count;
}
// loop thru csv file
// now read file line by line skipping line 1
$file = fopen($filepath, 'r');
$temp = fopen($tempfile, 'w');
// loop thru each line
while (($line = fgetcsv($file)) !== FALSE) {
// if first row just straight append
if ($counter = 0) {
// append line to temp file
fputcsv($temp, $line);
}
// if all other rows compare column count to header column count
if ($counter > 0) {
// get column count for normal rows
$colcount = count($line);
// compare to header column count
$coldif = $header_count - $colcount;
// loop til difference is zero
while ($colcount != $header_count) {
// add to line extra comma
$line .= ',';
// get new column count
$colcount = count($line);
}
// append to temp file
fputcsv($temp, $line);
// show each line
$myline .= 'Line: ['.$line.']<br/><br/>';
}
// increment counter
$counter++;
}
// check file size of temp file
$fs = filesize($tempfile);
// if below 200 ignore and do not copy
if ($fs > 200) {
// copy temp to original filename
copy($tempfile,$filepath);
}
return $myline;
}
The logic is to copy the original csv file to a new temp csv file and add extra commas to rows of data that have missing columns.
Thank you for any help.
Edit: So the various csv's contain private data, so I can not share them.
But let us for example say i download multiple csvs for different data daily.
Each csv has a header row, and data.
If the number of columns in each row isn't 100% the same number of columns as in the header, it errors out.
If there are any special characters, it errors out.
There are 1000's of rows of data.
The code above is my first attempt to try to fix rows that have missing columns.
Here is an example
FirstName, LastName, Email
Steve,Jobs
,Johnson,sj#johns.com
Just a very small example.
I have no control of how the csvs are created, I do control the download process and import process.
Which then i use the csv data to update mysql tables.
I have tried the load data infile but that errors out too.
So I need to fix the csv files after they are downloaded.
Any ideas?
Do not mix array and string, instead of
$line .= ',';
do
$Line[]= '';
Also fix:
$myline .= 'Line: ['.implode(',', $line).']<br/><br/>';
Suggestion, you can replace your while loop with:
$line = array_pad($line, $header_count, ''); // append missing items
$line = array_slice($line, 0, $header_count); // remove eventual excess items

Importing CSV to mysql table and add date to each row using PHP execution time

I did php code that do all the work of finding CSV files in a given directory and importing each csv file to the right table. The problem is I a csv file that contains 1M rows! Yes 1M rows :/ So it takes more than 15 mins to import it. This is the ISSUE. How can I improve the execution time?
$csv = new SplFileObject($file, 'r');
$csv->setFlags(SplFileObject::READ_CSV);
// get columns name
$tableColumns = $db->getColumns('daily_transaction');
print_r($tableColumns);
// get line fro csv file without the first one
foreach(new LimitIterator($csv, 1) as $line){
$i = 0;
$data = array();
foreach ($tableColumns as $Columns) {
// print($line[$i]."<br>");
$data[$Columns] = $line[$i];
$i++;
}
$data['file_name'] = $infoNAME;
$data['file_date'] = $file_date;
$insert_id = $db->arrayToInsert('daily_transaction', $data);
}
}

Export to CSV not same as table PHP

I am trying to download table data into a CSV format. The data in one of the fields contains ",".
Eg: Doe, John
When I download the csv file, the data after comma is shifted to next column. But I want the entire data i.e including comma in same column.
The Code I used as follows:
<?php
include('dbconfig.php');
//header to give the order to the browser
header('Content-Type: text/csv');
header('Content-Disposition: attachment;filename=download.csv');
//select table to export the data
$sql ="SELECT * FROM tablename";
$select_table=mysqli_query($db, $sql);
$rows = mysqli_fetch_assoc($select_table);
if ($rows)
{
getcsv(array_keys($rows));
}
while($rows)
{
getcsv($rows);
$rows = mysqli_fetch_assoc($select_table);
}
// get total number of fields present in the database
function getcsv($no_of_field_names)
{
$separate = '';
// do the action for all field names as field name
foreach ($no_of_field_names as $field_name)
{
if (preg_match('/\\r|\\n|,|"/', $field_name))
{
$field_name = '' . str_replace('<em>', '', $field_name) . '';
}
echo $separate . $field_name;
//sepearte with the comma
$separate = ',';
}
//make new row and line
echo "\r\n";
}
?>
Can someone help me get through this issue.
Thanks
Make sure you escape the ,. Typically values that contain sensitive characters (such as , and \n) are surrounded in ".
So your output can be:
"Doe, John",52,New York
You can either write your own escape function or use PHPs fputcsv. It writes to a file handler that's a bit inconvenient but you can make it stdout.
$handle = fopen("php://stdout");
fputcsv($handle, array("Doe, John", 52, "New York"));

deleting a sql dump file

I create a csv file with the SQL query:
$sql = "SELECT 'id','username','name','email',
'phone_number','city',
'batch','course_type' UNION ALL
SELECT id, username, name, email,
phone_number, city, batch, course_type
FROM users INTO OUTFILE '".$file_path."' ".
"FIELDS TERMINATED BY ','
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'";
After causing a browser force-download of this file, I need to delete this file from the /tmp directory. But since, the file is owned by mysql, the php code cannot delete it. How do I change permissions on this file in my code so that the file can be deleted?
One solution to do this right, is (This is a rough example of how it could be done):
Return the data from the database as an array, and iterate through it:
// tell the browser that this is a download
header("Content-type: text/csv");
header("Content-Disposition: attachment; filename=list.csv");
// column separator
$separator = ';';
// field delimiter
$field_delimiter = '"';
// line end
$line_end = "\r\n";
foreach ($result as $row):
$i = 0;
foreach ($row as $item):
// do not add separator before the first item
if ($i > 0)
{
echo $separator;
}
echo $field_delimiter.$item.$field_delimiter;
$i++;
endforeach;
// line end
echo $line_end;
endforeach;
Do not output anything else, just the CSV content.

Categories