PHP Export to CSV [duplicate] - php

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
PHP Export Excel to specific Path?
I'm not really familiar with PHP exporting to excel or csv, but I'm using PHP MySQL for a local point of sale.
According to the code below, this actually works..But not in the way it should be ! All records are placed as 1 row inside the csv file, how can i fix that ? Also, How would I stop overwriting the same file...I mean When I click on a Button to export the csv, it should check if there is an existing csv file, If there is--Create new one !
Thank You
require_once('connect_db.php');
$items_array = array();
$result = mysql_query("SELECT * FROM sold_items");
while($row = mysql_fetch_array($result))
{
$items_array[] = $row['item_no'];
$items_array[] = $row['qty'];
}
$f = fopen('C:/mycsv.csv', 'w');
fputcsv($f, $items_array);
fclose($f);

fputcsv appears to only be writing one row/record, and includes a row/record terminator in its output. You will need to call fputcsv for each line of the report.
dbf's solution for a sequential filenaming works well in many cases. Personally, I've found appending a timestamp helpful, as it requires less IO when there is a collection of existing files. Additionally, it makes it possible to know when the report was from without having to open each, even in the cases where the report was modified/copied/touched.
Minor detail: adjusted the query to just the columns your using.
<?php
require_once('connect_db.php');
$result = mysql_query("SELECT item_no, qty FROM sold_items");
$timestamp = date('Ymd-His');
$f = fopen("C:/mycsv-{$timestamp}.csv", 'w');
// Headers
fputcsv($f, array('Item No', 'Qty'));
while($row = mysql_fetch_row($result))
{
fputcsv($f, $row);
}
fclose($f);

First of all
$items_array[] = array($row['item_no'], $row['qty']);
second, use a variable to store the files name.
$filename = $name = "myscsv";
$index = 1;
while(file_exists($filename.".csv")) {
$filename = $name.$index;
$index++;
}
now you can save it ;)
$f = fopen("C:/{$filename}.csv", 'w');

Related

Timeout while parsing CSV file

I have a .csv file that is about 5mb (~45,000 rows). What I need to do is run through each row of the file and check if the ID in each line is already in a table in my database. If it is, I can delete that row from the file.
I did a good amount of research on the most memory efficient way to do this, so I've been using a method of writing lines that don't need to get deleted to a temporary file and then renaming that file as the original. Code below:
$file= fopen($filename, 'r');
$temp = fopen($tempFilename, 'w');
while(($row = fgetcsv($file)) != FALSE){
// id is the 7th value in the row
$id = $row[6];
// check table to see if id exists
$sql = "SELECT id FROM table WHERE id = $id";
$result = mysqli_query($conn, $sql);
// if id is in the database, skip to next row
if(mysqli_num_rows($result) > 0){
continue;
}
// else write line to temp file
fputcsv($temp, $row);
}
fclose($file);
fclose($temp);
// overwrite original file
rename($tempFilename, $filename);
Problem is, I'm running into a timeout while executing this bit of code. Anything I can do to make the code more efficient?
You fire a database query per line, aka 45.000 queries... that takes too much time.
Better you do a query before the loop and read the existing id into a lookup array, then only check this array in the loop.
Pseudo code:
$st = query('SELECT id FROM table');
while ($row = $st->fetch()) {
$lookup[ $row['id'] ] = $row['id'];
}
// now read CSV
while($row = fgetcsv($h)) {
$id = $row[6];
if (isset($lookup[ $id ])) {
// exist...
continue;
}
// write the non-existing id to different file...
}
edit:
Assume memory isn't sufficient to hold 1 million integer from the database. How can it still be done efficiently?
Collect ids from CSV into an array. Write a single query to find all those ids in the database and collect (it can be maximal so many as in the CSV). Now array_diff() the ids from file with the ids from database - those ids remaining exist in CSV but not in database.
Pseudo code:
$ids_csv = [];
while($row = fgetcsv($h)) {
$id = row[6];
$ids_csv[] = intval($id);
}
$sql = sprintf('SELECT id FROM table WHERE id IN(%s)', implode(',', $ids_csv));
$ids_db = [];
$st = query($sql);
while ($row = $st->fetch()) {
$ids_db[] = $row['id'];
}
$missing_in_db = array_diff($ids_csv, $ids_db);
I would use LOAD DATA INFILE: https://dev.mysql.com/doc/refman/8.0/en/load-data.html
Your database user needs to have FILE priveleges on the database to use.
to read the csv file into a separate table.
Then you can run one query to delete id's already exist (delete from join ...)
And export the rows that were left intact.
Other option is use your loop to insert your csv file into a seperate table, and then proceed with step 2.
Update: I use LOAD DATA INFILE with csv files up to 2 million rows (at the moment) and do some bulk data manipulation with big queries, it's blazingly fast and I would recommend this route for files containing > 100k lines.

csv export with php stops when huge amounts of data shall be exported

I am using a php script to perform a csv export.
When exporting small amounts of data it works fine, like 10k to 100k records.
However when more data shall get exported, the export stops at some point and the csv file is incomplete.
For example, when exporting 500k records, it will only export around 300k...
It is interesting that the stop appears not always at the same point - sometimes the exported file has 23mb sometimes 26mb sometimes 24mb and so on...
I guess the porblem is to be found somewhere in the php.ini, like a memory or a cache setting that is to low.
However I am not an expert in setting php - any ideas?
Here is the code that I use to perform the csv export:
mysqli_select_db($conn, "$settings_incident_database");
$sql = "SELECT *
FROM $settings_incident_database.incidents
$where
ORDER BY $settings_incident_database.incidents.Id DESC";
$result = mysqli_query($conn, $sql);
header('Content-Type: text/csv');
header('Content-Disposition: attachment;filename=export.csv');
$row = mysqli_fetch_assoc($result);
if ($row) {
echocsv(array_keys($row));
}
while ($row) {
echocsv($row);
$row = mysqli_fetch_assoc($result);
}
function echocsv($fields)
{
$separator = '';
foreach ($fields as $field) {
if (preg_match('/\\r|\\n|,|"/', $field)) {
$field = '"' . str_replace('"', '""', $field) . '"';
}
echo $separator . $field;
$separator = ';';
}
echo "\r\n";
}
I assume that you're using too much memory. You may try to export the content in temporary file using MySQL like this:
$tmp_csv_file = '/tmp/test.csv';
mysqli_select_db($conn, "$settings_incident_database");
$sql = "SELECT *
FROM $settings_incident_database.incidents
$where
ORDER BY $settings_incident_database.incidents.Id DESC
INTO OUTFILE ".$tmp_csv_file."
FIELDS TERMINATED BY ','
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
"; // Get temporary CSV file via MySQL
$result = mysqli_query($conn, $sql);
$csv_content = file_get_contents($tmp_csv_file); // Get the CSV content
unlink($tmp_csv_file); // Delete the temporary file
header('Content-Type: text/csv');
header('Content-Disposition: attachment;filename=export.csv');
echo $csv_content;
NOTE: Keep in mind that the temporary file must be unique if you expect multiple users.
NOTE2: If you use readfile() as Mark suggests it may be more efficient, but you'll need to delete the temporary file after the output.
I have found the problem in the php.ini
max_execution_time = 30 -> changed to 300
memory_limit = 128M -> changed to 512M
Chganging these settings solved the problem for me.
I will also consider the PHP's built-in fputcsv() function to make the code somehow less badly, but for a quick fix this is enough.
Thx.
Changing your code to use PHP's built-in fputcsv() function is as simple as changing
if ($row) {
echocsv(array_keys($row));
}
while ($row) {
echocsv($row);
$row = mysqli_fetch_assoc($result);
}
to
$csv = fopen('php://output', 'w');
if ($row) {
fputcsv($csv, array_keys($row));
}
while ($row) {
fputcsv($csv, $row);
$row = mysqli_fetch_assoc($result);
}
and then you can delete your echocsv function completely

PHP get string contain url from database

I'm trying to get string from database which is url "http://www.google.com/"
but data I get changed to this http:\ /\ /www.google.com\ /
while($row = mysql_fetch_array($result)){
// temporary array to create single category
$tmp = array();
$tmp["id"] = $row["id"];
$tmp["name"] = $row["name"];
$tmp["url"]= $row["url"];
array_push($response["database"], $tmp);
}
how can I get the url without changed.
In your example the two pieces of data are identical, did StackOverflow reformat your data? Your code looks fine to me, perhaps it is a problem with how the data is inserted into the database rather than how it is retrieved. Have you looked at the data in an SQL browser like phpMyAdmin or SQLyog Community Edition to confirm the data is stored as you expect?
the stripslashes() builtin function would seem to do the trick.
I solve this problem
actually I want to create .json file, but when we use array and show data like url in php page .php the data will change because it's read as an html code, in json case we have to create another file with in file managment generated after getting data from database
$response = array(); $response["feed"] = array();
foreach($db->query('SELECT * FROM table') as $row) {
$tmp = array();
$tmp['id'] = $row['id'];
$tmp['name'] = $row['name'];
$tmp['url']= $row['url'];
array_push($response['table'], $tmp); }
//here the data posted into php page so the url change
echo json_encode($response);
//here create .json data and write data in data.json
$fp = fopen('data.json', 'w');
fwrite($fp, json_encode($response));
fclose($fp);

Exporting data from database to csv file using php

I am able to export database to csv but my code somehow imports twice the data to my csv file. I.e same column twice side by side.this is my code. I think my problem is with the implode statment. Any help would be appreciated.
<?php
$db = new sqlite3('I:\preethi\webbs.db');
$headers = array
('Id','CompanyId','DateTime','Serial','DeviceId','AgentAId','GpsAddress','Targa','CommonRoadDescription'
,'RoadCivicNumber','VehicleBrandDescription','VehicleModelDescription' ,'VerbaliVehicleTypeDescription','CommonColorVehicleDescription','VerbaliRuleOneCode','VerbaliRuleOneDes
cription','VerbaliRuleOnePoints'
,'VerbaliClosedNoteDescription','Points','VerbaliMissedNotificationDescription
','MissedNotificationNote','StatementNote');
$results = $db->query('select'.implode (',',$headers).'from VerbaliData');
//$results = $db->query( 'select
Id ,CompanyId ,DateTime ,Serial ,DeviceId ,AgentAId
,GpsAddress ,Targa ,CommonRoadDescription ,RoadCivicNumber ,VehicleBrandDescription
,VehicleModelDescription ,VerbaliVehicleTypeDescription ,CommonColorVehicleDescription
,VerbaliRuleOneCode ,VerbaliRuleOneDescription ,VerbaliRuleOnePoints ,VerbaliClosedNoteDescription
,Points ,VerbaliMissedNotificationDescription ,MissedNotificationNote ,StatementNote from
VerbaliData');
$fp = fopen('explores.csv', 'w');
fputcsv($fp,$headers);
while ($row = $results->fetchArray()) {
fputcsv($fp, $row);
}
fclose($fp);
?>
Just try with :
while($row = $results->fetchArray(SQLITE3_NUM)) {
Or
while($row = $results->fetchArray(SQLITE3_ASSOC)) {
More Details: http://php.net/manual/en/sqlite3result.fetcharray.php
You have a slight prob in your code fetchArray() returns two array sets one associative and one is numbered, use fetchArray(SQLITE3_NUM) or fetchArray(SQLITE3_ASSOC).

PHP freezes when adding to array in while loop

I have a 260k line csv file that has two columns. I have read in the csv file using fgetcsv and have a while loop which reads every line in the file. In the loop I am trying to add the values from the second column to an array.
When I have the line in to add to the array, my PHP freezes and doesn't finish. I have done debugging and the values are getting added to the array so I know that the adding to array and while loop work but I do not know why it freezes.
If I remove the line the while loop completes going through the 260k lines and then processes the rest of the file.
Here is my code:
$amountRecords = 0;
$totalValue = 0;
$valueArray = array();
// reads in csv file
$handle = fopen('Task1-DataForMeanMedianMode.csv', 'r');
// to skip the header names/values
fgetcsv($handle);
// creates array containing variables from csv file
while(($row = fgetcsv($handle, "\r")) != FALSE)
{
/*
echo "ROW CONTAINS: ";
var_dump($row[1]);
echo "<br />";
*/
$valueArray[] = $row[1];
/*
echo "VALUEARRAY NOW CONTAINS: ";
var_dump($valueArray);
echo "<br />";
*/
$totalValue = $totalValue + $row[1];
$amountRecords++;
}
And sample of csv file:
ID,Value
1,243.00
2,243.00
3,243.00
4,243.00
5,123.11
6,243.00
7,180.00
8,55.00
9,243.00
10,55.00
With an out-of-memory error, there are two general approaches. As usual with these choices, you can pick easy-but-wrong and hard-but-right. The easy-but-wrong solution is to increase your memory limit to an appropriate level:
ini_set('memory_limit', '64M');
the better (although harder) solution is to re-engineer your algorithm to not need as much memory. This is clearly the more sustainable and robust approach. To do this properly, you will need to evaluate what you need to do with the array you are building. For instance, I have written similar scripts which were importing the rows to a database. Instead of building a huge array and then inserting, I did it in batches, where I built an array of 50-100 rows, then inserted those and cleared the array (freeing the memory for re-use).
Pseudo-code:
for(each row in file) {
$rows_cache[] = $row[1];
if(count($rows_cache) >= 50) {
insert_these($rows_cache);
$rows_cache = array();
}
}
Your first row is string, maybe try adding
while(($row = fgetcsv($handle, "\r")) != FALSE)
{
if(is_numeric($row[1]))
{
$valueArray[] = $row[1];
$totalValue = $totalValue + $row[1];
$amountRecords++;
}
}
Why not drop the line:
$totalValue = $totalValue + $row[1];
from inside your loop, and instead use:
$totalValue = array_sum($valueArray);
after completing your loop
Not really the problem, but
while(($row = fgetcsv($handle, "\r")) != FALSE)
can be rewritten as
while($row = fgetcsv(...))
instead. There's no need for the explicit false check - if fgetcsv returns false, the while loop would terminate anyways. Plus this version is more legible, and not as risky. If you forget the () around the fget portion, you'll be doing the equivalent of $row = (fgetcsv() != false) and simply getting a boolean value.

Categories