i have recently migrated my export CSV code from core php to code igniter. the code works well but its very slow when exporting very large amount of data..
here is my old code:
function exportCSV($qry,$con,$title)
{
$result = mysql_query($qry, $con) or die(mysql_error($con));
header('Content-Type: text/csv; charset=UTF-8');
header("Cache-Control: no-store, no-cache");
header("Content-Disposition: attachment;filename=".$title."-".date('mdY').".csv");
//echo "\xEF\xBB\xBF";
$row = mysql_fetch_assoc($result);
if ($row) {
echocsv(array_keys($row));
}
while ($row) {
echocsv($row);
$row = mysql_fetch_assoc($result);
}
}
function echocsv($fields)
{
$separator = '';
foreach ($fields as $field) {
if (preg_match('/\\r|\\n|,|"/', $field)) {
$field = '"' . str_replace('"', '""', $field) . '"';
}
echo $separator . $field;
$separator = ',';
}
echo "\r\n";
}
and here is my codeigniter code that is very slow... exporting data to CSV with 77000 rows took about 15 minutes excluding the download time..
public function exportCSV()
{
set_time_limit(0);
$delimiter = ",";
$newline = "\r\n";
$curr_date_time = date("l jS \of F Y h:i:s A");
$this->products_model->set_venture($this->selected_venture['abbrev']);
if($get_data = $this->input->get())
{
$data = $this->products_model->export_model($get_data);
$download = $this->dbutil->csv_from_result($data, $delimiter, $newline);
force_download('export - '.$curr_date_time.'.csv', $download);
}
else
{
show_404('page', FALSE);
}
}
public function export_model($params = NULL)
{
if ($params != NULL)
{
if ($params['name_filter'] != '')
{
$this->crawler_db->like('name', $params['name_filter']);
}
if ($params['comp_filter'] != '')
{
$this->crawler_db->where('fk_competitor_website', $params['comp_filter']);
}
}
return $this->crawler_db->get('pcrawler_'.$this->venture.'.products_final');
}
Hi check database util class can work for you its really simple to generate good CSV files here is the code
$this->load->dbutil();
$query = $this->db->query("SELECT * FROM mytable");
echo $this->dbutil->csv_from_result($query);
please read the document here Codeigniter CSV export with DB util
I would look at debugging the function, to see where it's actually slow.
Simple way is to use the Benchmarking Class to see if it's the query that's slow, or the call to csv_from_result().
Is it still slow when you don't pass any params to export_model()? If it's only slow when you run a like or where against the database, then maybe you need to add some indexes?
https://ellislab.com/codeigniter/user-guide/libraries/benchmark.html
Once you find the bottleneck, you can go from there to
code igniter's csv_from_result is really slow for me, i made a different one to suit my needs.. it can be re used for other CSV exports too...
public function array_to_csv($array)
{
if (count($array) == 0) {
return null;
}
ob_start();
$df = fopen("php://output", 'w');
fputcsv($df, array_keys(reset($array)));
foreach ($array as $row) {
fputcsv($df, $row);
}
fclose($df);
return ob_get_clean();
}
usage:
$this->array_to_csv($array);
Related
I'm using PHP to create an excel document with the data I'm pulling from a MySQL Database.
While cycling through the rows at the end and printing out my data to each cell, I want to replace each cell that has the string/int equivical of '-1' to '0'.
Here's my code so far,
header('Content-Type: text/csv');
header('Content-Disposition: attachment;filename=exported-data.csv');
$sql = "SELECT * FROM db_table";
$sth = $db->prepare($sql);
$sth->execute();
$filename = date('d.m.Y').'.csv';
$data = fopen($filename, 'w');
while ($row = $sth->fetch(PDO::FETCH_ASSOC)) {
$csv = implode(',', $row) . "\n";
fwrite($data, $csv);
print_r($csv);
}
echo "\r\n";
This works efficiently with getting the data from the db and outputting to an excel spreadsheet but how would I check each record pulled and replace it with a 0 if the record is -1?
So visually, if my database looked like this:
id mon tues wed
85 -1 -1 75
36 -1 12 -1
The excel spreadsheet would pop out like so:
85 0 0 75
36 0 12 0
I agree with u_mulder about replacing the -1 results with 0. I would build upon that with some error checking, cache busting, and I would use PHP's fputcsv function to stream the file directly to STDOUT. This will reduce memory usage, increase performance, and ensure that field values containing double quotes or commas are correctly parsed.
<?php
header('Content-Type: text/csv');
header('Content-Disposition: attachment;filename=exported-data.csv');
header("Pragma: no-cache");
header("Expires: 0");
$sql = "SELECT * FROM db_table";
$sth = $db->prepare($sql);
$sth->execute();
$file = fopen('php://output', 'w');
if(false === $file) {
throw new Exception("Failed to create output stream resource.");
}
while ($row = $sth->fetch(PDO::FETCH_ASSOC)) {
foreach($row as &$field) {
if("-1" == strval($field)) {
$field = "0";
}
}
if(false === fputcsv($file, $row)) {
throw new Exception("Failed to write row to CSV file.");
}
}
fclose($file);
I suggest you to do this:
while ($row = $sth->fetch(PDO::FETCH_ASSOC)) {
// check every key of $row
// use & to pass $value by reference
foreach ($row as &$value) {
// use strval to get string representation of a $value
if (strval($value) == '-1') {
$value = '0';
}
}
$csv = implode(',', $row) . "\n";
fwrite($data, $csv);
print_r($csv);
}
Doing this at the SQL level will be much faster then analyzing each row individually with PHP especially as your database gets larger. Use CASE.
header('Content-Type: text/csv');
header('Content-Disposition: attachment;filename=exported-data.csv');
$sql = "
SELECT
id
,CASE WHEN monday = -1 THEN 0 ELSE monday END AS monday
,CASE WHEN tuesday = -1 THEN 0 ELSE tuesday END AS tuesday
,CASE WHEN wednesday = -1 THEN 0 ELSE wednesday END AS wednesday
FROM db_table";
$sth = $db->prepare($sql);
$sth->execute();
$filename = date('d.m.Y').'.csv';
$data = fopen($filename, 'w');
while ($row = $sth->fetch(PDO::FETCH_ASSOC)) {
$csv = implode(',', $row) . "\n";
fwrite($data, $csv);
print_r($csv);
}
echo "\r\n";
Filter $row to replace negative values by 0:
$row = array_map(function ($val) { return $val > 0 ? $val : 0; }, $row);
Also, consider using fputcsv() to generate valid CSV files instead of trying to do it yourself. Simply separating values by , may work when you are dealing only with numbers but this may create problems with strings.
$output = fopen('php://output', 'w');
while ($row = $sth->fetch(PDO::FETCH_ASSOC))
{
$row = array_map(function ($val) { return $val > 0 ? $val : 0; }, $row);
fputcsv($data, $row);
fputcsv($output, $row);
}
Another problem that you may face when generating CSV files to be opened with Excel is localization. Excel expects the CSV files it opens to be formatted according to the local system localization settings. A record such as 1,2,3,1.23 may work in English, having each number separated in a column and having 1.23 with its decimals interpreted correctly, but will not work on many others languages, specially when , is used as decimal separator instead of ..
With that in mind you may want to implement localization to your CSV aswell, and your best guess is to use the client's browser language settings:
$locale = isset($_COOKIE['locale']) ? $_COOKIE['locale'] : $_SERVER['HTTP_ACCEPT_LANGUAGE'];
$locale = preg_split('/[,;]/', $locale);
setlocale(LC_ALL, $locale[0]);
$locale = localeconv();
$output = fopen('php://output', 'w');
while ($row = $sth->fetch(PDO::FETCH_ASSOC))
{
$row = array_map(function ($val) { return $val > 0 ? $val : 0; }, $row);
fputcsv($data, $row, $locale['decimal_point'] == ',' ? ';' : ',');
fputcsv($output, $row, $locale['decimal_point'] == ',' ? ';' : ',');
}
Now, if you don't want to have this complication, you can instead generate real Excel .XLSX files using a library such as PHPExcel.
I have a document called subjects.txt in the following format:
DateCreated,Subject,Link
18.10.2015,"Math",http: //address.html
17.10.2015,"English",http: //address.html
18.10.2015,"English",http: //address.html
19.10.2015,"Science",http: //address.html
17.10.2015,"Math",http: //address.html
The file contains URLs of sites created based on a school subject. There can be more than one site for a subject.
The goal is to use PHP to open, read, and display the contents of the file in the following format:
Math
Link 1
Link 2
English
Link 1
Link 2
Science (because there's only one link, the name of the subject is the
link)
So far I've been able to open and read the file:
$file = "./subjects.txt";
$subjects = file_get_contents($file);
I'm having trouble trying to determine how to go about writing the file in specified format.
I've tried using explode to separate the elements with "," - however I don't know where to go from there.
Your input file looks to be in Comma-separated values (CSV) format. PHP has a built-in fgetcsv function designed to make reading CSV data from a file easy.
<?php
$file = './subjects.txt';
$fh = fopen($file, 'r');
if ($fh === false) {
die("Can not read {$file}");
}
$data = array();
while (($row = fgetcsv($fh, 1000, ',')) !== false) {
if ($row[0] === 'DateCreated') {
// Ignore the column header row
continue;
}
list($date, $subject, $link) = $row;
if (!isset($data[$subject])) {
$data[$subject] = array();
}
$data[$subject][] = $link;
}
fclose($fh);
foreach ($data as $subject => $links) {
// TODO: output each subject here
}
Here is another version
<?php
$file = "./subjects.txt";
$h = fopen($file, "r");
if($h !== false) {
$subjects = [];
$data = [];
while(!feof($h)) {
if($line = trim(fgets($h))) {
$line = explode(",", $line);
if(!in_array("DateCreated",$line)) {
array_push($subjects, $line);
}
}
}
fclose($h);
foreach ($subjects as $subject) {
if(!isset($data[$subject[1]])) {
$data[$subject[1]] = [];
}
$data[$subject[1]][] = $subject[2];
}
foreach ($data as $subject => $links) {
if(count($links) == 1) {
echo "<p>$subject</p>\n";
} else {
$i = 1;
echo "<p>$subject</p>\n";
echo "<ul>\n";
foreach ($links as $link) {
echo "<li>link$i</li>\n";
$i++;
}
echo "</ul>\n";
}
}
}
?>
The problem using file_get_contents() is that retrieves all the file contents into $subjects.
You have to use a different approach. For example fgets():
$fp = fopen("./subjects.txt", "r");
if ($fp){
while (($line = fgets($fp)) !== false){
// So here you can treat each line individually.
// You can use explode (";", $line) for example if the line is not empty
}
}
fclose($fp);
Using fgets() will allow you to parse each of the file's lines individually.
As stated doing this with a database would be much easier probably 3 lines of code. Here's one approach you could use though.
$data = '18.10.2015,"Math",http: //address.html
17.10.2015,"English",http: //address1.html
18.10.2015,"English",http: //address2.html
19.10.2015,"Science",http: //address3.html
17.10.2015,"Math",http: //address4.html';
preg_match_all('~^(.*?),"(.*?)",(.*?)$~m', $data, $fields);
array_multisort($fields[2], SORT_STRING, $fields[1], $fields[3]);
$lastcat = '';
foreach($fields[2] as $key => $cat) {
if($cat != $lastcat) {
echo $cat . "\n";
}
$lastcat = $cat;
echo $fields[3][$key] . "\n";
}
Output:
English
http: //address1.html
http: //address2.html
Math
http: //address4.html
http: //address.html
Science
http: //address3.html
The array_multisort is how the categories are grouped.
Here's a regex101 demo of what that regex is doing. https://regex101.com/r/wN3nB2/1
Update for single record check (only ran 1 test on it):
$data = '18.10.2015,"Math",http: //address.html
17.10.2015,"English",http: //address1.html
18.10.2015,"English",http: //address2.html
19.10.2015,"Science",http: //address3.html
17.10.2015,"Math",http: //address4.html';
preg_match_all('~^(.*?),"(.*?)",(.*?)$~m', $data, $fields);
array_multisort($fields[2], SORT_STRING, $fields[1], $fields[3]);
$lastcat = '';
foreach($fields[2] as $key => $cat) {
if((empty($fields[2][($key +1)]) && $cat != $lastcat)|| ($cat != $lastcat && !empty($fields[2][($key +1)]) && $fields[2][($key +1)] != $cat)) {
//single record
echo $cat . $fields[3][$key] . "\n";
} else {
if($cat != $lastcat) {
echo $cat . "\n";
}
$lastcat = $cat;
echo $fields[3][$key] . "\n";
}
}
Having checked a variety of questions but not being able to find quite what I need, I am at a bit of a loss.
I am trying to chose the columns from MySQL I want exported to CSV by parsing the column names and adding the valid column names to a $colnames array, then adding those values as headers to the CSV and then only displaying the relevant data from the database through a while loop.
I have looked at the following in particular having been guided there from other questions: How to get all the key in multi-dimensional array in php
Here is the code:
function query_to_csv($query, $filename, $attachment = false, $headers = true, $specs_off = false) {
if($attachment) {
// send response headers to the browser
header( 'Content-Type: text/csv; charset=UTF-8' );
header( 'Content-Disposition: attachment;filename='.$filename);
$fp = fopen('php://output', 'w');
} else {
$fp = fopen($filename, 'w');
}
$result = mysql_query($query) or die( mysql_error() );
if($headers) {
// output header row (if at least one row exists)
$row = mysql_fetch_assoc($result);
if($row) {
// PRODUCTS TABLE SPECIFIC - get rid of specs_ and free_ columns so have nicer data set for user
if($specs_off) {
$columnames = array_keys($row);
$colnames = array();
//$colnames = array_keys($row);
foreach($columnames as $key => $value) {
if((substr_count($value, "spec_") < 1) && (substr_count($value, "free_") < 1)) {
array_push($colnames, $value);
}
}
}
else {
$colnames = array_keys($row);
}
// add in additional columns if exporting client data
if($table == 'clients') {array_push($colnames, "products", "last_order_date");}
//write the colnames to the csv file
fputcsv($fp, $colnames);
// reset pointer back to beginning
mysql_data_seek($result, 0);
}
} // done with the headers etc, now lets get on with the data
// clear out and create the $row_data array
$row_data = array();
// run through the row array adding values to row_data as we go
while($row = mysql_fetch_assoc($result)) {
// create the array_keys_multi from https://stackoverflow.com/questions/11234852/how-to-get-all-the-key-in-multi-dimensional-array-in-php/11234924#11234924
function array_keys_multi(array $array) {
$keys = array();
foreach ($array as $key => $value) {
$keys[] = $key;
if (is_array($array[$key])) {
$keys = array_merge($keys, array_keys_multi($array[$key]));
}
}
return $keys;
}
// run the function on the $row array
array_keys_multi($row);
// now use the $keys array
foreach($keys as $key => $value) {
// check if the value is in the colnames array and if so push the data on to the $row_data array ready for export to CSV
if(in_array($value, $colnames)) {
array_push($row_data, $row[$value]);
}
}
// now we are ready to write the CSV
fputcsv($fp, $row_data);
}
fclose($fp);
exit;
} // end of query_to_csv
// Write the sql statement
$sql = "SELECT * FROM ".$table." ";
if(isset($where_1_col)) { $sql .= " WHERE `".$where_1_col."` = '".$where_1_val."'"; }
if(isset($where_2_col)) { $sql .= " AND `".$where_2_col."` = '".$where_2_val."'"; }
if(isset($where_3_col)) { $sql .= " AND `".$where_3_col."` = '".$where_3_val."'"; }
if(isset($where_4_col)) { $sql .= " AND `".$where_4_col."` = '".$where_4_val."'"; }
if(isset($order_by_col)) { $sql .= " ORDER BY `".$order_by_col."` ". strtoupper($order_by_dir) ." "; }
// output as an attachment
query_to_csv($sql, $table."_export.csv", true, true, true);
All I am getting is a huge export of the chosen column names repeated as many times as there are values from the initial query. I don't know how to get the values in.
Any suggestions on where I am going wrong or how I can undertake this more neatly are welcomed.
It seems that you just append the new row data to $row_data but never clear that array.
array_push($row_data, $row[$value]);
What I did to fix it:
Move
// clear out and create the $row_data array
$row_data = array();
into the while loop.
Change
// clear out and create the $row_data array
$row_data = array();
while($row = mysql_fetch_assoc($result)) {
...
}
To
while($row = mysql_fetch_assoc($result)) {
// clear out and create the $row_data array
$row_data = array();
...
}
Note:
You are using $table everywhere but never define it. Like here if($table == 'clients')
If it is a global var you need to add global $table or a parameter to your function, too.
Edit:
As mentioned in my comment on your question you could just use array_keys() to get the keys.
php.net/manual/de/function.array-keys.php
And then change
array_keys_multi($row);
to
$keys = array_keys($row);
After that you can remove array_keys_multi()
Further you could move that part in front of your while-loop because you only need to calculate the column names you need once and not in every iteration.
Here's my PHP code to create a .CSV file based on some SQL data. It's working as intended, the only issue being that it simply creates a .CSV file on the server, and doesn't prompt the user to download it.
<?php
require_once "../assets/repository/mysql.php";
$query = "SELECT * FROM datashep_AMS.COMPLETE_APPLICATIONS";
$results = mysql_query($query);
$first = true;
$out = fopen('export.csv', 'w');
while($row = mysql_fetch_assoc($results)){
if($first){
$titles = array();
foreach($row as $key=>$val){
$titles[] = $key;
}
fputcsv($out, $titles);
$first = false;
}
fputcsv($out, $row);
}
fclose($out);
?>
So my question is, how do I make the user download the file immediately upon it being generated?
And, once they've downloaded it (or declined), how should I handle deleting the .CSV file from my server?
no need to store anything on the server (and thus no need to delete ...). Just write the results back to the browser and set the headers accordingly:
<?php
require_once "../assets/repository/mysql.php";
$query = "SELECT * FROM datashep_AMS.COMPLETE_APPLICATIONS";
$results = mysql_query($query);
$first = true;
header('Content-Type: text/csv');
header('Content-Disposition: attachment;filename="export.csv"');
header('Cache-Control: max-age=0');
$out = fopen('php://output', 'w');
while($row = mysql_fetch_assoc($results)){
if($first){
$titles = array();
foreach($row as $key=>$val){
$titles[] = $key;
}
fputcsv($out, $titles);
$first = false;
}
fputcsv($out, $row);
}
fclose($out);
?>
I'm trying to export an array of arrays to excel. I have it set up to be a header variable, and a data variable that basically builds a giant string to be executed in the export. However, only the header variable is going through. Let me show some code:
This is setting the parameters:
str_replace(" ", "_", $getData['name']);
$filename = $getData['name']."_summary.xls";
header("Content-Type: application/x-msdownload");
header("Content-Disposition: attachment; filename=\"$filename\"");
header("Pragma: no-cache");
header("Expires: 0");
Which goes to a function to get the information:
foreach($tempRS as $key=>$value)
{
foreach($value as $iKey=>$iValue)
{
if($count == 6)
{
$iValue = str_replace('"', '""', $iValue);
$iValue = '"'.$iValue.'"'."\n";
$data .= trim($iValue);
$count = 0;
}
else
{
$iValue = str_replace('"', '""', $iValue);
$iValue = '"'.$iValue.'"'."\t";
$data .= trim($iValue);
$count++;
}
}
}
$header = "ROW HEADER 1\tROW HEADER 2\tROW HEADER 3\tROW HEADER 4\tROW HEADER 5\tROW HEADER 6\n";
print "$header\n$data";
I can't seem to figure out why i'm losing the $data variable on the export.
Why not just fputcsv() to generate that CSV data for you? Or better yet, instead of making a .csv masquerade as an Excel file, you can use PHPExcel to output a native .xls/.xlsx and actually use formatting and formulae in the generated spreadsheet.
first of all, use echo instead of print. Print causes loads of overhead as it does both return and echo the data.
Secondly, don't put the variables within quotes, use
echo $header ."\n".$data;
To get to your question, does the foreach loops actually loop? Have you checked the $data if it contains any data?
A better solution might be this:
$header = '';
echo $header;
foreach() loop here {
//echo the data here instead of putting it in a variable
}
Or maybe better, use http://nl.php.net/fputcsv
I tested your code and it seems that your trim() method is trimming your \n and \t.
If you remove it, your code should be fine.
Here's my code to export a array of product in excel.
Only one little problem with this code : Excel doesn't want to open it because of a format problem, but if you click "yes" when it promps an error message, you'll be ok...
No problem with open office though
Working on a fix...
$filename = "products_exports " . date("Y-m-d H_i_s") . ".xls";
/**
* Tell the browser to download an excel file.
*/
header("Content-Type: application/x-msdownload; charset=utf-8");
header("Content-Disposition: attachment; filename=\"$filename\"");
header("Pragma: no-cache");
header("Expires: 0");
/**
* The header of the table
* #var string
*/
$header = "";
/**
* All the data of the table in a formatted string
* #var string
*/
$data = "";
//We fill in the header
foreach ($productArray as $i => $product) {
$count = 0;
foreach ($product as $key => $value) {
if($count == count((array)new Product("")) - 1){
$header.= $key;
}
else{
$header.= $key . "\t";
$count++;
}
}
break; /*One loop is enough to get the header !*/
}
//And then the data by the product and all the categories
/*Where $productArray = This can be your custom array of object. eg.
array[0] = new YouCustomObject(your params...);
*/
foreach ($productArray as $i => $product) {
$count = 0;
foreach ($product as $key => $value) {
if($count == count((array)new Product("")) - 1){
$value = str_replace('"', '""', $value);
$value = '"' . $value . '"' . "\n";
$data .= $value;
$count = 0;
}
else{
$value = str_replace('"', '""', $value);
$value = '"' . $value . '"' . "\t";
$data .= $value;
$count++;
}
}
}
echo $header ."\n". $data;