Export CSV from Mysql - php

I'm having a bit of trouble exporting a csv file that is created from one of my mysql tables using php.
The code I'm using prints the correct data, but I can't see how to download this data in a csv file, providing a download link to the created file. I thought the browser was supposed to automatically provide the file for download, but it doesn't. (Could it be because the below code is called using ajax?)
Any help greatly appreciated - code below, S.
include('../cofig/config.php'); //db connection settings
$query = "SELECT * FROM isregistered";
$export = mysql_query($query) or die("Sql error : " . mysql_error());
$fields = mysql_num_fields($export);
for ($i = 0; $i < $fields; $i++) {
$header .= mysql_field_name($export, $i) . "\t";
}
while ($row = mysql_fetch_row($export)) {
$line = '';
foreach ($row as $value) {
if ((!isset($value) ) || ( $value == "" )) {
$value = "\t";
} else {
$value = str_replace('"', '""', $value);
$value = '"' . $value . '"' . "\t";
}
$line .= $value;
}
$data .= trim($line) . "\n";
}
$data = str_replace("\r", "", $data);
if ($data == "") {
$data = "\n(0) Records Found!\n";
}
//header("Content-type: application/octet-stream"); //have tried all of these at sometime
//header("Content-type: text/x-csv");
header("Content-type: text/csv");
//header("Content-type: application/csv");
header("Content-Disposition: attachment; filename=export.csv");
//header("Content-Disposition: attachment; filename=export.xls");
header("Pragma: no-cache");
header("Expires: 0");
echo 'Download Exported Data'; //want my link to go in here...
print "$header\n$data";

In essence, you can't output the CSV file and the link to it in one go. (You need to introduce the concept of a page "mode" and activate the download mode via a ...pagename.php?mode=download or similar. You could then use PHP's switch statement to switch on $_GET['mode'] in your script.)
That said, the text/csv content type header you were using is correct, although you may also want to output the Content-Length and Content-Disposition headers. After you've output the file data, also be sure to stop any additional script processing via PHP's exit function.
Additionally, it would probably be a lot less hassle (and will certainly be faster/more memory efficient) to use MySQL SELECT ... INTO OUTFILE facility (if you have the permissions) rather than use PHP to gather the data.

You can't have text and a download on the same page. You need to have a link to the download area, which could just be a GET parameter leading to a function, which then does all the processing, displays headers, and echoes the content of the CSV.
For example, you could have Click here to download CSV, then in your code have if ($_GET['action'] === 'download'), get the data from the database, format it, send the headers, and echo the data. And then die(), because that part of the script can accomplish no more.

You should not put the link in the same file that generates the csv, as the link will not be in the csv itself!
Do something like:
Download CSV
and it should work

Three things to consider:
You're sending headers indicating that the user is going to be downloading a CSV file, but then you send create a link to download it? This isn't correct, you should be linking to this page, and then only outputting the CSV data itself after the headers.
MySQL has the ability to generate CSV output, and you should definitely take advantage of this instead of trying to do it yourself. You can use SELECT INTO ... OUTFILE to do this.
If you must create the CSV using PHP, please use fputcsv to do so. This will handle all the complications of CSV such as escaping and proper formatting. Since fputcsv writes to a file, you could either write it to a temporary file and then output it after you send your headers, or use the following trick to output it directly:
Do this after sending headers:
$fp = fopen('php://output', 'w');
while( $row = mysql_fetch_row( $export ) ) {
fputcsv($fp, $row);
}

I think the mySQL => CSV is common problem which is part of each PHP forum.
I have try to solve this issue in a common way and implement an free export
lib for PHP which is very similar to the Google AppInventor philosophie.
DragDrop and hide the coding stuff.
Use the lib and create your Export via Click&Point.
Common Demos: http://www.freegroup.de/software/phpBlocks/demo.html
Link to editor: http://www.freegroup.de/test/editor/editor.php?xml=demo_sql.xml
worth a look
Greetings
Andreas

Related

Exporting CSV with UTF8 to Excel using PHP

my client asked me to build an export system that export the whole SQL database into csv file that works on excel. I found PHPexcel and it's great, but I thought I can do stuff way more easier and faster using my own functions.
After struggling with Excel encoding, I finally succeeded to export the CSV that will work on Excel using the following code:
<?php
$data = showData("price"); //Function that loads all the SQL database into an array.
function array2csv(array &$array)
{
if (count($array) == 0) {
return null;
}
ob_start();
$df = fopen("php://output", 'w');
fputcsv($df, array_keys(reset($array)));
foreach ($array as $row) {
fputcsv($df, $row);
}
fclose($df);
return ob_get_clean();
}
function download_send_headers($filename) {
// disable caching
$now = gmdate("D, d M Y H:i:s");
header("Cache-Control: max-age=0, no-cache, must-revalidate, proxy-revalidate");
header("Last-Modified: {$now} GMT");
header ( 'HTTP/1.1 200 OK' );
header ( 'Date: ' . date ( 'D M j G:i:s T Y' ) );
header ( 'Content-Type: application/vnd.ms-excel') ;
header ( 'Content-Disposition: attachment;filename=export.csv' );
}
download_send_headers("export.csv");
$final = array2csv($data);
print chr(255) . chr(254) . mb_convert_encoding($final, 'UTF-16LE', 'UTF-8');
die();
?>
The problem is that when I try to open the file in Excel there are no columns. Each row is a large column that contains the whole data of the specific row, separated by a comma.
I figured out that I would need to replace those commas with something that Excel can read as a "new column". But I still need to keep my CSV to work as it is.
I searched SO and Google with no luck whatsoever finding a solution that will keep my CSV intact and yet split the data into columns Excel. If there is no way to do both, I think the more important thing to my client is that the Excel version will work as it should (each row separated into columns).
This is a picture of how it looks on CSV (using numbers on Mac)
And this is a picture of how it looks on Excel 2007 on Windows
It's all in the fputcsv() function:
http://php.net/manual/en/function.fputcsv.php
Try to choose the correct $delimiter, $enclosure and $escape_char.
The default character that is used as the field-separator in excel is set by the locale settings. That means: Importing CVS files can be language dependent. Export to CSV from excel to see what it does on your system, and check that your client system has the same settings.
It's better to export to XML instead of CSV, because that will circumvent this problem:
http://en.wikipedia.org/wiki/Microsoft_Office_XML_formats

Write to .csv file with PHP (Commas in Data Error)

I am working on a PHP statement that runs a query and then writes the data to a .csv file. The problem I am having is that some of the data I am receiving from the server has commas in the data which causes for the .csv file to enter data in the wrong place. Below I have an example of the code.
$sql = "Select *
From table;"
$data = mysqli_query($link, $sql);
$row= ("Column One, Column Two, Column Three\n");
while ($result = $data->fetch_assoc()) {
$row .= ("$result[columnOne], $result[columnTwo], $result[columnThree]\n");
}
$fd = fopen("./filePath.csv", "w") or die ("Error Message");
fwrite($fd, $row);
fclose($fd);
Column three is where the data contains commas which causes for it to write to different cells in the .csv file. Is there any solution to make the $result[columnThree] data stay in one cell even though it contains commas in it?
You can wrap the values in double-quotes:
$row .= ('"'.$result['columnOne'].'", "'.$result['columnTwo'].'", "'.$result['columnThree'].'"\n"');
Instead of concatenating a string, I like to use arrays as much as possible:
$rawCsv = array();
while ($result = $data->fetch_assoc()) {
if (count($rawCsv) === 0)
$rawCsv[] = '"'.implode('","', array_keys($result )).'"';
$rawCsv[] = '"'.implode('","', $result ).'"';
}
$csvString = implode("\n", $rawCsv);
Both of these approaches will have a hard time with a different character in your data though -- the double quote. With that in mind, an even better alternative would be to use fopen and fputcsv to create your CSV data and you don't have to think about it.
If you plan to immediately offer the CSV data for download, you don't need a file at all, just dump it into the output butter:
ob_start();
$file_handle = fopen("php://output", 'w');
... if you do want to hang on to a file, then use fopen on the desired output file and skip the call to ob_start
Next, assemble your data:
fputcsv($file_handle, array(
'Your',
'headings',
'here'
));
while ($result = $data->fetch_assoc()) {
fputcsv($file_handle, array(
$result['Your'],
$result['data'],
$result['"here"']
));
}
fclose($file_handle);
... If you're using a file, then you're all set! If you are using the output buffer (no file used), you can grab the CSV data and send it to the browser directly:
$csv = ob_get_clean();
echo $csv; // should send headers first!
Be careful with output buffering, though, some frameworks/applications make use of it internally. If you're running in to problems with it, try using a file. If the file works, then your framework is probably already doing something with the output buffer.
Documentation
RFC 4180 Common Format and MIME Type for Comma-Separated Values (CSV) Files - https://www.rfc-editor.org/rfc/rfc4180
implode - http://php.net/function.implode
fopen - http://php.net/manual/en/function.fopen.php
fclose - http://php.net/manual/en/function.fclose.php
fputcsv - http://php.net/manual/en/function.fputcsv.php
ob_start - http://php.net/manual/en/function.ob-start.php
ob_get_clean - http://php.net/manual/en/function.ob-get-clean.php

mysqli export to xls issue

I have troble geting this work and I can't figure it out what is the issue.
I download a xls file, but it doesent opens.
I had a mysql script like this, working, and I tried to convert it into mysqli and probably something is wrong...
Thanks in advance
$sqlExp = "SELECT * FROM table";
$countQryExp = mysqli_query($link, $sqlExp );
$filename = "sampledata.xls"; // File Name
header("Content-Disposition: attachment; filename=\"$filename\"");
header("Content-Type: application/vnd.ms-excel");
$flag = false;
while($row=mysqli_fetch_array($countQryExp,MYSQLI_ASSOC))
{
if(!$flag) {
// display field/column names as first row
echo implode("\t", array_keys($row)) . "\r\n";
$flag = true;
}
echo implode("\t", array_values($row)) . "\r\n";
}
There's a lot more to generating an Excel file than giving it a content type of application/vnd.ms-excel. Excel is a very particular format, whereas you're generating a TSV file - tab separated values, and in a pretty breakable manner (what happens if someone puts a \t in one of your site's fields, or a new line?).
If you want to generate real Excel files, you'll want one of the various libraries for doing so. If a CSV/TSV are fine, just export a .csv/.tsv file with proper headers.

Output 1,000s of records to text file

So I was hoping to be able to get by with a simple solution to read records from a database and save them to a text file that the user downloads. I have been doing this on the fly and for under 20,000 records, this works great. Over 20,000 records and I'm loading too much data into memory and PHP hits a fatal error.
My thought was to just grab everything in chunks. So I grab XX number of rows and echo them to the file and then loop to get the next XX rows until I'm done.
I am just echoing the results right now though, not building the file and then sending it for download, which I'm guessing I'll have to do.
The issue at this point succinctly is that with up to 20,000 rows, the file builds and downloads perfectly. With more than that, I get an empty file.
The code:
header('Content-type: application/txt');
header('Content-Disposition: attachment; filename="export.'.$file_type.'"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
// I do other things to check for records before, hence the do-while loop
$this->items = $model->getItems();
do {
foreach ($this->items as $k => $item) {
$i=0;
$tables = count($this->data['column']);
foreach ($this->data['column'] as $table => $fields) {
$columns = count($fields);
$j = 0;
foreach ($fields as $field => $junk) {
if ($quote_output) {
echo '"'.ucwords(str_replace(array('"'), array('\"'), $item->$field)).'"';
} else {
echo ''.$item->$field.'';
}
$j++;
if ($j<$columns) {
echo $delim;
}
}
$i++;
if ($i<$tables) {
echo $delim;
}
}
echo "\n";
}
} while($this->items = $this->_model->getItems());
Very large tables won't work that way.
You have to output the data as you read it from the database. If you need to sorted, then use the database ORDER BY for that purpose.
So more or less
// assuming you use a var such as $query to handle the DB
while(!$query->eof())
{
$fields = $query->read_next();
echo $fields; // with your formatting, maybe call a function...
}
The empty result is normal. If the memory is exhausted before any echo happens then nothing was sent to the browser.
Note also that PHP has a time limit (a watchdog) that you may need to tweak. The default is defined in your php.ini. You may set it to zero if you expect the tables to grow very much.
You should change your str_replace for addslashes(). This will probably free some memory.
Then I suggest you to save a file and use php file functions to do so: fopen() or file_put_contents().
I hope that might help you!
Actually, this might be simple fix. If PHP is running out of memory it's probably because the output buffer is filling before the file is sent. If so, simply flush() at regular intervals.
This will flush after each line:
do {
foreach(...) {
// assemble your output line here
}
echo "\n";
flush();
}
} while($this->items = $this->_model->getItems());
Flushing after each line might prove too slow, in which case add a counter and flush after every hundred, or whatever works best.

CSV file generation error

I'm working on a project for a client - a wordpress plugin that creates and maintains a database of organization members. I'll note that this plugin creates a new table within the wordpress database (instead of dealing with the data as custom_post_type meta data). I've made a lot of modifications to much of the plugin, but I'm having an issue with a feature (that I've left unchanged).
One half of this feature does a csv import and insert, and that works great. The other half of this sequence is a feature to download the contents of this table as a csv. This part works fine on my local system, but fails when running from the server. I've poured over each portion of this script and everything seems to make sense. I'm, frankly, at a loss as to why it's failing.
The php file that contains the logic is simply linked to. The file:
<?php
// initiate wordpress
include('../../../wp-blog-header.php');
// phpinfo();
function fputcsv4($fh, $arr) {
$csv = "";
while (list($key, $val) = each($arr)) {
$val = str_replace('"', '""', $val);
$csv .= '"'.$val.'",';
}
$csv = substr($csv, 0, -1);
$csv .= "\n";
if (!#fwrite($fh, $csv))
return FALSE;
}
//get member info and column data
$table_name = $wpdb->prefix . "member_db";
$year = date ('Y');
$members = $wpdb->get_results("SELECT * FROM ".$table_name, ARRAY_A);
$columns = $wpdb->get_results("SHOW COLUMNS FROM ".$table_name, ARRAY_A);
// echo 'SQL: '.$sql.', RESULT: '.$result.'<br>';
//output headers
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"members.csv\"");
//open output stream
$output = fopen("php://output",'w');
//output column headings
$data[0] = "ID";
$i = 1;
foreach ($columns as $column){
//DIAG: echo '<pre>'; print_r($column); echo '</pre>';
$field_name = '';
$words = explode("_", $column['Field']);
foreach ($words as $word) $field_name .= $word.' ';
if ( $column['Field'] != 'id' && $column['Field'] != 'date_updated' ) {
$data[$i] = ucwords($field_name);
$i++;
}
}
$data[$i] = "Date Updated";
fputcsv4($output, $data);
//output data
foreach ($members as $member){
// echo '<pre>'; print_r($member); echo '</pre>';
$data[0] = $member['id'];
$i = 1;
foreach ($columns as $column){
//DIAG: echo '<pre>'; print_r($column); echo '</pre>';
if ( $column['Field'] != 'id' && $column['Field'] != 'date_updated' ) {
$data[$i] = $member[$column['Field']];
$i++;
}
}
$data[$i] = $member['date_updated'];
//echo '<pre>'; print_r($data); echo '</pre>';
fputcsv4($output, $data);
}
fclose($output);
?>
So, obviously, a routine wherein a query is run, $output is established with fopen, each row is then formatted as comma delimited and fwrited, and finally the file is fclosed where it gets pushed to a local system.
The error that I'm getting (from the server) is
Error 6 (net::ERR_FILE_NOT_FOUND): The file or directory could not be found.
But it clearly is getting found, its just failing. If I enable phpinfo() (PHP Version 5.2.17) at the top of the file, I definitely get a response - notably Cannot modify header information (I'm pretty sure because phpinfo() has already generated a header). All the expected data does get printed to the bottom of the page (after all the phpinfo diagnostics), however, so that much at least is working correctly.
I am guessing there is something preventing the fopen, fwrite, or fclose functions from working properly (a server setting?), but I don't have enough experience with this to identify exactly what the problem is.
I'll note again that this works exactly as expected in my test environment (localhost/XAMPP, netbeans).
Any thoughts would be most appreciated.
update
Ok - spent some more time with this today. I've tried each of the suggested fixes, including #Rudu's writeCSVLine fix and #Fernando Costa's file_put_contents() recommendation. The fact is, they all work locally. Either just echoing or the fopen,fwrite,fclose routine, doesn't matter, works great.
What does seem to be a problem is the inclusion of the wp-blog-header.php at the start of the file and then the additional header() calls. (The path is definitely correct on the server, btw.)
If I comment out the include, I get a csv file downloaded with some errors planted in it (because $wpdb doesn't exist. And if comment out the headers, I get all my data printed to the page.
So... any ideas what could be going on here?
Some obvious conflict of the wordpress environment and the proper creation of a file.
Learning a lot, but no closer to an answer... Thinking I may need to just avoid the wordpress stuff and do a manual sql query.
Ok so I'm wondering why you've taken this approach. Nothing wrong with php://output but all it does is allow you to write to the output buffer the same way as print and echo... if you're having trouble with it, just use print or echo :) Any optimizations you could have got from using fwrite on the stream then gets lost by you string-building the $csv variable and then writing that in one go to the output stream (Not that optimizations are particularly necessary). All that in mind my solution (in keeping with your original design) would be this:
function escapeCSVcell($val) {
return str_replace('"','""',$val);
//What about new lines in values? Perhaps not relevant to your
// data but they'll mess up your output ;)
}
function writeCSVLine($arr) {
$first=true;
foreach ($arr as $v) {
if (!$first) {echo ",";}
$first=false;
echo "\"".escapeCSVcell($v)."\"";
}
echo "\n"; // May want to use \r\n depending on consuming script
}
Now use writeCSVLine in place of fputcsv4.
Ran into this same issue. Stumbled upon this thread which does the same thing but hooks into the 'plugins_loaded' action and exports the CSV then. https://wordpress.stackexchange.com/questions/3480/how-can-i-force-a-file-download-in-the-wordpress-backend
Exporting the CSV early eliminates the risk of the headers already being modified before you get to them.

Categories