Write to PHP output buffer and then download CSV from buffer - php

I need to write a CSV file to the PHP output buffer and then download that file to the client's computer after it's done writing. (I wanted to just write it on the server and download it which was working, but it turns out I won't have write access on production servers).
I have the following PHP script:
$basic_info = fopen("php://output", 'w');
$basic_header = array(HEADER_ITEMS_IN_HERE);
#fputcsv($basic_info, $basic_header);
while($user_row = $get_users_stmt->fetch(PDO::FETCH_ASSOC)) {
#fputcsv($basic_info, $user_row);
}
#fclose($basic_info);
header('Content-Description: File Transfer');
header('Content-Type: application/csv');
header('Content-Disposition: attachment; filename=test.csv');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize("php://output"));
ob_clean();
flush();
readfile("php://output");
I'm not sure what to do. The CSV file downloads but displays nothing. I assume it has something to do with the ordering of my ob_clean() and flush() commands, but I'm not sure what's the best way to order these things.
Any help is appreciated.

You're doing a little too much. Create the script with the sole purpose of outputting the CSV. Just print it out directly to the screen. Don't worry about headers or buffers or php://output or anything like that yet.
Once you've confirmed that you're printing the data out to the screen appropriately, just add these headers at the beginning:
<?php
header("Content-disposition: attachment; filename=test.csv");
header("Content-Type: text/csv");
?>
... confirm that that downloads the file appropriately. Then you can add the other headers if you like (the headers I included above are those I've used myself without any extra cruft to get this working, the others are basically for efficiency and cache control, some of which may already be handled appropriately by your server, and may or may not be important for your particular application).
If you want, use output buffering with ob_start() and ob_get_clean() to get the output contents into a string which you can then use to fill out Content-Length.

As mentioned in my comments of Edson's answer, I expected a "headers already sent" warning at the last line of code:
header('Content-Length: '.$streamSize);
since output is written before this header is sent, but his example works ok.
Some investigation leads me to to following conclusions:
At the time you use an output buffer (weither a user one, or the
default PHP one), you may send HTTP headers and content the way you
want. You know that any protocol require to send headers before body
(thus the term "header"), but when you use an ouput buffer layer, PHP
will take care of this for you. Any PHP function playing with output
headers (header(), setcookie(), session_start()) will in fact use the
internal sapi_header_op() function which just fills in the headers
buffer. When you then write output, using say printf(), it writes into
the output buffer (assuming one). When the output buffer is to be
sent, PHP starts sending the headers first, and then the body. PHP
takes care of everything for you. If you dont like this behavior, you
have no other choice than disabling any output buffer layer.
and
The default size of the PHP buffer under most configurations is 4096
bytes (4KB) which means PHP buffers can hold data up to 4KB. Once this
limit is exceeded or PHP code execution is finished, buffered content
is automatically sent to whatever back end PHP is being used (CGI,
mod_php, FastCGI). Output buffering is always Off in PHP-CLI.
Edson's code works because the output buffer did not automatically get flushed because it doesn't exceed the buffer size (and the script isn't terminated obviously before the last header is sent).
As soon as the data in the output buffer exceeds the buffer size, the warning will be raised. Or in his example, when the data of
$get_users_stmt->fetch(PDO::FETCH_ASSOC)
is too large.
To prevent this, you should manage the output buffering yourself with the ob_start() and ob_end_flush(); like below:
// Turn on output buffering
ob_start();
// Define handle to output stream
$basic_info = fopen("php://output", 'w');
// Define and write header row to csv output
$basic_header = array('Header1', 'Header2');
fputcsv($basic_info, $basic_header);
$count = 0; // Auxiliary variable to write csv header in a different way
// Get data for remaining rows and write this rows to csv output
while($user_row = $get_users_stmt->fetch(PDO::FETCH_ASSOC)) {
if ($count == 0) {
// Write select column's names as CSV header
fputcsv($basic_info, array_keys($user_row));
} else {
//Write data row
fputcsv($basic_info, $user_row);
}
$count++;
}
// Get size of output after last output data sent
$streamSize = ob_get_length();
//Close the filepointer
fclose($basic_info);
// Send the raw HTTP headers
header('Content-Type: text/csv');
header('Content-Disposition: attachment; filename=test.csv');
header('Expires: 0');
header('Cache-Control: no-cache');
header('Content-Length: '. ob_get_length());
// Flush (send) the output buffer and turn off output buffering
ob_end_flush();
You're still bound to other limits, though.

I did some tweaks to your code.
Moved headers before any output, as suggested by PHP's doc;
Remember that header() must be called before any actual output is
sent, either by normal HTML tags, blank lines in a file, or from PHP
Removed some headers that didn't make much of a change;
Commented another option of writing csv header using the select column's names;
Now content length works;
There is no need to echo $basic_info as it is already at output buffer and we redirected it inside a file through headers;
Removed # (PHP Error Control Operator) as it may cause overhead, don't have a link to show you right now but you might find it if you search. You should think twice before silencing errors, most times it should be fixed instead of silenced.
header('Content-Type: text/csv');
header('Content-Disposition: attachment; filename=test.csv');
header('Expires: 0');
header('Cache-Control: no-cache');
$basic_info = fopen("php://output", 'w');
$basic_header = array(HEADER_ITEMS_IN_HERE);
fputcsv($basic_info, $basic_header);
$count = 0; // auxiliary variable to write csv header in a different way
while($user_row = $get_users_stmt->fetch(PDO::FETCH_ASSOC)) {
// Write select column's names as CSV header
if ($count == 0) {
fputcsv($basic_info, array_keys($user_row));
}
fputcsv($basic_info, $user_row);
$count++;
}
// get size of output after last output data sent
$streamSize = ob_get_length();
fclose($basic_info);
header('Content-Length: '.$streamSize);

Related

PHP Output MS SQL to a downloadable CSV or Excel file

I'm trying to output data returned by an MS SQL query to an Excel or CSV file with PHP.
I've used the script in this answer and can output the file OK. Without the header lines (at the bottom of my code) it saves in my server's folder structure rather than outputs as a download to the browser.
If I add the header lines, it ouputs to a CSV file but writes the page's HTML to the file rather than the extract from the database! Am I missing a setting somewhere? I tried running the code on a page with no HTML in it (PHP and SQL code only), but it still happens.
// Give the file a suitable name:
$FileName= $PartNumber.".csv";
$fp = fopen($FileName, 'w');
// Connect to MS SQL server; the actual database is chosen in the form
// ConnSQL defined in inc/dbconn/config.php
ConnSQL($idDatabase);
// the query is a biggie; here it is:
require 'inc_sql.php';
// run it through the SQL server
$rstBOM = sqlsrv_query($GLOBALS['ConnSQL'], $sqlBOM);
while ($export= sqlsrv_fetch_array($rstBOM, SQLSRV_FETCH_ASSOC)) {
if (!isset($headings))
{
$headings = array_keys($export);
fputcsv($fp, $headings, ',', '"');
}
fputcsv($fp, $export, ',', '"');
}
// force download csv - exports HTML to CSV!
header("Content-type: application/force-download");
header('Content-Disposition: inline; filename="'.$FileName.'"');
header("Content-Transfer-Encoding: Binary");
header("Content-length: ". filesize($FileName));
header('Content-Type: application/excel');
header('Content-Disposition: attachment; filename="'.$FileName.'"');
fclose($fp);
Any ideas where I'm going wrong please?
You need to output your csv file to the browser simply by putting
readfile($FileName);
At the end of your code after the fclose($fp); function.
Otherwise, browser receives the headers for files, but no content in sent from your PHP code.
You could also generate your csv file on the fly and just echo $csvFileContents; instead. This would prevent server from creating and writing data to file, which could lead to security breaches.
Good luck!

Yii code executed twice when rendering pdf

On my page, people can choose to either view a pdf-file (on screen) or to download it. (to view it later on when they're offline)
When users choose to download, the code is executed once. I am keeping track of this with a counter and it increments by 1 for each download. So, this option is working fine and can be seen in the if-block below.
When users choose to view the file, the pdf file is displayed - so that's OK - but the counter increments by 2 for each view. This code is run from the else-block below.
I also checked the "Yii trace" and it is really going through all of it twice, but only when viewing the file...
if ($mode==Library::DOWNLOAD_FILE){
//DOWNLOAD
Yii::app()->getRequest()->sendFile($fileName, #file_get_contents( $rgFiles[0] ) );
Yii::app()->end();
}
else {
//VIEW
// Set up PDF headers
header('Content-type: application/pdf');
header('Content-Disposition: inline; filename="' . $rgFiles[0] . '"');
header('Content-Transfer-Encoding: binary');
header('Content-Length: ' . filesize($rgFiles[0]));
header('Accept-Ranges: bytes');
// Render the file
readfile($rgFiles[0]);
Yii::app()->end();
}
}
I tried a few other options, just to see how it would cause this to run twice:
When removing the "PDF headers" from the code above, the counter is
incremented by 1, but I obviously only get garbage on the screen...
If I get rid off the readfile command, the counter is also incremented by 1,
but the browser won't render the pdf (because it is not getting the data without this line)...
So, it's only when going through the else-block that all of it (Yii request) is executed twice...
Thanks in advance for any suggestions...
I think that is because with the sendFile() method you open the file actually just once, and in the else branch you really open it twice.
In the if branch you open the file once with the file_get_contents() and pass the file as a string to the sendFile() method and then it counts the size of this string, outputs headers, etc: http://www.yiiframework.com/doc/api/1.1/CHttpRequest#sendFile-detail
In the else branch you open the file first with the filesize() and then also with the readfile() method.
I think you could solve this problem by rewriting the else branch similar to the sendFile() method:
Basically read in the file with file_get_contents() into a string, and then count the length of this string with mb_strlen(). After you output the headers, just echo the content of the file without reopening it.
You could even copy-paste the whole sendFile() method into the else branch, just change the "attachment" to "inline" in the line (or replace this whole if/else statement with the sendFile method and just change the attachment/inline option to download or view, an even more elegant way would be overriding this method and extending with another parameter, to view or download the given file) :
header("Content-Disposition: attachment; filename=\"$fileName\"");
So I think something like this would be a solution:
// open the file just once
$contents = file_get_contents(rgFiles[0]);
if ($mode==Library::DOWNLOAD_FILE){
//DOWNLOAD
// pass the contents of file to the sendFile method
Yii::app()->getRequest()->sendFile($fileName, $contents);
} else {
//VIEW
// calculate length of file.
// Note: the sendFile() method uses some more magic to calculate length if the $_SERVER['HTTP_RANGE'] exists, you should check it out if this does not work.
$fileSize=(function_exists('mb_strlen') ? mb_strlen($content,'8bit') : strlen($content));
// Set up PDF headers
header('Content-type: application/pdf');
header('Content-Disposition: inline; filename="' . $rgFiles[0] . '"');
header('Content-Transfer-Encoding: binary');
header('Content-Length: ' . $fileSize);
header('Accept-Ranges: bytes');
// output the file
echo $contents;
}
Yii::app()->end();
I hope this solves your problem, and my explanations are understandable.

Download large CSV file to browser while it is being generated

I have a script that generates a large CSV file using fputcsv and sends it to the browser. It works, but the browser doesn't show the file download prompt (or start downloading the file) until the whole CSV file has been generated serverside, which takes a long time.
Instead, I'd like the download to begin while the remainder of the file has still being generated. I know this is possible because it's how the 'Export database' option in PHPMyAdmin works - the download starts as soon as you click the 'export' button even if your database is huge.
How can I tweak my existing code, below, to let the download begin immediately?
$csv = 'title.csv';
header( "Content-Type: text/csv;charset=utf-8" );
header( "Content-Disposition: attachment;filename=\"$csv\"" );
header( "Pragma: no-cache" );
header( "Expires: 0" );
$fp = fopen('php://output', 'w');
fputcsv($fp, array_keys($array), ';', '"');
foreach ($array as $fields)
{
fputcsv($fp, $fields, ';', '"');
}
fclose($fp);
exit();
Empirically, it seems that when receiving responses featuring a Content-Disposition: attachment header, different browsers will show the file download dialog at the following moments:
Firefox shows the dialog as soon as it receives the headers
Internet Explorer shows the dialog once it has received the headers plus 255 bytes of the response body.
Chromium shows the dialog once it has received the headers plus 1023 bytes of the response body.
Our objectives, then, are as follows:
Flush the first kilobyte of the response body to the browser as soon as possible, so that Chrome users see the file download dialog at the earliest possible moment.
Thereafter, regularly send more content to the browser.
Standing in the way of these objectives are, potentially, multiple levels of buffering, which you can try to fight in different ways.
PHP's output_buffer
If you have output_buffering set to a value other than Off, PHP will automatically create an output buffer which stores all output your script tries to send to the response body. You can prevent this by ensuring that you have output_buffering set to Off from your php.ini file, or from a webserver config file like apache.conf or nginx.conf. Alternatively, you can turn off the output buffer, if one exists, at the start of your script using ob_end_flush() or ob_end_clean():
if (ob_get_level()) {
ob_end_clean();
}
Buffering done by your webserver
Once your output gets past the PHP output buffer, it may be buffered by your webserver. You can try to get around this by calling flush() regularly (e.g. every 100 lines), although the PHP manual is hesitant about providing any guarantees, listing some particular cases where this may fail:
flush
...
Flushes the write buffers of PHP and whatever backend PHP is using (CGI, a web server, etc). This attempts to push current output all the way to the browser with a few caveats.
flush() may not be able to override the buffering scheme of your web server ...
Several servers, especially on Win32, will still buffer the output from your script until it terminates before transmitting the results to the browser.
Server modules for Apache like mod_gzip may do buffering of their own that will cause flush() to not result in data being sent immediately to the client.
You can alternatively have PHP call flush() automatically every time you try to echo any output, by calling ob_implicit_flush at the start of your script - though beware that if you have gzip enabled via a mechanism that respects flush() calls, such as Apache's mod_deflate module, this regular flushing will cripple its compression attempts and probably result in your 'compressed' output being larger than if it were uncompressed. Explicitly calling flush() every n lines of output, for some modest but non-tiny n, is thus perhaps a better practice.
Putting it all together, then, you should probably tweak your script to look something like this:
<?php
if (ob_get_level()) {
ob_end_clean();
}
$csv = 'title.csv';
header( "Content-Type: text/csv;charset=utf-8" );
header( "Content-Disposition: attachment;filename=\"$csv\"" );
header( "Pragma: no-cache" );
header( "Expires: 0" );
flush(); // Get the headers out immediately to show the download dialog
// in Firefox
$array = get_your_csv_data(); // This needs to be fast, of course
$fp = fopen('php://output', 'w');
fputcsv($fp, array_keys($array), ';', '"');
foreach ($array as $i => $fields)
{
fputcsv($fp, $fields, ';', '"');
if ($i % 100 == 0) {
flush(); // Attempt to flush output to the browser every 100 lines.
// You may want to tweak this number based upon the size of
// your CSV rows.
}
}
fclose($fp);
?>
If this doesn't work, then I don't think there's anything more you can do from your PHP code to try to resolve the problem - you need to figure out what's causing your web server to buffer your output and try to solve that using your server's configuration files.
have not tested this. try to flush the script after n number of data rows.
flush();
Try Mark Amery's answer, but just emphasize on the statement:
$array = get_your_csv_data(); // This needs to be fast, of course
If you're fetching huge number of records, fetch them by chunks (every 1000 records for example).
So:
Fetch 1000 records
Output them
Repeat
I think you are looking for the octet-stream header.
$csv = 'title.csv';
header('Content-Type: application/octet-stream');
header("Content-Disposition: attachment;filename=\"$csv\"" );
header('Content-Transfer-Encoding: binary');
header('Cache-Control: must-revalidate');
header('Expires: 0');
$fp = fopen('php://output', 'w');
fputcsv($fp, array_keys($array), ';', '"');
foreach ($array as $fields)
{
fputcsv($fp, $fields, ';', '"');
}
fclose($fp);
exit();

Download headers somehow inserting 18 lines of whitespace

I am trying to serve up a dynamically generated csv file. For some reason when I get the file, there are 18 empty rows preceding the data. I don't have any space between the headers I define and the csv data I'm sending. If I write the data to a file on the server, it does not get these empty rows. However, if I write the file and then try to serve it to the user, the empty lines come back. So I'm wondering if perhaps I've messed up the headers, or if perhaps there is another issue I'm not thinking of:
function generate_csv($source_type, $include_unpublished = FALSE) {
// retrieve data from DB
....
// start up headers
$csv_name = "$source_type-$data_set-csv_" . date('Y-m-d') . '.csv';
header('Content-Type: text/x-comma-separated-values');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Cache-Control: private', false); // required for certain browser
header('Content-Disposition: attachment; filename="' . $csv_name . '"');
// send csv data
print $csv_data;
} //end function
Disclaimer: I asked this question at https://drupal.stackexchange.com/questions/27649/extra-empty-rows-when-serving-csv-file, but it dosn't seem to be drupal-specific and there weren't many ideas coming up over there..
Maybe this lines "hang" in an output buffer, that were started some time before. This way you can set headers without the good old "headers already sent"-error, but this content will be send to the browser when flushing the buffer anyway.
Try
ob_clean();
print $csv_data;
http://php.net/ob-clean
It must be problem with files that you are including. Every whitespace more than one newline after php closing tag ?> is sent to the browser.
Best solution is to get rid of this closing tags in every php file.
Other option will be to remove only unnecessary new lines from them or to bufer output and disregard it before serving file.

PHP readfile() adding extra bytes to downloaded file

I am trying to troubleshoot an issue I am having with downloading a "zip" file from a php script. It seems that when I download the file using the following code, the downloaded file has an extra 0A09 appended to the beginning of the file, causing winzip to throw a corruption error.
<?php
$pagePermissions = 7;
require_once ('util/check.php');
require_once ('util/file_manager.php');
$file_manager = new FileManager();
if ($_SERVER['REQUEST_METHOD'] == "GET") {
if (isset($_GET['q']) && $_GET['q'] == 'logout') {
//require_once ('util/users.php');
//$userdata = new Userdata();
$userdata -> kill_session();
header("Location: download.php");
exit ;
}
if (isset($_GET['q']) && $_GET['q'] == 'fetch') {
if (isset($_GET['name'])) {
#apache_setenv('no-gzip', 1);
header("Content-length: " . filesize('upload/' . $_GET['name']));
header('Content-type: application/zip');
//header("Content-Disposition: attachment; filename=\"{$_GET['name']}\" ");
header("Content-Disposition: attachment; filename={$_GET['name']}");
header('Content-Transfer-Encoding: binary');
readfile('upload/' . $_GET['name']);
exit();
}
}
}
?>
Any help would be greatly appreciated, the file downloads fine through a direct link, the appended 2 bytes to the beginning of the file occurs only thorough this code.
Thanks in advance
Remove the last ?> and check that your opening tag is on the very first line, at the very first character of your scripts. PHP files do not have to end with end tags. The reason why your downloaded files contain a (or more) \r\n is because PHP will directly echo (output) anything outside of <?php ?>. Usually, if you script does not echo HTML, you will omit the closing PHP tag as it is not mandatory and, IMO, yields more trouble than anything else.
** Edit **
If you read the PHP manual for readfile, you have a useful example, pretty much the code you have in your question, less two lines of code :
#apache_setenv('no-gzip', 1);
header("Content-length: " . filesize('upload/' . $_GET['name']));
header('Content-type: application/zip');
//header("Content-Disposition: attachment; filename=\"{$_GET['name']}\" ");
header("Content-Disposition: attachment; filename={$_GET['name']}");
header('Content-Transfer-Encoding: binary');
// add these two lines
ob_clean(); // discard any data in the output buffer (if possible)
flush(); // flush headers (if possible)
readfile('upload/' . $_GET['name']);
exit();
If you still have a problem after that, then the problem might not be with your PHP code.
Sorry for late reply.....
i don't know i am right until you vote this.....
edit your code as :
ob_start("");
//instead of ob_start(); with out a null callback
and
ob_end_clean(); //at the end , Note : "important" add instead of ob_end_flush()
ie;
ob_start("");
//header
ob_end_clean();
I ran into a similar issue today related to readfile(). It turns out my php.ini file has output compression enabled and that was messing up the flash module trying to retrieve the file. (I guess it couldn't handle it.)
All I had to do was the turn off the compression in the php.ini file:
zlib.output_compression = off
Or alternatively, in your script:
<?php ini_set('zlib.output_compression', 'Off'); ?>
Just want to share this in case someone else was having trouble receiving files from a readfile() output.
I had a similar problem in Joomla (2.5), using readfile to pass back Excel (.xls) files to the user.
I also noticed that the text and xml files also had some code inserted at the begining, but nearly ignored it because xml & text readers tended to open the files still.
I decided to try Yanick's suggestions (rather than playing with server compression options), simply flushing the buffer before readfile:
ob_clean(); // discard any data in the output buffer (if possible)
flush(); // flush headers (if possible)
Hey presto, it worked. I'm suggesting this as an alternative answer: to highlight the root cause, show it can fix a Joomla issue and because I had a mixture of binary and text returns.
Just to add (apologies if this is obvious!) - the mime type setting worked fine:
$document = JFactory::getDocument();
$document->setMimeEncoding($mimetype);
I did not even need to set 'Content-Transfer-Encoding: binary' when the mime type was to application/octet-stream.
I had same problem So I used this headers and I got my solution.
$filename = ABSPATH.'/wp-content/summary/user_content/'.trim($file);
header('Pragma: public');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Content-Description: File Transfer');
header('Content-Type: text/text');
header('Content-Disposition: attachment; filename="'.$file.'"');
header('Content-Transfer-Encoding: binary');
header('Cache-Control: max-age=0');
readfile($filename);
exit;
I had a similar issue with blank space at the start of an image file.
I suspect my issue was caused by blank space before opening
What worked for me was:
#ob_start(''); //# supresses a warning
//header entries
ob_end_clean();
ob_clean();
readfile($file);

Categories