I am creating a very large table in a single string $output with the data from a MySQL database. In end of the php script, I send two headers:
header("Content-type: application/msexcel");
header("Content-Disposition: attachment; filename=action.xls");
The problem is when the large table has about more than 55000 rows. When this happens, the file sent is 0 bytes and the user opens a empty file. I am sending the file through this after the headers:
echo $output;
When the table has not too many rows, the file sent work. How to send a file in this way that the size of the string $output don't matter?
header("Content-type: application/msexcel");
header("Content-Disposition: attachment; filename=action.xls");
Move these lines to the top the page or Script before html table it will starts working
The solution found in another question worked for this. The file is too large to send, so the solution is send part by part to the user. Instead of use readfile() function, use the readfile_chuncked() customized function:
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
Related
I want to serve huge files from a folder above the public_html.
Currently I do:
<?php
// Authenticate
if ($_GET['key'] !== "MY-API-KEY") {
header('HTTP/1.0 403 Forbidden');
echo "You are not authorized.";
return;
}
define('CHUNK_SIZE', 1024*1024);
$PATH_ROOT_AUTOPILOT_ACTIVITY_STREAMS = "../../../data/csv/";
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
// Get the file parameter
$file = basename(urldecode($_GET['file']));
$fileDir = $PATH_ROOT_AUTOPILOT_ACTIVITY_STREAMS;
$filePath = $fileDir . $file;
if (file_exists($filePath))
{
// Get the file's mime type to send the correct content type header
$finfo = finfo_open(FILEINFO_MIME_TYPE);
$mime_type = finfo_file($finfo, $filePath);
// Send the headers
header("Content-Disposition: attachment; filename=$file.csv;");
header("Content-Type: $mime_type");
header('Content-Length: ' . filesize($filePath));
// Stream the file
readfile_chunked($filePath);
exit;
}
?>
This currently fails for some reason I don't understand. curl outputs:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 457M 0 457M 0 0 1228k 0 --:--:-- 0:06:21 --:--:-- 1762k
curl: (92) HTTP/2 stream 1 was not closed cleanly: INTERNAL_ERROR (err 2)
Is there a better way to serve big files programatically, via PHP?
Currently about 2/3 of the file is served. The response is not complete because it crashes. There are no logs.
I need to log total downloads of an specific file. Download function is working fine, but can't define if user canceled (clicking "cancel" on browser dialog) or if connection was aborted latter.
I understand it's not simple to know when a file download was finished, so I'm trying to get this by two ways. None works:
Get total bytes sent, latter I will compare it with total file size: this way $bytes_sent var always is set with total file size, no matter if user click cancel button of download dialog or if cancel download process latter.
Trigger connection_aborted() function: Have not found the way this function happen and define my session var...
(I'm not shure if the fact Im working with sessions is relevant).
I appreciate your help :)
<?php
if(is_file($filepath)){
$handle = fopen($filepath, "r");
header("Content-Type: $mime_type");
header("Content-Length: ". filesize($filepath).";");
header("Content-disposition: attachment; filename=" . $name);
while(!feof($handle)){
ignore_user_abort(true);
set_time_limit(0);
$data = fread($handle, filesize($filepath));
print $data;
$_SESSION['download'] = 'Successful download';
//Always is set as total file lenght, even when cancel a large file download before it finish:
bytes_sent = ftell($handle);
flush();
ob_flush();
//Can't trigger connection aborted, in any case:
if(connection_aborted()){
$_SESSION['download'] = 'Canceled download';
}
}
}
PHP Version 5.3.29
You need to read the file in small chunks, rather than reading it all in at once.
$chunk_size = 1000;
ignore_user_abort();
$canceled = false;
while ($chunk = fread($handle, $chunk_size)) {
print $chunk;
ob_flush();
$bytes_sent += strlen($chunk);
if (connection_aborted()) {
$canceled = true;
break;
}
}
$_SESSION['download'] = $canceled ? "Download canceled" : "Download successful";
I'm currently having the problem that my download always stops at almost exactly one gigabyte. I have created a PHP download script which checks the user permissions before downloading and then prevents copying the URL. For this I use fread() to pass the data through PHP. However, I have no idea why the download always stops exactly after one gigabyte. The file is about twice as large.
Could someone take a look at this?
// ... Permission check, get download url, etc. ...
try
{
define('CHUNK_SIZE', 8 * 1024);
function readfile_chunked($filename, $retbytes = true)
{
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
while (!feof($handle))
{
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
if(ob_get_level() > 0)
{
ob_flush();
flush();
}
if ($retbytes)
{
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status)
{
return $cnt;
}
return $status;
}
$mimetype = 'mime/type';
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=' . $download->url);
readfile_chunked($download->url);
}
catch(Exception $e)
{
echo "Error while reading the file";
exit;
}
Update:
I have now tested the script on another server. Here the download sometimes terminated after 300 MB and sometimes after 1.3 GB. I then defined the following settings:
ini_set('display_errors', 0);
ini_set('error_reporting', E_ALL);
ini_set("log_errors", 1);
ini_set("error_log", "error.log");
ini_set('memory_limit', '-1');
ini_set('max_execution_time', 7200);
On the other server, the script has now run through several times without any problems. The download was complete and the file was error-free. However, I copied these settings to the original system and the download still stops. I downloaded the file three times in a row. The file size varies slightly:
1.057.983 KB
1.056.192 KB
1.056.776 KB
I'm running out of ideas. What else could that be? They are both Apache web servers.
There are also no noticeable entries in the log.
I'm having some trouble with this one. I have found some helpful scripts on the web and have been modifying them for my needs. However, I can't seem to download a file. It will respond back with the contents of the file but doesn't download it. I am using Polymer 1.0+ for my client side and PHP for my server side. The client side code to download a file is as follows:
<!--THIS IS THE HTML SIDE-->
<iron-ajax
id="ajaxDownloadItem"
url="../../../dropFilesBackend/index.php/main/DownloadItem"
method="GET"
handle-as="document"
last-response="{{downloadResponse}}"
on-response="ajaxDownloadItemResponse">
</iron-ajax>
//THIS IS THE JAVASCRIPT THAT WILL CALL THE "iron-ajax" ELEMENT
downloadItem:function(e){
this.$.ajaxDownloadItem.params = {"FILENAME":this.selectedItem.FILENAME,
"PATH":this.folder};
this.$.ajaxDownloadItem.generateRequest();
},
The server side code is as follows (the url is different because I do some url modification to get to the correct script):
function actionDownloadItem(){
valRequestMethodGet();
$username = $_SESSION['USERNAME'];
if(validateLoggedIn($username)){
$itemName = arrayGet($_GET,"FILENAME");
$path = arrayGet($_GET,"PATH");
$username = $_SESSION['USERNAME'];
$downloadItem = CoreFilePath();
$downloadItem .= "/".$_SESSION['USERNAME']."".$path."".$itemName;
DownloadFile($downloadItem);
}
else {
echo "Not Logged In.";
}
}
function DownloadFile($filePath) {
//ignore_user_abort(true);
set_time_limit(0); // disable the time limit for this script
//touch($filePath);
//chmod($filePath, 0775);
if ($fd = fopen($filePath, "r")) {
$fsize = filesize($filePath);//this returns 12
$path_parts = pathinfo($filePath);//basename = textfile.txt
$ext = strtolower($path_parts["extension"]);//this returns txt
$header = headerMimeType($ext); //this returns text/plain
header('Content-disposition: attachment; filename="'.$path_parts["basename"].'"'); // use 'attachment' to force a file download
header("Content-type: $header");
header("Content-length: $fsize");
header("Cache-control: private"); //use this to open files directly
while(!feof($fd)) {
$buffer = fread($fd, 2048);
echo $buffer;
}
}
fclose ($fd);
}
Any help on this one would be greatly appreciated.
First you will need the file handle
$pathToSave = '/home/something/something.txt';
$writeHandle = fopen($pathToSave, 'wb');
Then, while you are reading the download, write to the file instead of echoing
fwrite($writeHandle, fread($fd, 2048));
Finally, after writing to the file finished close the handle
fclose($writeHandle);
I neglect the error check, you should implement your own.
I am using a variation of the familiar readfile_chunked in attempt of download for larger files:
function readfile_chunked($filename)
{
$chunk_size = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
while (!feof($handle))
{
$buffer = fread($handle, $chunk_size);
print $buffer;
ob_flush();
flush();
sleep(1);
}
$status = fclose($handle);
return $status;
}
smaller files work fine, but this larger file is missing the last 2830 bytes.
i found out the issue to this. under the php.ini file, make sure you set implicit_flushing to On. i still have the explicit flush code after each line outputted however.