I'm currently having the problem that my download always stops at almost exactly one gigabyte. I have created a PHP download script which checks the user permissions before downloading and then prevents copying the URL. For this I use fread() to pass the data through PHP. However, I have no idea why the download always stops exactly after one gigabyte. The file is about twice as large.
Could someone take a look at this?
// ... Permission check, get download url, etc. ...
try
{
define('CHUNK_SIZE', 8 * 1024);
function readfile_chunked($filename, $retbytes = true)
{
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
while (!feof($handle))
{
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
if(ob_get_level() > 0)
{
ob_flush();
flush();
}
if ($retbytes)
{
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status)
{
return $cnt;
}
return $status;
}
$mimetype = 'mime/type';
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=' . $download->url);
readfile_chunked($download->url);
}
catch(Exception $e)
{
echo "Error while reading the file";
exit;
}
Update:
I have now tested the script on another server. Here the download sometimes terminated after 300 MB and sometimes after 1.3 GB. I then defined the following settings:
ini_set('display_errors', 0);
ini_set('error_reporting', E_ALL);
ini_set("log_errors", 1);
ini_set("error_log", "error.log");
ini_set('memory_limit', '-1');
ini_set('max_execution_time', 7200);
On the other server, the script has now run through several times without any problems. The download was complete and the file was error-free. However, I copied these settings to the original system and the download still stops. I downloaded the file three times in a row. The file size varies slightly:
1.057.983 KB
1.056.192 KB
1.056.776 KB
I'm running out of ideas. What else could that be? They are both Apache web servers.
There are also no noticeable entries in the log.
Related
I want to serve huge files from a folder above the public_html.
Currently I do:
<?php
// Authenticate
if ($_GET['key'] !== "MY-API-KEY") {
header('HTTP/1.0 403 Forbidden');
echo "You are not authorized.";
return;
}
define('CHUNK_SIZE', 1024*1024);
$PATH_ROOT_AUTOPILOT_ACTIVITY_STREAMS = "../../../data/csv/";
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
// Get the file parameter
$file = basename(urldecode($_GET['file']));
$fileDir = $PATH_ROOT_AUTOPILOT_ACTIVITY_STREAMS;
$filePath = $fileDir . $file;
if (file_exists($filePath))
{
// Get the file's mime type to send the correct content type header
$finfo = finfo_open(FILEINFO_MIME_TYPE);
$mime_type = finfo_file($finfo, $filePath);
// Send the headers
header("Content-Disposition: attachment; filename=$file.csv;");
header("Content-Type: $mime_type");
header('Content-Length: ' . filesize($filePath));
// Stream the file
readfile_chunked($filePath);
exit;
}
?>
This currently fails for some reason I don't understand. curl outputs:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 457M 0 457M 0 0 1228k 0 --:--:-- 0:06:21 --:--:-- 1762k
curl: (92) HTTP/2 stream 1 was not closed cleanly: INTERNAL_ERROR (err 2)
Is there a better way to serve big files programatically, via PHP?
Currently about 2/3 of the file is served. The response is not complete because it crashes. There are no logs.
I am using this below code to stream a movie on my localhost. The movie which is lesser then 2GB works fine but when it comes to movie greater than 2gb then the movie doesn't play.
Please help me what should I do to read larger files.
This is the code I am using for streaming movie
<?php
//Determine file path according to extension
if (!isset($_GET['ext']) || $_GET['ext'] == 'mp4') {
$path ='movies/'.$_GET['movie'];
}
$finfo = new finfo(FILEINFO_MIME_TYPE);
$mime = $finfo->file($path);
header('Content-Type: ' . $mime);
$size = filesize($path);
if (isset($_SERVER['HTTP_RANGE'])) {
list($specifier, $value) = explode('=', $_SERVER['HTTP_RANGE']);
if ($specifier != 'bytes') {
header('HTTP/1.1 400 Bad Request');
return;
}
list($from, $to) = explode('-', $value);
if (!$to) {
$to = $size - 1;
}
$fp = fopen($path, 'rb');
if (!$fp) {
header('HTTP/1.1 500 Internal Server Error');
return;
}
header('HTTP/1.1 206 Partial Content');
header('Accept-Ranges: bytes');
header('Content-Length: ' . ($to - $from));
header("Content-Range: bytes {$from}-{$to}/{$size}");
fseek($fp, $from);
while(true){
if(ftell($fp) >= $to){
break;
}
echo fread($fp, 8192);
// Flush do buffer
ob_flush();
flush();
}
}
else {
header('Content-Length: ' . $size);
readfile($path);
}
I have hosted the files on localhost using xammpp as a server.
This could have different reasons.
fopen() reads a file into your RAM. So the limitation seems to come from there.
First, make sure you have min 4GB installed in your system.
If you need more also make sure you dont use a Win 32-Bit system since this only supports up to 4GB ram.
Then you can try to increase the memory_limit of PHP. Open the xampp/php/php.ini file and change the memory_limit to what ever you need OR set the ini value directly in your php file like this
ini_set(”memory_limit”,”16M”);
Hope this helps. For a more specific answer it would be helpful if you have any error message.
I need to log total downloads of an specific file. Download function is working fine, but can't define if user canceled (clicking "cancel" on browser dialog) or if connection was aborted latter.
I understand it's not simple to know when a file download was finished, so I'm trying to get this by two ways. None works:
Get total bytes sent, latter I will compare it with total file size: this way $bytes_sent var always is set with total file size, no matter if user click cancel button of download dialog or if cancel download process latter.
Trigger connection_aborted() function: Have not found the way this function happen and define my session var...
(I'm not shure if the fact Im working with sessions is relevant).
I appreciate your help :)
<?php
if(is_file($filepath)){
$handle = fopen($filepath, "r");
header("Content-Type: $mime_type");
header("Content-Length: ". filesize($filepath).";");
header("Content-disposition: attachment; filename=" . $name);
while(!feof($handle)){
ignore_user_abort(true);
set_time_limit(0);
$data = fread($handle, filesize($filepath));
print $data;
$_SESSION['download'] = 'Successful download';
//Always is set as total file lenght, even when cancel a large file download before it finish:
bytes_sent = ftell($handle);
flush();
ob_flush();
//Can't trigger connection aborted, in any case:
if(connection_aborted()){
$_SESSION['download'] = 'Canceled download';
}
}
}
PHP Version 5.3.29
You need to read the file in small chunks, rather than reading it all in at once.
$chunk_size = 1000;
ignore_user_abort();
$canceled = false;
while ($chunk = fread($handle, $chunk_size)) {
print $chunk;
ob_flush();
$bytes_sent += strlen($chunk);
if (connection_aborted()) {
$canceled = true;
break;
}
}
$_SESSION['download'] = $canceled ? "Download canceled" : "Download successful";
I am creating a very large table in a single string $output with the data from a MySQL database. In end of the php script, I send two headers:
header("Content-type: application/msexcel");
header("Content-Disposition: attachment; filename=action.xls");
The problem is when the large table has about more than 55000 rows. When this happens, the file sent is 0 bytes and the user opens a empty file. I am sending the file through this after the headers:
echo $output;
When the table has not too many rows, the file sent work. How to send a file in this way that the size of the string $output don't matter?
header("Content-type: application/msexcel");
header("Content-Disposition: attachment; filename=action.xls");
Move these lines to the top the page or Script before html table it will starts working
The solution found in another question worked for this. The file is too large to send, so the solution is send part by part to the user. Instead of use readfile() function, use the readfile_chuncked() customized function:
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
I am using a variation of the familiar readfile_chunked in attempt of download for larger files:
function readfile_chunked($filename)
{
$chunk_size = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
while (!feof($handle))
{
$buffer = fread($handle, $chunk_size);
print $buffer;
ob_flush();
flush();
sleep(1);
}
$status = fclose($handle);
return $status;
}
smaller files work fine, but this larger file is missing the last 2830 bytes.
i found out the issue to this. under the php.ini file, make sure you set implicit_flushing to On. i still have the explicit flush code after each line outputted however.