I need to log total downloads of an specific file. Download function is working fine, but can't define if user canceled (clicking "cancel" on browser dialog) or if connection was aborted latter.
I understand it's not simple to know when a file download was finished, so I'm trying to get this by two ways. None works:
Get total bytes sent, latter I will compare it with total file size: this way $bytes_sent var always is set with total file size, no matter if user click cancel button of download dialog or if cancel download process latter.
Trigger connection_aborted() function: Have not found the way this function happen and define my session var...
(I'm not shure if the fact Im working with sessions is relevant).
I appreciate your help :)
<?php
if(is_file($filepath)){
$handle = fopen($filepath, "r");
header("Content-Type: $mime_type");
header("Content-Length: ". filesize($filepath).";");
header("Content-disposition: attachment; filename=" . $name);
while(!feof($handle)){
ignore_user_abort(true);
set_time_limit(0);
$data = fread($handle, filesize($filepath));
print $data;
$_SESSION['download'] = 'Successful download';
//Always is set as total file lenght, even when cancel a large file download before it finish:
bytes_sent = ftell($handle);
flush();
ob_flush();
//Can't trigger connection aborted, in any case:
if(connection_aborted()){
$_SESSION['download'] = 'Canceled download';
}
}
}
PHP Version 5.3.29
You need to read the file in small chunks, rather than reading it all in at once.
$chunk_size = 1000;
ignore_user_abort();
$canceled = false;
while ($chunk = fread($handle, $chunk_size)) {
print $chunk;
ob_flush();
$bytes_sent += strlen($chunk);
if (connection_aborted()) {
$canceled = true;
break;
}
}
$_SESSION['download'] = $canceled ? "Download canceled" : "Download successful";
Related
I'm currently having the problem that my download always stops at almost exactly one gigabyte. I have created a PHP download script which checks the user permissions before downloading and then prevents copying the URL. For this I use fread() to pass the data through PHP. However, I have no idea why the download always stops exactly after one gigabyte. The file is about twice as large.
Could someone take a look at this?
// ... Permission check, get download url, etc. ...
try
{
define('CHUNK_SIZE', 8 * 1024);
function readfile_chunked($filename, $retbytes = true)
{
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
while (!feof($handle))
{
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
if(ob_get_level() > 0)
{
ob_flush();
flush();
}
if ($retbytes)
{
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status)
{
return $cnt;
}
return $status;
}
$mimetype = 'mime/type';
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=' . $download->url);
readfile_chunked($download->url);
}
catch(Exception $e)
{
echo "Error while reading the file";
exit;
}
Update:
I have now tested the script on another server. Here the download sometimes terminated after 300 MB and sometimes after 1.3 GB. I then defined the following settings:
ini_set('display_errors', 0);
ini_set('error_reporting', E_ALL);
ini_set("log_errors", 1);
ini_set("error_log", "error.log");
ini_set('memory_limit', '-1');
ini_set('max_execution_time', 7200);
On the other server, the script has now run through several times without any problems. The download was complete and the file was error-free. However, I copied these settings to the original system and the download still stops. I downloaded the file three times in a row. The file size varies slightly:
1.057.983 KB
1.056.192 KB
1.056.776 KB
I'm running out of ideas. What else could that be? They are both Apache web servers.
There are also no noticeable entries in the log.
I am creating a very large table in a single string $output with the data from a MySQL database. In end of the php script, I send two headers:
header("Content-type: application/msexcel");
header("Content-Disposition: attachment; filename=action.xls");
The problem is when the large table has about more than 55000 rows. When this happens, the file sent is 0 bytes and the user opens a empty file. I am sending the file through this after the headers:
echo $output;
When the table has not too many rows, the file sent work. How to send a file in this way that the size of the string $output don't matter?
header("Content-type: application/msexcel");
header("Content-Disposition: attachment; filename=action.xls");
Move these lines to the top the page or Script before html table it will starts working
The solution found in another question worked for this. The file is too large to send, so the solution is send part by part to the user. Instead of use readfile() function, use the readfile_chuncked() customized function:
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
I'm trying to serve a mp4 video from a PHP script. In all browsers works without problem, but when I try with Safari the video plays only with sound (without image).
The problem happens only with Safari and IIS, I tried with Safari and Apache and works fine.
If I try view the video directly from the server (without the PHP script) it works perfectly.
The code of the PHP script:
public function video() {
$params = Manager::getParams();
if (array_key_exists(2, $params)) {
$filename = $params[2];
$path = VIDEOS_FOLDER . $filename;
$mime = 'video/mp4';
$this->get_file($path, $mime);
} else {
echo "Incorrect params";
}
}
private function get_file($path, $mime) {
// Clears the cache and prevent unwanted output
ob_clean();
session_write_close();
#ini_set('error_reporting', E_ALL & ~ E_NOTICE);
//#apache_setenv('no-gzip', 1);
#ini_set('zlib.output_compression', 'Off');
$size = filesize($path); // The size of the file
// Send the content type header
header('Content-type: ' . $mime);
// Check if it's a HTTP range request
if (isset($_SERVER['HTTP_RANGE'])) {
// Parse the range header to get the byte offset
$ranges = array_map(
'intval', // Parse the parts into integer
explode('-', // The range separator
substr($_SERVER['HTTP_RANGE'], 6) // Skip the `bytes=` part of the header
)
);
// If the last range param is empty, it means the EOF (End of File)
if (!$ranges[1]) {
$ranges[1] = $size - 1;
}
// Send the appropriate headers
header('HTTP/1.1 206 Partial Content');
header('Accept-Ranges: bytes');
header('Content-Length: ' . ($ranges[1] - $ranges[0])); // The size of the range
// Send the ranges we offered
header(
sprintf(
'Content-Range: bytes %d-%d/%d', // The header format
$ranges[0], // The start range
$ranges[1], // The end range
$size // Total size of the file
)
);
// It's time to output the file
$f = fopen($path, 'rb'); // Open the file in binary mode
$chunkSize = 8192; // The size of each chunk to output
// Seek to the requested start range
fseek($f, $ranges[0]);
// Start outputting the data
while (true) {
// Check if we have outputted all the data requested
if (ftell($f) >= $ranges[1]) {
break;
}
// Output the data
echo fread($f, $chunkSize);
// Flush the buffer immediately
#ob_flush();
flush();
}
} else {
// It's not a range request, output the file anyway
header('Content-Length: ' . $size);
// Read the file
#readfile($path);
// and flush the buffer
#ob_flush();
flush();
}
}
My problem becomes when I play the video on Safari with IIS server, it just play the sound.
I added the following to the web.config:
<staticContent>
<mimeMap fileExtension=".mp4" mimeType="video/mp4" />
</staticContent>
Does anyone know how to fix it?
I'm having some trouble with this one. I have found some helpful scripts on the web and have been modifying them for my needs. However, I can't seem to download a file. It will respond back with the contents of the file but doesn't download it. I am using Polymer 1.0+ for my client side and PHP for my server side. The client side code to download a file is as follows:
<!--THIS IS THE HTML SIDE-->
<iron-ajax
id="ajaxDownloadItem"
url="../../../dropFilesBackend/index.php/main/DownloadItem"
method="GET"
handle-as="document"
last-response="{{downloadResponse}}"
on-response="ajaxDownloadItemResponse">
</iron-ajax>
//THIS IS THE JAVASCRIPT THAT WILL CALL THE "iron-ajax" ELEMENT
downloadItem:function(e){
this.$.ajaxDownloadItem.params = {"FILENAME":this.selectedItem.FILENAME,
"PATH":this.folder};
this.$.ajaxDownloadItem.generateRequest();
},
The server side code is as follows (the url is different because I do some url modification to get to the correct script):
function actionDownloadItem(){
valRequestMethodGet();
$username = $_SESSION['USERNAME'];
if(validateLoggedIn($username)){
$itemName = arrayGet($_GET,"FILENAME");
$path = arrayGet($_GET,"PATH");
$username = $_SESSION['USERNAME'];
$downloadItem = CoreFilePath();
$downloadItem .= "/".$_SESSION['USERNAME']."".$path."".$itemName;
DownloadFile($downloadItem);
}
else {
echo "Not Logged In.";
}
}
function DownloadFile($filePath) {
//ignore_user_abort(true);
set_time_limit(0); // disable the time limit for this script
//touch($filePath);
//chmod($filePath, 0775);
if ($fd = fopen($filePath, "r")) {
$fsize = filesize($filePath);//this returns 12
$path_parts = pathinfo($filePath);//basename = textfile.txt
$ext = strtolower($path_parts["extension"]);//this returns txt
$header = headerMimeType($ext); //this returns text/plain
header('Content-disposition: attachment; filename="'.$path_parts["basename"].'"'); // use 'attachment' to force a file download
header("Content-type: $header");
header("Content-length: $fsize");
header("Cache-control: private"); //use this to open files directly
while(!feof($fd)) {
$buffer = fread($fd, 2048);
echo $buffer;
}
}
fclose ($fd);
}
Any help on this one would be greatly appreciated.
First you will need the file handle
$pathToSave = '/home/something/something.txt';
$writeHandle = fopen($pathToSave, 'wb');
Then, while you are reading the download, write to the file instead of echoing
fwrite($writeHandle, fread($fd, 2048));
Finally, after writing to the file finished close the handle
fclose($writeHandle);
I neglect the error check, you should implement your own.
I'm developing a quick rapidshare-like site where the user can download files. First, I created a quick test setting headers and using readfile() but then I found in the comments section there's a way to limit the speed of the download, which is great, here's the code:
$local_file = 'file.zip';
$download_file = 'name.zip';
// set the download rate limit (=> 20,5 kb/s)
$download_rate = 20.5;
if(file_exists($local_file) && is_file($local_file))
{
header('Cache-control: private');
header('Content-Type: application/octet-stream');
header('Content-Length: '.filesize($local_file));
header('Content-Disposition: filename='.$download_file);
flush();
$file = fopen($local_file, "r");
while(!feof($file))
{
// send the current file part to the browser
print fread($file, round($download_rate * 1024));
// flush the content to the browser
flush();
// sleep one second
sleep(1);
}
fclose($file);}
else {
die('Error: The file '.$local_file.' does not exist!');
}
But now my question is, how to limit the number of downloads at the same time? How can I check there's still a connection with some user's IP?
Thanks.
Does a user have a login? if not just use sessions, or even better track on their ip-address.
Here's a sessions example:
$_SESSION['file_downloading']==true;
$file = fopen($local_file, "r");
while(!feof($file))
{
// send the current file part to the browser
print fread($file, round($download_rate * 1024));
// flush the content to the browser
flush();
// sleep one second
sleep(1);
}
$_SESSION['file_downloading']=null;
fclose($file);}
Then above all this code,
if(!empty($_SESSION['file_downloading']))
//perform a redirect or reduce their download rate or something.
Next option is via ip address.
//http://wiki.jumba.com.au/wiki/PHP_Get_user_IP_Address
function VisitorIP()
{
if(isset($_SERVER['HTTP_X_FORWARDED_FOR']))
$TheIp=$_SERVER['HTTP_X_FORWARDED_FOR'];
else $TheIp=$_SERVER['REMOTE_ADDR'];
return trim($TheIp);
}
get the visitor ip address, store this in the database along with the datetime stamp. Then simply remove that ip address when the file is finished downloading. Are you using a database system?