How can I make file download slower? - php

This is my final download page of my website where general public is able to download govt documents. From server my code is reading the to-be-downloaded-file and in a loop sending to the client browser.
$fp = fopen($file, "rb");
while (!feof($fp))
{
echo fread($fp, 65536);
flush(); // this is essential for large downloads
}
fclose($fp);
exit;
I want to send the file very slowly -- that is can I use Sleep function (or anything like that) within this loop and by how much maximum without causing the user client browser to timeout?
So that the user gets sufficient time to read the ads displayed on the page while he/she awaits for the file download to finish.
Also I'm not proficient with PHP environment.
(Pl. fogive me for the morality/immorality of this).

Try this approach: http://bytes.com/topic/php/answers/341922-using-php-limit-download-speed

You can use bandwidth sharing if you're willing to do this at the Apache level.

Related

Download abuse with php Content-Disposition: attachment and readfile

I'm having a download abuse issue with php Content-Disposition: attachment and readfile. It seems that my problem is with readfile, because although this script works, whether or not the client closes their browser, readfile reads the entire contents of the mp4, setting up the possibility of abuse with scripts initiating the download and immediately closing the progress. Something, somewhere, is running a script which clicks this link hundreds of times per second, running my php script and immediately cancelling their download, but my server is preparing that entire file to be offloaded each time.
Here's the script I'm running, when the user/abuser clicks a download link:
<?php
// get MP4 address
$MP4Address = $_GET["MP4Address"];
// We'll be outputting a MOV
header( 'Content-Type: application/octet-stream' );
$filename = basename($MP4Address);
// Name file
header('Content-Disposition: attachment; filename="'.$filename.'"');
// Source file
readfile($MP4Address);
?>
I suspect that readfile is the culprit here, but without it, the client will receive an empty file. There must be a more modern, proper way of doing this, but I'm not sure what it could be.
Unless you've called ignore_user_abort(true) PHP should get the signal that the connection has been aborted and cease execution. But it's possible that once you're inside the readfile() call PHP is not able to watch for that signal since it's busy doing low-level IO.
I would say you've got 2 options, the first being to simply write some code to detect and block the person that's abusing your service. You downloads are already backed by a PHP script, so adding in a bit of checking and filtering should be relatively simple.
The other would be to replace the readfile() call with a bit of [admittedly less efficient] code that should give PHP some breathing space to look for user aborts.
function read_file($filename, $chunksize=4096) {
if( ! $fh = fopen($filename, 'rb') ) {
throw new \Exception('Failed to open file');
}
while($chunk = fread($fh, $chunksize)) {
echo $chunk;
}
fclose($fh);
}

How to track currently downloaded files from server?

I'm using shared hosting(hostgator).
I have site with video content like youtube written in php.
Implemented via direct links to mp4 files and html video tag.
I want to limit connections for file downloads(plays) to around 350.
Because if I have more than ~350 connections hostgator blocks my site.
Is there any way to do that?
Any other suggestions how to deal with this situation will also be helpful.
You could use a php script which handles the actual file download. If the script is executed, increment your download counter and if the file is sent completely to the client, close the connection.
To detect if the file is completely sent, you should send the file in small chunks and check after each transmitted chunk, if the connection is still open.
To do this
send the correct mime-types and http headers
use ignore_user_abort to keep the script running if the client closes the connection
send the file in small chunks and check after each chunk if the connection is still alive. ob_flush and flush are used to keep the output buffer empty. connection_status or connection_aborted to test if the connection is still open.
after the whole file is submited, decrement your connection counter
In addition to this, you might also implement HTTP_RANGE, to resume incomplete downloads. This should be important especially for video downloads, if you want to seek somewhere in the middle of the stream.
Below a little .htaccess that rewrite all the requests for the PHP file.
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteRule ^ yourFile.php [L]
</IfModule>
Below the PHP file
// code to increment the counter..
// increment_counter(); ...
// Use the request path (URI) to choose what file to send.
$filename = 'video.mp4';
$size = filesize($filename);
$f = fopen($filename, 'rb');
if (!$f) {
// error...
}
ignore_user_abort(true);
set_time_limit(0);
header("Content-Length: $size");
header("Content-Type: video/mp4");
while (!feof($f)) {
echo fread($f, 8192);
ob_flush();
flush();
if (connection_status() != 0) {
// download aborted... decrement the counter
// decrement_counter(); ...
break;
}
}
fclose($f);
// download completed - decrement counter
// decrement_counter(); ...
This script is pretty simple, but should give you an idea. You might add more logic (as said above HTTP_RANGE) or send other headers, but this should give you a good starting point.
References:
Below the links to the documentation of the functions that could be less known.
connection_status
ignore_user_abort

PHP readfile on a file which is increasing in size

Is it possible to use PHP readfile function on a remote file whose size is unknown and is increasing in size? Here is the scenario:
I'm developing a script which downloads a video from a third party website and simultaneously trans-codes the video into MP3 format. This MP3 is then transferred to the user via readfile.
The query used for the above process is like this:
wget -q -O- "VideoURLHere" | ffmpeg -i - "Output.mp3" > /dev/null 2>&1 &
So the file is fetched and encoded at the same time.
Now when the above process is in progress I begin sending the output mp3 to the user via readfile. The problem is that the encoding process takes some time and therefore depending on the users download speed readfile reaches an assumed EoF before the whole file is encoded, resulting in the user receiving partial content/incomplete files.
My first attempt to fix this was to apply a speed limit on the users download, but this is not foolproof as the encoding time and speed vary with load and this still led to partial downloads.
So is there a way to implement this system in such a way that I can serve the downloads simultaneously along with the encoding and also guarantee sending the complete file to the end user?
Any help is appreciated.
EDIT:
In response to Peter, I'm actually using fread(read readfile_chunked):
<?php
function readfile_chunked($filename,$retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$totChunk = 0;
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
//usleep(120000); //Used to impose an artificial speed limit
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
readfile_chunked($linkToMp3);
?>
This still does not guarantee complete downloads as depending on the users download speed and the encoding speed, the EOF() may be reached prematurely.
Also in response to theJeztah's comment, I'm trying to achieve this without having to make the user wait..so that's not an option.
Since you are dealing with streams, you probably should use stream handling functions :). passthru comes to mind, although this will only work if the download | transcode command is started in your script.
If it is started externally, take a look at stream_get_contents.
Libevent as mentioned by Evert seems like the general solution where you have to use a file as a buffer. However in your case, you could do it all inline in your script without using a file as a buffer:
<?php
header("Content-Type: audio/mpeg");
passthru("wget -q -O- http://localhost/test.avi | ffmpeg -i - -f mp3 -");
?>
I don't think there's any of being notified about there being new data, short of something like inotify.
I suggest that if you hit EOF, you start polling the modification time of the file (using clearstatcache() between calls) every 200 ms or so. When you find the file size has increased, you can reopen the file, seek to the last position and continue.
I can highly recommend using libevent for applications like this.
It works perfect for cases like this.
The PHP documentation is a bit sparse for this, but you should be able to find more solid examples around the web.

Keep track of download progress in PHP

All,
I have built a form, if the users fills in the form, my code can determine what (large) files need to be downloaded by that user. I display each file with a download button, and I want to show the status of the download of that file next to the button (download in progress, cancelled/aborted or completed). Because the files are large (200 MB+), I use a well-known function to read the file to be downloaded in chuncks.
My idea was to log the reading of each (100 kbyte) chunk in a database, so I can keep track of the progress of the download.
My code is below:
function readfile_chunked($filename, $retbytes = TRUE)
{
// Read the personID from the cookie
$PersonID=$_COOKIE['PERSONID'];
// Setup connection to database for logging download progress per file
include "config.php";
$SqlConnectionInfo= array ( "UID"=>$SqlServerUser,"PWD"=>$SqlServerPass,"Database"=>$SqlServerDatabase);
$SqlConnection= sqlsrv_connect($SqlServer,$SqlConnectionInfo);
ChunckSize=100*1024; // 100 kbyte buffer
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
$BytesProcessed=0;
while (!feof($handle))
{
$buffer = fread($handle, $ChunckSize);
$BytesSent = $BytesSent + $ChunckSize;
// update database
$Query= "UPDATE [SoftwareDownload].[dbo].[DownloadProgress] SET LastTimeStamp = '" . time() . "' WHERE PersonID='$PersonID' AND FileName='$filename'";
$QueryResult = sqlsrv_query($SqlConnection, $Query);
echo $buffer;
ob_flush();
flush();
if ($retbytes)
{
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status)
{
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
The weird thing is that my download runs smoothly at a constant rate, but my database gets updated in spikes: sometimes I see a couple of dozen of database updates (so a couple of dozens of 100 kilobyte blocks) within a second, the next moment I see no database updates (so no 100 k blocks being fed by PHP) for up to a minute (this timeout depends on the bandwidth available for the download). The fact that no progress is written for a minute makes my code believe that the download was aborted.
In the meantime I see the memory usage of IIS increasing, so I get the idea that IIS buffers the chuncks of data that PHP delivers.
My webserver is running Windows 2008 R2 datacenter (64 bits), IIS 7.5 and PHP 5.3.8 in FastCGI mode. Is there anything I can do (in either my code, in my PHP config or in IIS config) to prevent IIS from caching the data that PHP delivers, or is there any way to see in PHP if the data generated was actually delivered to the downloading client?
Thanks in advance!
As no usefull answers came up, I've created a real dodgy workaround by using Apache strictly for the download of the file(s). The whole form, CSS etc is still delivered through IIS, just the downloads links are server by Apache (a reverse proxy server sits in between, so users don't have to fiddle with port numbers, the proxy takes care of this).
Apache doesn't cache PHP output, so the timestamps I get in my database are reliable.
I've asked a similar question once, and got very good results with the answers there.
Basically, you use a APC cache to cache a variable, then retrieve it in a different page with an Ajax call from the client.
Another possibility is to do the same with a database. Write the progress to a database and have the client to test for that (let's say every second) using an Ajax call.

Download Status with PHP and JavaScript

I'm currently looking into a way of showing the file download status on a page.
I know this isnt needed since the user usually has a download status in the browser, but I would like to keep the user on the page he is downloading from, as long as the download is lasting. To do that, the download status should match the status the file actually has (not a fake prograss bar). Maybe it will also display the speed the user is downloading at, and estimate the time it will take, depending on the current download rate.
Can this be done using PHP and Javascript? Or does it realy require Flash or Java?
Should not somewhere on the Server be an information about who is downloading what at what speed and how much?
Thank you for your help in advance.
Not really possible cross-browser, but have a look into http://markmail.org/message/kmrpk7w3h56tidxs#query:jquery%20ajax%20download%20progress+page:1+mid:kmrpk7w3h56tidxs+state:results for a pretty close effort. IE (as usual) is the main culprit for not playing ball.
You can do it with two seperate php files, first file for downloading process.
Like as:
$strtTime=time();
$download_rate=120; //downloading rate
$fp = fopen($real, "r");
flush();// Flush headers
while (!feof($fp)) {
$downloaded=round($download_rate * 1024);
echo fread($fp,$downloaded );
ob_flush();
flush();
if (connection_aborted ()) {
// unlink("yourtempFile.txt" ;
exit;
}
$totalDw +=$downloaded;
// file_put_contents("yourtempFile.txt", "downloaded: $totalDw ; StartTime:$strtTime");
sleep(1);
}
fclose($fp);
// unlink("yourtempFile.txt") ;
Second file would be used for reading yourtempFile.txt by Ajax continusly. Using Sessions and Cookies wouldn't be used because of starting print.

Categories