Download Status with PHP and JavaScript - php

I'm currently looking into a way of showing the file download status on a page.
I know this isnt needed since the user usually has a download status in the browser, but I would like to keep the user on the page he is downloading from, as long as the download is lasting. To do that, the download status should match the status the file actually has (not a fake prograss bar). Maybe it will also display the speed the user is downloading at, and estimate the time it will take, depending on the current download rate.
Can this be done using PHP and Javascript? Or does it realy require Flash or Java?
Should not somewhere on the Server be an information about who is downloading what at what speed and how much?
Thank you for your help in advance.

Not really possible cross-browser, but have a look into http://markmail.org/message/kmrpk7w3h56tidxs#query:jquery%20ajax%20download%20progress+page:1+mid:kmrpk7w3h56tidxs+state:results for a pretty close effort. IE (as usual) is the main culprit for not playing ball.

You can do it with two seperate php files, first file for downloading process.
Like as:
$strtTime=time();
$download_rate=120; //downloading rate
$fp = fopen($real, "r");
flush();// Flush headers
while (!feof($fp)) {
$downloaded=round($download_rate * 1024);
echo fread($fp,$downloaded );
ob_flush();
flush();
if (connection_aborted ()) {
// unlink("yourtempFile.txt" ;
exit;
}
$totalDw +=$downloaded;
// file_put_contents("yourtempFile.txt", "downloaded: $totalDw ; StartTime:$strtTime");
sleep(1);
}
fclose($fp);
// unlink("yourtempFile.txt") ;
Second file would be used for reading yourtempFile.txt by Ajax continusly. Using Sessions and Cookies wouldn't be used because of starting print.

Related

Download abuse with php Content-Disposition: attachment and readfile

I'm having a download abuse issue with php Content-Disposition: attachment and readfile. It seems that my problem is with readfile, because although this script works, whether or not the client closes their browser, readfile reads the entire contents of the mp4, setting up the possibility of abuse with scripts initiating the download and immediately closing the progress. Something, somewhere, is running a script which clicks this link hundreds of times per second, running my php script and immediately cancelling their download, but my server is preparing that entire file to be offloaded each time.
Here's the script I'm running, when the user/abuser clicks a download link:
<?php
// get MP4 address
$MP4Address = $_GET["MP4Address"];
// We'll be outputting a MOV
header( 'Content-Type: application/octet-stream' );
$filename = basename($MP4Address);
// Name file
header('Content-Disposition: attachment; filename="'.$filename.'"');
// Source file
readfile($MP4Address);
?>
I suspect that readfile is the culprit here, but without it, the client will receive an empty file. There must be a more modern, proper way of doing this, but I'm not sure what it could be.
Unless you've called ignore_user_abort(true) PHP should get the signal that the connection has been aborted and cease execution. But it's possible that once you're inside the readfile() call PHP is not able to watch for that signal since it's busy doing low-level IO.
I would say you've got 2 options, the first being to simply write some code to detect and block the person that's abusing your service. You downloads are already backed by a PHP script, so adding in a bit of checking and filtering should be relatively simple.
The other would be to replace the readfile() call with a bit of [admittedly less efficient] code that should give PHP some breathing space to look for user aborts.
function read_file($filename, $chunksize=4096) {
if( ! $fh = fopen($filename, 'rb') ) {
throw new \Exception('Failed to open file');
}
while($chunk = fread($fh, $chunksize)) {
echo $chunk;
}
fclose($fh);
}

PHP readfile on a file which is increasing in size

Is it possible to use PHP readfile function on a remote file whose size is unknown and is increasing in size? Here is the scenario:
I'm developing a script which downloads a video from a third party website and simultaneously trans-codes the video into MP3 format. This MP3 is then transferred to the user via readfile.
The query used for the above process is like this:
wget -q -O- "VideoURLHere" | ffmpeg -i - "Output.mp3" > /dev/null 2>&1 &
So the file is fetched and encoded at the same time.
Now when the above process is in progress I begin sending the output mp3 to the user via readfile. The problem is that the encoding process takes some time and therefore depending on the users download speed readfile reaches an assumed EoF before the whole file is encoded, resulting in the user receiving partial content/incomplete files.
My first attempt to fix this was to apply a speed limit on the users download, but this is not foolproof as the encoding time and speed vary with load and this still led to partial downloads.
So is there a way to implement this system in such a way that I can serve the downloads simultaneously along with the encoding and also guarantee sending the complete file to the end user?
Any help is appreciated.
EDIT:
In response to Peter, I'm actually using fread(read readfile_chunked):
<?php
function readfile_chunked($filename,$retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$totChunk = 0;
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
//usleep(120000); //Used to impose an artificial speed limit
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
readfile_chunked($linkToMp3);
?>
This still does not guarantee complete downloads as depending on the users download speed and the encoding speed, the EOF() may be reached prematurely.
Also in response to theJeztah's comment, I'm trying to achieve this without having to make the user wait..so that's not an option.
Since you are dealing with streams, you probably should use stream handling functions :). passthru comes to mind, although this will only work if the download | transcode command is started in your script.
If it is started externally, take a look at stream_get_contents.
Libevent as mentioned by Evert seems like the general solution where you have to use a file as a buffer. However in your case, you could do it all inline in your script without using a file as a buffer:
<?php
header("Content-Type: audio/mpeg");
passthru("wget -q -O- http://localhost/test.avi | ffmpeg -i - -f mp3 -");
?>
I don't think there's any of being notified about there being new data, short of something like inotify.
I suggest that if you hit EOF, you start polling the modification time of the file (using clearstatcache() between calls) every 200 ms or so. When you find the file size has increased, you can reopen the file, seek to the last position and continue.
I can highly recommend using libevent for applications like this.
It works perfect for cases like this.
The PHP documentation is a bit sparse for this, but you should be able to find more solid examples around the web.

Keep track of download progress in PHP

All,
I have built a form, if the users fills in the form, my code can determine what (large) files need to be downloaded by that user. I display each file with a download button, and I want to show the status of the download of that file next to the button (download in progress, cancelled/aborted or completed). Because the files are large (200 MB+), I use a well-known function to read the file to be downloaded in chuncks.
My idea was to log the reading of each (100 kbyte) chunk in a database, so I can keep track of the progress of the download.
My code is below:
function readfile_chunked($filename, $retbytes = TRUE)
{
// Read the personID from the cookie
$PersonID=$_COOKIE['PERSONID'];
// Setup connection to database for logging download progress per file
include "config.php";
$SqlConnectionInfo= array ( "UID"=>$SqlServerUser,"PWD"=>$SqlServerPass,"Database"=>$SqlServerDatabase);
$SqlConnection= sqlsrv_connect($SqlServer,$SqlConnectionInfo);
ChunckSize=100*1024; // 100 kbyte buffer
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
$BytesProcessed=0;
while (!feof($handle))
{
$buffer = fread($handle, $ChunckSize);
$BytesSent = $BytesSent + $ChunckSize;
// update database
$Query= "UPDATE [SoftwareDownload].[dbo].[DownloadProgress] SET LastTimeStamp = '" . time() . "' WHERE PersonID='$PersonID' AND FileName='$filename'";
$QueryResult = sqlsrv_query($SqlConnection, $Query);
echo $buffer;
ob_flush();
flush();
if ($retbytes)
{
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status)
{
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
The weird thing is that my download runs smoothly at a constant rate, but my database gets updated in spikes: sometimes I see a couple of dozen of database updates (so a couple of dozens of 100 kilobyte blocks) within a second, the next moment I see no database updates (so no 100 k blocks being fed by PHP) for up to a minute (this timeout depends on the bandwidth available for the download). The fact that no progress is written for a minute makes my code believe that the download was aborted.
In the meantime I see the memory usage of IIS increasing, so I get the idea that IIS buffers the chuncks of data that PHP delivers.
My webserver is running Windows 2008 R2 datacenter (64 bits), IIS 7.5 and PHP 5.3.8 in FastCGI mode. Is there anything I can do (in either my code, in my PHP config or in IIS config) to prevent IIS from caching the data that PHP delivers, or is there any way to see in PHP if the data generated was actually delivered to the downloading client?
Thanks in advance!
As no usefull answers came up, I've created a real dodgy workaround by using Apache strictly for the download of the file(s). The whole form, CSS etc is still delivered through IIS, just the downloads links are server by Apache (a reverse proxy server sits in between, so users don't have to fiddle with port numbers, the proxy takes care of this).
Apache doesn't cache PHP output, so the timestamps I get in my database are reliable.
I've asked a similar question once, and got very good results with the answers there.
Basically, you use a APC cache to cache a variable, then retrieve it in a different page with an Ajax call from the client.
Another possibility is to do the same with a database. Write the progress to a database and have the client to test for that (let's say every second) using an Ajax call.

file upload and move to another server

I have an application, which has one input=file
Now I need to upload to my server, and then move file to some other server. How can I avoid time out?
Also, any good suggestion for ajax uploader. Thanks.
Flash Uploader: Undoubtedly, SWFUpload or Uploadify (based on the latter).
File Transfer: Use PHP CURL to do an HTTP POST form transfer (http://www.php.net/manual/en/function.curl-setopt.php see the 2nd example).
Before doing the transfer do the following:
set_time_limit(-1); // PHP won't timeout
ignore_user_abort(true); // PHP won't quit if the user aborts
Edit: I don't see a valid reason why you would need a CRON job unless the file in question changes at some time (which is the real definition of sync-ing). On the other hand, if what you want is to just copy the file to a remote server, there's no reason you can't do it with plain PHP.
Also, one thing you should be aware of is file sizes. If the file size in anything less than 20mb, you're safe.
Edit 2: By the way, with the right conditions, (output buffering off, and implicit output on), you can show the user the current remote transfer progress. I've done, it ain't hard really. You just need a hidden iframe which sends progress requests to update the parent window.
It works kind of like AJAX, but using an iframe in place of XHR (since XHR returns as a bulk, not in blocks, unlike an iframe).
If interested, I can help you out with this, just ask.
Edit3: Dynamic remote upload example/explanation:
To make things short, I'll assume that your file has already been uploaded to the server by the user, but not the target remote server. I'll also assume the user landed on handle.php after uploading the file.
handle.php would look like:
// This current script is only cosmetic - though you might want to
// handle the user upload here (as I did)
$name = 'userfile'; // name of uploaded file (input box) YOU MUST CHANGE THIS
$new_name = time().'.'.pathinfo($_FILES[$name]['name'],PATHINFO_EXTESION); // the (temporary) filename
move_uploaded_file($_FILES[$name]['tmp_name'],'uploads/'.$new_name);
$url = 'remote.php?file='.$new_name; ?>
<iframe src="<?php echo $url; ?>" width="1" height="1" frameborder="0" scrolling="no"></iframe>
<div id="progress">0%</div>
<script type="text/javascript">
function progress(percent){
document.getElementById('progress').innerHTML='%'+percent;
}
</script>
Doesn't look difficult so far, no?
The next part is a little more complex. The file remote.php would look like:
set_time_limit(0); // PHP won't timeout
// if you want the user to be able to cancel the upload, simply comment out the following line
ignore_user_abort(true); // PHP won't quit if the user aborts
// to make this system work, we need to tweak output buffering
while(ob_get_level())ob_end_clean(); // remove all buffers
ob_implicit_flush(true); // ensures everything we output is sent to browser directly
function progress($percent){
// since we're in an iframe, we need "parent" to be able to call the js
// function "progress" which we defined in the other file.
echo '<script type="text/javascript">parent.progress('.$percent.');</script>';
}
function curlPostFile($url,$file=null,$onprogress=null){
curl_setopt($ch,CURLOPT_URL,$url);
if(substr($url,0,8)=='https://'){
curl_setopt($ch,CURLOPT_HTTPAUTH,CURLAUTH_ANY);
curl_setopt($ch,CURLOPT_SSL_VERIFYPEER,false);
}
if($onprogress){
curl_setopt($ch,CURLOPT_NOPROGRESS,false);
curl_setopt($ch,CURLOPT_PROGRESSFUNCTION,$onprogress);
}
curl_setopt($ch,CURLOPT_HEADER,false);
curl_setopt($ch,CURLOPT_USERAGENT,K2FUAGENT);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,true);
curl_setopt($ch,CURLOPT_FOLLOWLOCATION,true);
curl_setopt($ch,CURLOPT_MAXREDIRS,50);
if($file){
$fh=fopen($file);
curl_setopt($ch,CURLOPT_INFILE,$fh);
curl_setopt($ch,CURLOPT_INFILESIZE,filesize($file));
}
$data=curl_exec($ch);
curl_close($ch);
fclose($fh);
return $data;
}
$file = 'uploads/'.basename($_REQUEST['file']);
function onprogress($download_size,$downloaded,$upload_size,$uploaded){
progress($uploaded/$upload_size*100); // call our progress function
}
curlPostFile('http://someremoteserver.com/handle-uplaods.php',$file,'onprogress');
progress(100); // finished!
Use i.e. scp or rsync to transfer the file to another server. Do that with a cron job every couple of minutes, not from your php script - that will prevent any timeouts occuring if the server-to-server transfer takes too long.

How can I make file download slower?

This is my final download page of my website where general public is able to download govt documents. From server my code is reading the to-be-downloaded-file and in a loop sending to the client browser.
$fp = fopen($file, "rb");
while (!feof($fp))
{
echo fread($fp, 65536);
flush(); // this is essential for large downloads
}
fclose($fp);
exit;
I want to send the file very slowly -- that is can I use Sleep function (or anything like that) within this loop and by how much maximum without causing the user client browser to timeout?
So that the user gets sufficient time to read the ads displayed on the page while he/she awaits for the file download to finish.
Also I'm not proficient with PHP environment.
(Pl. fogive me for the morality/immorality of this).
Try this approach: http://bytes.com/topic/php/answers/341922-using-php-limit-download-speed
You can use bandwidth sharing if you're willing to do this at the Apache level.

Categories