Download abuse with php Content-Disposition: attachment and readfile - php

I'm having a download abuse issue with php Content-Disposition: attachment and readfile. It seems that my problem is with readfile, because although this script works, whether or not the client closes their browser, readfile reads the entire contents of the mp4, setting up the possibility of abuse with scripts initiating the download and immediately closing the progress. Something, somewhere, is running a script which clicks this link hundreds of times per second, running my php script and immediately cancelling their download, but my server is preparing that entire file to be offloaded each time.
Here's the script I'm running, when the user/abuser clicks a download link:
<?php
// get MP4 address
$MP4Address = $_GET["MP4Address"];
// We'll be outputting a MOV
header( 'Content-Type: application/octet-stream' );
$filename = basename($MP4Address);
// Name file
header('Content-Disposition: attachment; filename="'.$filename.'"');
// Source file
readfile($MP4Address);
?>
I suspect that readfile is the culprit here, but without it, the client will receive an empty file. There must be a more modern, proper way of doing this, but I'm not sure what it could be.

Unless you've called ignore_user_abort(true) PHP should get the signal that the connection has been aborted and cease execution. But it's possible that once you're inside the readfile() call PHP is not able to watch for that signal since it's busy doing low-level IO.
I would say you've got 2 options, the first being to simply write some code to detect and block the person that's abusing your service. You downloads are already backed by a PHP script, so adding in a bit of checking and filtering should be relatively simple.
The other would be to replace the readfile() call with a bit of [admittedly less efficient] code that should give PHP some breathing space to look for user aborts.
function read_file($filename, $chunksize=4096) {
if( ! $fh = fopen($filename, 'rb') ) {
throw new \Exception('Failed to open file');
}
while($chunk = fread($fh, $chunksize)) {
echo $chunk;
}
fclose($fh);
}

Related

Readfile is best solution to download external files?

I need to get a remote file and give it to user without saving it to my server disk (for hiding original URL) and found a lot of posts about download external files with various functions like file_get_contents or readfile. Already I'm using this one:
function startDownload($url){
if($this->url_exists($url))
{
//get filename from url
$name=$this->getFileName($url);
//first flush clear almost output
ob_end_flush();
//final clear
ob_clean();
//set headers
header('Content-Type: application/octet-stream');
header("Content-Transfer-Encoding: Binary");
header("Content-disposition: attachment; filename=\"" . $name . "\"");
//send file to client;
readfile($url);
//exit command is important
exit;
}
else JFactory::getApplication()->enqueueMessage(JText::_('URL_NOT_FOUND'), 'error');
}
And that's working but there is a problem! For a file with 200 MB size it takes ~ 10 seconds to start download in client browser. I think it's because readfile first downloads whole file to my server buffer and then give it to user. Is that right?
And is it possible to make it faster? for example download be started before fetch ended or it isn't possible technically?
In fact I don't know that this method is optimised or not. Any technical advice would be appreciated.
Note :
I know that this function should be changed for big files and that's not my concern now.
I consider to buy the external server in the same datacenter to make this download faster.
Target is that [File server] be separate than the file [online shop].
I tested curl method that mentioned by #LawrenceCherone. It worked nicely but when moved it to my project the result was the same as readfile (white screen for a few seconds).
So suspect to readfile() function. Separate my previous code to a single PHP file and result was amazing! Download starts immediately.
So I think my guess wasn't right and problem was not related to readfile function.
After a little search found a minor modification. I added below line :
while (ob_get_level()) ob_end_clean();
before the :
readfile($url);
And now download starts before whole file fetched in my server.

How to track currently downloaded files from server?

I'm using shared hosting(hostgator).
I have site with video content like youtube written in php.
Implemented via direct links to mp4 files and html video tag.
I want to limit connections for file downloads(plays) to around 350.
Because if I have more than ~350 connections hostgator blocks my site.
Is there any way to do that?
Any other suggestions how to deal with this situation will also be helpful.
You could use a php script which handles the actual file download. If the script is executed, increment your download counter and if the file is sent completely to the client, close the connection.
To detect if the file is completely sent, you should send the file in small chunks and check after each transmitted chunk, if the connection is still open.
To do this
send the correct mime-types and http headers
use ignore_user_abort to keep the script running if the client closes the connection
send the file in small chunks and check after each chunk if the connection is still alive. ob_flush and flush are used to keep the output buffer empty. connection_status or connection_aborted to test if the connection is still open.
after the whole file is submited, decrement your connection counter
In addition to this, you might also implement HTTP_RANGE, to resume incomplete downloads. This should be important especially for video downloads, if you want to seek somewhere in the middle of the stream.
Below a little .htaccess that rewrite all the requests for the PHP file.
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteRule ^ yourFile.php [L]
</IfModule>
Below the PHP file
// code to increment the counter..
// increment_counter(); ...
// Use the request path (URI) to choose what file to send.
$filename = 'video.mp4';
$size = filesize($filename);
$f = fopen($filename, 'rb');
if (!$f) {
// error...
}
ignore_user_abort(true);
set_time_limit(0);
header("Content-Length: $size");
header("Content-Type: video/mp4");
while (!feof($f)) {
echo fread($f, 8192);
ob_flush();
flush();
if (connection_status() != 0) {
// download aborted... decrement the counter
// decrement_counter(); ...
break;
}
}
fclose($f);
// download completed - decrement counter
// decrement_counter(); ...
This script is pretty simple, but should give you an idea. You might add more logic (as said above HTTP_RANGE) or send other headers, but this should give you a good starting point.
References:
Below the links to the documentation of the functions that could be less known.
connection_status
ignore_user_abort

How to allow downloading a file of 1GB

I want to allow the user to download a file up to 1GB in size, but according to my code only a file of 113MB can be downloaded...
header('Content-type: application/zip');
//open/save dialog box
header('Content-Disposition: attachment; filename="check.zip"');
//read from server and write to buffer
readfile('check.zip');
Can anyone tell me how to download a larger file?
I'm going to guess from what you've said that you're getting an "out of memory" error.
In that case, perhaps this note from the documentation might be of interest:
Note:
readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level().
So, check ob_get_level() and call ob_end_flush() if necessary to stop output buffering.
Alternatively, you could do something like this:
$f = fopen("check.zip","rb");
while(!feof($f)) {
echo fgets($f);
flush();
}
Another option is this:
header("Location: check.zip");
This will redirect the browser to the check.zip file. Since it's a download, the existing page won't be affected. You can even output the rest of a page to say something like "Your download will begin momentarily" to the user.
Either read and echo the file a chunk at a time, or use something like mod_sendfile to make it Not Your Problem.
Increase your file write buffer chunk size to maximum of file. That will decrease the utilization of resources and your download works fine.
Edit:
Use HTML5 webworkers to download large. Webworkers works in background so you can able to download large files.

How can I make file download slower?

This is my final download page of my website where general public is able to download govt documents. From server my code is reading the to-be-downloaded-file and in a loop sending to the client browser.
$fp = fopen($file, "rb");
while (!feof($fp))
{
echo fread($fp, 65536);
flush(); // this is essential for large downloads
}
fclose($fp);
exit;
I want to send the file very slowly -- that is can I use Sleep function (or anything like that) within this loop and by how much maximum without causing the user client browser to timeout?
So that the user gets sufficient time to read the ads displayed on the page while he/she awaits for the file download to finish.
Also I'm not proficient with PHP environment.
(Pl. fogive me for the morality/immorality of this).
Try this approach: http://bytes.com/topic/php/answers/341922-using-php-limit-download-speed
You can use bandwidth sharing if you're willing to do this at the Apache level.

Tracking file complete download

I have a file hosting site and users earn a reward for downloads. So I wanted to know is there a way I can track whether the visitor downloaded whole file so that there are no fake partial downloads just for rewards.
Thank You.
If you could monitor the HTTP response codes returned by your web server and tie them back to the sessions that generated them, you would be in business.
A response code of 206 shows that the system has delivered some of the information but not all of it. When the final chunk of the file goes out, it should not have a response code of 206.
If you can tie this to user sessions by putting the session code inside the URL, then you could give points based on a simple log aggregation.
I implemented a similar solution on a file hosting website.
What you want to do is use the register_shutdown_function callback that allows you to detect the end of execution of a php script regardless of the outcome.
Then you want to have your file in a non-web accessible location on your server(s), and have your downloads go through php: idea being that you want to be able to track how many bytes have been passed to the client.
Here's a basic way of implementing (eg:)
<?php
register_shutdown_function('shutdown', $args);
$file_name = 'file.ext';
$path_to_file = '/path/to/file';
$stat = #stat($path_to_file);
//Set headers
header('Content-Type: application/octet-stream');
header('Content-Length: '.$stat['size']);
header('Connection: close');
header('Content-disposition: attachment; filename='.$file_name);
//Get pointer to file
$fp = fopen('/path/to/file', 'rb');
fpassthru($fp);
function shutdown() {
$status = connection_status();
//A connection status of 0 indicates no premature end of script
if($status == 0){
//Do whatever to increment the counter for the file.
}
}
>?
There are obviously ways to improve, so if you need more details or another behaviour, please let me know!

Categories