Limit Download Speed PHP Remote Files - php

I am looking to limit download speed of a file, I have found the below thread on here which works wonders for locally stored files, however my files are stored on an external server and I'm not entirely sure how to make this work with said server.
Reference:
Limit download speed using PHP
Code:
<?php
set_time_limit(0);
// local file that should be send to the client
$local_file = 'https://remoteserver.com/example.mp4';
// filename that the user gets as default
$download_file = 'https://remoteserver.com/example.mp4';
// set the download rate limit (=> 20,5 kb/s)
$download_rate = 1024;
if(file_exists($local_file) && is_file($local_file)) {
// send headers
header('Cache-control: private');
header('Content-Type: application/octet-stream');
header('Content-Length: '.filesize($local_file));
header('Content-Disposition: filename='.$download_file);
// flush content
flush();
// open file stream
$file = fopen($local_file, "r");
while(!feof($file)) {
// send the current file part to the browser
print fread($file, round($download_rate * 1024));
// flush the content to the browser
flush();
// sleep one second
sleep(1);
}
// close file stream
fclose($file);}
else {
die('Error: The file '.$local_file.' does not exist!');
}
if ($dl) {
} else {
header('HTTP/1.0 503 Service Unavailable');
die('Abort, you reached your download limit for this file.');
}
?>
Both servers share the same base domain, just different subdomains. I could limit download speed using httpd configuration on the remote server, however I am looking to make different speeds for different user permissions and this would just result in an overall limitation.
My Solution:
I have used httpd / apache2 config to limit download speeds to a specified URL for example https://remoteserver.com/slow/... and https://remoteserver.com/fast/... using the below.
<Location "/slow">
SetOutputFilter RATE_LIMIT
SetEnv rate-limit 5120
SetEnv rate-initial-burst 5120
</Location>
<Location "/fast">
SetOutputFilter RATE_LIMIT
SetEnv rate-limit 204800
SetEnv rate-initial-burst 204800
</Location>
I then used .htaccess write to include that part of the URL for me as my file structure already existed and couldn't be split into subfolders.
RewriteEngine On
RewriteRule ^/?(?:slow|fast)(/downloads/.+)$ $1 [L,NC]

Related

Streaming a remote file in PHP is saving it to my server. Why?

I want just to stream the file, i don't want to save it on my ram or hdd. I am talking about streaming a hd video for each viewer individually.
There is my code:
// ...
ob_start();
header('...'); // I am sending some headers here
ob_flush(); flush(); ob_clean();
$handle = fopen('http://example.com/bigMovie.mp4', 'rb');
ob_flush(); flush(); ob_clean();
while(!feof($handle)) // do this until is the end of the file
{
echo fread($handle, 102400); // reading 100 kb from file
ob_flush(); flush(); ob_clean(); // sending it to the user, cleaning ram
}
ob_end_flush(); // finished sending the file
fclose($handle); // closing the remote connection to example.com
// ...
When I try to send bigMovie.mp4 to a viewer my server's hdd gets full.
What whoud i do ? Is anything wrong with my code ?
Thanks.
This should work:
# Send headers
header('...');
header('...');
header('...');
...
# Send file
readfile('http://example.com/bigMovie.mp4');
readfile() reads a file (even remotely over http) and immediately outputs it's contents.
Btw, your usage of the ob_* functions makes no sense at all. You need to understand how output control works in php before using it. The manual is a good starting point: http://php.net/manual/en/ref.outcontrol.php
#hek2mgl
Assuming these are your files that you are streaming to your users. You can place the files in a CDN. Here are a couple of the big players depending on your hosting preferences.
You can use Amazon Web Service's Cloudfront for streaming.
http://aws.amazon.com/cloudfront/streaming/
RackSpace CDN streaming
http://www.rackspace.com/knowledge_center/frequently-asked-question/getting-started-with-cloud-files-streaming
Azure CDN
http://blog.thoughtstuff.co.uk/2014/01/streaming-mp4-video-files-in-azure-storage-containers-blob-storage/

PHP How do I host a large file properly?

I'm currently using the following PHP function to allow a user to select a file and then download it. This happens over FTP. However, if the user chooses a large file then while the download is occurring it locks up the server for any other requests. Is there any way I can host the file while having PHP continue to respond to requests?
I need PHP to verify that the user is permitted to download the file with their credentials so I can't just host it as an asset. The file is located on an FTP server.
function download($file) {
$fileStream = "";
if($this->get($file)) {
//Send header to browser to receive a file
header("Content-disposition: attachment; filename=\"$file\"");
header("Content-type: application/octet-stream");
header("Pragma: ");
header("Cache-Control: no-cache");
header("Expires: 0");
$data = readfile($this->downloadDir . $file);
$i=0;
while ($data[$i] != "")
{
$fileStream .= $data[$i];
$i++;
}
unlink($this->downloadDir . $file);
echo $fileStream;
exit;
} else {
return false;
}
}
PHP is not the best solution for this kind of work, but it can delegate the job to the web server you are using. And as the file is in the same place as your application, this can work.
All major web servers that usually run PHP applications (Apache, lighttpd and nginx) have all support for XSendfile.
To use it, you have to first enable the functionality in your web server (check the links above for each of the web servers), then in your script add a new header:
Apache:
header("X-Sendfile: $location_of_file_to_download");
Lighttpd:
header("X-LIGHTTPD-send-file: $location_of_file_to_download");
nginx:
header("X-Accel-Redirect: $location_of_file_to_download");
The web server will catch this header from your application, and will replace the body of your PHP response with the file. And while it servers this file to the user, the PHP gets unblocked and ready to server a new user.
(The other headers will be kept, so you can retain the content-type and content-disposition headers)
Since PHP is single-threaded, you would have to make a structure for each request. Then, instead of just processing one request at a time, you should loop through the structures and slowly process all of them concurrently (as in send a few hundred kb to one, then move onto the next, etc).
Honestly, PHP doesn't sound like the right language to do this job. Why not using a purpose built FTP server like vsftp or something of that nature?

PHP on IIS 7.5 - Large file download blocks connection until download is complete

I want to allow users to download large video files. These files are outside of the public folder because of security reasons.
I'm using a combination of fopen(), feof(), and fread() to download the file in chuncks.
The download works fine. The video is downloaded and also works just fine. The problem is during the download. Any user who's downloading the file can't continue browsing the site until the file is downloaded. The browser is trying to establish a connection, but it hangs while the file is downloading. When the download is done, the connection is immediately established. Other users can browse the site just fine during the download, so it's not like the whole server hangs or whatever.
I'm working with PHP (CakePHP) installed on an IIS server.
A snippet of code:
$name = "filename.mp4";
$folder = "private/folder/";
$handle = fopen($folder.$name, "rb");
if(!$handle)
{
echo "File not found";
}
else
{
header("Content-length:".filesize($folder.$name));
header("Content-Type: video/mp4");
header("Content-Disposition: attachment; filename='filename.mp4'");
header("Content-Transfer-Encoding: binary");
session_write_close(); // this is the solution
while(!feof($handle))
{
$buffer = fread($handle, 1*(1024*1024));
echo $buffer;
ob_flush();
flush();
}
}
I finally solved the problem. As suggested above, the problem was indeed related to sessions. Even though session.auto_start was off, CakePHP itself was handling sessions at the moment. So, by inserting session_write_close() right before the while loop, the problem was solved.

create a folder in local computer from online system in php

I want to save a file from the online system and save it into specific,ex. C:/myFolder/. So,if in the C:/ there is no myFolder name,the system will automatically detect and create the folder in C drive, and save the file into that folder. When I try system locally, it can create the folder and the file into C drive.
But when it upload my file into the server, the folder was created within the server, not in the local computer. Can anyone help me how to solve this? how to create a folder in local computer and save file into that folder from online system?
Below is the code that works when the system runs locally:
$directory = 'C:/sales/'.$filename.'.txt';
$path_name = 'C:/sales/';
if ( ! is_dir($path_name)) {
mkdir($path_name);
}
if(mysql_num_rows($query))
{
$fp = fopen($directory, 'w');
if($fp)
{
for($i = 0; $i < mysql_num_rows($query); $i++)
{
$f = mysql_fetch_array($query);
$orderFee_q = mysql_query("select * from sales_order where status in ('waiting', 'waiting1', 'waiting2', 'approved') and outstanding = 'N' order by so_no desc");
$get_orderFee = mysql_fetch_array($orderFee_q);
$line = $f["id"]."\t".$f["so_no"];
if(trim($line) != '') {fputs($fp, $line."\r\n");}
}
}
fclose($fp);
}
$download_name = basename($filename);
if(file_exists($filename))
{
header('Content-Description: File Transfer');
header('Content-Type: application/force-download');
header('Content-Transfer-Encoding: Binary');
header("Content-Disposition: attachment; filename=".$download_name);
header('X-SendFile: '.$filename);
header('Pragma: no-cache');
header("Expires: 0");
readfile($filename);
}
Simply, you can't do that in php or any other server side language.
Edit:
Reason is simple server side application and scripts have access only to local resources where they are launched. So when you run your application on local computer, everything works as you wish for. But because of how HTTP works and because of safety reasons you cannot access user local files.
You could create a desktop application that interacts with your server, which would allow you to do something like this.
You can not tell the user where to save the file you sending. Even with JavaScript. Server Site languages can manipulata the files on the server, not on the client's machine, by defaylt.But anyway it is not possible to access client's files without the permision of the client. Possibly this could be done by an ActiveX component, but client should agree and accept this action. Otherwise untill now your computer would have more virusese than your files. So the browser makes a protected invironment for safe browsing. You have already set the name of the file you are sending ... so that's the max possible you can do server side.

PHP readfile() and large files

When using readfile() -- using PHP on Apache -- is the file immediately read into Apache's output buffer and the PHP script execution completed, or does the PHP script execution wait until the client finishes downloading the file (or the server times out, whichever happens first)?
The longer back-story:
I have a website with lots of large mp3 files (sermons for a local church). Not all files in the audio archive are allowed to be downloaded, so the /sermon/{filename}.mp3 path is rewritten to really execute /sermon.php?filename={filename} and if the file is allowed to be downloaded then the content type is set to "audio/mpeg" and the file streamed out using readfile(). I've been getting complaints (almost exclusively from iPhone users who are streaming the downloads over 3G) that the files don't fully download, or that they cut off after about 10 or 15 minutes. When I switched from streaming out the file with a readfile() to simply redirecting to the file -- header("Location: $file_url"); -- all of the complaints went away (I even checked with a few users who could reliably reproduce the problem on demand previously).
This leads me to suspect that when using readfile() the PHP script engine is in use until the file is fully downloaded but I cannot find any references which confirm or deny this theory. I'll admit I'm more at home in the ASP.NET world and the dotNet equivalent of readfile() pushes the whole file to the IIS output buffer immediately so the ASP.NET execution pipeline can complete independently of the delivery of the file to the end client... is there an equivalent to this behavior with PHP+Apache?
You may still have PHP output buffering active while performing the readfile(). Check that with:
if (ob_get_level()) ob_end_clean();
or
while (ob_get_level()) ob_end_clean();
This way theonly remaining output Buffer should be apache's Output Buffer, see SendBufferSize for apache tweaks.
EDIT
You can also have a look at mod_xsendfile (an SO post on such usage, PHP + apache + x-sendfile), so that you simply tell the web server you have done the security check and that now he can deliver the file.
a few things you can do (I am not reporting all the headers that you need to send that are probably the same ones that you currently have in your script):
set_time_limit(0); //as already mention
readfile($filename);
exit(0);
or
passthru('/bin/cat '.$filename);
exit(0);
or
//When you enable mod_xsendfile in Apache
header("X-Sendfile: $filename");
or
//mainly to use for remove files
$handle = fopen($filename, "rb");
echo stream_get_contents($handle);
fclose($handle);
or
$handle = fopen($filename, "rb");
while (!feof($handle)){
//I would suggest to do some checking
//to see if the user is still downloading or if they closed the connection
echo fread($handle, 8192);
}
fclose($handle);
The script will be running until the user finishes downloading the file. The simplest, most efficient and surely working solution is to redirect the user:
header("Location: /real/path/to/file");
exit;
But this may reveal the location of the files. It's a good idea to password-protect the files that may not be downloaded by everyone anyway with an .htaccess file, but perhaps you use a database to detemine access and this is no option.
Another possible solution is setting the maximum execution time of PHP to 0, which disables the limit:
set_time_limit(0);
Your host may disallow this, though. Also PHP reads the file into the memory first, then goes through Apache's output buffer, and finally makes it to the network. Making users download the file directly is much more efficient, and does not have PHP's limitations like the maximum execution time.
Edit: The reason you get this complaint a lot from iPhone users is probably that they have a slower connection (e.g. 3G).
downloading files thru php isnt very efficient, using a redirect is the way to go. If you dont want to expose the location of the file, or the file isnt in a public location then look into internal redirects, here is a post that talks about it a bit, Can I tell Apache to do an internal redirect from PHP?
Try using stream_copy_to_stream() instead. I find is has fewer problems than readfile().
set_time_limit(0);
$stdout = fopen('php://output', 'w');
$bfname = basename($fname);
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$bfname\"");
$filein = fopen($fname, 'r');
stream_copy_to_stream($filein, $stdout);
fclose($filein);
fclose($stdout);
Under Apache, there is a nice elgant solution not involving php at all:
Just place an .htaccess config file into the folder containing the files to be offered for download with the following contents:
<Files *.*>
ForceType applicaton/octet-stream
</Files>
This tells the Apache to offer all files in this folder (and all its subfolders) for download, instead of directly displaying them in the browser.
See below url
http://php.net/manual/en/function.readfile.php
<?php
$file = 'monkey.gif';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>

Categories