I am currently trying to develop a PHP application in which my server downloads a file and the user can do the same almost simultaneously. I already think about the problem "If the user downloads fastly than the server...", but it's not a problem at this moment.
To do so, I used the header and readfile functions of php. Here is my code :
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$data['name'].'";');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$data['size']);
readfile($remoteFile);
I must to use the Content-length header to set the proper size of the file and not the size that is downloaded when the user clicks on the link. However, after some seconds or minutes, download is stopped and I need to restart...
If you think about a solution, even if it didn't use the header(); function, please tell me.
Thank you in advance...
I have experienced that this is directly related to maximum runtime settings, that are enforced upon you if you run with safe_mode on.
If you have the option, try setting set_time_limit(0) and see if that makes it work.
if you have your own server, you should look into the mod_xsendfile module for apache, since that is built specifically to send large files to the user.
Oh, and its stupidly easy to use
header("X-Sendfile: $path_to_somefile");
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$somefile\"");
exit;
Related
I recently found a youtube video that talked about accessing a PDF in a web browser. It was interesting because as I suspected I could access a PDF most anywhere on the system (given permissions) and pass it along via PHP to a web browser.
I do not want his to become a security discussion so please abstain from security comments!
https://www.youtube.com/watch?v=z4a6QJSGL28
The code looks something like this
$file="/home/kodi/Pictures/scans/test.pdf";
$filename="test.pdf";
header('Content-type: application/pdf');
header('Content-disposition: inline; filename="'.$filename.'"');
header('Content-Transfer-Encoding: binary');
header('Accept-Ranges: bytes');
#readfile($file);
So I tested and it works then I tried this
$file="/home/kodi/Pictures/scans/test.jpg";
$filename="test.jpg";
header('Content-type: application/jpg');
header('Content-disposition: inline; filename="'.$filename.'"');
header('Content-Transfer-Encoding: binary');
header('Accept-Ranges: bytes');
#readfile($file);
Oddly enough this did NOT display in a web browser, but several seconds later, my preferred app for downloaded jpgs opened and there was the file . It had downloaded instead of opening in browser.
I have been working on a LAN app that can access users folders on the server through PAM authentication and have been using base64 encoding, however I felt that if I could reduce the overhead of base64 images (30%) , conversions and literally converting each image in a list to display links to them, the system would load faster. This would definately be a better way but everyone says that this kind of thing does not work! The proof that there is hope that it will work ios in the file I downloaded from the web server that was in a user folder.
Any ideas how to make it work with jpgs?
Your content type is incorrect, it should be image/jpeg, instead of application/jpg
So my company store all PDF files in Amazon S3 privately.
When the user request it our system pull it from Amazon S3 and then serve it to the user with following code:
header("Cache-Control: public");
header("Pragma: public");
header("Expires: 0");
header("Content-Description: File Transfer");
header('Content-Disposition: attachment; filename="'.$fileName.'"');
header('Content-Length: ' . strlen($res->body));
header("Content-type: application/pdf");
header("Content-Transfer-Encoding: binary");
header('Connection: close');
echo $res->body;
$res is the respond returned from Amazon with the content from $res->body;
I see random slow download speed when the user try to download the PDF files, especially when the PDF is large (~5mb) compare to the rest that only having 800kb-1.5mb.
Solution tried:
1) Removing the content-length header doesn't help.
2) Remove EnableSendfile off in httpd.conf doesn't help either.
I also checked the server to make sure it wasn't the workload of the server that's causing this.
The speed test of both the server and user's workstation looks good too.
Do anyone of you have any idea what is the reason that's causing this slowness?
It might be that multiple people from your company use the internet.
You should do a full scan of the network, but still even then your not alone on the internet, and their "virtual" server environments are shared as well.
from my understanding the issue is that simply takes time to fetch the file from S3 in order to return it to the user.
Make yourself a favor:
create a signed url for a short period time
redirect the user to such url
dont worry, as creating signed urls doesn't expose any private information that compromises your security (if you do ir correctly)
When using readfile() -- using PHP on Apache -- is the file immediately read into Apache's output buffer and the PHP script execution completed, or does the PHP script execution wait until the client finishes downloading the file (or the server times out, whichever happens first)?
The longer back-story:
I have a website with lots of large mp3 files (sermons for a local church). Not all files in the audio archive are allowed to be downloaded, so the /sermon/{filename}.mp3 path is rewritten to really execute /sermon.php?filename={filename} and if the file is allowed to be downloaded then the content type is set to "audio/mpeg" and the file streamed out using readfile(). I've been getting complaints (almost exclusively from iPhone users who are streaming the downloads over 3G) that the files don't fully download, or that they cut off after about 10 or 15 minutes. When I switched from streaming out the file with a readfile() to simply redirecting to the file -- header("Location: $file_url"); -- all of the complaints went away (I even checked with a few users who could reliably reproduce the problem on demand previously).
This leads me to suspect that when using readfile() the PHP script engine is in use until the file is fully downloaded but I cannot find any references which confirm or deny this theory. I'll admit I'm more at home in the ASP.NET world and the dotNet equivalent of readfile() pushes the whole file to the IIS output buffer immediately so the ASP.NET execution pipeline can complete independently of the delivery of the file to the end client... is there an equivalent to this behavior with PHP+Apache?
You may still have PHP output buffering active while performing the readfile(). Check that with:
if (ob_get_level()) ob_end_clean();
or
while (ob_get_level()) ob_end_clean();
This way theonly remaining output Buffer should be apache's Output Buffer, see SendBufferSize for apache tweaks.
EDIT
You can also have a look at mod_xsendfile (an SO post on such usage, PHP + apache + x-sendfile), so that you simply tell the web server you have done the security check and that now he can deliver the file.
a few things you can do (I am not reporting all the headers that you need to send that are probably the same ones that you currently have in your script):
set_time_limit(0); //as already mention
readfile($filename);
exit(0);
or
passthru('/bin/cat '.$filename);
exit(0);
or
//When you enable mod_xsendfile in Apache
header("X-Sendfile: $filename");
or
//mainly to use for remove files
$handle = fopen($filename, "rb");
echo stream_get_contents($handle);
fclose($handle);
or
$handle = fopen($filename, "rb");
while (!feof($handle)){
//I would suggest to do some checking
//to see if the user is still downloading or if they closed the connection
echo fread($handle, 8192);
}
fclose($handle);
The script will be running until the user finishes downloading the file. The simplest, most efficient and surely working solution is to redirect the user:
header("Location: /real/path/to/file");
exit;
But this may reveal the location of the files. It's a good idea to password-protect the files that may not be downloaded by everyone anyway with an .htaccess file, but perhaps you use a database to detemine access and this is no option.
Another possible solution is setting the maximum execution time of PHP to 0, which disables the limit:
set_time_limit(0);
Your host may disallow this, though. Also PHP reads the file into the memory first, then goes through Apache's output buffer, and finally makes it to the network. Making users download the file directly is much more efficient, and does not have PHP's limitations like the maximum execution time.
Edit: The reason you get this complaint a lot from iPhone users is probably that they have a slower connection (e.g. 3G).
downloading files thru php isnt very efficient, using a redirect is the way to go. If you dont want to expose the location of the file, or the file isnt in a public location then look into internal redirects, here is a post that talks about it a bit, Can I tell Apache to do an internal redirect from PHP?
Try using stream_copy_to_stream() instead. I find is has fewer problems than readfile().
set_time_limit(0);
$stdout = fopen('php://output', 'w');
$bfname = basename($fname);
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$bfname\"");
$filein = fopen($fname, 'r');
stream_copy_to_stream($filein, $stdout);
fclose($filein);
fclose($stdout);
Under Apache, there is a nice elgant solution not involving php at all:
Just place an .htaccess config file into the folder containing the files to be offered for download with the following contents:
<Files *.*>
ForceType applicaton/octet-stream
</Files>
This tells the Apache to offer all files in this folder (and all its subfolders) for download, instead of directly displaying them in the browser.
See below url
http://php.net/manual/en/function.readfile.php
<?php
$file = 'monkey.gif';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>
I have a bad problem on my site where my file downloads are getting truncated when I use PHP to set the Content-Disposition to "attachment" and then sending the file using readfile(). I know this is somewhat of a notorious problem since there is plenty of discussion about it on PHP's readfile() manual page. I have tried each of the solutions posted there with no luck (sending chunked output, disabling gzip compression in both Apache/PHP). Here is the gist of that code, without all the workarounds, just as you'd find it on the PHP manual:
$file = '/path/to/ugly.file.name.mp4';
$filename = 'ANiceFileName.mp4';
header('Content-Description: File Transfer');
header('Content-Type: video/mp4');
header('Content-Disposition: attachment; filename='.$filename.'.mp4');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
Most importantly, notice that I'd like to be able to send a dynamic file name to the browser, to avoid the more machine-friendly codes I use to store the files on disk. And yes, I have done plenty of security checks before this bit of code executes -- I already know readfile() can be a security flaw. :-)
The only reliable way I have found to send a complete file is to simply trash PHP and instead use Apache's mod_header to force the file as a download. Like so:
<FilesMatch "\.mp4$">
ForceType application/octet-stream
Header set Content-Disposition attachment
</FilesMatch>
This works very well, but there's a big problem -- I can no longer generate dynamic file names upon request, which is somewhat important to my content. I already know about specifying ;filename=file_name.mp4, but remember - I need to specify a dynamic name rather than just file_name.mp4.
It would be really nice if I could somehow notify Apache (using PHP) of the dynamic file name when sending a file. Is that possible? Or am I going to be forced to rename all my files on disk to a user-friendly name? The PHP solutions just aren't working at all. My files are sometimes up to 15MB, and those are often truncated to less than 2MB when transferring. Ouch!
Help me Stack Overflow Kenobi! You're my only hope.
Try mod_xsendfile as described here. This basically allows PHP to step out of the middle of the HTTP conversation at the point that PHP sets the X-Sendfile header - but, as you can see on that blog, it allows for setting your Content-Disposition header.
Is it getting truncated because the php script times out? set_time_limit?
Many users of my site have reported problems downloading a large file (80 MB). I am using a forced download using headers. I can provide additional php settings if necessary. I am using the CakePHP framework, but this code is all regular php. I am using php 5.2 with apache on a dedicated virtual server from media temple, CentOS Linux. Do you see any problems with the following code:
set_time_limit(1500);
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"" . basename($file_path) . "\"");
header("Content-Length: ".$content_length);
header("Content-Transfer-Encoding: binary");
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Cache-Control: private', false);
header('Pragma: public');
header('Expires: 0');
//Change this part
$handle = fopen($file_path, 'rb');
while (!feof($handle))
{
echo fread($handle, 4096);
ob_flush();
flush();
}
fclose($handle);
exit;
Basically, the problem being reported is that the download starts and then stops in the middle. I was thinking it was a problem with the time limit, so I add the set_time_limit code. I was using the php readfile function before, but that also did not work smoothly.
The problem with PHP-initiated http transfers is that they seldomly support partial requests:
GET /yourfile HTTP/1.1
Range: bytes=31489531-79837582
Whenever a browser encounters a transmission problem, it will try to resume the download. Your php script does not accomodate for that (it's not trivial, so nobody does).
So really avoid that. Redirect users to a static file and let your webserver handle it. If you need to handle authorization, use tricks like symlinks or rewriterules that check for session cookies or even a static permission file (./allowed/178.224.2.55-file-1). Any required extra HTTP headers can be injected likewise, or with a .meta file.
I don't see any trouble, but for S&G's try placing the set_time_limit inside the while loop. This ensures they don't hit a hard limit and (as long as the client's taking the information) the time-limit gets extended.