I am using the simple file downloading script:
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
It is working on my localserver upto 200mb.
When i try this code in my website it downloads 173KB instead of 200MB file.
I checked everything, wrote some custom code (using ob functions and fread instead of readfile) but can't download big files.
Thank you for your answers.
I am using Apache 2.2, PHP 5.3
All PHP settings to deal with big files are ok. (execution times, memory limits, ...
One issue I have with the following code is you have no control over the output stream, your letting PHP handle it without knowing exactly what is going on within the background:
What you should do is set up an output system that you can control and replicated accros servers.
For example:
if (file_exists($file))
{
if (FALSE!== ($handler = fopen($file, 'r')))
{
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: chunked'); //changed to chunked
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
//header('Content-Length: ' . filesize($file)); //Remove
//Send the content in chunks
while(false !== ($chunk = fread($handler,4096)))
{
echo $chunk;
}
}
exit;
}
echo "<h1>Content error</h1><p>The file does not exist!</p>";
This is only basic but give it a go!
Also read my reply here: file_get_contents => PHP Fatal error: Allowed memory exhausted
It seems readfile can have issues with long files. As #Khez asked, it could be that the script is running for too long. A quick Googling resulted in a couple examples of chunking the file.
http://teddy.fr/blog/how-serve-big-files-through-php
http://www.php.net/manual/en/function.readfile.php#99406
One solution to certain scenarios is that you can use PHP-script to intelligently decide what file from where to download, but instead of sending the file directly from PHP, you could return a redirection to the client which then contains the direct link which is processed by the web server alone.
This could be done at least in two ways: either PHP-script copies the file into a "download zone" which for example might be cleaned from "old" files regularly by some other background/service script or you expose the real permanent location to the clients.
There are of course drawbacks as is the case with each solution. In this one is that depending on the clients (curl, wget, GUI browser) requesting the file they may not support redirection you make and in the other one, the files are very exposed to the outer world and can be at all times read without the (access) control of the PHP script.
Have you made sure your script can run long enough and has enough memory?
Do you really need output buffering ?
The real solution is to avoid using a PHP script for just sending a file to the client, it's overkill and your webserver is better suited for the task.
Presumably you have a reason for sending the files through PHP, perhaps users must authenticate first? If that is the case then you should use X-Accel-Redirect (if you're using nginx) or X-Sendfile (previously X-LIGHTTPD-send-file) on lighttpd.
If you're using Apache I've found a few references to mod_xsendfile but I've never used it personally, and I doubt it's installed for you if you have managed hosting.
If these solutions are untenable I apologise, but I really need more information on the actual problem: Why are you sending these files through PHP in the first place?
Related
I'm using this script:
http://www.webvamp.co.uk/blog/coding/creating-one-time-download-links/
to allow users download files (one time). Everything works fine with small files. Now i'm trying to do the same but with larger file 1.2 GB. Instead of forcing user to download file, script show off the relative patch to the file! Is there any way to modify the script or its a fault of the server configuration?
Thanks for help!
Looking for the code i think it fails on large files due to memory limitation. Script reads the whole file in memory via file_get_contents() before sending it. I suspect, >1Gb files will cause the problems with memory.
Try to replace following lines in download.php script:
//get the file content
$strFile = file_get_contents($strDownload);
//set the headers to force a download
header("Content-type: application/force-download");
header("Content-Disposition: attachment;
filename=\"".str_replace(" ", "_", $arrCheck['file'])."\"");
//echo the file to the user
echo $strFile;
to:
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($strDownload));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($strDownload));
ob_clean();
flush();
readfile($strDownload);
This may be help.
Note from manual: readfile() will not present any memory issues, even when sending large files, on its own
For some reason, our webserver is not responding while it's serving large files.
We use the windows platform, because we need to remotely call Win32 applications in order to generate the file that is to be served. This file is served through PHP's function: fpassthru, using this code:
if (file_exists($file)) {
$handle = #fopen($file, "rb");
header('Content-Description: File Transfer');
header('Content-Type: video/mp4');
if($stream==0){
header('Content-Disposition: attachment; filename='.basename($filename.".mp4"));
}
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_end_clean();
fpassthru($handle);
exit;
}
These files are often over 1GB in size and takes a while to transfer, but during this time, the webserver will not serve any pages. My firefox indicates it's 'connecting' but nothing else. Note that somebody else is transferring this file, not me, so different IP, different session.
Any clue where to look? Obviously, it's intolerable to have to wait 5 minutes for a website.
Thanks in advance!
This is commonly caused when you do not close the session before you begin sending the file data. This is because the session cache file can only be opened by one PHP process at a time, therefore the download is effectively blocking all other PHP processes at session_start().
The solution is to call session_write_close() to commit the session data to disk and close the file handle before you start outputting the file data.
I have fried my brain all day on this. Researching SO until my eyes are bleary...I need to know: How do I access files placed outside the site root?
Background: Apache 2.0 dedicated server running Linux.
Code: PHP and MySQL
Reason: I want the files to be secured against typing in the file path and filename into a browser.
This can't be that difficult...but my splitting head says otherwise. Any help would be absolutely appreciated.
Have a look at the answers to this question, which seem to be doing more or less the same thing.
A quick summary: readfile() or file_get_contents() are what you're after. This example comes from the readfile() page:
<?php
$file = 'monkey.gif';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>
I don't recommend allowing the $file variable to be set using user input! Think about where the filenames are coming from before arbitrarily returning files in the response.
are you trying to access files outside of site root? Then you can look at this link in stackoverflow.
And this is the official doc in Apache.
Otherwise you don't have to do special handling to prevent others from accessing files outside site root.
I have an mp3 on my server (urls are just examples):
http://www.my-server.com/myaudio.mp3
I have a php script on the server at:
http://www.my-server.com/testmp3.php
Which contains the following code (which I got here):
<?
$file = "myaudio.mp3";
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>
Is this all I have to do to mimic the behavior so that both request behave the same way and return the exact same response? Or is there anything I'm missing.
I'm using some streaming code on iOS (not relevant here) and both requests stream the audio fine but I can't seek properly using the php request but I can with the mp3 request directly.
So without getting into details about the app itself I wanted to eliminate this one variable first. Is there anything I need to do to make sure that from another app's perspective these two request will return the exact same data?
Thanks for any input you can give me here.
Update
It turns out my question really should have read "how do you support seeking of an mp3 when returning from a php script?".
To support seeking, you often will have to support a range request.
From the RFC: http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.35
See also: Resumable downloads when using PHP to send the file?
Its probably better to handle this with a .htaccess modification rather than some PHP code.
Here's a link on htaccess to get you started.
If you have a whole directory of .mp3 files that you want to appear as downloads instead of playing it in browser, you'd simply modify the .htaccess file in that folder to include
AddType application/octet-stream .mp3
Many users of my site have reported problems downloading a large file (80 MB). I am using a forced download using headers. I can provide additional php settings if necessary. I am using the CakePHP framework, but this code is all regular php. I am using php 5.2 with apache on a dedicated virtual server from media temple, CentOS Linux. Do you see any problems with the following code:
set_time_limit(1500);
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"" . basename($file_path) . "\"");
header("Content-Length: ".$content_length);
header("Content-Transfer-Encoding: binary");
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Cache-Control: private', false);
header('Pragma: public');
header('Expires: 0');
//Change this part
$handle = fopen($file_path, 'rb');
while (!feof($handle))
{
echo fread($handle, 4096);
ob_flush();
flush();
}
fclose($handle);
exit;
Basically, the problem being reported is that the download starts and then stops in the middle. I was thinking it was a problem with the time limit, so I add the set_time_limit code. I was using the php readfile function before, but that also did not work smoothly.
The problem with PHP-initiated http transfers is that they seldomly support partial requests:
GET /yourfile HTTP/1.1
Range: bytes=31489531-79837582
Whenever a browser encounters a transmission problem, it will try to resume the download. Your php script does not accomodate for that (it's not trivial, so nobody does).
So really avoid that. Redirect users to a static file and let your webserver handle it. If you need to handle authorization, use tricks like symlinks or rewriterules that check for session cookies or even a static permission file (./allowed/178.224.2.55-file-1). Any required extra HTTP headers can be injected likewise, or with a .meta file.
I don't see any trouble, but for S&G's try placing the set_time_limit inside the while loop. This ensures they don't hit a hard limit and (as long as the client's taking the information) the time-limit gets extended.