Downloading files with PHP - Only downloading one at a time! - php

I have a PHP file that serves up a file, but the problem is that no matter what browser is being used, if you click on 2 links that go to 2 separate files, the second download doesn't start until the first one is complete! Any ideas?
Download Code
header('Content-Type: application/octet-stream');
header('Content-Description: File Transfer');
header('Content-Disposition: attachment; filename="'.basename($filename).'"');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($fullpath));
readfile($fullpath);
Example Links
Link 1: download.php?downloadfile=1
Link 2: download.php?downloadfile=2

There could be different reasons for this.
You are using sessions. Therefor only one script at a time is allowed to modify the session. So download B can only start after download A has finished. Did you try two downloads concurrently with download A in browser A and download B in browser B? Check description for session_write_close
Some other HTTP issue where your browser won't open multiple connections to the server but reuse a single connection and that way of course has to wait until first request finishes.
Some OS/Webserver setting which only allows a very limited number of open concurrent connections either in total or per host

Related

Check permission and log download with PHP, but use original link to download the file?

I'm using the following code to check if a user has permission to download a file and to log that s/he downloaded it. It works good, but the files are hosted on Dropbox and I assume that the files are downloading through my server? I have recently gotten CPU spikes and complete server stalls so I'm looking into the option of either optimizing my code below if possible, or make so the access check and download count happens and then you're redirected to the dropbox link. Any suggestions?
<?php
/* PHP code here (not included in this snippet) to check for access and log download */
session_write_close();
header("Cache-control: private");
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"file.mov\"");
header('Content-Transfer-Encoding: binary');
// Disable caching
header('Cache-Control: no-cache, no-store, must-revalidate'); // HTTP 1.1.
header('Pragma: no-cache'); // HTTP 1.0.
header('Expires: 0'); // Proxies.
readfile('https://dropboxlink...');
exit;
?>

One Time Download Links and large files

I'm using this script:
http://www.webvamp.co.uk/blog/coding/creating-one-time-download-links/
to allow users download files (one time). Everything works fine with small files. Now i'm trying to do the same but with larger file 1.2 GB. Instead of forcing user to download file, script show off the relative patch to the file! Is there any way to modify the script or its a fault of the server configuration?
Thanks for help!
Looking for the code i think it fails on large files due to memory limitation. Script reads the whole file in memory via file_get_contents() before sending it. I suspect, >1Gb files will cause the problems with memory.
Try to replace following lines in download.php script:
//get the file content
$strFile = file_get_contents($strDownload);
//set the headers to force a download
header("Content-type: application/force-download");
header("Content-Disposition: attachment;
filename=\"".str_replace(" ", "_", $arrCheck['file'])."\"");
//echo the file to the user
echo $strFile;
to:
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($strDownload));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($strDownload));
ob_clean();
flush();
readfile($strDownload);
This may be help.
Note from manual: readfile() will not present any memory issues, even when sending large files, on its own

Webpages not served during large file transfers (IIS7.5 + PHP)

For some reason, our webserver is not responding while it's serving large files.
We use the windows platform, because we need to remotely call Win32 applications in order to generate the file that is to be served. This file is served through PHP's function: fpassthru, using this code:
if (file_exists($file)) {
$handle = #fopen($file, "rb");
header('Content-Description: File Transfer');
header('Content-Type: video/mp4');
if($stream==0){
header('Content-Disposition: attachment; filename='.basename($filename.".mp4"));
}
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_end_clean();
fpassthru($handle);
exit;
}
These files are often over 1GB in size and takes a while to transfer, but during this time, the webserver will not serve any pages. My firefox indicates it's 'connecting' but nothing else. Note that somebody else is transferring this file, not me, so different IP, different session.
Any clue where to look? Obviously, it's intolerable to have to wait 5 minutes for a website.
Thanks in advance!
This is commonly caused when you do not close the session before you begin sending the file data. This is because the session cache file can only be opened by one PHP process at a time, therefore the download is effectively blocking all other PHP processes at session_start().
The solution is to call session_write_close() to commit the session data to disk and close the file handle before you start outputting the file data.

How to securely store/serve files on a webserver via PHP that are only accessible via a secure login area?

I am currently creating a PHP website which allows administrators to upload a variety of documents (pdf,doc,docx,xls) which can then be downloaded at a later date. These can only be accessed by administrators after they have logged in. Up until this point to do this I have been storing files above the web root and then using PHP to access and serve the file via a PHP script hence preventing direct access to the files. This does work but never seems like an ideal way to do it as it's reliant on setting the correct headers via PHP for the file download which does not always give the correct results on all browsers. I can't really see any other way of doing it that would also stop the files being publically accessible if they knew where they were located.
What process would you usually use to store and serve files on a web server that should not be publically accessible?
Sample PHP:
<?php
if (TRUE === $_SESSION['logged_in']) {
}
$file = '/full/path/to/useruploads/secret.pdf';
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=' . basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
?>

php, file download

I am using the simple file downloading script:
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
It is working on my localserver upto 200mb.
When i try this code in my website it downloads 173KB instead of 200MB file.
I checked everything, wrote some custom code (using ob functions and fread instead of readfile) but can't download big files.
Thank you for your answers.
I am using Apache 2.2, PHP 5.3
All PHP settings to deal with big files are ok. (execution times, memory limits, ...
One issue I have with the following code is you have no control over the output stream, your letting PHP handle it without knowing exactly what is going on within the background:
What you should do is set up an output system that you can control and replicated accros servers.
For example:
if (file_exists($file))
{
if (FALSE!== ($handler = fopen($file, 'r')))
{
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: chunked'); //changed to chunked
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
//header('Content-Length: ' . filesize($file)); //Remove
//Send the content in chunks
while(false !== ($chunk = fread($handler,4096)))
{
echo $chunk;
}
}
exit;
}
echo "<h1>Content error</h1><p>The file does not exist!</p>";
This is only basic but give it a go!
Also read my reply here: file_get_contents => PHP Fatal error: Allowed memory exhausted
It seems readfile can have issues with long files. As #Khez asked, it could be that the script is running for too long. A quick Googling resulted in a couple examples of chunking the file.
http://teddy.fr/blog/how-serve-big-files-through-php
http://www.php.net/manual/en/function.readfile.php#99406
One solution to certain scenarios is that you can use PHP-script to intelligently decide what file from where to download, but instead of sending the file directly from PHP, you could return a redirection to the client which then contains the direct link which is processed by the web server alone.
This could be done at least in two ways: either PHP-script copies the file into a "download zone" which for example might be cleaned from "old" files regularly by some other background/service script or you expose the real permanent location to the clients.
There are of course drawbacks as is the case with each solution. In this one is that depending on the clients (curl, wget, GUI browser) requesting the file they may not support redirection you make and in the other one, the files are very exposed to the outer world and can be at all times read without the (access) control of the PHP script.
Have you made sure your script can run long enough and has enough memory?
Do you really need output buffering ?
The real solution is to avoid using a PHP script for just sending a file to the client, it's overkill and your webserver is better suited for the task.
Presumably you have a reason for sending the files through PHP, perhaps users must authenticate first? If that is the case then you should use X-Accel-Redirect (if you're using nginx) or X-Sendfile (previously X-LIGHTTPD-send-file) on lighttpd.
If you're using Apache I've found a few references to mod_xsendfile but I've never used it personally, and I doubt it's installed for you if you have managed hosting.
If these solutions are untenable I apologise, but I really need more information on the actual problem: Why are you sending these files through PHP in the first place?

Categories