Download large files with PHP - php

I'm using jQuery with PHP.
I've written a simple download function with PHP:
function downloadFile($sFile){
#Main function
header('Content-Type: '.mime_content_type($sFile));
header('Content-Description: File Transfer');
header('Content-Length: ' . filesize($sFile));
header('Content-Disposition: attachment; filename="' . basename($sFile) . '"');
readfile($sFile);
}
I can download a file through this script, but if it's a large files(like 1GB), the readfile function needs his time until the download start. So i have to wait about a minute or something, until the download really starts.
Any idea how to optimze my script, so the download starts immediately?

You could configure Apache to set the proper headers in your .htaccess file. Then, you could link directly to the file instead of the PHP page. This will also reduce server load.
Of course, if the PHP script performs functions other than just setting headers (such as authentication) then this is not an option. You will have to pass the file through PHP in chunks as #N.B. mentions in his comment.

Related

start downloading of a generated file and then interrupt download

Does anyone know how i php could generate and random file and start downloading for few second, after what just interrupt download? I need to test some upload/download scripts and need some really large files generated, but to be interrupted on half way of download. I was going to integrate the test utility with my debug script.
this can start download of file, but how i can interrupt download after random amount of time?
header('Content-Type: application/csv');
header("Content-length: " . filesize($NewFile));
header('Content-Disposition: attachment; filename="' . $FileName . '"');
echo $content;
exit();
You may use this bash script and save it as interrupter.sh. Run it in background by ./interrupter.sh &
#!/bin/sh
while :
do
/etc/init.d/networking stop
sleep 3
/etc/init.d/networking start
sleep 10
done
It will interrupt download for 3 seconds. You may customize it by changing param to sleep.

Processing a local file from a server

I have a PHP script that is currently working locally that I'd like to put on a server.
Currently, the user choose a .txt file, the PHP script works on it and outputs a new file based on what it read in the file.
The problem is that I can only select files in the folder with the script, and not elsewhere.
I use a to get the file name, but it only gives out the name of the file, and not it's absolute path.
From what I've read, I think that I need to upload the file to the server, process it with the script and then give it back to the user.
I'm not sure this is the correct method though.
Also, while I have found plenty of informations on uploading files to the server, I don't know how to put the new file created by the script in the folder where the original file is located.
You cannot read or write files directly on the client's machine. The client will need to upload the file by selecting it in the browser, the server receives the data, processes the data and returns data. This returned data can be presented in the form of a file download by setting the appropriate HTTP headers. The client will have to acknowledge the file download and save it somewhere of his choosing.
Your server has no business knowing anything about files or folders on the client's machine. It can only communicate with it over the HTTP protocol and send and receive data.
You will have to give the file back to the client, as a downloadable file. You can "write" it to the user by setting some headers. Take a look:
<?php
$file = 'random_text_file.txt';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>
That will prompt a download of the file to the user.

PHP readfile() and large files

When using readfile() -- using PHP on Apache -- is the file immediately read into Apache's output buffer and the PHP script execution completed, or does the PHP script execution wait until the client finishes downloading the file (or the server times out, whichever happens first)?
The longer back-story:
I have a website with lots of large mp3 files (sermons for a local church). Not all files in the audio archive are allowed to be downloaded, so the /sermon/{filename}.mp3 path is rewritten to really execute /sermon.php?filename={filename} and if the file is allowed to be downloaded then the content type is set to "audio/mpeg" and the file streamed out using readfile(). I've been getting complaints (almost exclusively from iPhone users who are streaming the downloads over 3G) that the files don't fully download, or that they cut off after about 10 or 15 minutes. When I switched from streaming out the file with a readfile() to simply redirecting to the file -- header("Location: $file_url"); -- all of the complaints went away (I even checked with a few users who could reliably reproduce the problem on demand previously).
This leads me to suspect that when using readfile() the PHP script engine is in use until the file is fully downloaded but I cannot find any references which confirm or deny this theory. I'll admit I'm more at home in the ASP.NET world and the dotNet equivalent of readfile() pushes the whole file to the IIS output buffer immediately so the ASP.NET execution pipeline can complete independently of the delivery of the file to the end client... is there an equivalent to this behavior with PHP+Apache?
You may still have PHP output buffering active while performing the readfile(). Check that with:
if (ob_get_level()) ob_end_clean();
or
while (ob_get_level()) ob_end_clean();
This way theonly remaining output Buffer should be apache's Output Buffer, see SendBufferSize for apache tweaks.
EDIT
You can also have a look at mod_xsendfile (an SO post on such usage, PHP + apache + x-sendfile), so that you simply tell the web server you have done the security check and that now he can deliver the file.
a few things you can do (I am not reporting all the headers that you need to send that are probably the same ones that you currently have in your script):
set_time_limit(0); //as already mention
readfile($filename);
exit(0);
or
passthru('/bin/cat '.$filename);
exit(0);
or
//When you enable mod_xsendfile in Apache
header("X-Sendfile: $filename");
or
//mainly to use for remove files
$handle = fopen($filename, "rb");
echo stream_get_contents($handle);
fclose($handle);
or
$handle = fopen($filename, "rb");
while (!feof($handle)){
//I would suggest to do some checking
//to see if the user is still downloading or if they closed the connection
echo fread($handle, 8192);
}
fclose($handle);
The script will be running until the user finishes downloading the file. The simplest, most efficient and surely working solution is to redirect the user:
header("Location: /real/path/to/file");
exit;
But this may reveal the location of the files. It's a good idea to password-protect the files that may not be downloaded by everyone anyway with an .htaccess file, but perhaps you use a database to detemine access and this is no option.
Another possible solution is setting the maximum execution time of PHP to 0, which disables the limit:
set_time_limit(0);
Your host may disallow this, though. Also PHP reads the file into the memory first, then goes through Apache's output buffer, and finally makes it to the network. Making users download the file directly is much more efficient, and does not have PHP's limitations like the maximum execution time.
Edit: The reason you get this complaint a lot from iPhone users is probably that they have a slower connection (e.g. 3G).
downloading files thru php isnt very efficient, using a redirect is the way to go. If you dont want to expose the location of the file, or the file isnt in a public location then look into internal redirects, here is a post that talks about it a bit, Can I tell Apache to do an internal redirect from PHP?
Try using stream_copy_to_stream() instead. I find is has fewer problems than readfile().
set_time_limit(0);
$stdout = fopen('php://output', 'w');
$bfname = basename($fname);
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$bfname\"");
$filein = fopen($fname, 'r');
stream_copy_to_stream($filein, $stdout);
fclose($filein);
fclose($stdout);
Under Apache, there is a nice elgant solution not involving php at all:
Just place an .htaccess config file into the folder containing the files to be offered for download with the following contents:
<Files *.*>
ForceType applicaton/octet-stream
</Files>
This tells the Apache to offer all files in this folder (and all its subfolders) for download, instead of directly displaying them in the browser.
See below url
http://php.net/manual/en/function.readfile.php
<?php
$file = 'monkey.gif';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>

Using JavaScript and PHP to download image files and return a zip file

I'm in the middle of developing a Safari extension for imageboard-type websites and one of the bigger features I'm hoping to implement is the ability to download all of the images (the posted ones, not the global page-level images) that had been posted.
There are similar questions here already, but mine differs a bit in that the images in question are hosted on an entirely different server. I've been brainstorming a bit and figured that gathering all of the image URLs in a JS array then sending it to my server to be turned into a zip file (forcing the download, not just a link to the file) would be the best way to go. I also want the zip to be deleted after the user downloads it.
I've already finished the majority of the extension features but this one is stumping me. Any help would be greatly appreciated.
How would I'd go about doing this?
You want a extension to contact your server for downloads? That's a terrible idea! Make the zipfile locally - it's not regular javascript, it's an extension - you have full access.
Anyway assuming you want to do this anyway, what is the trouble you are having? You get a list of urls, send them to your server, your server downloads them, zips them and send them to the user. (The "your server downloads them" part should worry you!)
What problem are you having?
You can use PHP's ZipArchive class to make a ZIP, then stream it to the browser.
<?php
// Create temp zip file
$zip = new ZipArchive;
$temp = tempnam(sys_get_temp_dir(), 'zip');
$zip->open($temp);
// Add files
$zip->addFromString('file.jpg', file_get_contents('http://path/to/file.jpg'));
$zip->addFile('/this/is/my/file.txt');
// Write temp file
$zip->close();
// Stream file to browser
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=myFile.zip');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($temp));
readfile($temp);
unlink($temp);
exit;

PHP Headers and downloads

I am currently trying to develop a PHP application in which my server downloads a file and the user can do the same almost simultaneously. I already think about the problem "If the user downloads fastly than the server...", but it's not a problem at this moment.
To do so, I used the header and readfile functions of php. Here is my code :
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$data['name'].'";');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$data['size']);
readfile($remoteFile);
I must to use the Content-length header to set the proper size of the file and not the size that is downloaded when the user clicks on the link. However, after some seconds or minutes, download is stopped and I need to restart...
If you think about a solution, even if it didn't use the header(); function, please tell me.
Thank you in advance...
I have experienced that this is directly related to maximum runtime settings, that are enforced upon you if you run with safe_mode on.
If you have the option, try setting set_time_limit(0) and see if that makes it work.
if you have your own server, you should look into the mod_xsendfile module for apache, since that is built specifically to send large files to the user.
Oh, and its stupidly easy to use
header("X-Sendfile: $path_to_somefile");
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$somefile\"");
exit;

Categories