I'm currently using the following PHP function to allow a user to select a file and then download it. This happens over FTP. However, if the user chooses a large file then while the download is occurring it locks up the server for any other requests. Is there any way I can host the file while having PHP continue to respond to requests?
I need PHP to verify that the user is permitted to download the file with their credentials so I can't just host it as an asset. The file is located on an FTP server.
function download($file) {
$fileStream = "";
if($this->get($file)) {
//Send header to browser to receive a file
header("Content-disposition: attachment; filename=\"$file\"");
header("Content-type: application/octet-stream");
header("Pragma: ");
header("Cache-Control: no-cache");
header("Expires: 0");
$data = readfile($this->downloadDir . $file);
$i=0;
while ($data[$i] != "")
{
$fileStream .= $data[$i];
$i++;
}
unlink($this->downloadDir . $file);
echo $fileStream;
exit;
} else {
return false;
}
}
PHP is not the best solution for this kind of work, but it can delegate the job to the web server you are using. And as the file is in the same place as your application, this can work.
All major web servers that usually run PHP applications (Apache, lighttpd and nginx) have all support for XSendfile.
To use it, you have to first enable the functionality in your web server (check the links above for each of the web servers), then in your script add a new header:
Apache:
header("X-Sendfile: $location_of_file_to_download");
Lighttpd:
header("X-LIGHTTPD-send-file: $location_of_file_to_download");
nginx:
header("X-Accel-Redirect: $location_of_file_to_download");
The web server will catch this header from your application, and will replace the body of your PHP response with the file. And while it servers this file to the user, the PHP gets unblocked and ready to server a new user.
(The other headers will be kept, so you can retain the content-type and content-disposition headers)
Since PHP is single-threaded, you would have to make a structure for each request. Then, instead of just processing one request at a time, you should loop through the structures and slowly process all of them concurrently (as in send a few hundred kb to one, then move onto the next, etc).
Honestly, PHP doesn't sound like the right language to do this job. Why not using a purpose built FTP server like vsftp or something of that nature?
Related
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
php How do I start an external program running - Having trouble with system and exec
how to open exe with php?
I had this idea and make hard to success it for several years,but failed at last. any one tell me a success method to do the job ?
<?php
if(isset($_POST['file_path'])){
/* -------
using "notepad++.exe" to open "test.php" file.
or run a bat file which calling "notepad++.exe" to open "test.php" file.
how to seting php.ini or firefox or any setting to do this job.
it is only for conveniently developing web page in my PC ,not for web servers
------- */
}
?>
<form action="test.php" method="post">
<input type="text" name="file_path" value="test.php"/>
<button type="submit">open with notepad++</button>
</form>
This would create something like:
To launch a program on the computer which runs the webserver:
<?php
exec('"C:\Program Files (x86)\Notepad++\notepad++.exe" "C:\foo.php"');
The above will work on vista/win7 IF the webserver does not run as a windows service. For example, if you run apache and it automatically starts when your computer boots, you probably installed it as a service. You can check to see if apache shows up in the windows services tab/thingy.
If the webserver runs as a service, you'll need to look into enabling the "allow desktop interaction" option for the service. But otherwise:
An easy test using php's new built in webserver(php 5.4+). The key thing here is you manually start the server from a shell, so it runs as your user instead of as a service.
<?php
// C:\my\htdocs\script.php
exec('"C:\Program Files (x86)\Notepad++\notepad++.exe" "C:\foo.php"');
start a webserver via a command window
C:\path\to\php.exe -S localhost:8000 -t C:\my\htdocs
Then in your browser
http://localhost:8000/script.php
I assume you are wanting the client device to open Notepad++ not the remote server. If this is the case, the best you can do is to serve up the file with the proper file type header and hope the client has Notepad ++ set up as the default application to open such a file.
Something like this should do it:
header('Content-type: text/plain');
header('Content-Disposition: attachment; filename="' . $file_name . '"'); // forces file download
header('Content-length: ' . filesize($file_path));
header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); // make sure file is re-validated each time it is requested
$fh = fopen($file_path, 'r');
while(!feof($fh)) {
$buffer = fread($fh, 2048);
echo $buffer;
}
fclose($fh);
Where $file_name is the name of the file (not the full path) and $file_path is the full path to the file
the finally successful way I tested.
thank Charles ,refer to php How do I start an external program running - Having trouble with system and exec
Start->Run, type "services.msc" to bring up Services control (other ways to get there, this is easiest IMO)
Locate your Apache service (mine was called "wampapache" using WampServer 2.0)
Open the service properties (double-click or right click->properties)
Flip to the Log On account and ensure the checkbox titled "Allow service to interact with Desktop" is checked
Flip back to the General tab, stop the service, start the service
then: in php
pclose(popen("start /B \"d:\\green soft\\notepad++5.8.4\\notepad++.exe\" \"d:\\PHPnow-1.5.6\\htdocs\\laji\\a.php\"", "r"));
Thank all your good guys, what a great help . I had finally made my idea to be true. Happy new year !
never had a reason to do so, but have you tried passthru() ?
http://php.net/manual/en/function.passthru.php
EDIT:
sorry the OP was really unclear at first glance..
what I would do is parse the file into a string or whatnot, then force the browser to treat that as a download.. php is server sided so you can`t only ask the browser to do some stuff..
$someText = 'some text here';
header('Content-type: text/plain');
header('Content-Disposition: attachment; filename="text.txt"');
echo $someText;
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
php How do I start an external program running - Having trouble with system and exec
how to open exe with php?
I had this idea and make hard to success it for several years,but failed at last. any one tell me a success method to do the job ?
<?php
if(isset($_POST['file_path'])){
/* -------
using "notepad++.exe" to open "test.php" file.
or run a bat file which calling "notepad++.exe" to open "test.php" file.
how to seting php.ini or firefox or any setting to do this job.
it is only for conveniently developing web page in my PC ,not for web servers
------- */
}
?>
<form action="test.php" method="post">
<input type="text" name="file_path" value="test.php"/>
<button type="submit">open with notepad++</button>
</form>
This would create something like:
To launch a program on the computer which runs the webserver:
<?php
exec('"C:\Program Files (x86)\Notepad++\notepad++.exe" "C:\foo.php"');
The above will work on vista/win7 IF the webserver does not run as a windows service. For example, if you run apache and it automatically starts when your computer boots, you probably installed it as a service. You can check to see if apache shows up in the windows services tab/thingy.
If the webserver runs as a service, you'll need to look into enabling the "allow desktop interaction" option for the service. But otherwise:
An easy test using php's new built in webserver(php 5.4+). The key thing here is you manually start the server from a shell, so it runs as your user instead of as a service.
<?php
// C:\my\htdocs\script.php
exec('"C:\Program Files (x86)\Notepad++\notepad++.exe" "C:\foo.php"');
start a webserver via a command window
C:\path\to\php.exe -S localhost:8000 -t C:\my\htdocs
Then in your browser
http://localhost:8000/script.php
I assume you are wanting the client device to open Notepad++ not the remote server. If this is the case, the best you can do is to serve up the file with the proper file type header and hope the client has Notepad ++ set up as the default application to open such a file.
Something like this should do it:
header('Content-type: text/plain');
header('Content-Disposition: attachment; filename="' . $file_name . '"'); // forces file download
header('Content-length: ' . filesize($file_path));
header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); // make sure file is re-validated each time it is requested
$fh = fopen($file_path, 'r');
while(!feof($fh)) {
$buffer = fread($fh, 2048);
echo $buffer;
}
fclose($fh);
Where $file_name is the name of the file (not the full path) and $file_path is the full path to the file
the finally successful way I tested.
thank Charles ,refer to php How do I start an external program running - Having trouble with system and exec
Start->Run, type "services.msc" to bring up Services control (other ways to get there, this is easiest IMO)
Locate your Apache service (mine was called "wampapache" using WampServer 2.0)
Open the service properties (double-click or right click->properties)
Flip to the Log On account and ensure the checkbox titled "Allow service to interact with Desktop" is checked
Flip back to the General tab, stop the service, start the service
then: in php
pclose(popen("start /B \"d:\\green soft\\notepad++5.8.4\\notepad++.exe\" \"d:\\PHPnow-1.5.6\\htdocs\\laji\\a.php\"", "r"));
Thank all your good guys, what a great help . I had finally made my idea to be true. Happy new year !
never had a reason to do so, but have you tried passthru() ?
http://php.net/manual/en/function.passthru.php
EDIT:
sorry the OP was really unclear at first glance..
what I would do is parse the file into a string or whatnot, then force the browser to treat that as a download.. php is server sided so you can`t only ask the browser to do some stuff..
$someText = 'some text here';
header('Content-type: text/plain');
header('Content-Disposition: attachment; filename="text.txt"');
echo $someText;
When using readfile() -- using PHP on Apache -- is the file immediately read into Apache's output buffer and the PHP script execution completed, or does the PHP script execution wait until the client finishes downloading the file (or the server times out, whichever happens first)?
The longer back-story:
I have a website with lots of large mp3 files (sermons for a local church). Not all files in the audio archive are allowed to be downloaded, so the /sermon/{filename}.mp3 path is rewritten to really execute /sermon.php?filename={filename} and if the file is allowed to be downloaded then the content type is set to "audio/mpeg" and the file streamed out using readfile(). I've been getting complaints (almost exclusively from iPhone users who are streaming the downloads over 3G) that the files don't fully download, or that they cut off after about 10 or 15 minutes. When I switched from streaming out the file with a readfile() to simply redirecting to the file -- header("Location: $file_url"); -- all of the complaints went away (I even checked with a few users who could reliably reproduce the problem on demand previously).
This leads me to suspect that when using readfile() the PHP script engine is in use until the file is fully downloaded but I cannot find any references which confirm or deny this theory. I'll admit I'm more at home in the ASP.NET world and the dotNet equivalent of readfile() pushes the whole file to the IIS output buffer immediately so the ASP.NET execution pipeline can complete independently of the delivery of the file to the end client... is there an equivalent to this behavior with PHP+Apache?
You may still have PHP output buffering active while performing the readfile(). Check that with:
if (ob_get_level()) ob_end_clean();
or
while (ob_get_level()) ob_end_clean();
This way theonly remaining output Buffer should be apache's Output Buffer, see SendBufferSize for apache tweaks.
EDIT
You can also have a look at mod_xsendfile (an SO post on such usage, PHP + apache + x-sendfile), so that you simply tell the web server you have done the security check and that now he can deliver the file.
a few things you can do (I am not reporting all the headers that you need to send that are probably the same ones that you currently have in your script):
set_time_limit(0); //as already mention
readfile($filename);
exit(0);
or
passthru('/bin/cat '.$filename);
exit(0);
or
//When you enable mod_xsendfile in Apache
header("X-Sendfile: $filename");
or
//mainly to use for remove files
$handle = fopen($filename, "rb");
echo stream_get_contents($handle);
fclose($handle);
or
$handle = fopen($filename, "rb");
while (!feof($handle)){
//I would suggest to do some checking
//to see if the user is still downloading or if they closed the connection
echo fread($handle, 8192);
}
fclose($handle);
The script will be running until the user finishes downloading the file. The simplest, most efficient and surely working solution is to redirect the user:
header("Location: /real/path/to/file");
exit;
But this may reveal the location of the files. It's a good idea to password-protect the files that may not be downloaded by everyone anyway with an .htaccess file, but perhaps you use a database to detemine access and this is no option.
Another possible solution is setting the maximum execution time of PHP to 0, which disables the limit:
set_time_limit(0);
Your host may disallow this, though. Also PHP reads the file into the memory first, then goes through Apache's output buffer, and finally makes it to the network. Making users download the file directly is much more efficient, and does not have PHP's limitations like the maximum execution time.
Edit: The reason you get this complaint a lot from iPhone users is probably that they have a slower connection (e.g. 3G).
downloading files thru php isnt very efficient, using a redirect is the way to go. If you dont want to expose the location of the file, or the file isnt in a public location then look into internal redirects, here is a post that talks about it a bit, Can I tell Apache to do an internal redirect from PHP?
Try using stream_copy_to_stream() instead. I find is has fewer problems than readfile().
set_time_limit(0);
$stdout = fopen('php://output', 'w');
$bfname = basename($fname);
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$bfname\"");
$filein = fopen($fname, 'r');
stream_copy_to_stream($filein, $stdout);
fclose($filein);
fclose($stdout);
Under Apache, there is a nice elgant solution not involving php at all:
Just place an .htaccess config file into the folder containing the files to be offered for download with the following contents:
<Files *.*>
ForceType applicaton/octet-stream
</Files>
This tells the Apache to offer all files in this folder (and all its subfolders) for download, instead of directly displaying them in the browser.
See below url
http://php.net/manual/en/function.readfile.php
<?php
$file = 'monkey.gif';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>
I have a file
/file.zip
A user comes to
/download.php
I want the user's browser to start downloading the file. How do i do that? Does readfile open the file on server, which seems like an unnecessary thing to do. Is there a way to return the file without opening it on the server?
I think you want this:
$attachment_location = $_SERVER["DOCUMENT_ROOT"] . "/file.zip";
if (file_exists($attachment_location)) {
header($_SERVER["SERVER_PROTOCOL"] . " 200 OK");
header("Cache-Control: public"); // needed for internet explorer
header("Content-Type: application/zip");
header("Content-Transfer-Encoding: Binary");
header("Content-Length:".filesize($attachment_location));
header("Content-Disposition: attachment; filename=file.zip");
readfile($attachment_location);
die();
} else {
die("Error: File not found.");
}
readfile will do the job OK and pass the stream straight back to the webserver. It's not the best solution as for the time the file is sent, PHP still runs. For better results you'll need something like X-SendFile, which is supported on most webservers (if you install the correct modules).
In general (if you care about heavy load), it's best to put a proxying webserver in front of your main application server. This will free up your application server (for instance apache) up quicker, and proxy servers (Varnish, Squid) tend to be much better at transfering bytes to clients with high latency or clients that are generally slow.
If the file is publicly accessable, just do a simple redirect to the URL of your file.
If the file is public, then you can just serve it as a static file directly from the web server (e.g. Apache), and make download.php redirect to the static URL. Otherwise, you have to use readfile to send the file to the browser after authenticating the user (remember about the Content-Dispositon header).
So I am trying to serve large files via a PHP script, they are not in a web accessible directory, so this is the best way I can figure to provide access to them.
The only way I could think of off the bat to serve this file is by loading it into memory (fopen, fread, ect.), setting the header data to the proper MIME type, and then just echoing the entire contents of the file.
The problem with this is, I have to load these ~700MB files into memory all at once, and keep the entire thing there till the download is finished. It would be nice if I could stream in the parts that I need as they are downloading.
Any ideas?
You don't need to read the whole thing - just enter a loop reading it in, say, 32Kb chunks and sending it as output. Better yet, use fpassthru which does much the same thing for you....
$name = 'mybigfile.zip';
$fp = fopen($name, 'rb');
// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));
// dump the file and stop the script
fpassthru($fp);
exit;
even less lines if you use readfile, which doesn't need the fopen call...
$name = 'mybigfile.zip';
// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));
// dump the file and stop the script
readfile($name);
exit;
If you want to get even cuter, you can support the Content-Range header which lets clients request a particular byte range of your file. This is particularly useful for serving PDF files to Adobe Acrobat, which just requests the chunks of the file it needs to render the current page. It's a bit involved, but see this for an example.
The best way to send big files with php is the X-Sendfile header. It allows the webserver to serve files much faster through zero-copy mechanisms like sendfile(2). It is supported by lighttpd and apache with a plugin.
Example:
$file = "/absolute/path/to/file"; // can be protected by .htaccess
header('X-Sendfile: '.$file);
header('Content-type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.basename($file).'"');
// other headers ...
exit;
The server reads the X-Sendfile header and sends out the file.
While fpassthru() has been my first choice in the past, the PHP manual actually recommends* using readfile() instead, if you are just dumping the file as-is to the client.
* "If you just want to dump the contents of a file to the output buffer, without first modifying it or seeking to a particular offset, you may want to use the readfile(), which saves you the fopen() call." —PHP manual
If your files are not accessible by the web server because the path is not in your web serving directory (htdocs) then you can make a symbolic link (symlink) to that folder in your web serving directory to avoid passing all traffic trough php.
You can do something like this
ln -s /home/files/big_files_folder /home/www/htdocs
Using php for serving static files is a lot slower, if you have high traffic, memory consumption will be very large and it may not handle a big number of requests.
Have a look at fpassthru(). In more recent versions of PHP this should serve the files without keeping them in memory, as this comment states.
Strange, neither fpassthru() nor readfile() did it for me, always had a memory error.
I resorted to use passthru() without the 'f':
$name = 'mybigfile.zip';
// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));
// dump the file and stop the script
passthru('/bin/cat '.$filename);
exit;
this execs 'cat' Unix command and send its output to the browser.
comment for slim: the reason you just don't put a symlink to somewhere is webspace is SECURITY.
One of benefits of fpassthru() is that this function can work not only with files but any valid handle. Socket for example.
And readfile() must be a little faster, cause of using OS caching mechanism, if possible (as like as file_get_contents()).
One more tip. fpassthru() hold handle open, until client gets content (which may require quite a long time on slow connect), and so you must use some locking mechanism if parallel writes to this file possible.
The Python answers are all good. But is there any reason you can't make a web accessible directory containing symbolic links to the actual files? It may take some extra server configuration, but it ought to work.
If you want to do it right, PHP alone can't do it. You would want to serve the file by using Nginx's X-Accel-Redirect (Recommended) or Apache's X-Sendfile, which are built exactly for this purpose.
I will include in this answer some text found on this article.
Why not serve the files with PHP:
Done naively, the file is read into memory and then served. If the
files are large, this could cause your server to run out of memory.
Caching headers are often not set correctly. This causes web browsers
to re-download the file multiple times even if it hasn't changed.
Support for HEAD requests and range requests is typically not
automatically supported. If the files are large, serving such files
ties up a worker process or thread. This can lead to starvation if
there are limited workers available. Increasing the number of workers
can cause your server to run out of memory.
NGINX handles all of these things properly. So let's handle permission checks in the application and let NGINX serve the actual file. This is where internal redirects come in. The idea is simple: you can configure a location entry as usual when serving regular files.
Add this to your nginx server block:
location /protected_files/ {
internal;
alias /var/www/my_folder_with_protected_files/;
}
In your project, require the HTTP Foundation package:
composer require symfony/http-foundation
Serve the files in PHP using Nginx:
use Symfony\Component\HttpFoundation\BinaryFileResponse;
$real_path = '/var/www/my_folder_with_protected_files/foo.pdf';
$x_accel_redirect_path = '/protected_files/foo.pdf';
BinaryFileResponse::trustXSendfileTypeHeader();
$response = new BinaryFileResponse( $real_path );
$response->headers->set( 'X-Accel-Redirect', $accel_file );
$response->sendHeaders();
exit;
This should be the basic you need to get started.
Here's a more complete example serving an Inline PDF:
use Symfony\Component\HttpFoundation\BinaryFileResponse;
use Symfony\Component\HttpFoundation\File\File;
use Symfony\Component\HttpFoundation\ResponseHeaderBag;
$real_path = '/var/www/my_folder_with_protected_files/foo.pdf';
$x_accel_redirect_path = '/protected_files/foo.pdf';
$file = new File( $file_path );
BinaryFileResponse::trustXSendfileTypeHeader();
$response = new BinaryFileResponse( $file_path );
$response->setImmutable( true );
$response->setPublic();
$response->setAutoEtag();
$response->setAutoLastModified();
$response->headers->set( 'Content-Type', 'application/pdf' );
$response->headers->set( 'Content-Length', $file->getSize() );
$response->headers->set( 'X-Sendfile-Type', 'X-Accel-Redirect' );
$response->headers->set( 'X-Accel-Redirect', $accel_file );
$response->headers->set( 'X-Accel-Expires', 60 * 60 * 24 * 90 ); // 90 days
$response->headers->set( 'X-Accel-Limit-Rate', 10485760 ); // 10mb/s
$response->headers->set( 'X-Accel-Buffering', 'yes' );
$response->setContentDisposition( ResponseHeaderBag::DISPOSITION_INLINE, basename( $file_path ) ); // view in browser. Change to DISPOSITION_ATTACHMENT to download
$response->sendHeaders();
exit;