Send files using Content-Disposition: attachment in parallel - php

I have a PHP page that sends a file to the browser depending on the request data it receives. getfile.php?get=something sends file A, getfile.php?get=somethingelse sends file B and so on and so forth.
This is done like so:
header('Content-Disposition: attachment; filename='. urlencode($filename));
readfile($fileURL);
It works except it can only send one file at a time. Any other files requested are send in linear fashion. One starts as soon as another finishes.
How can I get this to send files in parallel if the user requests another file while one is downloading?
Edit: I have tried downloading two files at the same time by directly using their filepaths and it works, so neither Apache, nor the browser seem to have a problem. It seems PHP is the issue. I have by the way used session_start() at the beginning of the page.

This may be down to their browser settings, your server settings, PHP, or all three. Most browsers will only process two simultaneous HTTP connections to the same server, queuing others. Many web servers will also queue connections if there are more than two from the same browser. If you're using sessions, PHP may be serializing fulfilling requests in the session (to just one active request at a time) to minimize race conditions. (I don't know if PHP does this; some others do.)
Two of these (server and PHP) you're in control of; not much you can do about the browser.
Somewhat OT, but you could always allow them to select multiple files and then send them back a dynamically-created zip (or other container format).

Adding session_write_close() right after I'm finished with the session and before starting the download seems to have solved the issue.

Related

Non blocking Ajax request to PHP

I'm using PHP to download a (big) file from a remote server and this download is triggered by clicking on a download button on a web page.
So when i click the download button on the web page, then an Ajax request (with angulars $http) is made to a PHP function. That function triggers a download using cURL.
In the mean time I'd like to make other requests to my PHP site with Ajax. But all other Ajax requests show the status Pending as long as the download is in progress.
So basically the download is blocking every other request to PHP. Is there any way I can avoid this blockage?
This is most likely due to the session file being locked. This is a very common oversight on many php-based web-apps. Essentially, when you call session_start() to access the $_SESSION array, it opens the session file in the tmp directory in read/write mode and locks this file to avoid potential concurrency issues. If you call another script from a different ajax request (or any HTTP request, such as from a new browser window), if that second request also calls session_start, it will wait until the session file is unlocked before moving forward.
The fix is to release the session file once you know you are no longer going to be writing to it. Since your use-case is a huge file download, it is unlikely that during the data output you will need to push anything into the $_SESSION array. You release it from write-mode by calling session_write_close()
I had no idea this was the case until I found out a popular web-app I frequently use was guilty of this. A great blog post on this common bottleneck is:
http://konrness.com/php5/how-to-prevent-blocking-php-requests/

Download multiple files simultaneously with PHP - Forking, Sockets

I'm using the following code to manage downloads from my site (the files are behind a captcha): http://www.richnetapps.com/php-download-script-with-resume-option/
Trouble is, when a file is being downloaded, it locks the rest of the site, and it's not possible to download another file simultaneously. ('Locks' as in trying to go to, say, the homepage when a download is in progress results in a long wait. The homepage appears only when the download is finished or cancelled. This is a problem because some of the files are several hundred MB).
I'd like two things to happen: 1- To be able to browse the site while a file is being downloaded, and 2- to be able to download another file (or two, or three, or ten...) simultaneously.
My gut feeling is I need to fork the process, create a new one, or open another socket. But I'm way out of my depth, and even if this was the right approach, I don't know how to do it. Any ideas guys?
Many thanks in advance....
EDIT----
I found it! I added session_write_close() right before setting the headers in the download script. Apparently this behaviour is due to PHP session handling - further info here: php simultaneous file downloads from the same browser and same php script (I searched and searched before asking, but obviously missed this post).
Many thanks....
A Content Delivery Network (CDN) will both offload from your server allowing your server to process homepage (or other) page requests, and allow many, many simultaneous downloads. It should be cheaper for bandwidth and perhaps faster for most users as well.
The key will be to configure to protect the files only after your Captcha, instead of being freely available like most CDN setups.

PHP readfile/fgets/fread causes multiple requests to server

I'm trying to stream MP4 files through Apache / Nginx using a PHP proxy for authentication. I've implemented byte-ranges to stream for iOS as outlined here: http://mobiforge.com/developing/story/content-delivery-mobile-devices. This works perfectly fine in Chrome and Safari but.... the really odd thing is that if I monitor the server requests to the php page, three of them occur per page load in a browser. Here's a screen shot of Chrome's inspector (going directly to the PHP proxy page):
As you can see, the first one gets canceled, the second remains pending, and the third works. Again, the file plays in the browser. I've tried alternate methods of reading the file (readfile, fgets, fread, etc) with the same results. What is causing these three requests and how can I get a single working request?
The first request is for the first range of bytes, preloading the file. The browser cancels the request once it has downloaded the specified amount.
The second one I'm not sure about...
The third one is when you actually start playing the media file, it requests and downloads the full thing.
Not sure whether this answers your question, but serving large binary files with PHP isn't the right thing to do.
It's better to let PHP handle authentication only and pass the file reference to the web server to serve, freeing up resources.
See also: Caching HTTP responses when they are dynamically created by PHP
It describes in more detail what I would recommend to do.

PHP passthrough slow

For various reasons, I need to play the intermediary between an HTTP Request and a file on disk. My approach has been to populate headers and then perform a readfile('/path/to/file.jpg');
Now, everything works fine, except that it returns even a medium sized image very slowly.
Can anyone provide me with a more efficient way of streaming the file to the client once the headers have been sent?
Note: it's a linux box in a shared hosting environment if it matters
Several web servers allow an external script to tell them to do exactly this. X-Sendfile on Apache (with mod_xsendfile) is one.
In a nutshell, all you send is headers. The special X-Sendfile header instructs the web server to send the named file as the body of the response.
You could start with implementing conditional GET request support.
Send out a "Last-Modified" header with the file and reply with "304 Not Modified" whenever the client requests the file with "If-Modified-Since" and you see that the file has not been modified. Some sensible freshness-information (via "Cache-Control" / "Expires" headers) also is advisable to prevent repeated requests for an unchanged resource in the first place.
This way at least the perceived performance can be improved, even if you should find that you can do nothing about the actual performance.
This should actually be fairly fast. We have done this with large images without a problem. Are you doing anything else before outputting the image that might be slowing the process, such as calculating some meta data on the image?
Edit:
You may need to flush the output and use fread. IE:
$fp = fopen($strFile, "rb");
//start buffered download
while(!feof($fp))
{
print(fread($fp,1024*16));
flush();
ob_flush();
}
fclose($fp);
Basically you want to build a server... That's not trivial.
There is a very promissing project of a PHP based server : Nanoweb.
It's free, and fully extensible.

PHP/Apache blocking on each request?

Ok, this may be a dumb question but here goes. I noticed something the other day when I was playing around with different HTML to PDF converters in PHP. One I tried (dompdf) took forever to run on my HTML. Eventually it ran out of memory and ended but while it was still running, none of my other PHP scripts were responsive at all. It was almost as if that one request was blocking the entire Webserver.
Now I'm assuming either that can't be right or I should be setting something somewhere to control that behaviour. Can someone please clue me in?
did you had open sessions for each of the scripts?:) they might reuse the same sesion and that blocks until the session is freed by the last request...so they basically wait for each other to complete(in your case the long-running pdf generator). This only applies if you use the same browser.
Tip, not sure why you want html to pdf, but you may take a look at FOP http://xmlgraphics.apache.org/fop/ to generate PDF's. I'm using it and works great..and fast:) It does have its quirks though.
It could be that all the scripts you tried are running in the same application pool. (At least, that's what it's called in IIS.)
However, another explanation is that some browsers will queue requests over a single connection. This has caused me some confusion in the past. If your web browser is waiting for a response from yourdomain.com/script1.php and you open another window or tab to yourdomain.com/script2.php that request won't be sent until the first request receives a reply making it seem like your entire web server is hanging. An easy way to test if this is what's going on try two requests on two separate browsers.
It sounds like the server is simply being overwhelmed and under too much load to complete the requests. Converting an HTML file to a PDF is a pretty complex process, as the PHP script has to effectively provide the same functionality as a web browser and render the HTML using PDF drawing functions.
I would suggest you either split the HTML into separate, smaller files or run the script as a scheduled task directly through PHP independent of the server.

Categories