Slower downloads through Apache than with PHP readfile - php

I've set up a Download-Script with PHP on my server, which checks some details before letting the user download files via Apache (X-Sendfile). The Files are outside the Document-Root.
The code for downloading with Apache and the Module X-Sendfile is:
header("X-Sendfile: $fullpath");
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$link_file\"");
When using Apache and X-Sendfile i have a download-speed of 500 kB/s with my client. I also tested it with Apache and without X-Sendfile with the same file within the Document Root - same thing here!
So I tested downloading the same file, with the same client, the same infrastructure on both sides and the same internet-connection a few seconds later via PHP with readfile:
header("Pragma: no-cache");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Content-Type: application/octet-stream");
header("Content-Length: ".(string)(filesize($fullpath)));
header("Content-Disposition: attachment; filename=\"$link_file\"");
readfile($fullpath);
This time the download-speed was 9.500 kB/s!
I repeated this test using both options more than a multiple times and the result was the same every time trying. The only difference besides the download-speed was a waiting time of a few seconds (depending on the size of the file which was downloaded), when trying with the PHP readfile method. When repeating the PHP readfile method instantly, the waiting time didn't appear again. Most likely because it was stored in the memory after the first time.
I'm using a professional HP Raid-System on the server, which has an average local Speed of 800 MB/s, so the reason for this can't be the Diskspeed. Also i didn't find any compressing- or bandwith-settings in the httpd.conf of Apache.
Can anyone of you explain why there is such a great difference of the download-speed and how this can be changed?
Thank you in advance.
Server: Windows Server 2008 R2 Apache/2.2.21 (Win32) PHP/5.4.20
Client: Windows 7 Ultimate x64 Google Chrome 30.0.1599.101 LAN >100 Mbit/s

SOLUTION:
httpd.conf, turn on the line "EnableSendfile off"

Related

Large .zip files download incomplete

I am a severe novice, but I did quite a lot of research before posting, so I hope you can help.
I'm trying to serve a large .zip file that is hosted on an Apache server, about 6.4 Gb. The file shows complete download in browser, but in fact only about 500Mb have been downloaded. This seems like a very common problem and have found a lot of other posts and information on the web, but the problem has been persistent for me.
Large Zip file offered for download using php
IE download incomplete even though it claims success
Large zip downloads failing only in IE
I have been testing with Chrome 39.0.2171.71, but I get the same problem with FireFox and IE. I think my file is much larger than what others have been posting about, therefore perhaps their solutions helped the situation, but didn't fix the root problem. I have a second .zip file that is about 400Mb, and I use the same http headers with success.
The most useful article I have found is this: http://perishablepress.com/http-headers-file-downloads/ and I have copied much of the php shown below from that source, as it appears other posters on this website have done.
I have also tried using X-SendFile, but I don't think my webhost has the appropriate Apache module installed. I've spent all day working on this, and have run out of ideas! I have used a download manager with success, I don't know if this was just by chance or what, but I don't want to require my clients to have to download and install a separate program just to get the .zip file.
<?php
// HTTP Headers for ZIP File Downloads
// set example variables
$filename = "huge.zip";
$filepath = "****";
// http headers for zip downloads
// header("X-Sendfile: $filepath$filename");
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"".$filename."\"");
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".filesize($filepath.$filename));
set_time_limit(0);
ob_end_flush();
#readfile($filepath.$filename);
?>
Here are the response headers when I run the php above
Remote Address:76.162.142.242
Request URL:****/GetSW.php
Request Method:GET
Status Code:200 OK
Request Headers
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Encoding:gzip, deflate, sdch
Accept-Language:en-US,en;q=0.8
Authorization:Basic ******
Connection:keep-alive
Cookie:_ga=GA1.2.1176828605.1417985823
DNT:1
Host:www.teamursa.org
Referer:http://www.teamursa.org/****.html
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36
Response Headers
Cache-Control:public
Connection:Keep-Alive
Content-Description:File Transfer
Content-Disposition:attachment; filename="huge.zip"
Content-Length:6720560824
Content-Transfer-Encoding:binary
Content-Type:application/octet-stream
Date:Sun, 07 Dec 2014 22:16:57 GMT
Expires:0
Keep-Alive:timeout=3, max=120
Pragma:public
Server:Apache
X-Powered-By:PHP/5.2.17
You might need to increase the memory limit on the server. Try the below just after the opening PHP tag
ini_set('memory_limit','16M');
And just keep increasing from 16M
I recommend you try using a web browser to download file directly, meaning you go to the address that file is stored at. E.g www.example.com/downloads/download.zip Hope this helps

Large file download taking more time

I have used following code to download approximate 920MB file,
set_time_limit(0);
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("X-Sendfile: $zipname"); // For Large Files
header("Content-type: application/zip");
header("Content-Disposition: attachment; filename=\"".$zipname."\"");
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".filesize($directory_location . '/' . $zipname));
ob_end_flush();
readfile($directory_location . '/' . $zipname);
Before this code i did some study with the following links Using X-Sendfile with Apache/PHP, Streaming a large file using PHP and Limit download speed using PHP but not much helpful to me because file download still takes more time with just (2MB) file. It's not showing and transfer rate or anything else. I want download start to serve file with around 60Kbps, with all files (Large or small)
UPDATE: One more thing i noticed its not showing any download process just executing and after sometime display the pop-up to choose the location, and after hitting save button its direct save to the computer without any downloading process window :(
Please help me to guide the right way.
Based on above comments there are two solutions:
1) Just download the file directly. You don't appear to be doing any validation, so if not, then just pass the user to the file to download and let apache handle it.
2) If you do need validation / pre-processing, then check mod_xsendfile - adding the header isn't enough, you actually need to add the mod to apache. If you're in Linux then compile from source (https://tn123.org/mod_xsendfile/). If you're not in Linux then mod_xsendfile for Win x64? has a response from the author saying he can provide binaries - but that's in 2010. There's a bit of advice around the web - although it's been a while since I looked at it so can't really help much more.

Forced file downloads via PHP headers opens exe files in browser on localhost XAMPP server

I am trying to force exe files to download via php script on my developing server (XAMPP).
Here is what I have for headers:
$file=$dl_path.$dl_filename;
header("Content-type: application/force-download");
header("Content-Transfer-Encoding: Binary");
header("Content-length: ".filesize($file));
header("Content-disposition: attachment; filename=\"".basename($file)."\"");
readfile("$file");
This works fine when I publish to live web server but on my local machine which I use for development, it loads the exe files in the same browser window - showing millions of lines of gibberish, and taking 30 seconds to stop the page and click "back".
I think I had this working correctly on my local server at one point, but after a few hours of goggling I can't find what the problem is. Anyone out there have an idea?
Thanks

PHP Headers and downloads

I am currently trying to develop a PHP application in which my server downloads a file and the user can do the same almost simultaneously. I already think about the problem "If the user downloads fastly than the server...", but it's not a problem at this moment.
To do so, I used the header and readfile functions of php. Here is my code :
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$data['name'].'";');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$data['size']);
readfile($remoteFile);
I must to use the Content-length header to set the proper size of the file and not the size that is downloaded when the user clicks on the link. However, after some seconds or minutes, download is stopped and I need to restart...
If you think about a solution, even if it didn't use the header(); function, please tell me.
Thank you in advance...
I have experienced that this is directly related to maximum runtime settings, that are enforced upon you if you run with safe_mode on.
If you have the option, try setting set_time_limit(0) and see if that makes it work.
if you have your own server, you should look into the mod_xsendfile module for apache, since that is built specifically to send large files to the user.
Oh, and its stupidly easy to use
header("X-Sendfile: $path_to_somefile");
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$somefile\"");
exit;

Apache2 with PDF and PHP - "This file does not start with "%PDF-"

I have been trying to find the reason for this error for weeks now - and I have come up a blank. The system uses PHP to generate dynamic .pdf files.
I have three servers: Dev (Win7 with Apache2), Test (Ubuntu 10.4 with nginx), and Live (Ubuntu 10.10 with nginx). All are running php5 and the system I have developed - same code. Equivalent, same config.
I have many browsers I have tested things with: DevIE (win7, IE8), DevFF (Win7 Firefox 3.5), DevSaf (win, Safari), LaptopFF (WinXP, Firfox 3.5), Laptop IE(WinXP, IE8 Test (Ubuntu FF3.5), and users (mostly IE8 on Win 7 and Win XP).
When I generate a PDF from Test it works correctly in all browsers (except Users which I can't test).
When I generate a PDF from Dev it fails from DevIE, DevFF and DevSaf, but calling for it from Test works.
Apache2 always fails from the same machine.
From the laptop, using FF succeeds, and using IE8 fails (see below).
The users are reporting intermittent problems. It fails, and then the repeat the request and it succeeds.
When it fails....
The log of the generated PDF is shown, sending the right sort of size reply (500KB to 1.8MB) with a 200 OK result. This is sometimes followed about 10 seconds later with a repeat of the same URL - but this generates the log-on screen (again 200 OK reply), but only 2K in size. The implication is that it was requested without the cookie.
Adobe Reader tries to display the log-on page, with the inevitable "This file does not start with "%PDF-" error message.
Except for when I try with the laptop and IE8 - then it fails with show source showing a 4 line html file with an empty body!
The system has been working for over a year - and only started failing with a change of production server about 2 months ago. The test version was not changed at this time, but started to fail also.
I have tried all sorts of headers, but nothing I have tried makes any difference. The current set of headers is:
header('Content-Disposition: inline; filename="'.$this->pdfFilename().'"');
header('Content-type: application/pdf');
header("Pragma: public");
$when = date('r',time()+20); // expire in 20 seconds
header("Expires: $when");
I've tried replacing inline with attachment. Adding and removing all sorts of no-cache headers. All to no avail.
The PDF is requested in a new window, by JavaScript - and is followed 8 seconds later by a refresh. I have tested without the new window, and without the refresh - no change.
I have has a few (small) PDFs served by the Dev server. So I have raised every limit I can think of. Now it always fails.
So I have a Windows Apache2.2 server that fails when browsed from the same machine and succeeds when browsed from other machines in Firefox.
There is no proxy or cache mechanism involved other than that in the browsers.
Has anyone any ideas about what might be going wrong? As I said, I have been testing and eliminating things for nearly 4 weeks now, on and off, and I have not yet even identified the failing component.
This is really tough to troubleshoot - for starters, (please excuse my bluntness, but) this a prime example of what a pipeline should not look like:
Three different operating systems.
Probably at least two different versions of PHP.
Two different webservers.
But anyway, a few general hints on debugging PHP:
make sure to enable error_log and log_errors in php.ini (set display_errors = Off)
use the most verbose error_reporting
set access_log and error_log in nginx.
crank up log level in nginx (I'm guessing you use php-cgi or php-fpm, so you should be able to see what status the backend emits when the download attemp fails).
Furthermore:
You haven't shared how the PDF is generated - are you sure all libraries used here are the same or at least somewhat the same across all systems?
In any case, just to be sure I would save the PDF on the server before it is offered to download. This allows you to troubleshoot the actual file — to see if the PDF generation actually worked.
Since you're saving the PDF, I'd see about putting it in a public folder, so you can see if you can just redirect to it after it's generated. And only if this works, then I'd work on a force-download kind of thing.
I would replicate the production environment in all stages. ;-) You need your dev server to be exactly like the production environment. For your own workstation, I'd recommend a VM (e.g. through Virtualbox with Ubuntu 10.10).
Let me know this gets you somewhere and reply with updates. :-)
Update:
I'd investigate these two headers:
header("Cache-Control: no-cache, must-revalidate"); // HTTP/1.1
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT"); // Date in the past
Definitely helps with cache busting.
These are the headers, which finally worked in a similar situation in one of my apps:
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false);
header( "Content-Type: application/pdf" );
header("Content-Disposition: inline; filename=\"YourPDF_" . time() . ".pdf\";");
header("Content-Transfer-Encoding: binary");
header("Content-Length: ". strlen( $pdfData ) );
I added the time() code to make the filename change each time, so that it likely passes all proxies.
From time to time but seldom, the problem re-appears. Then, we ask our clients to download the file using the browser context menu.
PS: The app uses ezPDF found here: http://www.ros.co.nz/pdf/

Categories