PHP file download with mod-xsendfile does not download entire file - php

I am having problem serving downloads from our website. Large files just won't download in full. Download will stop somewhere in between... Example this file (size cca 172MB) won't download in full size (there are other files also).
I switched from entirely PHP-base download script, the one included in Kohana framework:
return download::force($filePath);
to a mod-xsendfile solution. I was reading about the possible problems with PHP based download scripts and large file and cam over mod-xsendfile is the right solution... Well looks like not, I am getting the same result with both techniques. My current download implementation using mod-xsendfile headers like this:
header("X-Sendfile: $filePath");
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename="' . basename($filePath) . '"');
What am I doing wrong?
UPDATE:
I used this HTTP sniffer to check response headers and this is the result if it helps solving this problem.
Status: HTTP/1.1 200 OK
Server: Apache
Set-Cookie: dewesoftsession=63ms5j67kc231pr4bpm8cmg1f7; expires=Sat, 30-Mar-2013 11:36:59 GMT; path=/
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Set-Cookie: dewesoftsession=63ms5j67kc231pr4bpm8cmg1f7; expires=Sat, 30-Mar-2013 11:36:59 GMT; path=/
Content-Disposition: attachment; filename="DEWESoft_FULL_7_0_5.exe"
Last-Modified: Mon, 24 Sep 2012 12:50:12 GMT
ETag: "25814de-ac291e9-4ca7207c7fcd9"
Content-Type: application/octet-stream
Content-Length: 180523497
Date: Sat, 30 Mar 2013 09:37:01 GMT
X-Varnish: 294312007
Age: 2
Via: 1.1 varnish
Connection: close
X-Varnish-Cache: MISS

After couple of days we managed to find what cause the problem. Varnish has a start-up parameter called send_timeout which is set to 600s by default. With large file downloads you might run into this timeout which will cause your download to be interrupted.
So increasing Varnish's send_timeout parameter will help you solve this kind of issue.

Related

JSON Cache Header on Browser doesn't work

I make a request to a php file and I took back these headers
Access-Control-Allow-Origin: *
Cache-Control: max-age=360000, must-revalidate
Connection: keep-alive
Content-Type: application/json
Date: Thu, 19 Jul 2018 07:08:20 GMT
Expires: Mon, 26 Jul 2040 05:00:00 GMT
Pragma: no-cache
Server: nginx
Transfer-Encoding: chunked
I'm using these headers to php file
header('Cache-Control: max-age=360000, must-revalidate');
header('Expires: Mon, 26 Jul 2040 05:00:00 GMT');
header('Content-type: application/json');
header("Access-Control-Allow-Origin: *");
But every time I refresh the page... It's not cached... It always ask server for the response.
Any ideas? I want to be cached until expire date
I assume that Pragma: no-cache might be a problem, remove that header
By documentation
The Pragma: no-cache header field is an HTTP/1.0 header intended for
use in requests. It is a means for the browser to tell the server and
any intermediate caches that it wants a fresh version of the resource,
not for the server to tell the browser not to cache the resource. Some
user agents do pay attention to this header in responses, but the
HTTP/1.1 RFC specifically warns against relying on this behaviour.

Trouble downloading Excel files all of a sudden

I'm all of a sudden having a hard time downloading excel spreadsheets from our webserver.
We use apache2.2.22 and php 5.4.45-0+deb7u7.
The files are corrupt after download.
I've verified the spreadsheet files are OK. (I'm able to open them w/o problem if I don't download them via the browser then try to look at them.)
Here's the response header:
Cache-Control no-store, no-cache, must-reval…te, post-check=0, pre-check=0
Connection Keep-Alive
Content-Encoding gzip
Content-Length 22
Content-Type text/html
Date Mon, 29 Jan 2018 17:56:52 GMT
Expires Thu, 19 Nov 1981 08:52:00 GMT
Keep-Alive timeout=3, max=100
Pragma no-cache
Server Apache/2.2.22 (Debian)
Vary Accept-Encoding
X-Powered-By PHP/5.4.45-0+deb7u7
I'm just totally stumped here.
I've also been trying to add a mime-type like this:
case "xlsx":
header("Content-type: application/vnd.ms-excel-xml");
header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\""); // use 'attachment' to force a download
break;
Nothing seems to be working. (but it used to work flawlessly.)
Any help or ideas come much appreciated. Thank you.

How to enforce browser caching?

I have php script that serve image from the script
i expect the browser to cache the image even its come from the api.
I have read the thread How to browser cache image from php
and
How to force a web browser to cache Images
that to modify Expires header, I've tried it but is seem got no luck
I've manually
header('Content-Type : image/jpeg');
header('Content-Size : 2759' );
header('Cache-Control : public, max-age=2592000' );
header('Pragma : public' );
header('Expires: ' . date('r', time() + 30*24*60*60 ));
my response header from my output is :
Server: nginx/0.8.54
Date: Tue, 06 Dec 2011 11:20:17 GMT
Content-Type: image/jpeg
Transfer-Encoding: chunked
Connection: keep-alive
X-Powered-By: PHP/5.3.7-ZS5.5.0 ZendServer/5.0
Set-Cookie: ZDEDebuggerPresent=php,phtml,php3; path=/
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0, public, max-age=2592000
Pragma: no-cache
Expires: Wed, 05 Dec 2012 18:20:17 +0700
Content-Size: 2759
as you can see, the headers is modified, but in cache-control header it still have no-store and no-cache value,
is this nginx configs things or can be resolved from php, and how ??
thank you in advance....

Disable Gzip compression for single php file with IIS

I have a configuration that doesn't seem too common on the Internets (PHP with IIS), and so far I have not been able to find a solution for my problem because of this.
Basically when I'm sending a manual 404 header on my php page:
header('HTTP/1.0 404 Not Found');
The problem is that I then always get encoding errors, which I've determined has something to do with gzip being enabled.
When I curl with --compressed I get:
HTTP/1.1 404 Not Found
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Content-Length: 3560
Content-Type: text/html
Content-Encoding: gzip
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Vary: Accept-Encoding
Server: Microsoft-IIS/6.0
Set-Cookie: PHPSESSID=b0ueqhtc3o4p7m2luqss170fr3; path=/
Date: Mon, 11 Jul 2011 16:15:40 GMT
curl: (61) Error while processing content unencoding: invalid code lengths set
Is it possible to disable compression just for this one page? Or is there some other solution for this that I'm missing? I don't want to disable compression for the entire site.
This is simple, just use the ini set like so:
<?php
if(ini_get('zlib.output_compression')){
ini_set('zlib.output_compression', 'Off');
}
header('HTTP/1.0 404 Not Found');
?>
Simple as that.

PHP PDF-Generation - IE7/Acrobat8: "Website cannot be displayed"

I've got some trouble with displaying pdfs in IE7 (which were generated by R&OS' ezpdf).
IE7 with Acrobat Reader 8.1.2. says "The page cannot be displayed"
Other Browsers (like FF3/Acrobat 8.1.2. or IE6/Acrobat 7) have no problem with the file.
The following headers are returned by the server:
Date: Thu, 08 Jan 2009 10:52:40 GMT
Server: Apache/2.2.8 (Win32) mod_ssl/2.2.8 OpenSSL/0.9.8g PHP/5.2.5 DAV/2
X-Powered-By: PHP/5.2.5
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Content-Length: 4750
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: application/pdf
Does anybody know how to fix this problem?
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
...so IE won't store the file in the Temporary Internet Files folder. However the mechanism used to directly 'Open' a file from the browser in IE often requires it to be opened from inside Temporary Internet Files. Directly opening a file from a browser is generally unreliable, especially in IE; 'Save as' works better.
Consider replacing the cachebusting headers with an alternative method, such as add a '?randomstring' parameter to the URL. Also consider adding a "Content-Disposition: attachment; filename=..." header, which will stop a plug-in trying and failing to display the file in the browser UI.
I think I've solved the problem.
The problem is not on the server-side but on the client-side.
The generated PDF is being displayed in a popup-window (javascript: window.open) and IE7 chokes on it.
When I open a html-file in the popup which is redirecting to the PDF it works.

Categories