I'm all of a sudden having a hard time downloading excel spreadsheets from our webserver.
We use apache2.2.22 and php 5.4.45-0+deb7u7.
The files are corrupt after download.
I've verified the spreadsheet files are OK. (I'm able to open them w/o problem if I don't download them via the browser then try to look at them.)
Here's the response header:
Cache-Control no-store, no-cache, must-reval…te, post-check=0, pre-check=0
Connection Keep-Alive
Content-Encoding gzip
Content-Length 22
Content-Type text/html
Date Mon, 29 Jan 2018 17:56:52 GMT
Expires Thu, 19 Nov 1981 08:52:00 GMT
Keep-Alive timeout=3, max=100
Pragma no-cache
Server Apache/2.2.22 (Debian)
Vary Accept-Encoding
X-Powered-By PHP/5.4.45-0+deb7u7
I'm just totally stumped here.
I've also been trying to add a mime-type like this:
case "xlsx":
header("Content-type: application/vnd.ms-excel-xml");
header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\""); // use 'attachment' to force a download
break;
Nothing seems to be working. (but it used to work flawlessly.)
Any help or ideas come much appreciated. Thank you.
Related
I make a request to a php file and I took back these headers
Access-Control-Allow-Origin: *
Cache-Control: max-age=360000, must-revalidate
Connection: keep-alive
Content-Type: application/json
Date: Thu, 19 Jul 2018 07:08:20 GMT
Expires: Mon, 26 Jul 2040 05:00:00 GMT
Pragma: no-cache
Server: nginx
Transfer-Encoding: chunked
I'm using these headers to php file
header('Cache-Control: max-age=360000, must-revalidate');
header('Expires: Mon, 26 Jul 2040 05:00:00 GMT');
header('Content-type: application/json');
header("Access-Control-Allow-Origin: *");
But every time I refresh the page... It's not cached... It always ask server for the response.
Any ideas? I want to be cached until expire date
I assume that Pragma: no-cache might be a problem, remove that header
By documentation
The Pragma: no-cache header field is an HTTP/1.0 header intended for
use in requests. It is a means for the browser to tell the server and
any intermediate caches that it wants a fresh version of the resource,
not for the server to tell the browser not to cache the resource. Some
user agents do pay attention to this header in responses, but the
HTTP/1.1 RFC specifically warns against relying on this behaviour.
I'm developing a site using Zend Framework 2 where the client wanted to be able to export some data to an Excel file. For this I went with PHPExcel, and I got it working on my local computer, where I run Apache2.
<?php
//Create a writer for Excel
$objWriter = new PHPExcel_Writer_Excel2007($objPHPExcel);
// Get the data from php://output
// https://stackoverflow.com/questions/16551375/phpexcel-in-zend2-controller
ob_flush();
ob_start();
$objWriter->save('php://output');
$excelOutput = ob_get_clean();
//Get a new response object
$response = new \Zend\Http\Response();
//Set headers for the response object
$response->getHeaders()
->addHeaderLine('Last-Modified: ' . gmdate("D, d M Y H:i:s") . ' GMT')
->addHeaderLine('Cache-Control: no-store, no-cache, must-revalidate')
->addHeaderLine('Cache-Control: post-check=0, pre-check=0')
->addHeaderLine('Pragma: no-cache')
->addHeaderLine('Content-Type', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet')
->addHeaderLine('Content-Disposition: attachment;filename="report.xlsx"');
//Set the content for the response object
$response->setContent($excelOutput);
return $response;
?>
The problem is that when I uploaded the project to my production server, running Nginx with a pretty standard setup using PHP-FPM, the Excel file isn't sent using the correct headers. It seems like they are overwritten with "text/html" which makes the browser show some garbled characters instead of making the file available for download.
I followed this (PHPExcel in Zend2 Controller) question for getting the content into a variable.
I can't figure out why this is happening, are the headers set differently using PHP on Apache2 than running Nginx with PHP-FPM?
Update: Returned headers
Apache on local computer:
HTTP/1.1 200 OK
Date: Sun, 11 May 2014 11:24:09 GMT
Server: Apache/2.4.4 (Win32) OpenSSL/0.9.8y PHP/5.4.19
X-Powered-By: PHP/5.4.19
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: post-check=0, pre-check=0
Pragma: no-cache
Last-Modified: Sun, 11 May 2014 11:24:09 GMT
Content-Disposition: attachment;filename="report.xlsx"
Content-Length: 6551
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
Nginx headers:
HTTP/1.1 200 OK
Server: nginx/1.1.19
Date: Sun, 11 May 2014 11:14:18 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: keep-alive
Vary: Accept-Encoding
X-Powered-By: PHP/5.3.10-1ubuntu3.11
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Content-Encoding: gzip
I am having problem serving downloads from our website. Large files just won't download in full. Download will stop somewhere in between... Example this file (size cca 172MB) won't download in full size (there are other files also).
I switched from entirely PHP-base download script, the one included in Kohana framework:
return download::force($filePath);
to a mod-xsendfile solution. I was reading about the possible problems with PHP based download scripts and large file and cam over mod-xsendfile is the right solution... Well looks like not, I am getting the same result with both techniques. My current download implementation using mod-xsendfile headers like this:
header("X-Sendfile: $filePath");
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename="' . basename($filePath) . '"');
What am I doing wrong?
UPDATE:
I used this HTTP sniffer to check response headers and this is the result if it helps solving this problem.
Status: HTTP/1.1 200 OK
Server: Apache
Set-Cookie: dewesoftsession=63ms5j67kc231pr4bpm8cmg1f7; expires=Sat, 30-Mar-2013 11:36:59 GMT; path=/
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Set-Cookie: dewesoftsession=63ms5j67kc231pr4bpm8cmg1f7; expires=Sat, 30-Mar-2013 11:36:59 GMT; path=/
Content-Disposition: attachment; filename="DEWESoft_FULL_7_0_5.exe"
Last-Modified: Mon, 24 Sep 2012 12:50:12 GMT
ETag: "25814de-ac291e9-4ca7207c7fcd9"
Content-Type: application/octet-stream
Content-Length: 180523497
Date: Sat, 30 Mar 2013 09:37:01 GMT
X-Varnish: 294312007
Age: 2
Via: 1.1 varnish
Connection: close
X-Varnish-Cache: MISS
After couple of days we managed to find what cause the problem. Varnish has a start-up parameter called send_timeout which is set to 600s by default. With large file downloads you might run into this timeout which will cause your download to be interrupted.
So increasing Varnish's send_timeout parameter will help you solve this kind of issue.
I have a configuration that doesn't seem too common on the Internets (PHP with IIS), and so far I have not been able to find a solution for my problem because of this.
Basically when I'm sending a manual 404 header on my php page:
header('HTTP/1.0 404 Not Found');
The problem is that I then always get encoding errors, which I've determined has something to do with gzip being enabled.
When I curl with --compressed I get:
HTTP/1.1 404 Not Found
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Content-Length: 3560
Content-Type: text/html
Content-Encoding: gzip
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Vary: Accept-Encoding
Server: Microsoft-IIS/6.0
Set-Cookie: PHPSESSID=b0ueqhtc3o4p7m2luqss170fr3; path=/
Date: Mon, 11 Jul 2011 16:15:40 GMT
curl: (61) Error while processing content unencoding: invalid code lengths set
Is it possible to disable compression just for this one page? Or is there some other solution for this that I'm missing? I don't want to disable compression for the entire site.
This is simple, just use the ini set like so:
<?php
if(ini_get('zlib.output_compression')){
ini_set('zlib.output_compression', 'Off');
}
header('HTTP/1.0 404 Not Found');
?>
Simple as that.
I've got some trouble with displaying pdfs in IE7 (which were generated by R&OS' ezpdf).
IE7 with Acrobat Reader 8.1.2. says "The page cannot be displayed"
Other Browsers (like FF3/Acrobat 8.1.2. or IE6/Acrobat 7) have no problem with the file.
The following headers are returned by the server:
Date: Thu, 08 Jan 2009 10:52:40 GMT
Server: Apache/2.2.8 (Win32) mod_ssl/2.2.8 OpenSSL/0.9.8g PHP/5.2.5 DAV/2
X-Powered-By: PHP/5.2.5
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Content-Length: 4750
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: application/pdf
Does anybody know how to fix this problem?
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
...so IE won't store the file in the Temporary Internet Files folder. However the mechanism used to directly 'Open' a file from the browser in IE often requires it to be opened from inside Temporary Internet Files. Directly opening a file from a browser is generally unreliable, especially in IE; 'Save as' works better.
Consider replacing the cachebusting headers with an alternative method, such as add a '?randomstring' parameter to the URL. Also consider adding a "Content-Disposition: attachment; filename=..." header, which will stop a plug-in trying and failing to display the file in the browser UI.
I think I've solved the problem.
The problem is not on the server-side but on the client-side.
The generated PDF is being displayed in a popup-window (javascript: window.open) and IE7 chokes on it.
When I open a html-file in the popup which is redirecting to the PDF it works.