This script is for my http file server. I have my files out of web root, so I am using a php script to grab and send the files to the client. The problem is, when I click on the file to download the browser(ff and chrome) does not ask me if I want to save the file. In chrome's web tools, under network, I see that the download.php did execute successfully. In fact, I even see where the file was transfered in bytes for the request. But the downloaded file is not on the client computer. Why isn't the browser client asking/downloading the file?
Web server is Nginx.
<?php
$path = "/trunk/";
$file = $_POST['filename'];
$file_ext = pathinfo($file, PATHINFO_EXTENSION);
$file_name = pathinfo($file, PATHINFO_BASENAME);
$file_full = $path . $file_name;
$finfo = new finfo(FILEINFO_MIME_TYPE);
$ctype = $finfo -> file($file_full);
header("Content-Disposition: attachment; filename=\"".$file_name."\"");
header("Content-Type: " .$ctype);
header("Content-Length: " .filesize($file_full));
error_log(filesize($file_full));
readfile($file_full);
?>
Request Header:
POST /php/download.php HTTP/1.1
Host: www.example.com
Connection: keep-alive
Content-Length: 146
Origin: https://www.example.com
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.79 Safari/537.4
Content-Type: multipart/form-data; boundary=----WebKitFormBoundaryMA7u5AkYPL9qGmRY
Accept: */*
Referer: https://www.example.com/index.php
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
RESPONSE HEADER
HTTP/1.1 200 OK
Server: nginx
Date: Sat, 01 Dec 2012 20:22:28 GMT
Content-Type: application/zip
Content-Length: 75090
Connection: keep-alive
Keep-Alive: timeout=300
X-Powered-By: PHP/5.4.6--pl0-gentoo
Content-Disposition: attachment; filename="try.zip"
Expires: Sat, 01 Dec 2012 20:22:27 GMT
Cache-Control: no-cache
Use Content-Type: application/octet-stream to guarantee a forced download across all browsers. If you know you are sending binary content (like your zip), then add Content-Transfer-Encoding: binary as well.
Related
When I browse to a page with Firefox and click a download link, the following headers are shown when I inspect the request in network inspector:
Connection: keep-alive
Content-Disposition: attachment; filename="example_file.mp3"
Content-Length: 35181829
Content-Transfer-Encoding: binary
Content-Type: audio/mpeg
Date: Fri, 19 Aug 2016 18:19:02 GMT
Keep-Alive: timeout=60
Server: nginx
X-Powered-By: PHP/5.4.45
However, when I use cURL to visit the same address, I get this:
Connection: keep-alive
Content-Length: 1918
Content-Type: text/html; charset=UTF-8
Date: Fri, 19 Aug 2016 20:46:23 GMT
Keep-Alive: timeout=60
Server: nginx
X-Powered-By: PHP/5.4.45
How can I form a request with cURL that gives me the same response as Firefox?
In Firefox, open up the Net tab in the developer options(F12) and open the URL of the page you need.
Take note of all the Request Headers in the request sent to the server:
Example:
Accept
text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding
gzip, deflate
Accept-Language
nl,en-US;q=0.7,en;q=0.3
Connection
keep-alive
Cookie
_ga=GA1.2.598213448.1471644637; _gat=1
Host
mariannesdelights.be
User-Agent
Mozilla/5.0 (Windows NT 10.0; WOW64; rv:47.0) Gecko/20100101 Firefox/47.0
Put all the headers in an array in this way
$headers = array('HeaderName:HeaderValue','HeaderName2:HeaderValue2');
Use the php function curl_setoption() to set the headers in the request:
curl_setopt($ch,CURLOPT_HTTPHEADER,$headers);
That should produce the exact same HTTP-Response headers.
I want to cache html page in my browser , and i am tying it on localhost , And I am sending the correct header( using the PHP) in response header but still browser is not caching the response, and every time i request same resource, It connect to server and get response from there
At top of my html page I am using
<?php
header("Cache-Control:max-age=36000");
?>
And the Response headers are
HTTP/1.1 200 OK
Date: Tue, 15 Nov 2016 14:45:37 GMT
Server: Apache/2.4.16 (Win32) OpenSSL/1.0.1p PHP/5.6.12
X-Powered-By: PHP/5.6.12
Cache-Control: max-age=36000
Accept-Ranges: none
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 154
Keep-Alive: timeout=3, max=100
Connection: Keep-Alive
Content-Type: text/html; charset=UTF-8
When i saw Cache-Control:max-age=36000 in headers , I was expecting browser will cache this response for 36000 seconds and if i reload page ,I will get the cached response (and different response header) , but i am getting same header after reload ,and getting response straight from server again ,,
after reload request headers are
GET /check.php HTTP/1.1
Host: localhost
Connection: keep-alive
Cache-Control: max-age=0
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.71 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Encoding: gzip, deflate, sdch, br
Accept-Language: en-US,en;q=0.8
Should i send any other response header for tell browser to cache the response ?
PHP (of course) adds some magical cache-control headers by itself. It is not possible to simply overwrite those with header(), and you have to use session_cache_limiter() to set different cache control headers, or session_cache_limiter('') to disable those magical headers all together..
When I browse to a page with Firefox and click a download link, the following headers are shown when I inspect the request in network inspector:
Connection: keep-alive
Content-Disposition: attachment; filename="example_file.mp3"
Content-Length: 35181829
Content-Transfer-Encoding: binary
Content-Type: audio/mpeg
Date: Fri, 19 Aug 2016 18:19:02 GMT
Keep-Alive: timeout=60
Server: nginx
X-Powered-By: PHP/5.4.45
However, when I use cURL to visit the same address, I get this:
Connection: keep-alive
Content-Length: 1918
Content-Type: text/html; charset=UTF-8
Date: Fri, 19 Aug 2016 20:46:23 GMT
Keep-Alive: timeout=60
Server: nginx
X-Powered-By: PHP/5.4.45
How can I form a request with cURL that gives me the same response as Firefox?
In Firefox, open up the Net tab in the developer options(F12) and open the URL of the page you need.
Take note of all the Request Headers in the request sent to the server:
Example:
Accept
text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding
gzip, deflate
Accept-Language
nl,en-US;q=0.7,en;q=0.3
Connection
keep-alive
Cookie
_ga=GA1.2.598213448.1471644637; _gat=1
Host
mariannesdelights.be
User-Agent
Mozilla/5.0 (Windows NT 10.0; WOW64; rv:47.0) Gecko/20100101 Firefox/47.0
Put all the headers in an array in this way
$headers = array('HeaderName:HeaderValue','HeaderName2:HeaderValue2');
Use the php function curl_setoption() to set the headers in the request:
curl_setopt($ch,CURLOPT_HTTPHEADER,$headers);
That should produce the exact same HTTP-Response headers.
Here is the request and response headers
http://www.example.com/get/pdf
GET /~get/pdf HTTP/1.1
Host: www.example.com
User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Referer: http://www.example.com
Cookie: etc
HTTP/1.1 200 OK
Date: Thu, 29 Apr 2010 02:20:43 GMT
Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8i DAV/2 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635
X-Powered-By: Me
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Pragma: no-cache
Cache-Control: private
Content-Disposition: attachment; filename="File #1.pdf"
Content-Length: 18776
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: text/html; charset=utf-8
----------------------------------------------------------
Basically, the response headers are sent by DOMPDF's stream() method.
In Firefox, the file is prompted as File #1.pdf. However, in Safari, the file is saved as File #1.pdf.html.
Does anyone know why Safari is appending the html extension to the filename?
I'm also using Kohana 3, serving the PDF from a controller method.
From what i see the content type is incorrect, i believe if that is fixed, your problem will be solved.
I've fixed it by adding die(); after streaming it
$dompdf = new DOMPDF();
$dompdf->set_paper("a4", "portrait");
$dompdf->load_html($html);
$dompdf->render();
$dompdf->stream($invoice.".pdf");
die();
Because you're telling it that it's HTML. Fix your MIME type.
Content-Type: text/html; charset=utf-8
You can change how Kohana 3 sends headers like so...
$this->request->headers['Content-Type'] = File::mime($file);
This post started as a question on ServerFault ( https://serverfault.com/questions/131156/user-receiving-partial-downloads ) but I determined that our php script was the culprit. So I'm issuing an updated question here about what I believe is the actual issue.
I am using a php script to verify permissions and then serve up a file for users of my website to download. Most of the time, this works, but recently one user has been seeing problems with larger downloads. He is only getting ~80% of downloads for files that are > 100MB in size. Also, all downloads from this script fail to report a filesize. Further, tests revealed that the same user COULD reliably download each of the failed files if given a direct link (at which point the filesize is reported).
Here's the relevant snippet of code that we are using to serve the file:
header("Content-type:$contenttype");
$len = filesize($filename);
header("Content-Length: $len");
header("Content-Disposition: attachment; filename=".$title.".".$ext);
readfile($filename);
Note that $contenttype, $filename, $title, and $ext are all set correctly before we get here. These have been triple-checked. None of them are the problem. Also, $len does provide the correct filesize.
While researching this issue, I came across this post: Content-Length header always zero
It seems that I am encountering the same issue. When I use the script, I get chunked encoding on the file and no size is set for content-length. I'm hypothesizing that something is going wrong on the large downloads, leading him to get a zero-length chunk before the end of the file.
Here's what the headers look like for a direct request:
http://www.grinderschool.com/videos/zfff5061b65ae00e8b21/KillsAids021.wmv
GET /videos/zfff5061b65ae00e8b21/KillsAids021.wmv HTTP/1.1
Host: www.grinderschool.com
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729)
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Referer: http://www.grinderschool.com/phpBB3/viewtopic.php?f=14&p=29468
Cookie: style_cookie=printonly; phpbb3_7c544_u=2; phpbb3_7c544_k=44b832912e5f887d; phpbb3_7c544_sid=e8852df42e08cc1b2250300c2897f78f; __utma=174624884.2719561324781918700.1251850714.1270986325.1270989003.575; __utmz=174624884.1264524375.411.12.utmcsr=google|utmccn=(organic)|utmcmd=organic|utmctr=low%20stakes%20poker%20videos; phpbb3_cmviy_k=; phpbb3_cmviy_u=2; phpbb3_cmviy_sid=d8df5c0943863004ca40ef9c392d371d; __utmb=174624884.4.10.1270989003; __utmc=174624884
Pragma: no-cache
Cache-Control: no-cache
HTTP/1.1 200 OK
Date: Sun, 11 Apr 2010 12:57:41 GMT
Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 mod_auth_passthrough/2.1 FrontPage/5.0.2.2635
Last-Modified: Sun, 04 Apr 2010 12:51:06 GMT
Etag: "eb42d6-7d9b843-48368aa6dc280"
Accept-Ranges: bytes
Content-Length: 131708995
Keep-Alive: timeout=10, max=30
Connection: Keep-Alive
Content-Type: video/x-ms-wmv
And here's what they look like for the request answered by my script:
http://www.grinderschool.com/download_video_test.php?t=KillsAids021&format=wmv
GET /download_video_test.php?t=KillsAids021&format=wmv HTTP/1.1
Host: www.grinderschool.com
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729)
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Cookie: style_cookie=printonly; phpbb3_7c544_u=2; phpbb3_7c544_k=44b832912e5f887d; phpbb3_7c544_sid=e8852df42e08cc1b2250300c2897f78f; __utma=174624884.2719561324781918700.1251850714.1270986325.1270989003.575; __utmz=174624884.1264524375.411.12.utmcsr=google|utmccn=(organic)|utmcmd=organic|utmctr=low%20stakes%20poker%20videos; phpbb3_cmviy_k=; phpbb3_cmviy_u=2; phpbb3_cmviy_sid=d8df5c0943863004ca40ef9c392d371d; __utmb=174624884.4.10.1270989003; __utmc=174624884
HTTP/1.1 200 OK
Date: Sun, 11 Apr 2010 12:58:02 GMT
Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 mod_auth_passthrough/2.1 FrontPage/5.0.2.2635
X-Powered-By: PHP/5.2.11
Content-Disposition: attachment; filename=KillsAids021.wmv
Vary: Accept-Encoding
Content-Encoding: gzip
Keep-Alive: timeout=10, max=30
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: video/x-ms-wmv
So the question is...what can I do to make downloads from the script work properly? Again, for 99% of users, it works as is (though I find it annoying now that no filesize is reported and thus that no time estimate can be computed about the download).
It's your GZIP compression. When you specify a content length but turn compression on, it gums everything up. It's happened to me a few times: try turning it off in your script.
Generally you'd turn it on with:
ob_start("ob_gzhandler");
...so just comment that line out. If that's not in your code, chances are there's a setting in either your php.ini file somewhere or your apache.conf/conf.d.
Hope this helps!
Content-Encoding: gzip
Hmm. Presumably PHP's zlib.output_compression is doing that. (Doesn't look like Apache's mod_deflate.)
Try turning it off and seeing if it's that that's forcing chunked encoding. You don't want to compress the download of a filetype like WMV which is already highly compressed.
However, chunked encoding would only explain the lack of size report. The download should still work. Is it possible you're being hit by a timeout (eg. PHP's set_time_limit, or Apache Timeout)?
If it is the script then have you tried using a substitute function for the readfile() function that reads and outputs a bit at a time? The reasoning behind this could be that a memory limit is reached somewhere and it fails.
From http://php.net/manual/en/function.readfile.php :
function readfile_chunked ($filename) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
print $buffer;
}
return fclose($handle);
}
Also, try to flush the output as often as you can.