content-length header from php is overwritten ! - php

I'm trying to figure why the Content-Length header of php gets overwritten.
This is demo.php
<?php
header("Content-Length: 21474836470");die;
?>
a request to fetch the headers
curl -I http://someserver.com/demo.php
HTTP/1.1 200 OK
Date: Tue, 19 Jul 2011 13:44:11 GMT
Server: Apache/2.2.16 (Debian)
X-Powered-By: PHP/5.3.3-7+squeeze3
Content-Length: 2147483647
Cache-Control: must-revalidate
Content-Type: text/html; charset=UTF-8
See Content-Length ? It maxes out at 2147483647 bytes, that is 2GB.
Now if modify demo.php like so
<?php
header("Dummy-header: 21474836470");die;
?>
the header is not overwritten.
HTTP/1.1 200 OK
Date: Tue, 19 Jul 2011 13:49:11 GMT
Server: Apache/2.2.16 (Debian)
X-Powered-By: PHP/5.3.3-7+squeeze3
Dummy-header: : 21474836470
Cache-Control: must-revalidate
Content-Type: text/html; charset=UTF-8
Here are the modules loaded
root#pat:/etc/apache2# ls /etc/apache2/mods-enabled/
alias.conf authz_host.load dav_fs.load expires.load php5.conf reqtimeout.load status.conf
alias.load authz_user.load dav.load headers.load php5.load rewrite.load status.load
auth_basic.load autoindex.conf dav_lock.load mime.conf proxy.conf setenvif.conf
authn_file.load autoindex.load dir.conf mime.load proxy_http.load setenvif.load
authz_default.load cgi.load dir.load negotiation.conf proxy.load ssl.conf
authz_groupfile.load dav_fs.conf env.load negotiation.load reqtimeout.conf ssl.load
Here is a phpinfo() : http://pastehtml.com/view/b0z02p8zc.html
Apache does support files over 2GB, as I don't have any problem accessing large file directly :
curl -I http://www.someserver.com/somehugefile.zip (5.3 Gig)
HTTP/1.1 200 OK
Date: Tue, 19 Jul 2011 14:00:25 GMT
Server: Apache/2.2.16 (Debian)
Last-Modified: Fri, 15 Jul 2011 08:50:22 GMT
ETag: "301911-1548e4b11-4a817bd63ef80"
Accept-Ranges: bytes
Content-Length: 5713578769
Cache-Control: must-revalidate
Content-Type: application/zip
Here is a uname -a
Linux pat.someserver.com 2.6.38.2-grsec-xxxx-grs-ipv6-32 #1 SMP Fri Apr 15 17:41:28 UTC 2011 i686 GNU/Linux
Hope somebody can help !
cheers

Seems like php cast Content-length to int

Yes it's certainly a 32 bits thing. Well I don't want to tweak PHP, recompile or something, so for the time being, I will check the file size, and if it's over 2GB, I'm not sending the header.
thank you all for your input

Related

Amazon s3 - Call file after upload

I'm using a php script to upload a file to Amazon S3
$s3->putObjectFile($url,$bucket,$filename, S3::ACL_PRIVATE,array(),array("Content-Type" => "application/json"));
Immediately after the upload, I need to get some information from the file and I call it through Cloudfront but I see that the file is available only after some minutes.
Do you know why and if there is a way to sort this out?
Is there a way to have the file immediately available?
Thank you
It was immediately available for me with this test...
$ date;aws s3 cp foo s3://BUCKET/foo;date;curl -I http://dxxx.cloudfront.net/foo;date
Thu Aug 6 04:30:01 UTC 2015
upload: ./foo to s3://BUCKET/foo
Thu Aug 6 04:30:02 UTC 2015
HTTP/1.1 200 OK
Content-Type: text/html
Content-Length: 24
Connection: keep-alive
Date: Thu, 06 Aug 2015 04:30:03 GMT
x-amz-version-id: null
Last-Modified: Thu, 06 Aug 2015 04:30:03 GMT
ETag: "5089fc210a3df99a6a04fe64c929d8e7"
Accept-Ranges: bytes
Server: AmazonS3
X-Cache: Miss from cloudfront
Via: 1.1 fed2ed9d8e7c2b37559eac3b2a2386fc.cloudfront.net (CloudFront)
X-Amz-Cf-Id: gEaAkLCqstyAdakXkBvMmSJVrfKjKTD5K9WoyZBHvVZufWuuhclNLQ==
Thu Aug 6 04:30:02 UTC 2015
Perhaps your upload is happening asynchronously and the upload hadn't completed yet?

Browser not automatically decompressing gz file

I have the same website with the same css file being gzipped and served on two separate servers. Viewing the site on one server, the browser properly decompresses it, and uses the styling. But on the other, the browser does not decompress the file. I thought perhaps this is something to do with the headers, but all the resources I've found seem to think the Content-Type and Content-Encoding are the only two headers that matter for decompressing gzip, and those are the same on both servers. Is there another response header that is incorrect?
The working response headers for the .css.gz file:
HTTP/1.1 200 OK
Cache-Control: public, max-age=604800, must-revalidate
Accept-Ranges: bytes
Content-Type: text/css
Age: 353722
Date: Tue, 07 Apr 2015 21:44:23 GMT
Last-Modified: Tue, 29 Oct 2013 17:44:18 GMT
Expires: Fri, 10 Apr 2015 19:29:01 GMT
Content-Length: 33130
Connection: keep-alive
The response headers for the .css.gz file that don't seem to work:
HTTP/1.1 200 OK
Date: Wed, 08 Apr 2015 15:14:11 GMT
Content-Type: text/css
Last-Modified: Tue, 07 Apr 2015 22:42:25 GMT
Transfer-Encoding: chunked
Connection: close
Content-Encoding: gzip

Change order header apache

how to change order header the following line
HTTP/1.1 200 OK
Date: Sat, 24 Aug 2013 05:10:06 GMT
Content-Length: 0
Server: apache
Pragma: no-cache
X-Powered-By: PHP5
to :
HTTP/1.1 200 OK
Pragma: no-cache
Content-Length: 0
Server: apache
X-Powered-By: PHP5
Date: Sat, 24 Aug 2013 05:10:06 GMT
in apache

PHP HTTP HEADER: how to keep/rebuild apache2's last-modified&ETag

calling a .html on my website directly the header will be:
HTTP/1.1 200 OK
Date: Tue, 07 May 2013 14:53:30 GMT
Server: Apache
Last-Modified: Tue, 24 Aug 2012 21:51:42 GMT
ETag: "1431a086-1e01-78e98c5498f1c"
Accept-Ranges: bytes
Content-Length: 7681
Vary: Accept-Encoding
Content-Type: text/html
now the request is forwarded through a php script like
(- the use of the php script here is only to filter some words from the html before delivering it by a regex and to add a footer to every page)
and the header looks like:
HTTP/1.1 200 OK
Date: Tue, 07 May 2013 14:52:50 GMT
Server: Apache
Vary: User-Agent,Accept-Encoding
Content-Type: text/html
Question: How to keep "Last-Modified: ..." and "ETag: ..." ?
Thanks=)

Google Chrome audit on caching

If I run an audit on my sites with Google Chrome, I get this message in the Leverage browser caching section:
The following resources are missing a
cache expiration. Resources that do
not specify an expiration may not be
cached by browsers:
A list of all the pictures follows. I get a similar notice in Leverage proxy caching:
Consider adding a "Cache-Control:
public" header to the following
resources:
Apart from pictures, I also get a notice about HTML, CSS and JavaScript files:
The following resources are explicitly
non-cacheable. Consider making them
cacheable if possible:
Its funny because I've worked hard to cache all static contents (except for pictures, where I just left Apache's default settings). Firefox does indeed store all these items in cache.
Is there anything I should improve in my HTTP headers?
Here's the complete header set of some items as loaded after removing the browser caché. Pictures use default settings I didn't really check before, the rest should be cachéd for three hours. I can set headers with both .htaccess and PHP.
PNG
HTTP/1.1 200 OK
Date: Sat, 31 Jul 2010 12:46:14 GMT
Server: Apache
Last-Modified: Thu, 18 Mar 2010 21:40:54 GMT
Etag: "c48024-230-4821a15d6c580"
Accept-Ranges: bytes
Content-Length: 560
Keep-Alive: timeout=4
Connection: Keep-Alive
Content-Type: image/png
HTML
HTTP/1.1 200 OK
Date: Sat, 31 Jul 2010 12:46:13 GMT
Server: Apache
X-Powered-By: PHP/5.2.11
Expires: Sat, 31 Jul 2010 15:46:13 GMT
Cache-Control: max-age=10800, s-maxage=10800, must-revalidate, proxy-revalidate
Content-Encoding: gzip
Vary: Accept-Encoding
Last-Modified: Wed, 24 Mar 2010 20:30:36 GMT
Keep-Alive: timeout=4
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html; charset=ISO-8859-15
CSS
HTTP/1.1 200 OK
Date: Sat, 31 Jul 2010 12:48:21 GMT
Server: Apache
X-Powered-By: PHP/5.2.11
Expires: Sat, 31 Jul 2010 15:48:21 GMT
Cache-Control: max-age=10800, s-maxage=10800, must-revalidate, proxy-revalidate
Content-Encoding: gzip
Vary: Accept-Encoding
Last-Modified: Thu, 18 Mar 2010 21:40:12 GMT
Keep-Alive: timeout=4
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/css
JavaScript
HTTP/1.1 200 OK
Date: Sat, 31 Jul 2010 12:48:21 GMT
Server: Apache
X-Powered-By: PHP/5.2.11
Expires: Sat, 31 Jul 2010 15:48:21 GMT
Cache-Control: max-age=10800, s-maxage=10800, must-revalidate, proxy-revalidate
Content-Encoding: gzip
Vary: Accept-Encoding
Last-Modified: Thu, 18 Mar 2010 21:40:12 GMT
Keep-Alive: timeout=4
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: application/x-javascript
Update
I've tested Jumby's suggestion and set my CSS's expire to 1 year:
Cache-Control:max-age=31536000, s-maxage=31536000, must-revalidate, proxy-revalidate
Connection:Keep-Alive
Content-Encoding:gzip
Content-Length:4198
Content-Type:text/css
Date:Mon, 02 Aug 2010 20:48:56 GMT
Expires:Tue, 02 Aug 2011 20:48:56 GMT
Keep-Alive:timeout=5, max=99
Last-Modified:Thu, 18 Mar 2010 20:40:12 GMT
Server:Apache/2.2.14 (Win32) PHP/5.3.1
Vary:Accept-Encoding
X-Powered-By:PHP/5.3.1
However, Chrome still claims "explicitly non-cacheable".
3 hour expiry might not be enough "time" for the yslow/page speed stuff and they might complain about it. I have seen this with static content on my sites with 4 hour expiration & yslow (havent tried with google's stuff).
Most of those want versioned static content with LONG expire times (like 1 year); see here
The problem is the "must-revalidate" part of your cache-control directive. Get rid of that, and you should be good to go.
I just got a similar issue, I discovered the very same setup and code produces a chrome audit warning when trying on my test server at 127.0.0.1, but not on the real server with a real DNS name.

Categories