How to print out whether the current web page is compressed? - php

I am using a .htaccess file to compress my .php , .css and .js files. What I want to do is to echo out in my footer is GZIP enabled or not.
For example if its not enabled by the server to echo out GZIP is not enabled or of it is to echo out GZIP is enabled.
I've looked around the net and couldn't find a php script doing that. I thought I could make it work with ini_get() but obviously unsuccessfully.
Can someone give me a tip?
Thanks :)

I have problems to understand your question. If you want to find out if the gzip encoding of your server and php configuration works, you need to check that with an additional program sending requests to the URLs in question and check if gzip is used. One such tool is curl on the commandline:
$ curl --compress --raw -i URL
Example:
$ curl --compress -I https://stackoverflow.com/questions/8029168/show-is-my-gzip-enabled-on-my-webpage
HTTP/1.1 200 OK
Cache-Control: public, max-age=18
Content-Length: 8683
Content-Type: text/html; charset=utf-8
*****************************
* Content-Encoding: deflate *
*****************************
Expires: Sun, 06 Nov 2011 18:21:05 GMT
Last-Modified: Sun, 06 Nov 2011 18:20:05 GMT
Vary: *
Date: Sun, 06 Nov 2011 18:20:46 GMT
See the line I highlighted:
Content-Encoding: deflate
it signals that something gzip is active. See RFC2616 - HTTP/1.1 14.11 Content-Encoding for the full meaning of this line.
More information is available in a related question/answer: PHP Output buffering, Content Encoding Error caused by ob_gzhandler?.
If that's not your issue maybe Gzip a website with inline PHP code is helpful.

Related

Content-Length header on HHVM FastCGI is intermittent

Maybe it's the jetlag, but I'm failing to make PHP/HHVM give me the Content-Type header when I need it.
I've deployed the full stack (MySQL, HHVM, Nginx) on a Vagrant machine and I've managed to reproduce the issue on a test script:
<?php
$file='/usr/share/doc/iptables/html/NAT-HOWTO.html'; # random test file
header('Content-Length: ' . filesize($file));
echo(readfile($file));
?>
If you examine the headers with curl:
hostname:~ jsimpson$ curl -I http://vagrant/test.php
HTTP/1.1 200 OK
Server: nginx/1.4.6 (Ubuntu)
Date: Tue, 16 Sep 2014 22:09:25 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 2592
Connection: keep-alive
X-Powered-By: HHVM/3.2.0
Content-Encoding: none;
We have a content length header. However if we hit the same URL from Chrome, and get the headers from the Dev tools:
HTTP/1.1 200 OK
Server: nginx/1.4.6 (Ubuntu)
Date: Tue, 16 Sep 2014 22:14:41 GMT
Content-Type: text/html; charset=utf-8
Transfer-Encoding: chunked
Connection: keep-alive
X-Powered-By: HHVM/3.2.0
Content-Encoding: gzip
No Content-Length header. I've also packet sniffed this to verify that the header isn't sent. I can switch to PHP FPM and it sends the header.
I reproduced the issue by hitting the server with:
curl -H 'Accept-Encoding: gzip,deflate' --compressed -v http://foo/bar
HHVM enables compression by default. Disabling that gave me the header back.
Everything was awesome after adding this to /etc/hhvm/server.ini
hhvm.server.gzip_compression_level = 0
I stumbled upon this issue/feature. Not running HHVM though. Pure nginx + PHP-FPM.
The thing is, if your PHP app calculates and sets Content-Lenght header field, and your nginx is configured to gzip content, it will just ditch this information and replace it by Chunked transfer encoding and GZIP headers.
So do not set GZIP to a very small buffers, default is 20 bytes which does quite the opposite (end result is larger then before GZIP compression).
I set it like this:
gzip_min_length 1024;

php error compression mod_deflate gzip compression

I follow this way compression 1
and
compression 2
how to compression with mod_deflate? I've followed the above link and success to compress but browser can not read compression?
I get an error when I open the css and js files in a browser!
SyntaxError: illegal character
��{{�F�7���"ƫ
and
KÊO©¬NJLÎN/Ê/ÍKÑMÎÏÉ/²RN2H6II­�ó>å¿rŸ—Ï9?ýSúg½/úž;RSŸ‘‡™§çZV†¤ü”Êê¤Ääìô¢üÒ¼Ýäüœü"+å$ƒd“”ÔZ�
Response Headers
Connection Keep-Alive
Content-Encoding gzip
Content-Length 127
Content-Type text/css
Date Wed, 18 Dec 2013 07:15:22 GMT
Expires Fri, 17 Jan 2014 14:15:22 +0700
Keep-Alive timeout=5, max=100
Last-Modified Wed, 18 Dec 2013 13:17:18 +0700
Server Apache/2.2.25 (Win32) PHP/5.3.26
Vary Accept-Encoding
X-Powered-By PHP/5.3.26
anybody please...thanks
It looks like you are compressing the code twice, once with PHP and once with mod_deflate.
Could this be the case?

garbled xml output

I'm trying to parse an xml feed using PHP:
http://trustbox.trustpilot.com/r/travelnation.co.uk.xml
Visiting this, it looks perfectly OK, but when I try
<?php
$file = file_get_contents("http://trustbox.trustpilot.com/r/netamity.com.xml");
print_r($file);
?>
I get
‹•SÁŽÓ0=/ÿ`ŒÄmœ- 븊àèJV«••L«ŽmÙN²ý{Æi·M
...
How is it getting garbled? Using simplexml it wont parse it (unsurprisingly). I've tried setting headers UTF-8 headers but I think the issue is in the get_file_contents. Any ideas?
The content looks "weird" simply because the encoding is compressed (see the HTTP header Content-Encoding: gzip).
HTTP/1.1 200 OK
x-amz-id-2: 8wYarFnod0jtLJ3U8ZDN38102fjtG+EbwJjy0tY4YTZncrz9auEcQbzt1vyiSEhq
x-amz-request-id: A60F1E6CA5437776
Date: Sun, 24 Feb 2013 18:00:45 GMT
Content-Encoding: gzip
Last-Modified: Sun, 24 Feb 2013 05:19:11 GMT
ETag: "64eaa6f87768aeb3ae6741ba06318cb6"
Accept-Ranges: bytes
Content-Type: application/xhtml+xml
Content-Length: 52366
Server: AmazonS3
I guess what you need is to know how to read a file over HTTP; you could try this one on SO.

Downloading using wget in php

I am using wget in php script to download images from the url submitted by the user. Is there some way for me to determine the size of image before actually downloading it and restricting the size of download to 1mb? Also can I possibly check that the url points to an image only and not an entire website without downloading?
I dont want to end up filling my server with malware.
Before loading you can check headers (you'll have to download them though). I use curl - not wget. Here's an example:
$ curl --head http://img.yandex.net/i/www/logo.png
HTTP/1.1 200 OK
Server: nginx
Date: Sat, 16 Jun 2012 09:46:36 GMT
Content-Type: image/png
Content-Length: 3729
Last-Modified: Mon, 26 Apr 2010 08:00:35 GMT
Connection: keep-alive
Expires: Thu, 31 Dec 2037 23:55:55 GMT
Cache-Control: max-age=315360000
Accept-Ranges: bytes
Content-Type and Content-Length should normally indicate that the image is ok

What is removing my headers in PHP/Apache2?

I'm using PHP 5.3 and Apache 2.0 to serve a script which adds a number of headers to the output:
header('HTTP/1.1 200 OK');
header("Content-Type: application/json");
header("Last-Modified: $lastmode"); // $lastmod = Tue, 01 Mar 2011 14:56:22 +0000
header("Etag: $etag"); // Etag = 5da02274fcad09a55f4d74467f66a864
Now, all the headers come through except for the Last-Modified and Etag. In my httpd.conf I have the following:
Header unset Cache-Control
Header unset Pragma
But in my response I get:
HTTP/1.1 200 OK
Date: Tue, 01 Mar 2011 16:49:10 GMT
Server: Apache/2.2.14 (EL) mod_ssl/2.2.14 OpenSSL/0.9.8e-fips-rhel5
Keep-Alive: timeout=15, max=8000
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: application/json
Expires: 0
Cache-Control: no-cache
My Last-Modified and Etag headers are gone and we have Cache-Control and Expires in their place.
I should also mention that I have disabled mod_expires to no avail. I am pulling my hair out at this point as no matter what I do the headers are simply not there. What could possibly cause them to be removed?
Thanks,
J
UPDATE: It seems that Apache is adding the additional headers after PHP has shutdown and I would think it's also removing the headers I set above. Registering a shutdown function in PHP and calling apache_response_headers shows:
Pragma=
Expires=
Etag=5da02274fcad09a55f4d74467f66a864
Last-Modified=Tue, 01 Mar 2011 14:56:22 +0000
Keep-Alive=timeout=15, max=8000
Connection=Keep-Alive
Transfer-Encoding=chunked
Content-Type=application/json
To answer my own question it seems that the tool I was using to debug was giving me the grief. It was the mod_expires module causing the problem in the first place but the reason that removing it had no effect was that the proxy I was using to debug the problem (Charles) seems to modify the headers. Once Charles was taken out of the loop my headers were there!

Categories