Sorry to bother you with a question that seems very well documented, but none of the things I tried was completely satisfying.
My Company switched from a hosting package to a manages Server just last week and I'm still in the process of optimising it.
Now, Googles PageSpeed Insights tell me, it is not compressed, as does GTMetrix. GidNetwork tells me compression works fine.
I have already added
<IfModule mod_deflate.c>
<FilesMatch "\.(html|php|txt|xml|js|css)$">
SetOutputFilter DEFLATE
</FilesMatch>
</IfModule>
to my .htaccess, (as recommended here) which works correctly, other settings I've changed are fine, as well as
zlib.output_compression = On
to my php.ini.
The entire .htaccess and php.ini can be seen at jsFiddle.
Headers sent and received in both Firefox and Chrome claim that compression is happening.
I also created a httpd.conf in my home directory, because none existed on my server yet. Should I move the file somewhere else?
What I really want to know:
Soo... what am I doing wrong? Is it compressed? Is it not? How can I make google 'see' the compression?
Thank you very much for your help.
This should be the function you need, it should automatically generates the headers you need:
http://www.php.net/manual/en/function.gzdecode.php
anyway check your php version because it works only with php 5.4.0 or later ones.
Although my browser accepts deflate/gzip encoding (Accept-Encoding: gzip, deflate), your server does not answer using compressed data:
HTTP/1.1 200 OK
Date: Tue, 11 Mar 2014 09:41:45 GMT
Server: Apache
Connection: Keep-Alive
Keep-Alive: timeout=2, max=200
Etag: "96-100e9-4f4517d791912"
Expires: Fri, 11 Apr 2014 13:28:25 GMT
Cache-Control: max-age=2692000, public
Vary: User-Agent
If it was compressed, the server would send also
Content-Encoding: deflate
Use FireBug or the dev console to see the headers. It must be your httpd.conf.
You cannot simply create this file in your home directory and Apache will load it.
Have a look in /etc/apache* and /etc/httpd/* for config files.
You have already enabled gzip compression but you haven't set it to compress some file types such as javascript and css. That is why Googles PageSpeed was tried to suggest to enable compression. To enble compression for those two types, use
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/javascript
What worked for me in the end to compress the js/css files that were left uncompressed for reasons I don't quite get yet is described here.
In this method, javascript and css-files are forcibly gzipped by a php-script. It also sets new "Expire"-Headers, so if you want caching for more than 5 Minutes, change the number. Adding different files is trickier, but possible, I think.
Related
I have a download script written in php which works fine BUT I've been running into problems allowing users to download .zip files of between 300 and 600mb from a hosted server. Files of around 70mb download fine but larger ones are either incomplete or corrupt. I've been reading that this problem could be caused by gzip being enabled on the server.
As the server is hosted I have no way of editng the apache config file to disable gzip. Is there another way? I've tried adding the code below
to the .htaccess file that I created in the same directory as the php download script but it did not disable gzip.
#Disable GZIP in directory
RewriteRule ^(.*)$ $1 [NS,E=no-gzip:1,E=dont-vary:1]
What can I do to disable gzip?
This is what the repsonse headers are sending back when I try to download.
Cache-Control max-age=604800
Connection keep-alive
Content-Encoding gzip
Content-Type text/html; charset=UTF-8
Date Tue, 13 Feb 2018 16:12:05 GMT
Expires Tue, 20 Feb 2018 16:12:05 GMT
Server nginx/1.12.2
Transfer-Encoding chunked
Is there any clever way to trick nginx to stop gzip if the backend already has set the "content-encoding" header?
Nginx is configured to gzip the output from the php fastcgi backend.
This works great in 99% of the cases.
Except on rare occasion php will send a raw gzipped file and attach a Content-Encoding: gzip header.
Nginx unfortunately will go right ahead and try to gzip that content a second time.
The produces a double content-encoding: gzip content-encoding: gzip header and double-encoded gzipped body.
Most modern browsers can handle this, Firefox, Chrome.
IE8 cannot, Safari mobile cannot, old Safari 5 for Windows cannot - instead they will show garbled gzipped content because it merges the content-encoding headers and only decodes the gzipped body once.
Thanks for any ideas.
Somewhere in nginx.conf where it applies (there should be a fastcgi_params file somewhere) :
fastcgi_param HTTP_ACCEPT_ENCODING "";
This will disable the encoding from the backend.
I hope that still Nginx will serve encoded content after this. (I am not sure)
I'm on a shared hosted linux server so I must use .htaccess. I'm busy working on compressing and caching things. I actually have two questions, but first here is how I have the cache setup in my .htaccess file.
ExpiresActive on
<FilesMatch "\.(bmp|png|ico|gff|jpg|jpeg|gif|gcf)$">
FileETag MTime Size
ExpiresDefault "now plus 60 minutes"
</FilesMatch>
Question 1, so this does cache those things with the exception of one PNG file.
Now most of my files are all lowercase as I'm on Linux but a few PNG files have slipped through with and upper case extension.
What's strange is that all of the PNG files cache except for one called addon2.PNG. At first I thought it was because of the case but I have checked and I have 3 other PNG files with uppercase extensions - which Google Page Speed says are cached. So any ideas or is Google Page Speed just B.S.?
And question 2, as I'm wary due to my hosts mess up with their Varnish issue I'm adding things to cache a little at a time and waiting to see if my stuff screws up. When I try to cache HTML files the login/logout features of my site—written in PHP—do not work.
You have to login and refresh or logout and refresh. I'm wondering is that because the page HTML is output via the PHP file? All my main pages are PHP and I only have a few actual html files. But I thought caching HTML would just do those files with htm & html extensions using the code below. But it's kinda like the server is trying to cache the html outputted by the PHP files. Am I out of my mind here?
ExpiresActive on
ExpiresDefault "now plus 60 minutes"
ExpiresByType text/html "now plus 60 minutes"
<FilesMatch "\.(css|bmp|png|ico|htm|gff|html|jpg|jpeg|gif|gcf)$">
FileETag MTime Size
ExpiresDefault "now plus 60 minutes"
</FilesMatch>
Your long explanation basically boils down to a PNG image is not being cached & HTML is not refreshing. It could be that the PNG image you say is having issues (addon2.PNG) is simply damaged. But the problem with your question is you have not provided any headers to show what the output is.
If you are using Linux, just go to the terminal and use curl to get headers. For example, this is how I can get headers sent with the main Google image on their homepage:
curl -I https://www.google.com/images/srpr/logo11w.png
And the output of that command is:
HTTP/1.1 200 OK
Content-Type: image/png
Last-Modified: Wed, 09 Oct 2013 01:35:39 GMT
Date: Tue, 17 Dec 2013 23:39:21 GMT
Expires: Wed, 17 Dec 2014 23:39:21 GMT
X-Content-Type-Options: nosniff
Server: sffe
Content-Length: 14022
X-XSS-Protection: 1; mode=block
Cache-Control: public, max-age=31536000
Age: 1647616
Alternate-Protocol: 443:quic
So in your case just type in this command—remember to set it to the real path of the image—like so:
curl -I https://url/to/your/site/addon2.PNG
And compare the headers against something you know is caching as you wish it to.
Also, ExpiresDefault & Cache-Control can sometimes slightly conflict with each other. So check those two values to see what your host is setting versus what your .htaccess is setting.
And regarding this:
So any ideas or is Google Page Speed just B.S.?
All of those page benchmarking tools have varying usefulness. In general for the specific issue you are facing, they are not B.S. but utterly useless. It’s like you are in a car with an oil leak & some automatic system in the car is telling you to change your oil. You don’t need to “change your oil” you need to find the leak & plug it up so the oil stays in!
So in your case the best bet is to using Unix tools like curl to check what the header output is & adjust/tweak until you get it right.
EDIT Also, please look over this pretty in-depth—and useful—caching guide from Google.
I have some problems with gzip encoding and I just can't figure them out!
I added in my htaccess "AddOutputFilterByType DEFLATE ... ..." and I used the tool from http://www.gidnetwork.com/tools/gzip-test.php to check if my pages are encoded properly. The tool says everything is ok.
The problem is that when I check the pages with the web developer tools from chrome or firebug, they don't recognize the pages as gzipped. In the headers section the "content-encoding: gzip" does not appear and PageSpeed also says they are not encoded and recommends me to encode them.
Any ideas? Thanks.
Adi Ulici
AddOutputFilterByType is deprecated try AddOutputFilter instead, eg
AddOutputFilter DEFLATE php js css html
I'm having some trouble prevent mod_deflate from jumping in on this scenario:
user running CodeIgniter (or any other framework that re-directs to index.php)
mod_deflate is active
zip file is served by a CodeIgniter controller (headers + readfile)
The thing is that Apache always detects the content as being php and therefor something like the lines bellow wont work as the server assumes the ZIP file as being a PHP one.
<FilesMatch "\.(xml|txt|html|php)$">
SetOutputFilter DEFLATE
</FilesMatch>
Any ideas on how I can have Apache distinguish from an HTML file or a ZIP file both generated by the same index.php framework file.
Edit:
apache log
[Mon Jun 20 02:14:19 2011] [debug]
mod_deflate.c(602): [client 192.168.0.5]
Zlib: Compressed 50870209 to 50878224 : URL /index.php,
referer: http://demo.dev/
Edit:
CI controller that serves the zip
header('Content-Type: application/zip');
header('Content-Transfer-Encoding: binary');
header("Content-Length: " . filesize($file_location));
header('Content-Disposition: attachment; filename="' . $file_title . '"');
readfile($file_location);
Even tough all answers should have been perfectly valid in a reasonable scenario (and were actually tested prior to making the question) the reason to why I've been unable to instruct Apache to deflate a file by MIME-Type remains unknown.
I was able to have it work as desired by forcing the following instructions into the script
apache_setenv('no-gzip', 1);
ini_set('zlib.output_compression', 0);
I do understand that this is a hot patch and is not addressing the problem's root but so far that will have to suffice. As there are others who may hit the same flag, the above code stays here for reference in what is a dirty fix.
You can either:
use the deprecated AddOutputFilterByType and specify only the content types you do want to filter; or
use the more powerful mod_filter. In FilterProvider you can provide a rule that excludes the filter when the zip content type (application/zip) is found in the response headers.
You can make use of mod_rewrite to change the mime-type of the request on the Apache level:
# Serve .zip request as zip-files
RewriteRule \.zip$ - [T=application/zip,E=no-gzip:1]
Place it above the rules of the framework, however this needs to make DEFLATE as well depended on mime-type and not file-extension as you do with <FilesMatch>.
Probably it works well together with
AddOutputFilterByType DEFLATE text/html
instead of the <FilesMatch> Directive.
Edit: Added the L flag which should be used in .htaccess context and additionally turned DEFLATE off via the no-gzip environment variable.
Try this (since your urls appear to end in .zip it might work for you):
<FilesMatch "\.(xml|txt|html|php)$">
SetEnvIf Request_URI "\.zip$" no-gzip
SetOutputFilter DEFLATE
</FilesMatch>
Instead of using
<FilesMatch "\.(xml|txt|html|php)$">
SetOutputFilter DEFLATE
</FilesMatch>
Use this configuration for setting compression rules.
AddOutputFilterByType DEFLATE text/html text/plain text/css text/xml application/x-javascript application/javascript
This way, your output will be compressed only if content-type matches with above directives.
CI controller that serves the zip is already sending correct content-type header, so this will not get compressed.
header('Content-Type: application/zip');