caching using .htaccess - one image not caching weirdness - php

I'm on a shared hosted linux server so I must use .htaccess. I'm busy working on compressing and caching things. I actually have two questions, but first here is how I have the cache setup in my .htaccess file.
ExpiresActive on
<FilesMatch "\.(bmp|png|ico|gff|jpg|jpeg|gif|gcf)$">
FileETag MTime Size
ExpiresDefault "now plus 60 minutes"
</FilesMatch>
Question 1, so this does cache those things with the exception of one PNG file.
Now most of my files are all lowercase as I'm on Linux but a few PNG files have slipped through with and upper case extension.
What's strange is that all of the PNG files cache except for one called addon2.PNG. At first I thought it was because of the case but I have checked and I have 3 other PNG files with uppercase extensions - which Google Page Speed says are cached. So any ideas or is Google Page Speed just B.S.?
And question 2, as I'm wary due to my hosts mess up with their Varnish issue I'm adding things to cache a little at a time and waiting to see if my stuff screws up. When I try to cache HTML files the login/logout features of my site—written in PHP—do not work.
You have to login and refresh or logout and refresh. I'm wondering is that because the page HTML is output via the PHP file? All my main pages are PHP and I only have a few actual html files. But I thought caching HTML would just do those files with htm & html extensions using the code below. But it's kinda like the server is trying to cache the html outputted by the PHP files. Am I out of my mind here?
ExpiresActive on
ExpiresDefault "now plus 60 minutes"
ExpiresByType text/html "now plus 60 minutes"
<FilesMatch "\.(css|bmp|png|ico|htm|gff|html|jpg|jpeg|gif|gcf)$">
FileETag MTime Size
ExpiresDefault "now plus 60 minutes"
</FilesMatch>

Your long explanation basically boils down to a PNG image is not being cached & HTML is not refreshing. It could be that the PNG image you say is having issues (addon2.PNG) is simply damaged. But the problem with your question is you have not provided any headers to show what the output is.
If you are using Linux, just go to the terminal and use curl to get headers. For example, this is how I can get headers sent with the main Google image on their homepage:
curl -I https://www.google.com/images/srpr/logo11w.png
And the output of that command is:
HTTP/1.1 200 OK
Content-Type: image/png
Last-Modified: Wed, 09 Oct 2013 01:35:39 GMT
Date: Tue, 17 Dec 2013 23:39:21 GMT
Expires: Wed, 17 Dec 2014 23:39:21 GMT
X-Content-Type-Options: nosniff
Server: sffe
Content-Length: 14022
X-XSS-Protection: 1; mode=block
Cache-Control: public, max-age=31536000
Age: 1647616
Alternate-Protocol: 443:quic
So in your case just type in this command—remember to set it to the real path of the image—like so:
curl -I https://url/to/your/site/addon2.PNG
And compare the headers against something you know is caching as you wish it to.
Also, ExpiresDefault & Cache-Control can sometimes slightly conflict with each other. So check those two values to see what your host is setting versus what your .htaccess is setting.
And regarding this:
So any ideas or is Google Page Speed just B.S.?
All of those page benchmarking tools have varying usefulness. In general for the specific issue you are facing, they are not B.S. but utterly useless. It’s like you are in a car with an oil leak & some automatic system in the car is telling you to change your oil. You don’t need to “change your oil” you need to find the leak & plug it up so the oil stays in!
So in your case the best bet is to using Unix tools like curl to check what the header output is & adjust/tweak until you get it right.
EDIT Also, please look over this pretty in-depth—and useful—caching guide from Google.

Related

Refresh: make the browser use the files in cache [duplicate]

How can I setup expires headers in PHP + Apache? I'm currently using an auto_prepend to serve resources gzipped but I'd also like to maximise the HTTP cache.
How can I set these up?
There are two ways to do this. The first is to specify the header in your php code. This is great if you want to programatically adjust the expiry time. For example a wiki could set a longer expires time for a page which is not edited very often.
header('Expires: '.gmdate('D, d M Y H:i:s \G\M\T', time() + (60 * 60))); // 1 hour
Your second choice is to create an .htaccess file or modify your httpd config. In a shared hosting environment, modifying your .htaccess file is quite common. In order to do this, you need to know if your server supports mod_expires, mod_headers or both. The easiest way is simply trial and error, but some Apache servers are configured to let you view this information via the /server-info page. If your server has both mod_expires and mod_headers, and you want to set the expiry on static resources, try putting this in your .htaccess file:
# Turn on Expires and set default to 0
ExpiresActive On
ExpiresDefault A0
# Set up caching on media files for 1 year (forever?)
<FilesMatch "\.(flv|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav)$">
ExpiresDefault A29030400
Header append Cache-Control "public"
</FilesMatch>
For other combinations and more examples see: http://www.askapache.com/htaccess/speed-up-your-site-with-caching-and-cache-control.html
This Apache module might be of help:
http://httpd.apache.org/docs/2.0/mod/mod_expires.html
Did you try something like?
<?php
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT");
?>

Confusion about gzip, is it compressed or not?

Sorry to bother you with a question that seems very well documented, but none of the things I tried was completely satisfying.
My Company switched from a hosting package to a manages Server just last week and I'm still in the process of optimising it.
Now, Googles PageSpeed Insights tell me, it is not compressed, as does GTMetrix. GidNetwork tells me compression works fine.
I have already added
<IfModule mod_deflate.c>
<FilesMatch "\.(html|php|txt|xml|js|css)$">
SetOutputFilter DEFLATE
</FilesMatch>
</IfModule>
to my .htaccess, (as recommended here) which works correctly, other settings I've changed are fine, as well as
zlib.output_compression = On
to my php.ini.
The entire .htaccess and php.ini can be seen at jsFiddle.
Headers sent and received in both Firefox and Chrome claim that compression is happening.
I also created a httpd.conf in my home directory, because none existed on my server yet. Should I move the file somewhere else?
What I really want to know:
Soo... what am I doing wrong? Is it compressed? Is it not? How can I make google 'see' the compression?
Thank you very much for your help.
This should be the function you need, it should automatically generates the headers you need:
http://www.php.net/manual/en/function.gzdecode.php
anyway check your php version because it works only with php 5.4.0 or later ones.
Although my browser accepts deflate/gzip encoding (Accept-Encoding: gzip, deflate), your server does not answer using compressed data:
HTTP/1.1 200 OK
Date: Tue, 11 Mar 2014 09:41:45 GMT
Server: Apache
Connection: Keep-Alive
Keep-Alive: timeout=2, max=200
Etag: "96-100e9-4f4517d791912"
Expires: Fri, 11 Apr 2014 13:28:25 GMT
Cache-Control: max-age=2692000, public
Vary: User-Agent
If it was compressed, the server would send also
Content-Encoding: deflate
Use FireBug or the dev console to see the headers. It must be your httpd.conf.
You cannot simply create this file in your home directory and Apache will load it.
Have a look in /etc/apache* and /etc/httpd/* for config files.
You have already enabled gzip compression but you haven't set it to compress some file types such as javascript and css. That is why Googles PageSpeed was tried to suggest to enable compression. To enble compression for those two types, use
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/javascript
What worked for me in the end to compress the js/css files that were left uncompressed for reasons I don't quite get yet is described here.
In this method, javascript and css-files are forcibly gzipped by a php-script. It also sets new "Expire"-Headers, so if you want caching for more than 5 Minutes, change the number. Adding different files is trickier, but possible, I think.

php caching and .htaccess caching

There is another thread similar to this that was closed and that didn't have any useful information in it: https://stackoverflow.com/questions/11955822/php-file-caching-vs-cache-through-htaccess
Is it necessary to implement a php caching system if you are caching through .htaccess? Here is my current .htaccess caching:
<IfModule mod_headers.c>
# Cache Media Files
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|swf|mp3|mp4)$">
Header set Cache-Control "public"
Header set Expires "Mon, 20 Apr 2015 20:00:00 GMT"
Header unset Last-Modified
</FilesMatch>
# Cache JavaScript & CSS
<FilesMatch "\.(js|css)$">
Header set Cache-Control "public"
Header set Expires "Mon, 20 Apr 2015 20:00:00 GMT"
Header unset Last-Modified
</FilesMatch>
# Disable Caching for Scripts and Other Dynamic Files
<FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$">
Header unset Cache-Control
</FilesMatch>
</IfModule>
with this file caching, will building out a php caching system improve my site even more? Or would it make more sense to compress data in .htaccess and use php to cache? I'm just trying to understand which method of caching will improve a site more or if using both is recommended.
For static files, you can cache them by HTML Headers tags, and .htaccess
The browsers will cache them in local machines.
For Dynamic content with .PHP, you can cache widget, objects to reduce the query call to mysql database.
You can try this one. Example, it cache $products in 600 seconds, and your PHP only send 1 request to database. If you have like 500 visitors online, your page still use 1 query from first visitors to serve 500.
<?php
include("php_fast_cache.php");
// try to get from Cache first.
$products = phpFastCache::get("products_page");
if($products == null) {
$products = YOUR DB QUERIES || GET_PRODUCTS_FUNCTION;
// set products in to cache in 600 seconds = 5 minutes
phpFastCache::set("products_page",$products,600);
}
foreach($products as $product) {
// Output Your Contents HERE
}
?>
If you are using Wordpress, you can see all the cache plugins, they cache your content by PHP ( Files or Memcache ), and cache your images, css, js by .htaccess
We need both of them together will speed up the site and save bandwidth / CPU
You're doing client-side caching for static files only.
Caching in PHP solves a completely different problem - server-side performace issues of your application. So you should use it if your site is loading too slowly or if you're causing high server load.
There are many strategies how to implement server-side caching and it's up to you what fits best your application.
For example you can cache SQL queries results or you can cache HTML output of whole webpages. Do not forget about cache invalidation when your data changes.

HTACCESS image caching rule that checks the image modification time

I'm serving images two different ways:
Using a PHP script for profile pictures for example
By pointing to them directly, for icons and backgrounds for example
I'm in the process of handling their caching properly, and i'm totally new to this.
For the PHP script, i'm just adding a Last-Modified header to the response, and delivering a 304 status code if it's called again, if the file hasn't changed (using filemtime()).
For direct accesses, i'm using HTACCESS, but every rule i saw so far doesn't allow me to do the same as in my PHP script (checking if the file has changed, then serving a 304 or the file itself).
Here's the HTACCESS rule i'm planning to use:
Header unset Pragma
FileETag None
Header unset ETag
# cache images/pdf docs for 10 days
<FilesMatch "\.(ico|pdf|jpg|jpeg|png|gif)$">
Header set Cache-Control "max-age=864000, public, must-revalidate"
Header unset Last-Modified
</FilesMatch>
From what i understand, the only way of updating a cached image is to rename it. Does someone know a way around it? By checking the image's last modification date for instance?
You could use mod_expires, if available:
<FilesMatch "\.(ico|pdf|jpg|jpeg|png|gif)$">
ExpiresDefault "modification plus 10 days"
</FilesMatch>
What you are doing with PHP should do apache automatically for static files. It will set the Last-Modified header and respond with 304 if it will find if-Modified-since in the request. This is done automatically and has nothing to do with caching. It will not prevent repeated requests to your server, it will just save you bandwidth (and loading times for user) when the file is not modified by returning just 304 info instead of the whole file.
To prevent those repeated requests to your server, browser (and proxy servers) has to do some caching. You can control the caching either via HTTP headers or for HTML also via META tags. When you specify that the file is cacheable for 1 week, browser won't try to contact your server for 1 week (although most browsers are set to revalidate cache entries on first access after startup).
So you will either live with the possibility that some users will use old cached copy for some time (depends on the expiry header) or you must change your URL as Gerben suggested. Only then you can be 100% sure that everyone will get the new version (this is important for javascript as having some of the js files old and some new can make very strange errors). Nowadays almost every high performance website uses the file.ext?v=3 approach, so that they can set the expiry header to large values like 6 months.
As #Gumbo pointed out, "Apache should already do that for static files".
And that's true, Apache does that, so that kind of stuff works fine:
<FilesMatch "\.(ico|pdf|jpg|jpeg|png|gif)$">
Header set Cache-Control "max-age=864000, public, must-revalidate"
</FilesMatch>
ps: Sorry #Gumbo, but i asked you to change your answer so that i can accept it, but you wouldnt do it and i had to close that question eventually, so.

Setup HTTP expires headers using PHP and Apache

How can I setup expires headers in PHP + Apache? I'm currently using an auto_prepend to serve resources gzipped but I'd also like to maximise the HTTP cache.
How can I set these up?
There are two ways to do this. The first is to specify the header in your php code. This is great if you want to programatically adjust the expiry time. For example a wiki could set a longer expires time for a page which is not edited very often.
header('Expires: '.gmdate('D, d M Y H:i:s \G\M\T', time() + (60 * 60))); // 1 hour
Your second choice is to create an .htaccess file or modify your httpd config. In a shared hosting environment, modifying your .htaccess file is quite common. In order to do this, you need to know if your server supports mod_expires, mod_headers or both. The easiest way is simply trial and error, but some Apache servers are configured to let you view this information via the /server-info page. If your server has both mod_expires and mod_headers, and you want to set the expiry on static resources, try putting this in your .htaccess file:
# Turn on Expires and set default to 0
ExpiresActive On
ExpiresDefault A0
# Set up caching on media files for 1 year (forever?)
<FilesMatch "\.(flv|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav)$">
ExpiresDefault A29030400
Header append Cache-Control "public"
</FilesMatch>
For other combinations and more examples see: http://www.askapache.com/htaccess/speed-up-your-site-with-caching-and-cache-control.html
This Apache module might be of help:
http://httpd.apache.org/docs/2.0/mod/mod_expires.html
Did you try something like?
<?php
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT");
?>

Categories