Problems caching images with php - php

I have two websites one for the visitors and one for the content. On the second one, I'm trying to cache the images because when you go to a certain url the same thing is always going to be there. I cannot figure out how to make my browser (Chrome) cache the images. This is what I'm getting for the responses headers:
Cache-Control:public, max-age=1401497895
Connection:Keep-Alive
Content-Type:image/png
Date:Fri, 31 May 2013 00:58:15 GMT
Expires:Sat, 31 May 2014 00:58:15 +0000
Keep-Alive:timeout=5, max=100
Pragma:public
Server:Apache/2.2.22 (Ubuntu)
Transfer-Encoding:chunked
X-Powered-By:PHP/5.4.6-1ubuntu1.2
and here is my php setting the headers:
header('Expires:Sat, 31 May 2014 00:58:15 +0000');
header('Cache-Control:public, max-age=1401497895');
header('Pragma:public');
what am I doing wrong?? I want to minimize the image load time.
Also, I'm accessing them from a CNAME if that matters, although i'm having the same problems with the normal domain.

Why you set max-age so high?
You are trying to set it to nearly 50 years!
This should work but i never tried it with such a high max-age.

Related

Forcing Codeigniter to work with varnish

I have a codeiginter application that already use codeiginter build-in caching, I applied Varnish as a new caching layer. But as i can see from varnishstat it's not making that difference.
Hitrate ratio: 1 1 1
Hitrate avg: 0.0480 0.0480 0.0480
I think thats because of Codeiginter cookies and HTTP Headers that are being sent back.
This is the http request coming from varnish.
Accept-Ranges:bytes
Age:0
Cache-Control:no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Connection:keep-alive
Content-Encoding:gzip
Content-Length:1755
Content-Type:text/html
Date:Mon, 29 Jul 2013 07:25:59 GMT
Expires:Thu, 19 Nov 1981 08:52:00 GMT
Pragma:no-cache
Server:Apache/2.2.22 (Ubuntu)
Set-Cookie:ci=%2BEZFRauewyNnl3CXHE0cmz6F3G3moadilt5vXu5gghKXkWm4gg0JObnF2Etq6J5rl3XVigbF0je3funibpvEi4X%2BT3rS2VmazGG3T4Vm1%2B8YcbqRgL4xuMcxybndqJy%2BU9mNmsJjOgmYEZ8oPG8BKOtMaxNiMHmgmbBydxt3SzKfFfFUOydFx%2BeeJ7P1WE7V10m6GyfnFO5DrFYNsE%2F37WNAI%2Bpux%2Fwwch5B4DH1%2F6wssPm%2BNwsLQ1%2FBd44hgUXe3mMzzcqvxNjKqs0gjuwzwPT4nibEHirfaJ7TMVGObMjdrbREnoYS2gwoN15cCeKgXmTJQI2vvTuPcdtZVCjcAX6OvTy491HdIvQIdKRhX2BNi8d7ygo%2F7n5T6%2FN%2B0IohNN9iZ%2Fh959W%2Fz4azEJPfTrluucf6cLnlp2T2zb%2Fb3XroWuPqguk4wMpsAstfLsSfA%2F6yEi4Hph%2BPFxX%2BhyBazs11LJ38FA0flWtYY%2Bk%2B6yoF13sTaENN2pWj0bKDTtres9E4y3xMPr%2FZaO78WRA9CccDzcQfbZ3bZUqoXg4HmX%2BHDHiYPLD6uFpnC28LuDrCSbgXFIlhDrC8j65sxNSKhnzlUP7Konr%2FKRfKNzrgtWHBEzuXArW%2BlgIg1MzaW3GIkRy1gr16ZUjIiv7CCx7Y2twAfKOm4t00MvrTcFoxBPN1lzoasNyRLMIvshU8heWZHy17OPEapuO6N%2BuMl9L8LqU0%2FF%2BUeUDyFVwLG39LGkIVuF93VsIYEp6w2UwtccX4OO4P2uwJEoAJMMqUE%2FztELpCv%2BkfRAiub48n%2BRxK%2FhgUHw1LWsWIPv3xngq3MI8ypWCqkWLjPuu5dc%2FdOd3BSW2MYcBwacoB5CEOPBHGq3hw1QSZfY330hkLuyQPHxkh%2FDTija%2FN2Rz6z47JorsCqHGDBK6%2BPswBWvYZeMd0VMD%2F95j%2BFibi6rBqL3hoE%2BDgcfCdly%2FYH9py%2Fe%2Fa0AUiIINTK8EPtpuKdC8dLhKo2jI5J4e1ifZuWjVd3VnL2CvX; path=/
Vary:Accept-Encoding
Via:1.1 varnish
X-Powered-By:PHP/5.3.10-1ubuntu3.5
X-Varnish:1353481467
I noticed difference things:
No matter how hard i try codeiginter won't change the
Cache-Control, Age, Expires until i set them manually
Codeiginiter Sent an old date 1981 I doubled checked the date on my
server it's correct
Codeiginiter keep changing the cookies almost each request.
In my app i have several pages that require username/password but i'm trying to focus first on getting the public pages to be cached. after that I will check the account related pages.
I'm testing on this code:
//$this->output->set_header("Cache-Control:public, max-age=9000");
//$this->output->set_header("Vary: Accept-Encoding,Cookie,User-Agent");
$this->output->cache(2400);
$this->load->view("test");
If your backend is sending Cache-control: max-age=XXX you can forget about Expires header, as it will be ignored in HTTP 1.1 compliant clients & proxys [1] (In fact, it's usual to set it on the past in order to avoid old HTTP 1.0 clients caching items).
Be very careful when setting Vary: User-Agent header, as it can harm your caching chances [2]
And finally, Varnish won't cache a request such as that because of the Set-Cookie header. Varnish needs lazy session initializing (see [3]) to cache thing (or a quite complicated VCL)
If you fix the cookie thing and still have problems the VCL file and varnish version will be really apreciated.
[1] http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html
[2] https://www.varnish-cache.org/docs/3.0/tutorial/vary.html
[3] How should I implement lazy session creation in PHP?
Your cookie problem is one I'm currently wrestling with. I'm guessing from your sample that you have changed the cookie_name variable to be ci rather than ci_session.
The simplest thing for you - for certain values of simple - may be to jettison CI's session handling for an alternative or switch to db-based cookies rather than cookie-based. If there's a set-cookie - and there always is when you have sessions initialized/auto-loaded - it won't get cached. If the data is all stored server-side instead you won't have this issue.
Personally I'm not gung-ho to introduce databases for our use of sessions so I'm going to experiment with setting the cache-control header to ignore cookies and using header_remove() at the top of views I know will never be user-specific (like our RSS feed).

PHP or Apache seems to be caching files read via file_get_contents or include (unwanted behaviour)

Our web application has version numbers that get served out to the client on each request so we can detect an update to the code (ie rolling updates) and displays a popup informing them to reload to take advantage of the latest update.
But I'm experiencing some weird behaviour after the update of the version number on the server, where some requests return the new version number and some return the old, so the popup keeps poping up until you have reloaded the page a few times.
Originally I suspected maybe apache was caching files it read off disk via file_get_contents so instead of storing the version number in a plain text file, I now store it in a php file that gets included with each request, but I'm experiencing the exact same issue!
Anyone have any ideas what might be causing apache or php it self to be serving out old information after i have done an update?
EDIT: I have confirmed its not browser caching as I can have the client generate unique urls to the server (that it can deal with via rewrite) and i still see the same issue where some requests return the old version number and some the new, and clearing the browser cache doesn't help
EDIT 2: The response headers as requested
HTTP/1.1 200 OK
Date: Mon, 23 Jul 2012 16:50:53 GMT
Server: Apache/2.2.14 (Ubuntu)
X-Powered-By: PHP/5.3.2-1ubuntu4.7
Cache-Control: no-cache, must-revalidate
Pragma: no-cache
Expires: Sat, 26 Jul 1997 05:00:00 GMT
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 500
Connection: close
Content-Type: text/html
EDIT 3: So trying to reproduce to get the response headers I found I could only make it happen going through our full deploy process which involves creating versioned folders storing the code and symlinking the relavant folder into the webroot. Just changing the version number wasn't enough to cause it to happen! So seems to be somehow related to the symlinks i create!
I have the same problem when there is a change in the symlink. Have a look at https://bugs.php.net/bug.php?id=36555 it's maybe what you are looking for.
Try (as said in this bug report) setting realpath_cache_size is 0.

Caching problem using "AddHandler application/x-httpd-php"

Using .htaccess, I'm setting PHP handler to all my .css and ,js in order to output user-agent based code:
AddHandler application/x-httpd-php .css .js
For example:
<?PHP if ($CurrentBrowser == 'msie') { ?>
.bind('selectstart', function(event) { ... })
<?PHP } ?>
So, in fact, my code files are dynamically created but can be considered static files. That's because, once they have been compiled for the first time, browsers can get them back from cache and reuse them until I change their content.
That's why I'm using fingerprinting/versioning and long time expiration on them:
[INDEX.PHP]
<script type="application/javascript" src="<?PHP echo GetVersionedFile('/script.js'); ?>"></script>
<script type="application/javascript" src="/script.1316108341.js"></script>
[.HTACCESS]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule "^(.+)\.(\d+)\.(css|js)$" $1.$3 [L]
The problem is that those files, even if I send them with a proper header, are never being cached by any browser (I never get a 304 code, always 200). This is a log of my server responses:
[CHROME]
Request URL:http://127.0.0.1:8888/script.1316108341.js
Request Method:GET
Status Code:200 OK
-----
Cache-Control:max-age=31536000, public
Connection:Keep-Alive
Content-Encoding:gzip
Content-Length:6150
Content-Type:application/javascript
Date:Thu, 15 Sep 2011 21:41:25 GMT
Expires:Fri, 14 Sep 2012 21:41:25 GMT
Keep-Alive:timeout=5, max=100
Server:Apache/2.2.17 (Win32) PHP/5.3.6
Vary:Accept-Encoding
X-Powered-By:PHP/5.3.6
[MOZILLA]
Request URL:http://127.0.0.1:8888/script.1316108341.js
Request Method:GET
Status Code:200 OK
-----
Date Thu, 15 Sep 2011 21:43:26 GMT
Server Apache/2.2.17 (Win32) PHP/5.3.6
X-Powered-By PHP/5.3.6
Content-Encoding gzip
Vary Accept-Encoding
Cache-Control max-age=31536000, public
Expires Fri, 14 Sep 2012 21:43:26 GMT
Content-Type application/javascript
Content-Length 6335
Keep-Alive timeout=5, max=100
Connection Keep-Alive
-----
Last Modified Thu Sep 15 2011 23:43:26 GMT+0200 (= time i loaded the page) (???)
Last Fetched Thu Sep 15 2011 23:43:26 GMT+0200 (= time i loaded the page) (???)
Expires Fri Sep 14 2012 23:43:26 GMT+0200
Data Size 6335
Fetch Count 10
Device disk
What could be the problem? How can I force caching on these files?
Many, many thanks!
Since the requests for PHP and CSS files are being handled by PHP, your PHP code with its conditionals is being executed each time.
Apache/PHP have no idea if the content is cacheable or if it should be regenerated so it executes your PHP code each time.
If you send the last modified header, or use your versioning/fingerprinting method, then it is your responsibility in your PHP script to check the fingerprint or version and determine if it is still valid. If so, then you can send a 304 Not Modified header and terminate any further processing. You can also check the request headers for a Last-Modified tag and use that method.
Another approach would be to cache the response for various browsers and dates to a file so you can serve that file up for first time users rather than regenerating it with php. Then you can check the modification time of that file to determine if you can send a 304 header.
This SitePoint article explains several methods of using PHP to cache. Hope that helps.

Way to gzip files like CSS, Javascript once and save them for serving to client instead of processing and gzipping every time a request is made

I believe this would be a more CPU friendly method, can it be implemented with php ?, instead of gzipping content for every request, I compress the files once and serve those instead =).
Yes, this is quite easy to do with Apache.
Store the uncompressed and compressed files side by side. E.g.:
\-htdocs
|-index.php
|-javascript.js
\-javascript.js.gz
Enable content negotiation in Apache. Use:
Options +MultiViews
Now when "/javascript" is requested, Apache will serve the gzipped version if the client declares it accepts it (through Accept-encoding).
Example of two HTTP requests (some headers omitted):
Client claims to accept gzip
GET /EP/Exames/2006-2007/exame2B HTTP/1.1
Host: lebm.geleia.net
Accept-Encoding: gzip, identity
HTTP/1.1 200 OK
Date: Fri, 13 Aug 2010 16:22:59 GMT
Content-Location: exame2B.nb.gz
Vary: negotiate,accept-encoding
TCN: choice
Last-Modified: Sun, 04 Feb 2007 15:33:53 GMT
ETag: "0-c9d-428a84de03a40;48db6d490abee"
Accept-Ranges: bytes
Content-Length: 3229
Content-Type: application/mathematica
Content-Encoding: gzip
‹áüÅE
(response continues)
Client does not claim to accept gzip
GET /EP/Exames/2006-2007/exame2B HTTP/1.1
Host: lebm.geleia.net
Accept-Encoding: identity
HTTP/1.1 200 OK
Date: Fri, 13 Aug 2010 16:23:14 GMT
Content-Location: exame2B.nb
Vary: negotiate,accept-encoding
TCN: choice
Last-Modified: Sun, 04 Feb 2007 15:33:53 GMT
ETag: "0-257f-428a84de03a40;48db6d490abee"
Accept-Ranges: bytes
Content-Length: 9599
Content-Type: application/mathematica
(************** Content-type: application/mathematica **************
CreatedBy='Mathematica 5.2'
(response continues)
See a more complete version here http://pastebin.com/TAwxpngX
Yes, this is a sensible approach to save both bandwidth and connections. (You can enable gzip compression within Apache if so desired, but it's potentially worth doing this anyway as you've save connections.)
In essence, use a PHP function to check if the browser supports gzip compression. (If if doesn't you'll need to fetch the JavaScript/CSS as per normal.) If it does, you can simply point the JavaScript or CSS source location at a PHP script which is responsible for:
Checking to see if there's a compressed version in place. (Simply output the existing 'on disk' if there is.)
Creating a compressed version of the required files.
You'll also probably want to enable/disable this from a define/top level config (for testing purposes, etc.) As a suggestion, you could store the required CSS/JavaScript files paths in a set of arrays which could be used as a basis for creating the cache file or including the files in the traditional manner as a fallback.
I've written a solution along these lines in the past that created a file based on a hash of the required filenames. As such, the cache was automatically rebuilt if a different/additional file was included. (It also re-built the cache after 'n' hours, but that's only to keep things fresh if the filenames didn't change, but the content did.)

Downloading invoices (PDF) does not complete

We have a magento commerce site running on an IIS 6.0 server with PHP 5.2.11 running magento.
Whenever user tries to use the print to download pdf to their computer from the admin panel the download does not complete. I can see that the full file is downloaded to the computer but the browser still keeps on saying it is downloading. This means the file gets save with a .part in the end and users cant open the file as pdf. If i remove .part extension created by firefox then i can view the pdf correctly. This means the data is sent to the browser from server in full but download does not terminate.
See headers below on response while starting to download the pdf
HTTP/1.x 200 OK
Cache-Control: must-revalidate, post-check=0, pre-check=0
Pragma: public
Content-Length: 1456781
Content-Type: application/pdf
Content-Encoding: gzip
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Last-Modified: Fri, 18 Dec 2009 10:23:37 +0000
Vary: Accept-Encoding
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET, PHP/5.2.11
Content-Disposition: attachment; filename=invoice2009-12-18_10-23-37.pdf
Date: Fri, 18 Dec 2009 10:23:37 GMT
I guess it is something to do with not closing the connection after sending the whole file through? Please help!
Thanks.
I had the exact same problem (Apache), I temporarily solved the issue by turning off the gzip compression on the responses. My guess is that the size being reported by Magento (which it gets from a strlen() call on the PDF content) to the browser does not reflect the real content size that the browser gets given that it gets compressed later on. This results in the browser waiting for more data which is never going to arrive..
edit: worth noting that in my case I was going to the site through a reverse proxy.
Have you tried explicitly calling exit; after you output the pdf data. Sounds like an IIS thing.

Categories