Browser timing out attempting to load images - php

I've got a page on a webapp that has about 13 images that are generated by my application, which is written in the Kohana PHP framework. The images are actually graphs. They are cached so they are only generated once, but the first time the user visits the page, and the images all have to be generated, about half of the images don't load in the browser. Once the page has been requested once and images are cached, they all load successfully.
Doing some ad-hoc testing, if I load an individual image in the browser, it takes from 450-700 ms to load with an empty cache (I checked this using Google Chrome's resource tracking feature). For reference, it takes around 90-150 ms to load a cached image.
Even if the image cache is empty, I have the data and some of the application's startup tasks cached, so that after the first request, none of that data needs to be fetched.
My questions are:
Why are the images failing to load? It seems like the browser just decides not to download the image after a certain point, rather than waiting for them all to finish loading.
What can I do to get them to load the first time, with an empty cache?
Obviously one option is to decrease the load times, and I could figure out how to do that by profiling the app, but are there other options?
As I mentioned, the app is in the Kohana PHP framework, and it's running on Apache. As an aside, I've solved this problem for now by fetching the page as soon as the data is available (it comes from a batch process), so that the images are always cached by the time the user sees them. That feels like a kludgey solution to me, though, and I'm curious about what's actually going on.
Edit: A commenter asked to see the headers for the request:
Request
Request URL: http://domain.com/charts/chart_name/1234/
Request Method: GET
Status Code: 200 OK
Request Headers
Cache-Control: max-age=0
Referer: http://domain.com/home/chart_page
User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.55 Safari/533.4
Response Headers
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Connection: Keep-Alive
Content-Length: 6354
Content-Type: image/png
Date: Wed, 26 May 2010 21:10:45 GMT
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Keep-Alive: timeout=15, max=94
Pragma: no-cache
Server: Apache
X-Powered-By: PHP/5.3.1
With the image cached, the only difference in the response headers is:
Content-Length: 1129
Keep-Alive:timeout=15, max=96
I am looking into the strange difference for the content length, as it should be the exact same content. I realize that this is likely not optimized in terms of getting the browser to cache the image, but once the image is generated once, the entire page load (including downloading images, scripts, etc.) takes about 1-2 seconds. Without the images cached on the server, the page load is taking 20-30s and several of the images fail to load at all.

After noticing the discrepancy in file sizes there, I realized that I had Kohana's profiler set up incorrectly, so it was outputting a bunch of profile data at the end of the images. Not a lot per request, but in aggregate, it made a considerable difference. The images all load now.

Related

Forcing Codeigniter to work with varnish

I have a codeiginter application that already use codeiginter build-in caching, I applied Varnish as a new caching layer. But as i can see from varnishstat it's not making that difference.
Hitrate ratio: 1 1 1
Hitrate avg: 0.0480 0.0480 0.0480
I think thats because of Codeiginter cookies and HTTP Headers that are being sent back.
This is the http request coming from varnish.
Accept-Ranges:bytes
Age:0
Cache-Control:no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Connection:keep-alive
Content-Encoding:gzip
Content-Length:1755
Content-Type:text/html
Date:Mon, 29 Jul 2013 07:25:59 GMT
Expires:Thu, 19 Nov 1981 08:52:00 GMT
Pragma:no-cache
Server:Apache/2.2.22 (Ubuntu)
Set-Cookie:ci=%2BEZFRauewyNnl3CXHE0cmz6F3G3moadilt5vXu5gghKXkWm4gg0JObnF2Etq6J5rl3XVigbF0je3funibpvEi4X%2BT3rS2VmazGG3T4Vm1%2B8YcbqRgL4xuMcxybndqJy%2BU9mNmsJjOgmYEZ8oPG8BKOtMaxNiMHmgmbBydxt3SzKfFfFUOydFx%2BeeJ7P1WE7V10m6GyfnFO5DrFYNsE%2F37WNAI%2Bpux%2Fwwch5B4DH1%2F6wssPm%2BNwsLQ1%2FBd44hgUXe3mMzzcqvxNjKqs0gjuwzwPT4nibEHirfaJ7TMVGObMjdrbREnoYS2gwoN15cCeKgXmTJQI2vvTuPcdtZVCjcAX6OvTy491HdIvQIdKRhX2BNi8d7ygo%2F7n5T6%2FN%2B0IohNN9iZ%2Fh959W%2Fz4azEJPfTrluucf6cLnlp2T2zb%2Fb3XroWuPqguk4wMpsAstfLsSfA%2F6yEi4Hph%2BPFxX%2BhyBazs11LJ38FA0flWtYY%2Bk%2B6yoF13sTaENN2pWj0bKDTtres9E4y3xMPr%2FZaO78WRA9CccDzcQfbZ3bZUqoXg4HmX%2BHDHiYPLD6uFpnC28LuDrCSbgXFIlhDrC8j65sxNSKhnzlUP7Konr%2FKRfKNzrgtWHBEzuXArW%2BlgIg1MzaW3GIkRy1gr16ZUjIiv7CCx7Y2twAfKOm4t00MvrTcFoxBPN1lzoasNyRLMIvshU8heWZHy17OPEapuO6N%2BuMl9L8LqU0%2FF%2BUeUDyFVwLG39LGkIVuF93VsIYEp6w2UwtccX4OO4P2uwJEoAJMMqUE%2FztELpCv%2BkfRAiub48n%2BRxK%2FhgUHw1LWsWIPv3xngq3MI8ypWCqkWLjPuu5dc%2FdOd3BSW2MYcBwacoB5CEOPBHGq3hw1QSZfY330hkLuyQPHxkh%2FDTija%2FN2Rz6z47JorsCqHGDBK6%2BPswBWvYZeMd0VMD%2F95j%2BFibi6rBqL3hoE%2BDgcfCdly%2FYH9py%2Fe%2Fa0AUiIINTK8EPtpuKdC8dLhKo2jI5J4e1ifZuWjVd3VnL2CvX; path=/
Vary:Accept-Encoding
Via:1.1 varnish
X-Powered-By:PHP/5.3.10-1ubuntu3.5
X-Varnish:1353481467
I noticed difference things:
No matter how hard i try codeiginter won't change the
Cache-Control, Age, Expires until i set them manually
Codeiginiter Sent an old date 1981 I doubled checked the date on my
server it's correct
Codeiginiter keep changing the cookies almost each request.
In my app i have several pages that require username/password but i'm trying to focus first on getting the public pages to be cached. after that I will check the account related pages.
I'm testing on this code:
//$this->output->set_header("Cache-Control:public, max-age=9000");
//$this->output->set_header("Vary: Accept-Encoding,Cookie,User-Agent");
$this->output->cache(2400);
$this->load->view("test");
If your backend is sending Cache-control: max-age=XXX you can forget about Expires header, as it will be ignored in HTTP 1.1 compliant clients & proxys [1] (In fact, it's usual to set it on the past in order to avoid old HTTP 1.0 clients caching items).
Be very careful when setting Vary: User-Agent header, as it can harm your caching chances [2]
And finally, Varnish won't cache a request such as that because of the Set-Cookie header. Varnish needs lazy session initializing (see [3]) to cache thing (or a quite complicated VCL)
If you fix the cookie thing and still have problems the VCL file and varnish version will be really apreciated.
[1] http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html
[2] https://www.varnish-cache.org/docs/3.0/tutorial/vary.html
[3] How should I implement lazy session creation in PHP?
Your cookie problem is one I'm currently wrestling with. I'm guessing from your sample that you have changed the cookie_name variable to be ci rather than ci_session.
The simplest thing for you - for certain values of simple - may be to jettison CI's session handling for an alternative or switch to db-based cookies rather than cookie-based. If there's a set-cookie - and there always is when you have sessions initialized/auto-loaded - it won't get cached. If the data is all stored server-side instead you won't have this issue.
Personally I'm not gung-ho to introduce databases for our use of sessions so I'm going to experiment with setting the cache-control header to ignore cookies and using header_remove() at the top of views I know will never be user-specific (like our RSS feed).

PHP or Apache seems to be caching files read via file_get_contents or include (unwanted behaviour)

Our web application has version numbers that get served out to the client on each request so we can detect an update to the code (ie rolling updates) and displays a popup informing them to reload to take advantage of the latest update.
But I'm experiencing some weird behaviour after the update of the version number on the server, where some requests return the new version number and some return the old, so the popup keeps poping up until you have reloaded the page a few times.
Originally I suspected maybe apache was caching files it read off disk via file_get_contents so instead of storing the version number in a plain text file, I now store it in a php file that gets included with each request, but I'm experiencing the exact same issue!
Anyone have any ideas what might be causing apache or php it self to be serving out old information after i have done an update?
EDIT: I have confirmed its not browser caching as I can have the client generate unique urls to the server (that it can deal with via rewrite) and i still see the same issue where some requests return the old version number and some the new, and clearing the browser cache doesn't help
EDIT 2: The response headers as requested
HTTP/1.1 200 OK
Date: Mon, 23 Jul 2012 16:50:53 GMT
Server: Apache/2.2.14 (Ubuntu)
X-Powered-By: PHP/5.3.2-1ubuntu4.7
Cache-Control: no-cache, must-revalidate
Pragma: no-cache
Expires: Sat, 26 Jul 1997 05:00:00 GMT
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 500
Connection: close
Content-Type: text/html
EDIT 3: So trying to reproduce to get the response headers I found I could only make it happen going through our full deploy process which involves creating versioned folders storing the code and symlinking the relavant folder into the webroot. Just changing the version number wasn't enough to cause it to happen! So seems to be somehow related to the symlinks i create!
I have the same problem when there is a change in the symlink. Have a look at https://bugs.php.net/bug.php?id=36555 it's maybe what you are looking for.
Try (as said in this bug report) setting realpath_cache_size is 0.

Browser does not listen to headers sent by PHP for cache, what can be changed?

I have a systen in place where I have two-layered cache on the website. Every request checks for cache on the web server and if it finds the generated page that has been cached, it returns the cache instead of generating it again.
For example:
User A visits website URL http://www.example.com/home/
Server does not find cache
Server generates the page
Server writes the page to cache
Server returns the generated page to User A
User A is happy
and
User B visits website URL http://www.example.com/home/
Server finds cache
Server returns the cache instead of generating the page again
User B is happy
All that works without problems. But I also want to add an option that browser would not ping the server again (saving servers time of checking if cache exists or not) and use its own cache instead.
User A visits URL http://www.example.com/home/ again
Browser has that page in cache
Browser loads the page for the user from cache
I cannot get the latter working. During original page generation, I am sending to the user with the page the following headers:
header('Cache-Control: public, max-age=10000, must-revalidate');
header('Expires: Fri, 03 Feb 2012 01:59:45 GMT');
But when I check for it with Firebug or Chrome Developer Tools it does not say it is using a cache, instead asking for the data from the server again. I know I must be doing something wrong, since I have the same thing set up for static files like Javascript, and that works.
To test this I didn't just try reloading the page, I created links on the website and moving between those links it asked for the pages from server each time.
Am I missing something?
EDIT:
Alright, apparently what happened was that server sent "Pragma: no-cache" automatically every time. Does anyone know why server would do that? This kept the browser from using cache.
If session is enabled for that page/url, the Pragma: no-cache-header will be added to the http header which prevents the browser from using the cache.
if you use session_start
PHP will add
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Pragma: no-cache
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
the whole point of PHP is to provide dynamic pages after all.
to stop this ...
session_cache_limiter('');
session_start();
and you can then write your own headers based on the content you provide

Lost session data when setting http headers (in CodeIgniter)

EDIT #2
Ok, the problem is different to what I originally thought, so I'm putting this 'edit' up the top. I've also updated the question title (the old one was 'Streaming wav audio from a mysql blob').
The problem now seems to be related to CodeIgniter sessions. The above script will only run if the user is logged in. For some reason, when I manually set the response headers (either
with php's header() or codeigniters output class) I can see from my logs that everything gets reloaded and reinitilised -- and that the session data is lost, so the user is no longer logged in, so the script is actually outputting an error.
Once I removed any requirement or reference to session data the audio plays fine... but this isn't really an option unless I can manage to authenticate the user some other way. Very frustrating.
.
.
.
** Original Text **
I have a mysql blob which contains audio data in wav format. I'm trying to output that to the browser to stream in whatever audio player plugin the browser wants to use.
The audio player works fine if I point it at a physical .wav file, so that doesn't seem to be a problem. However, if I point it at my PHP script I get a 'Search for suitable plugin?' popup in Firefox (which fails to find anything), and just an empty inactive player in Chrome.
Here's the PHP, the $aud object contains information retrieved from the database:
header("Content-length: $aud->size");
header("Content-type: audio/x-wav");
echo $aud->recording;
exit();
If I add header("Content-Disposition: attachment; filename=$name"); to the above I get to download the file, which then plays fine in an external audio player, but that's not what I want.
This snippit is part of a CodeIgniter application, if that would make a difference. I have routing set up so that /audio/$id.wav will grab the appropriate data and output it with the code above.
Can anyone see or think of any reason the above might not be working?
EDIT
These are the headers returned by the php script:
Date: Tue, 22 Mar 2011 22:12:06 GMT
Server: Apache/2.2.16 (Ubuntu)
X-Powered-By: PHP/5.3.3-1ubuntu9.3
Set-Cookie: ci_session=<long encrypted session string>; expires=Wed, 23-Mar-2011 00:12:06 GMT; path=/
Content-Length: 12345
Keep-Alive: timeout=15, max=98
Connection: Keep-Alive
Content-Type: audio/x-wav
200 OK
And for comparison, these headers are returned when I force a download of the above audio and open that wav file directly in the browser:
Date: Tue, 22 Mar 2011 22:10:53 GMT
Server: Apache/2.2.16 (Ubuntu)
Last-Modified: Tue, 22 Mar 2011 22:08:30 GMT
Etag: "200a83-3039-49f197bfcb380"
Accept-Ranges: bytes
Content-Length: 12345
Content-Type: audio/x-wav
200 OK
Saving and then opening the file directly does work. Having the PHP script output to the browser does not.
I would start investigating this problem by comparing the HTTP content-length, content-type, and all the other headers sent by the web server when pointing to the physical .wav file, with the headers sent when trying to open the PHP script. I think the actual content body is correct, according to your post stating that if you download the file as an attachment, it can be played with an audio player application.
Try adding this header in there:
header('Content-Transfer-Encoding: binary');
See if that helps it out.

Downloading invoices (PDF) does not complete

We have a magento commerce site running on an IIS 6.0 server with PHP 5.2.11 running magento.
Whenever user tries to use the print to download pdf to their computer from the admin panel the download does not complete. I can see that the full file is downloaded to the computer but the browser still keeps on saying it is downloading. This means the file gets save with a .part in the end and users cant open the file as pdf. If i remove .part extension created by firefox then i can view the pdf correctly. This means the data is sent to the browser from server in full but download does not terminate.
See headers below on response while starting to download the pdf
HTTP/1.x 200 OK
Cache-Control: must-revalidate, post-check=0, pre-check=0
Pragma: public
Content-Length: 1456781
Content-Type: application/pdf
Content-Encoding: gzip
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Last-Modified: Fri, 18 Dec 2009 10:23:37 +0000
Vary: Accept-Encoding
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET, PHP/5.2.11
Content-Disposition: attachment; filename=invoice2009-12-18_10-23-37.pdf
Date: Fri, 18 Dec 2009 10:23:37 GMT
I guess it is something to do with not closing the connection after sending the whole file through? Please help!
Thanks.
I had the exact same problem (Apache), I temporarily solved the issue by turning off the gzip compression on the responses. My guess is that the size being reported by Magento (which it gets from a strlen() call on the PDF content) to the browser does not reflect the real content size that the browser gets given that it gets compressed later on. This results in the browser waiting for more data which is never going to arrive..
edit: worth noting that in my case I was going to the site through a reverse proxy.
Have you tried explicitly calling exit; after you output the pdf data. Sounds like an IIS thing.

Categories