I have a flash app that requests xml generated by a php script. The data doesn't change much, and I would like flash to cache the xml instead of loading it every time. I've been checking my access logs, and every single time i reload a page with the flash app on it, the php file is accessed and the xml downloaded.
I've read that flash doesn't control what is cached, as it just requests something from the browser, but nothing else that flash downloads (i.e. mp3 files that are supplied by the xml) doesn't get cached. So I'm not really sure what that means.
I've googled the heck out of this, but everything I find is telling me how to keep flash from caching stuff.
Here's the code I used (AS3):
xmlLoader.load(new URLRequest("info.php"));
It's not a huge deal but sometimes it takes 2-3 seconds to load if my host decides to respond slowly.
edit: I got the headers:
HEAD /beatinfo.php HTTP/1.1[CRLF]
Host: spoonhands.com[CRLF]
Connection: close[CRLF]
User-Agent: Web-sniffer/1.0.37 (+http://web-sniffer.net/)[CRLF]
Accept-Encoding: gzip[CRLF]
Accept-Charset: ISO-8859-1,UTF-8;q=0.7,*;q=0.7[CRLF]
Cache-Control: no-cache[CRLF]
Accept-Language: de,en;q=0.7,en-us;q=0.3[CRLF]
Referer: http://web-sniffer.net/[CRLF]
Try looking at the header function. (http://php.net/manual/en/function.header.php)
That is the one i always use to send html headers so that it will not be cached. I think you can send headers so that it will be cached instead.
Related
We're using a normal PHP download script (with headers etc) to serve files to users.
The issue however is that with some browsers and large downloads the download script is requested multiple times. NGINX logs show the requests with a 206 status code, (suggesting chunked streaming?) which is strange because we don't serve any streamable content?
Regardless, this means the download script is requested multiple times and thus the MySQL function of +1'ing the download counter for the file is run multiple times per download.
We tried using sessions, but seeing as the download is severed from an external server + domain we have no way to clear said sessions after they're set.
We're using Laravel with NGINX + MySQL, any help would be appreciated. Thanks!
Looking at the spec and the headers for the request which would ultimately result in a 206 response, there was one header which struck out which looks like it would be perfect.
The header in question is the Content-Range header which could look like the following:
Content-Range: bytes 21010-47021/47022
What this is saying is it wants to grab bytes 21010-47021 out of 47022 bytes. All you should need to be worried about is the first number here and if it's 0 or not. If the header was set and the first number is 0, you can assume it's just beginning the download and you should increment the counter.
This must be so simple, but I can't find the answer anywhere.
I'm using Symfony2 but I suppose that's not the issue.
(1) Using this (example) my HTML output pages are cached perfectly by a browser:
Cache-Control: max-age=60, private
...but then if I output in JSON and use exactly the same cache headers, the browser won't cache it.
(2) If I use this:
Cache-Control: max-age=60, public
...then the Symfony reverse-proxy kicks in and works great (for both HTML and JSON), so it is definitely the browser not caching JSON that's the problem.
I've looked at Apache settings, but all I can see are ones that will set cache headers that I'm already using (e.g. ExpiresDefault).
Background
Part of my application's responsibility is handling requests for static resources (CSS, JavaScript, images) in a controlled manner. Based on some application logic, it will return one from a selection of different files that might be served on that URL at different times and to different users. These are therefore static files, but delivered in a dynamic way.
The application is based on Symfony Components and the serving of these static-ish files is handled by the BinaryFileResponse class.
The bootstrap code calls the trustXSendfileTypeHeader method:
\Symfony\Component\HttpFoundation\BinaryFileResponse::trustXSendfileTypeHeader();
The application uses some internal logic based on configuration and the detection and use of apache_get_modules() to determine availability. If XSendfile is available and the configuration says to use it, it sets the X-Sendfile-Type header:
if ($useHeader === true) {
$request->headers->set('X-Sendfile-Type', $header);
}
$response = new BinaryFileResponse($filename);
Problem
When I run this with the configuration set to never use XSendfile, or through the PHP built-in web server, which obviously does not support XSendfile, everything is perfect.
When I utilise XSendfile, it also works -- most of the time.
Every so often, typically if I press the f5 key 3-4 times in quick succession, "something" wigs out and I get a garbled response. For example, this is supposed to be a JavaScript file (copied from "Response" tab under "Net" in Firebug):
hxYîãx��HTTP/1.1 200 OK Date: Tue, 05 Feb 2013 14:49:10 GMT Server:
Apache/2.2.22 (Ubuntu) X-Powered-By: PHP/5.4.6-1ubuntu1.1
Cache-Control: public Last-Modified: Tue, 29 Jan 2013 13:33:23 GMT
Accept-Ranges: bytes Content-Transfer-Encoding: binary ETag:
"10426f-9f6-0" Vary: Accept-Encoding Content-Encoding: gzip
Content-Length: 1011 Keep-Alive: timeout=5, max=98 Connection:
Keep-Alive Content-Type: application/javascript
������VmoÛ6þ,ÿkÀ²ãIý°~q [Üt]
XÑt¶H¤#Rv¼Àÿ}w(YSÀØ2yïå¹*¾Á>¯¥¥,è) Æ^Ât¸BaÆ\éjgäjí
Î&ð*¸Åí¸tY!³Ç$Óe"jÞ![#,n®®oï®A¨þ¸þù××Þ©¼¼ôÇêÚd¹49mv°ÔrtBÖ^;WÍÓÔg´Y¥´FéôÁR9o°35Îà^º´N=UÐèEµ¢XE¸íÒ%ª°¨Úò7¬KñT¾{;£ÈrTnß³étUè{QÀçÍn·:'üJëQÍÄËZeNjOàyÕÁ:#3wö~4Òét1ù$µeN)RD|
¶FTØJ·ß½¥¨¸õGç >9TyÜxzgl-J:) b«9ûAQ½KXÉ!yÐÓ]
óÆÎ#W¡?¢vún·7j©ÿ¢ðõÖGEÁy\ºp¤÷cKxf?ï*¼Éç0^ïîÌÇ°ñDQ¸mYJ|4t¾ñæËÛ¯Å
¨6:çøp(}þÑò|LÂ;Õ(#v¹* /[¨U|xª
æ]ÍyìjµòÛ¯p?4sI¥"v÷ôp|uQ4ò4&Ï·$eÒc¸ xo%7Ôi´2ñx;TuÙj23 áÊ%ħ¿¹lÌwÀS.&ÏØß7¸}ó
ZXzå k2'Zdùè
�¦ºû-Ù[Ó²ÿU(¯¤¥=pÃjô¾ç]]Øhhô²×ÙãÚÍ4¨[!Õ}'Òþ^Ð�ûxÿ#+ÚVÞ~áÌáy?d
aíD¹·U×ÃÚ] õ5íÃø¨o÷ÂAvUÆmÍaày`¦ä©A?mL[-}®(ÿË
d°öò¬}Ç¢³Çp1À^6%0 hTô^ts´ÞíWô
fO¶ö¢ÎNÜæ·HîUôÔ¶±ÌCµsxh.9åçi Û·_ÈÞØ_ÄãY_Ö}G<ì°ý2wÔ¿aw8/þù\ã±þ"0C
oÂh'tE¶À¤¥7I½éßRt.s?á^d|k/Æ)wRw÷cG¿<Þ
¼´°/^ø*ʤAVZ×y¿zÅΪ¥[²Õ1ò_Vµæï_YXÁÕö ��YXÁÕö ��
Note the presence of the headers in the response body, and the rest of it which is clearly not JavaScript. There are also some spurious characters at the start, which possibly is what leads to the headers being pushed to the body. I have tried to determine if this content is the result of gzipping, but I can't confirm that yet. (See also update below)
Question
Firstly, is BinaryFileResponse even the correct class to use for serving text (non-binary) files? The documentation for the class only says "BinaryFileResponse represents an HTTP response delivering a file." This isn't very detailed but it doesn't say anything about it being exclusively for "binary" files. However the name has its own implications, why didn't Fabien just call this class FileResponse?
Secondly, and more importantly, what could be causing this? I don't believe it is a browser issue because it is repeatable in both Firefox and Chrome. Is this a bug in the XSendfile module or in the BinaryFileResponse class perhaps? (I am likely to think it is not the former because I have used it before in a more "raw" way not via Symfony Components, with no such issues).
Has anyone else experienced this? Any idea where I should even start looking to track this down? I've looked at the BinaryResponseFile source code but it doesn't really do much with XSendfile, just sets the relevant header and prevents content in the response body, from what I can see.
Update
I've just noticed a couple of things about these garbled responses:
There are no actual headers being sent at all, i.e. on the "Headers" tab in Firebug, for the garbled responses, it only lists Request headers and doesn't even show the heading for Response headers.
Even if I set some custom header on the Response in PHP, that header does not appear at all in the garbled responses (as a header or in the response body), but the custom headers appear correctly for the responses that aren't broken.
First, let me say that I don't have any experience with this Apache module, but I'll try to guide you through a general error deduction:
You should check if you can reproduce it more reliably. While a web browser might be ok to try it out, you should go for something like curl and do the request multiple times, for example using a bash for-loop.
for i in `seq 1 5`; do curl -v http://localhost/xsendfile-url; done
The fact that the Connection: Keep-Alive header is set and that there are some weird characters before the actual HTTP header lead me to believe that you won't be able to reproduce this problem with separated curl calls, because it will open a fresh connection each time. So try this to check if that gives you the weird behavior (curl has keep alive on by default):
curl -v http://localhost/xsendfile-url http://localhost/xsendfile-url http://localhost/xsendfile-url
Using this, you could go to the projects github issue page and report your findings. Most probably they will there help you in telling you why mod_xsendfile is behaving the way it is or that you have found a bug.
I think my question seems pretty casual but bear with me as it gets interesting (at least for me :)).
Consider a PHP page that its purpose is to read a requested file from filesystem and echo it as the response. Now the question is how to enable cache for this page? The thing to point out is that the files can be pretty huge and enabling the cache is to save the client from downloading the same content again and again.
The ideal strategy would be using the "If-None-Match" request header and "ETag" response header in order to implement a reverse proxy cache system. Even though I know this far, I'm not sure if this is possible or what should I return as response in order to implement this technique!
Serving huge or many auxiliary files with PHP is not exactly what it's made for.
Instead, look at X-accel for nginx, X-Sendfile for Lighttpd or mod_xsendfile for Apache.
The initial request gets handled by PHP, but once the download file has been determined it sets a few headers to indicate that the server should handle the file sending, after which the PHP process is freed up to serve something else.
You can then use the web server to configure the caching for you.
Static generated content
If your content is generated from PHP and particularly expensive to create, you could write the output to a local file and apply the above method again.
If you can't write to a local file or don't want to, you can use HTTP response headers to control caching:
Expires: <absolute date in the future>
Cache-Control: public, max-age=<relative time in seconds since request>
This will cause clients to cache the page contents until it expires or when a user forces a page reload (e.g. press F5).
Dynamic generated content
For dynamic content you want the browser to ping you every time, but only send the page contents if there's something new. You can accomplish this by setting a few other response headers:
ETag: <hash of the contents>
Last-Modified: <absolute date of last contents change>
When the browser pings your script again, they will add the following request headers respectively:
If-None-Match: <hash of the contents that you sent last time>
If-Modified-Since: <absolute date of last contents change>
The ETag is mostly used to reduce network traffic as in some cases, to know the contents hash, you first have to calculate it.
The Last-Modified is the easiest to apply if you have local file caches (files have a modification date). A simple condition makes it work:
if (!file_exists('cache.txt') ||
filemtime('cache.txt') > strtotime($_SERVER['HTTP_IF_MODIFIED_SINCE'])) {
// update cache file and send back contents as usual (+ cache headers)
} else {
header('HTTP/1.0 304 Not modified');
}
If you can't do file caches, you can still use ETag to determine whether the contents have changed meanwhile.
Here's a strange one:
I've got nginx reverse proxying requests to apache 2 with mod_php.
A user (using firefox 3.1b3) reported that recently, he's started getting sporadic "What should firefox do with this file?" popups during normal navigation. We haven't had any other reports of this issue, and haven't been able to reproduce it ourselves.
I checked Nginx and apache's logs. Nothing in the error logs, and they both show a normal HTTP 200 for the request.
I had him send me the downloaded file, and it's generated HTML, as it should be -- except it has some trailing and leading bytes tacked on.
The opening byte sequence is the magic gzip header: 1F8B08
Here are the opening characters, C-escaped for convenience:
\x1F\x8B\x089608\r\n<!DOCTYPE HTML ...
and the file ends with:
...</html>\n\r\n0\r\n\r\n
When I fetch the same URL via wget, it starts with as expected; the mysterious opening and closing bytes are nowhere to be seen.
Has anyone ever seen anything similar to this? Could this be a FF 3.1b3 bug?
Never seen an issue exactly like that, but I did have an issue once with a transparent proxy that would claim to the web server that it could handle gzip compressed content when, in fact, it received the gzipped content from the server, stripped the gzip headers without decompressing it, and sent the result to the browser. The behavior we saw was what you describe: a save/open file dialog for what should have been a normal web page. In this case the browser in question was IE.
I'm not sure if you're problem is related, but as an experiment, you could look at the requests between the proxy and Apache and see if they are gzipped, or else turn off the gzip compression for requests in Apache and see if that fixes the issue. If so, then you probably have a problem with gzip handling in your proxy.
wget doesn't request a compressed response. Try:
curl --compressed <URL>
You could also try adding a -v to print the response headers, and check that a sensible Content-Type is being returned.