I have an ZF2 PHP app and i want my page never put in cache and force the browser to get a new one each time.
First : how manipulate response through controller (ZF 2.4) ? My search fall on older version, often the 1.x and so one. Maybe i miss a thing.
Second : have you an advice about the 'browser never cache' manipulation to do ? On php.net they are a lot of solution but in pure PHP, maybe you know a good way with ZF2.
Thanks, have a nice day
You can access the headers through the response object as follows from a controller action:
$this->getResponse()->getHeaders()->addHeaderLine('Cache-Control: no-store, no-cache, must-revalidate, max-age=0')
Related
I'm working on a Laravel 5.1 project, using a lot of ajax calls returning html blocks.
To optimize the speed of the website i want to implement private and public response caching. this works fine using following code:
return response()
->json($result)
->header('Cache-Control', 'public, max-age=300');
Yet using it this way wont hold in account objects that are updated within the 300 seconds.
Are there possibilities that allow me to clear the response cache of a request, if and only if the returning objects have been updated ?
Maybe you can try server side caching with something like this below. sorry this is crude
function sometest(User $user)
{
/** . . .conditions to check if some data has changed . . . **/
$jsonResponse = Cache::remember(Auth::id() . "_sometest", 300, function () use ($user)
{
$result = $user->all(); //get result here
return $result;
});
return response()->json($jsonResponse);
}
You can read about here Cache
you can also try
config caching: php artisan config:cache
route caching: php artisan route:cache
and utilizing memcached if you are able to.
As other said, client browser need a request to know that the data has been updated. Here's some solutions I would look into in your case:
Server-side cache (data still need to be network transferred):
depending on your environment, I would set up an Nginx + FastCGI cache using a "stale and update" policy. So the cache is always served (fast), and the cache is always refreshed. So only a few requests (one, or more depending on time to refresh cache) after code update are served with outdated data. This cache is URL-based, so if your content is cookie/session-based it can become tricky.
as #ZachRobichaud said, you can use the Laravel cache and set up a low cache retention time. Let's say 10s, which means the request will be outdated for 10s max after your content update. I'm not aware of a "stale and update" way on laravel, but it can be done with queues.
Client-side cache (no data transfer needed):
as I said, the client needs to know the data has been updated to invalidate the cache.
Usually for assets, we do "cache bursting" by adding GET parameters to the file URL. Like 'asset?version=1234with version changing for each deployment. As header cache is URL (and header) based, URL change will force network loading of files. Not tested with text/HTML content-type response, but worth a try if you can update URLs with a parameter you can change in.env` as example. Can be done dynamically if you have a CD/CI or thins you can trigger on deploy. In this case, you can "infinite" cache those as the "refresh" will be done by changing the URL parameter.
You can take a look at stale-while-revalidate Cache-Control header value that seems to work the same: always serve cache, and refresh cache if expired (also look at other parameters, can give you ideas). Careful about compatibility on this (no IE or Safari).
The Laravel Cache may be the fastest to implement and test, and see if the results suit you. It depends on the payload size also, if it's huge, browser cache is indeed better. If bandwidth is not the issue, it's mostly server response time: in this case, Laravel Cache would do the trick.
I'm very new to Restler3. And a very happy new user!
My object was basically to setup a EC2 which using Restlers clever routing can rewrite streaming manifests (the manifest basically tells which fragments of video/audio/subtitle to stream).
All is very good. Restler gets the Manifest, does the rewriting and easily sends it back to requestor.
Now I am trying to squeeze something else into Restler. I need Restler to respond with an MP4-header formatted XML-Subtitle-TTML chunk.
You might ask, why squeeze this onto a Restler-platform?
A. The routing in Restler makes everything so much easier.
B. Why not try it out?
So, I have managed to get Restler to do almost what I need. I simply bypass Restlers return statement. I simply echo() out the binary data to requestor. And amazingly it all works.
My only tiny problem left to sort out is Content-Type.
All my other "normal" xml-requests return "text/html" when testing with this awkward way of returning the response, using simple echo-statements of nicely handcrafted XML. So I try to override with
header('Content-type: text/xml');
Which also gets returned.
The problem is that somehow the binary response with an MP4-header gets forced by "someone" into
Content-Type: application/json; charset=utf-8
although I have set
header('Content-type: text/xml');
Any clues what I could do to override this?
Easiest way to fix headers Content-type is by adding this header and then exiting your function. And by doing the exit; cheating Restler into replying with the (maybe non-standard) Content-type.
header('Content-type: text/xml');
exit;
I'm trying to disable twig cache in prod mode, or to force it to recompile my views.
I'm using KnapLaps SnappyBundle to generate some PDFs (the same problem appears with DomPDF), and I have dynamic content to render.
When in dev mode, I can modify some text, or even some css properties, the changes are effective immediately.
But in prod mode, I need to cache:clear, or to rm -rf app/cache/prod/twig/* to see the changes.
I tried the following options in my config.yml for Twig section (not at the same time)
cache: "/dev/null"
cache: false
auto-reload: ~
I also try some stuff with header when generating and redering my pdf:
$html = $this->renderView("xxxxPdfBundle:Pdf:test.html.twig", array("foo" => $bar));
return new Response(
$this->get('knp_snappy.pdf')->getOutputFromHtml($html),
200,
array(
'Cache-Control' => 'no-cache, must-revalidate, post-check=0, pre-check=0',
'Content-Type' => 'application/pdf',
'Content-Disposition' => 'attachment; filename='.$file
)
);
I can't figure out how to force twig to recompile or not use the app/cache, because obviously the pdf content will be dynamic when in production.
Info update from the comments:
I perceived that even the dynamic template variables were not updated, so the same PDF got generated over and over again in production, but not in development.
After clearing all caches again, that issue is fixed: PDFs are now generated with dynamic content as designed.
Still, a question remains: what if, when my website is in production, I decide to change the CSS styling inside a pdf template ? CSS is not template variable, and I can't force people to empty their cache :/
The correct way to disable Twig's caching mechanism is to set the cache environment parameter to false instead of a cache directory:
# config_dev.yml
# ...
twig:
cache: false
References:
Twig Environment Options
TwigBundle Configuration
The question of client-side caching has some answers.
First, HTTP employs some headers that describe to the client how to do the caching. The worst of them is to declare that the received resource should be considered cacheable for the next time X without revalidating for updates. The less intrusive version is to add a header with a signature of the delivered version or a last-modified timestamp, and the client should revalidate every time whether or not the resource is still up to date, before using it.
The first kind of caching can only be updated by deleting the client cache in the browser. The second could probably be circumvented by force-loading the page again (Ctrl-F5 or so), but this really is as hidden as the menu allowing to clear the cache.
To play it safe, the usual approach would be to add a tag, revision number, incremented counter or whatever is available, to the query string of the URL used for that resource.
http://example.com/generated/summary.pdf?v=1234
http://example.com/generated/summary.pdf?v=1235
The first URL is from deployment run 1234, the second is from 1235 - this number changes the URL enough to trigger a new request instead of getting the old version from the cache.
I don't know if there is something available in your system to act like this. You could also always add an ever changing value like the current timestamp to avoid caching at all, if you cannot disable the HTTP caching headers.
How can I implement jquery in my Zend Framework application in a custom manner.
appending jquery.js ok
appending script ok
send POST data to controller ok
process POSTed data ok
send 'AjaxContext' respond to client now ok (thanks)
I'm using jquery for the first time, what am I doing wrong?
Early on, the best practice to get Zend to respond to ajax requests without the full layout was to check a variable made available via request headers. According to the documentation many client side libraries including jQuery, Prototype, Yahoo UI, MockiKit all send the the right header for this to work.
if($this->_request->isXmlHttpRequest())
{
//The request was made with via ajax
}
However, modern practice, and what you're likely looking for, is now to use one of two new helpers:
ContextSwitcher
AjaxContent
Which make the process considerably more elegant.
class CommentController extends Zend_Controller_Action
{
public function init()
{
$ajaxContext = $this->_helper->getHelper('AjaxContext');
$ajaxContext->addActionContext('view', 'html')
->initContext();
}
public function viewAction()
{
// Pull a single comment to view.
// When AjaxContext detected, uses the comment/view.ajax.phtml
// view script.
}
Please Note: This modern approach requires that you request a format in order for the context to be triggered. It's not made very obvious in the documentation and is somewhat confusing when you end up just getting strange results in the browser.
/url/path?format=html
Hopefully there's a workaround we can discover. Check out the full documentation for more details.
Make sure your using $(document).ready() for any jQuery events that touch the DOM. Also, check the javascript/parser error console. In Firefox it's located in Tools->Error Console. And if you don't already have it installed, I would highly recommend Firebug.
This should have been a comment, can't, yet...
It has nothing to do with ZF+Jquery combination.
First try a proto of what you need with a simple php file. No framework, just Jquery and straight forward, dirty php.
Oh, and don't forget to track what happens with FireBug.
For example, if I have an echo statement, there's no guarantee that the browser might display it right away, might display a few dozen echo statements at once, and might wait until the entire page is done before displaying anything.
Is there a way to have each echo appear in a browser as it is executed?
You can use flush() to force sending the buffer contents to the browser.
You can enable implicit flushing with "ob_implicit_flush(true)".
function printnow($str, $bbreak=true){
print "$str";
if($bbreak){
print "<br />";
}
ob_flush(); flush();
}
Obviously this isn't going to behave if you pass it complicated objects (or at least those that don't implement __toString) but, you get the idea.
As others pointed out, there are places where things can get hung up besides PHP (e.g., the web server or the client's browser). If you really want to ensure that information is displayed as it becomes available, you'll probably need some AJAX-based solution. You would have one PHP script that handles display and another that does calculations, and have the display script make AJAX requests to the other. jQuery has some pretty simple AJAX functions that might help you there.
You'd also want to have a fallback in case the browser doesn't support/has disabled JavaScript that would just be the standard page that may not display content until the end.
Enabling implicit flush as blueyed said should do the trick since it calls flush after every echo however some browsers also require no-cache headers to be set. Here is what I use. Your mileage may vary depending on the browser.
header('Cache-Control: no-cache, no-store, max-age=0, must-revalidate');
header('Expires: Mon, 26 Jul 1997 05:00:00 GMT'); // Date in the past
header('Pragma: no-cache');
You can call flush() in PHP, but there are several other places that the output may be held (e.g. on the webserver). If you are using output buffering you need to call ob_flush() also.
You may also find that some browsers will not render the page until the HTML is valid which won't be until all the tags are closed (like body, html)
Start your investigation here:
http://httpd.apache.org/docs/1.3/misc/FAQ-F.html#nph-scripts
flush() is part of the answer. At least until a year ago, using flush was unreliable in Safari, however. Depending on your scenario, I'd look into solutions involving javascript. Maybe the various implementation of progress bars have code/ideas you can recycle.
I'd suggest using AJAX.