I'm working on a Laravel 5.1 project, using a lot of ajax calls returning html blocks.
To optimize the speed of the website i want to implement private and public response caching. this works fine using following code:
return response()
->json($result)
->header('Cache-Control', 'public, max-age=300');
Yet using it this way wont hold in account objects that are updated within the 300 seconds.
Are there possibilities that allow me to clear the response cache of a request, if and only if the returning objects have been updated ?
Maybe you can try server side caching with something like this below. sorry this is crude
function sometest(User $user)
{
/** . . .conditions to check if some data has changed . . . **/
$jsonResponse = Cache::remember(Auth::id() . "_sometest", 300, function () use ($user)
{
$result = $user->all(); //get result here
return $result;
});
return response()->json($jsonResponse);
}
You can read about here Cache
you can also try
config caching: php artisan config:cache
route caching: php artisan route:cache
and utilizing memcached if you are able to.
As other said, client browser need a request to know that the data has been updated. Here's some solutions I would look into in your case:
Server-side cache (data still need to be network transferred):
depending on your environment, I would set up an Nginx + FastCGI cache using a "stale and update" policy. So the cache is always served (fast), and the cache is always refreshed. So only a few requests (one, or more depending on time to refresh cache) after code update are served with outdated data. This cache is URL-based, so if your content is cookie/session-based it can become tricky.
as #ZachRobichaud said, you can use the Laravel cache and set up a low cache retention time. Let's say 10s, which means the request will be outdated for 10s max after your content update. I'm not aware of a "stale and update" way on laravel, but it can be done with queues.
Client-side cache (no data transfer needed):
as I said, the client needs to know the data has been updated to invalidate the cache.
Usually for assets, we do "cache bursting" by adding GET parameters to the file URL. Like 'asset?version=1234with version changing for each deployment. As header cache is URL (and header) based, URL change will force network loading of files. Not tested with text/HTML content-type response, but worth a try if you can update URLs with a parameter you can change in.env` as example. Can be done dynamically if you have a CD/CI or thins you can trigger on deploy. In this case, you can "infinite" cache those as the "refresh" will be done by changing the URL parameter.
You can take a look at stale-while-revalidate Cache-Control header value that seems to work the same: always serve cache, and refresh cache if expired (also look at other parameters, can give you ideas). Careful about compatibility on this (no IE or Safari).
The Laravel Cache may be the fastest to implement and test, and see if the results suit you. It depends on the payload size also, if it's huge, browser cache is indeed better. If bandwidth is not the issue, it's mostly server response time: in this case, Laravel Cache would do the trick.
Related
I have code similar to what's below (it's example code). Using Guzzle, I make multiple calls to the same site to see if a document exists. If the document exists, I save it. As I make each call, the memory usage goes up. Eventually, if the number of requests is high enough, I run out of memory. I used memory_get_peak_usage to track where the memory use was happening, and it's the Guzzle client.
The code works as expected, but I cannot find a way to tell the Guzzle Client to "reset and dump all previous requests". I'm pretty sure it's caching the results in memory, but as I've written them out to a file, I know I won't be needing said results. How do I dump them?
FWIW, my current solution is making a new client duplicating the parameters of the original one, and unsetting it periodically. It works, but it's ugly.
$client = new \Guzzle\Http\Client('some_url');
for ($i=0; $i<10000; $i++)
{
try {
$pdf = $client->get( $pdf_name )->send();
$this->filesystem->put(
$pdf_name,
$pdf->getBody( true )
);
} catch ( ClientErrorResponseException $ex ) {
}
}
Based on a cursorary glance at the source code for the bundle the Guzzle client is making use of Doctrine's filesystem cache. References:
Bundle/Resources/config/cache.xml
Bundle/DependencyInjection/MisdGuzzleExtension.php
Bundle/DependencyInjection/Configuration.php
The Bundle documentation also provides information on Caching. So, in theory to remove/disable the cache, all you have to do is remove the reference to <argument type="service" id="misd_guzzle.cache.filesystem"/> from the the addSubscriber section of your MyBundle/Resources/config/services.xml
I have HHVM running on a virtualbox VM, with the webroot mapping to my local laravel install, being served out at an internal IP.
Locally, I'm serving the site out under http://[localhost]:8000.
The codebase is identical.
code of MembersController.php (resourceful controller):
public function show($id)
{
$member = Member::findOrFail($id);
$data = array();
$data['id'] = $member->id;
$data['first_name'] = $member->first_name;
$data['last_name'] = $member->last_name;
return Response::json($data);
}
Assuming everything is working normally:
When I run a GET request to LOCALHOST: http://[localhost]:8000/api/v1/member/1, the client returns the JSON as normal - all good.
When I run a GET request to HHVM (same client, identical codebase): http://[vm_ip_address]/api/v1/member/1, the client receives no data.
The data is being passed back through the calls within HHVM though, as if I change the 'return' to 'echo', the payload is returned in both cases (headers also)
It looks like HHVM is affecting with laravel's Response::json() function and disallowing the reply contents from being displayed in the client.
Has anyone else seen this?
This is not something I can set up a unit test for, as it always passes, because the final reply always has data in it :/
Any input would be great - I'm interested to learn how to get around this.
Thanks.
Sadly you're probably going to have to get your hands dirty debugging. HHVM probably has a very slight difference in how it does something which this code-path is sensitive to. We pass 100% of laravel unit tests, but there probably isn't one covering this case.
If you can, please trace down the code to where the data changes. Put in echos and error_logs until you can build a very small test case then then open an issue on github and we'll get it fixed.
I'm trying to disable twig cache in prod mode, or to force it to recompile my views.
I'm using KnapLaps SnappyBundle to generate some PDFs (the same problem appears with DomPDF), and I have dynamic content to render.
When in dev mode, I can modify some text, or even some css properties, the changes are effective immediately.
But in prod mode, I need to cache:clear, or to rm -rf app/cache/prod/twig/* to see the changes.
I tried the following options in my config.yml for Twig section (not at the same time)
cache: "/dev/null"
cache: false
auto-reload: ~
I also try some stuff with header when generating and redering my pdf:
$html = $this->renderView("xxxxPdfBundle:Pdf:test.html.twig", array("foo" => $bar));
return new Response(
$this->get('knp_snappy.pdf')->getOutputFromHtml($html),
200,
array(
'Cache-Control' => 'no-cache, must-revalidate, post-check=0, pre-check=0',
'Content-Type' => 'application/pdf',
'Content-Disposition' => 'attachment; filename='.$file
)
);
I can't figure out how to force twig to recompile or not use the app/cache, because obviously the pdf content will be dynamic when in production.
Info update from the comments:
I perceived that even the dynamic template variables were not updated, so the same PDF got generated over and over again in production, but not in development.
After clearing all caches again, that issue is fixed: PDFs are now generated with dynamic content as designed.
Still, a question remains: what if, when my website is in production, I decide to change the CSS styling inside a pdf template ? CSS is not template variable, and I can't force people to empty their cache :/
The correct way to disable Twig's caching mechanism is to set the cache environment parameter to false instead of a cache directory:
# config_dev.yml
# ...
twig:
cache: false
References:
Twig Environment Options
TwigBundle Configuration
The question of client-side caching has some answers.
First, HTTP employs some headers that describe to the client how to do the caching. The worst of them is to declare that the received resource should be considered cacheable for the next time X without revalidating for updates. The less intrusive version is to add a header with a signature of the delivered version or a last-modified timestamp, and the client should revalidate every time whether or not the resource is still up to date, before using it.
The first kind of caching can only be updated by deleting the client cache in the browser. The second could probably be circumvented by force-loading the page again (Ctrl-F5 or so), but this really is as hidden as the menu allowing to clear the cache.
To play it safe, the usual approach would be to add a tag, revision number, incremented counter or whatever is available, to the query string of the URL used for that resource.
http://example.com/generated/summary.pdf?v=1234
http://example.com/generated/summary.pdf?v=1235
The first URL is from deployment run 1234, the second is from 1235 - this number changes the URL enough to trigger a new request instead of getting the old version from the cache.
I don't know if there is something available in your system to act like this. You could also always add an ever changing value like the current timestamp to avoid caching at all, if you cannot disable the HTTP caching headers.
I need to know if I can improve the way I cache my api calls from the inside of my CodeIgniter app.The way I do it right now is like this, in a hmvc pattern:
Controller HOME == calls to => module app/application/modules/apis/controllers/c_$api == loads library => app/application/libraries/$api ==> Library returns response to module's controller_X, the controller invokes the view with the data it has
//Note: My app does not use twitter api, but others
Inside the apis module is where the all the apc caching is happening, like so:
// Load up drivers
$this->load->library('driver');
$this->load->driver('cache', array('adapter' => 'apc'));
// Get Tweets from Cache
$tweets = $this->cache->get('my_tweets');
if ( ! $tweets)
{
// No tweets in the cache, so get new ones.
$url = 'http://api.twitter.com/1/statuses/user_timeline.json?screen_name=gaker&count=5';
$tweets = json_decode(file_get_contents($url));
$this->cache->save('my_tweets',$tweets, 300);
}
return $tweets;
as explained in this article: http://www.gregaker.net/2011/feb/12/codeigniter-reactors-caching-drivers/
So I was wondering:
Having 3 scenarios: home, query, result; in each module apis's controller, do you think it would be a good idea to implement cache for each controller with all the scenarios? example:
//for each api1, api2 ... apiX, apply this:
//home
$this->cache->save('api_home',$api_home, 300);
//query
$this->cache->save("api_$query", $api_{$query}, 300); // I don't know for sure if $api_{$query} works or not, so don't hang me because I haven't tried it.
//result
$this->cache->save("api_$queryId", $api_{$queryId}, 300);
Even though I cached the api call, do you think I should cache the result in the controller that is calling the api module controller, with the same 3 scenarios (home, query and result)? Like so:
//modules/{home,fetch,article}/controllers/{home,fetch,article}.php
//home
$homeData['latest'][$api] = modules::run("apis/c_$api/data", array('action'=>'topRated'));
$this->cache->save('home_data', $home_data, 300);
//query
$searchResults[$api] = modules::run("apis/c_$api/data", $parameters);
$this->cache->save("search_results_$query", $search_results_{$query}, 300);
//article page
$result = modules::run("apis/c_$api/data", $parameters);
$this->cache->save("$api_article_$id", ${$api}_article_{$id}, 300);
So, what do you think? Is it a good practice the mentioned above, or just an awful stupid one?
//Note, the suggested caching ideas were not tested... so, I don't know if ${$api}_article_{$id} will work or not (even though I suppose it will)
IMHO It is a good idea to cache api results if you don't need real time data. If you don't care that you won't see new data for an hour, then by all means cache it for an hour. So for your first question, you just need to ask yourself: "How fresh does the content need to be for my application?" and implement caching accordingly.
For the second question: I don't see a lot of value in caching content if it's only been manipulated in simple ways. At that point you're using up space in your cache and not getting a lot of value. But if there are database, or other api calls being made using that data, then yes they should be cached using a technique similar to the above.
If you're that worried about processor load (the only reason to cache content after manipulation) you're best bet is to look at something like Varnish or CloudFront.
My Google-fu hasn't revealed what I'm looking for, so I'm putting this one out to the crowd.
Coming from an ASP.NET development background, I'm used to having the Application and Cache collections available for me to stash rarely-modified but often-used resources (such as lookup rows from a database or the contents of static XML documents) in the memory of the web server, so I don't have to reload these often-used items during every request.
Does PHP have an equivalent? I've read up briefly on the memcache extension, but this won't work for me (as I don't have control over the server configuration.) I'm tempted to implement something that would allow me to pre-parse or pre-select the resources and generate a sort of PHP cache "file" that would construct the cached object from literals stored in the file, but this seems like a very hacky solution to me.
Is there something in PHP (or, alternatively, a helper library of some sort) that will allow me to accomplish this using best practices?
In short, no, such a thing is not available natively in PHP. To understand why, you have to understand that PHP has its entire environment built for each request, and it is subsequently torn down at the end of the request. PHP does give you $_SESSION to store per session variables, but after digging into the docs you will see that that variable is built during each request also. PHP (or mod php to be more specific) is fundamentally different from other "application servers". Basically, it is not an application server. It is a per request script runner.
Now, don't get me wrong, PHP lets you do application level data store, but you will have to go to a database, or to disk to get it. Remember this though, don't worry about optimizing for performance until it is shown that preformance is a problem. And I will guess that 99 times out of 100, by the time performance is an issue that isn't due to some poor code you wrote, you will have the resources to build your own pretty little memcached server.
Take a look at Zend_Cache library, for example. It can cache in multiple backends.
This is a bit of a hack but but works in php 7+
Basically you cache your data to a temp file and then use include to read the file, which is cached in memory by the php engine’s in-memory file caching (opcache)
function cache_set($key, $val) {
$val = var_export($val, true);
// HHVM fails at __set_state, so just use object cast for now
$val = str_replace('stdClass::__set_state', '(object)', $val);
// Write to temp file first to ensure atomicity
$tmp = "/tmp/$key." . uniqid('', true) . '.tmp';
file_put_contents($tmp, '<?php $val = ' . $val . ';', LOCK_EX);
rename($tmp, "/tmp/$key");
}
And here’s how we “get” a value from the cache:
function cache_get($key) {
#include "/tmp/$key";
return isset($val) ? $val : false;
}
from https://medium.com/#dylanwenzlau/500x-faster-caching-than-redis-memcache-apc-in-php-hhvm-dcd26e8447ad