What I'm trying to do is render either a partial or a fragment in Symfony from the cache (the easy part) but if the cache does not exist, then I want Symfony to (instead of recreating the cache) render nothing.
My website pulls data from multiple other websites, which can insanely slow down page rendering speed, so instead of loading the info from other websites on the initial page load, I plan on doing it once the initial page is finished loading and a user clicks the appropriate button, then caching the data for later. However, if the data is cached (from a previous request) then I would rather dump the cached data right into the initial page load.
I tried to clarify it as much as possible, so hopefully it makes sense.
i think you could handle this with a filter and the getViewCacheManager()
Related
How can I get CodeIgniter page cache to take inputs into account? For example, say I have a pagination system. With cache enabled, if I go to page 1, then page 2 (same controller)... CI cache will return page 1's content.
function my_controller() {
$this->output->cache(1);
$page = $this->input->post("page");
$data = getData($page);
$this->load->view('my_view', $data);
}
In the above example, you cannot pan through pages correctly. It will keep loading the cached page even if the posted input changes.
How can I get the cache system to take $_POST data into account and treat those as different requests from a cache point of view?
Caching dynamic content where the view has no noticeable change to the caching class can get messy...
The simplest solution is don't cache dynamic pages, its not really what caching is for anyway, caching is meant for (at least, mostly) static data.
Another way you can do it (depending on result sets) is to use javascript and paginate client side, and use ajax calls to fetch the data from the server, the client would never actually leave the single view and only the initial view would be cached.
How often does the content change? are we talking like a list of orders that changes minute by minute, or a list of products that changes maybe monthly?
If loaded by ajax then just make sure what you are calling is not exposed to your cache (unless the content rarely changes)... As it would be called from a different method this is not difficult to ensure, and on page load you would have it call the first xx number of results fresh every time, but using the benefits of caching for the rest of the view.
Here is what I found after investigating cache behavior in Magento.
I'm not sure of this and asking for correction.
When something like a product is modified, cache entries such as "HTML Block" becomes "invalidated", resulting on being ignored and not used in frontend. This makes sense because these data is now outdated.
It remains "invalidated" until manually "refreshed" through admin area.
Once manually "refreshed", the first render of a cached block will construct its cached copy and append it to this HTML Block cache reserve. Subsequent render operation of this block will find this cache usable, and use it, finally, until cache becomes "invalidated" over again.
Why this process is called "refresh", as it should be something like "reset"? because "refresh" would means it generate updated cache snapshot, but instead it merely allows cache entries to be constructed.
Why don't invalidated data become refreshed once it's invalidated?
This makes me question my conclusion, was I correct?
why this process is called "refresh", as it should be like "reset"
Your general take on this is correct -- some people call it "refresh" because although the action you take resets the cache, in a working Magento system the cache will almost immediately rebuild itself the next time you (or another user) loads the page.
Why don't invalidated data become refreshed once it's invalidated?
When the cache is invalidated, that means the developer working on whatever backend feature invalidated the cache was smart enough to know their actions required a cache refresh, but that that programatic cache control wasn't sufficient to refresh only their portion of the changed cache.
For example, certain blocks might render a change in a product's price, which means any blocks with the price cached need to be refreshed. However, as a backend programmer, there's no way to know which blocks need that invalidation, nor know which cache system (block cache, FPC, varnish) they're stored in. There's also a question of store performance -- if you're editing 100 products, do you want Magento to rebuild the cache 100 times during peak traffic hours? So, instead of deciding how to handle all that, the developer marks the cache as invalidated. This allows the cache system to take whatever action it deems necessary.
In a perfect theoretical cache system, there would be automated processes running that would detect an invalidated cache, and know what to do and when to refresh it. That's a complex system to implement and maintain, so instead Magento chose to simply notify the store owner's of the invalidated cache, and let them take whatever action they deemed appropriate.
Magento cache refreshing should happen when data important to the user is modified by default. E.g. order data, shipping info etc.
This is the behavior I observed during my years writing extensions for the software. You can can manually disable this behavior, but as it stands by default , dynamic data should be punching "holes" through the cache.
I'm building a web app, the way I started off the app for testing purposes is to load lots of data in to session arrays from my database so I can use the values easily throughout the pages. I have one page the has numerous selects on it, and each time the php page loops through all the variables, chooses the selected one, and outputs the dropdown. One of my arrays though has just under 3000 values and loading this dropdown slows the page down from about 300ms to 1-1.2s. Not terrible but easy to tell that it is less responsive. So I'd like to know if there is anyway for me to improve the load speed, or any thoughts on a substitute for the dropdown.
What I have tried so far:
Session arrays hold all the values, when the page is loaded through jquery ajax method the php page loops through these values and echos the dropdowns.
Php include - create php or html pages of all the values pre written as selects, this creates a ~100kb page for the problem dropdown and this is then included with include. Takes roughly the same amount plus I'd have to then use javascript to set the value, but I'd do this if it could be improved. I thought perhaps some caching could provide improvements here. There seemed to be no significant difference between html and php pages for include but I'd assume html would be better. I'm also assuming that I cannot use regular caching because I am using a php function to include these pages.
I have tried just loading in the html page and it takes about 1 sec on first load, after browser caching it is back down to 100-350ms so I imagine caching could provide a huge boost in performance.
I have considered:
Creating a cached version of the whole page but this will be quite the pain to implement so I'd only do it if people thought it is the right way to go with this. I would have to use ajax to retrieve some data for the inputs which I was originally doing with php echos.
Just removing the problem dropdown.
Just to clarify something I've never had clarified, am I correct in thinking php pages can never be cached by the browser, and so by extension any php included files can't be either. But then how come a javascript file linked to in a php file can be cached, because it is using an html method?
The data being returned and parsed into a dropdown is probably your bottleneck. However, if the bottleneck is actually the PHP code you could try installing an optcode cache like APC at http://php.net/manual/en/book.apc.php. It will speed up your PHP. (Zend Optimizer is also available at: http://www.zend.com/en/products/guard/runtime-decoders)
If your bottleneck is the database where the items in the dropdown is coming from, you may want to try setting MySQL to cache the results.
You may also want to try an alternative dropdown that uses AJAX to populate the dropdown as the user scrolls down, a few records at a time. You could also create it as a text field that prompts the user for possible matches as they type. These things may be faster.
I suspect the problem is the raw size of the data you're transmitting, based on the results of number 2 in "What I have tried so far." I don't think you can rely on browser caching, and server-side caching won't change the size of the data transmitted.
Here are a couple of ideas to reduce the amount of data transmitted during page load:
Load the select box separately, after the main page has been
delivered, using an asynchronous javascript call.
Break the choice into a hierarchical series of choices. User
chooses the top-level category, then another select box is populated
with matching sub-categories. When they choose a sub-category, the
third box fills with the actual options in that sub-category. Something like
this.
Of course, this only works if those 2nd and 3rd controls are filled-in using an async
javascript call.
Either way, make sure gzip compression is enabled on your server.
Edit: More on browser caching
The browser caches individual files, and you typically don't ask it to cache PHP pages because they may be different next time. (Individual php includes are invisible to the browser, because PHP combines their contents into the HTML stream.) If you use a browser's developer console (hit f12 on Chrome and go to Network, for example), you can see that most pages cause multiple requests from the browser to the server, and you may even see that some of those files (js, css, images) are coming from the cache.
What the browser caches and for how long is controlled by various HTTP response headers, like Cache-Control and Expires. If you don't override these in php by calling the header function, they are controlled by the web server (Apache) configuration.
As far as I understand it, when you turn on caching in smarty, smarty caches compiled templates. These compiled templates can then used to speed up rendering of the page. Wouldn't it be a good idea to run you own level of caching on top of your smarty application that goes like this.
if(a cache for this page exists){
-Don't run my application, don't include my files don't instantiate my classes.
-Send the cached version of this page to the user
-end the script here
}else{//if the cache for this page does not exist or is not current
- run my application as usual
-save all the output to a file for next time
}
The whenever somthing happens on my site that would update the content of the this page, eg the admin makes changes to the content of the site delete the cache file. I feel like I must be missing something here. This method would allow me to store an all html version of every page and send that when it is valid. It seems like this would drastically improve the speed of my site.
Edit: Ok so I have discovered that smarty does infact store a html version of my site. How do I prevent my application from running if the rest of my application from running if the cache is current. Do I just include and instantiate smarty first and do something like
if($smarty->usingcache())[
exit;
}
If your site were static, this would work. But in that case, you wouldn't need Smarty...
Suppose you update a record in your database. Then all pages in your site which contain output directly or indirectly affected by the update would have to be invalidated. How would you know which pages to invalidate?
How can you know what the page will look like until you do some request specific processing. Until you've checked your client's authentication status, performed some database queries ( or fetched cached results) to fetch recent information, you can't know if the most recently rendered page is the same as what would be rendered this time. Smarty solves this with this strategy:
Your app does all of it's domain/business logic in response to the request.
Your app populates the smarty template instance with template variables
Smarty generates a hash of the template and the template variables
If the hash is not in the cache, Smarty renders the template and caches it
If the hash is in the cache, Smarty returns the cached template instead of rendering
If the cache is full, smarty evicts an old, cached page to make room.
By default, Smarty uses the filesystem for the Cache, but it the caching strategy is compatible with any key-value store. In fact, plugins exist for other stores, such as this one for memcached.
Getting some data from a MySQL database and loading it into the page. None of the stuff really needs to be retrieved at any point other than page load, so that advantage of Ajax is moot. Is there any other advantage to Ajax?
It has the advantage that, if you can defer the retrieval of that data, you can potentially:
Make the page load faster (since the content sent will be smaller).
Provide more up-to-date content.
Additionally, if that data may not be retrieved, you can potentially:
Save bandwidth.
Lower the server load.
Finally, you need to use Ajax if you want to display content more recent than when the page was loaded without refreshing it.
EDIT
If you insist on loading everything when the page is loaded, the only possible reason I can concoct is when what you loading depends on some logic implemented in Javascript.