How can I get CodeIgniter page cache to take inputs into account? For example, say I have a pagination system. With cache enabled, if I go to page 1, then page 2 (same controller)... CI cache will return page 1's content.
function my_controller() {
$this->output->cache(1);
$page = $this->input->post("page");
$data = getData($page);
$this->load->view('my_view', $data);
}
In the above example, you cannot pan through pages correctly. It will keep loading the cached page even if the posted input changes.
How can I get the cache system to take $_POST data into account and treat those as different requests from a cache point of view?
Caching dynamic content where the view has no noticeable change to the caching class can get messy...
The simplest solution is don't cache dynamic pages, its not really what caching is for anyway, caching is meant for (at least, mostly) static data.
Another way you can do it (depending on result sets) is to use javascript and paginate client side, and use ajax calls to fetch the data from the server, the client would never actually leave the single view and only the initial view would be cached.
How often does the content change? are we talking like a list of orders that changes minute by minute, or a list of products that changes maybe monthly?
If loaded by ajax then just make sure what you are calling is not exposed to your cache (unless the content rarely changes)... As it would be called from a different method this is not difficult to ensure, and on page load you would have it call the first xx number of results fresh every time, but using the benefits of caching for the rest of the view.
Related
I use simple file_get_contents feature to grab data from other site and place it on mine.
<?php
$mic1link = "https://example.com/yyy.html";
$mic2link = "https://example.com/zzz.html";
$mic3link...
$mic4link...
$mic5link...
$mic6link...
?>
<?php
$content = file_get_contents($mic1link);
preg_match('#<span id="our_price_displays" class="price" itemprop="price" content=".*">(.*)</span>#Uis', $content, $mic1);
$mic1 = $mic1[1];
?>
<?php
$content = file_get_contents($mic2link);
preg_match('#<span id="our_price_displays" class="price" itemprop="price" content=".*">(.*)</span>#Uis', $content, $mic2);
$mic2 = $mic2[1];
?>
And fired up by
<?php echo "$mic1";?> and <?php echo "$mic2";?>
It works but it impacts on performance (delay).
Is there any way to optimize this script or maybe another way to achieve this?
Firstly, as others have said, the first step is to use the Guzzle library for this instead of file_get_contents(). This will help, although ultimately you will always be constrained by the performance of the remote sites.
If at all possible, try to reduce the number of http requests you have to make: Can the remote site aggregate the data from all the requests into a single one? Or are you able to obtain the data via other means? (eg direct requests to a remote database?). The answers here will depend on what the data is and where you're getting it from, but look for ways to acheive this as those requests are going to be a bottleneck to your system no matter what.
If the resources are static (ie they don't change from one request to another), then you should cache them locally and read the local content rather than the remote content on every page load.
Caching can be done either the first time the page loads (in which case that first page load will still have the performance hit, but subsequent loads won't), or done by a separate background task (in which case your page needs to take account of the possibility of the content not being available in the cache if the page is loaded before the task runs). Either way, once the cache is populated, your page loads will be much faster.
If the resources are dynamic then you could still cache them as above, but you'll need to expire the cache more often, depending on how often the data are updated.
Finally, if the resources are specific to the individual page load (eg time-based data, or session- or user-specific) then you'll need to use different tactics to avoid the performance hit. Caching still has its place, but won't be anything like as useful in this scenario.
In this scenario, your best approach is to limit the amount of data being loaded in a single page load. You can do this a number of ways. Maybe by giving the user a tabbed user-interface, where he has to click between tabs to see each bit of data. Each tab would be a different page load, so you'd be splitting the performance hit between multiple pages, and thus less noticable to the user, especially if you've used caching to make it seamless when he flips back to a tab he previously loaded. Alternatively if it all needs to be on the same page, you could use ajax techniques to populated the different bits of data directly into the page. You might even be able to call the remote resources directly from the Javascript in the browser rather than loading them in your back-end php code. This would remove the dog-leg of the data having to go via your server to get to the end user.
Lots to think about there. You'll probably want to mix and match bits of this with other ideas. I hope I've given you some useful tips though.
I'm building a web app, the way I started off the app for testing purposes is to load lots of data in to session arrays from my database so I can use the values easily throughout the pages. I have one page the has numerous selects on it, and each time the php page loops through all the variables, chooses the selected one, and outputs the dropdown. One of my arrays though has just under 3000 values and loading this dropdown slows the page down from about 300ms to 1-1.2s. Not terrible but easy to tell that it is less responsive. So I'd like to know if there is anyway for me to improve the load speed, or any thoughts on a substitute for the dropdown.
What I have tried so far:
Session arrays hold all the values, when the page is loaded through jquery ajax method the php page loops through these values and echos the dropdowns.
Php include - create php or html pages of all the values pre written as selects, this creates a ~100kb page for the problem dropdown and this is then included with include. Takes roughly the same amount plus I'd have to then use javascript to set the value, but I'd do this if it could be improved. I thought perhaps some caching could provide improvements here. There seemed to be no significant difference between html and php pages for include but I'd assume html would be better. I'm also assuming that I cannot use regular caching because I am using a php function to include these pages.
I have tried just loading in the html page and it takes about 1 sec on first load, after browser caching it is back down to 100-350ms so I imagine caching could provide a huge boost in performance.
I have considered:
Creating a cached version of the whole page but this will be quite the pain to implement so I'd only do it if people thought it is the right way to go with this. I would have to use ajax to retrieve some data for the inputs which I was originally doing with php echos.
Just removing the problem dropdown.
Just to clarify something I've never had clarified, am I correct in thinking php pages can never be cached by the browser, and so by extension any php included files can't be either. But then how come a javascript file linked to in a php file can be cached, because it is using an html method?
The data being returned and parsed into a dropdown is probably your bottleneck. However, if the bottleneck is actually the PHP code you could try installing an optcode cache like APC at http://php.net/manual/en/book.apc.php. It will speed up your PHP. (Zend Optimizer is also available at: http://www.zend.com/en/products/guard/runtime-decoders)
If your bottleneck is the database where the items in the dropdown is coming from, you may want to try setting MySQL to cache the results.
You may also want to try an alternative dropdown that uses AJAX to populate the dropdown as the user scrolls down, a few records at a time. You could also create it as a text field that prompts the user for possible matches as they type. These things may be faster.
I suspect the problem is the raw size of the data you're transmitting, based on the results of number 2 in "What I have tried so far." I don't think you can rely on browser caching, and server-side caching won't change the size of the data transmitted.
Here are a couple of ideas to reduce the amount of data transmitted during page load:
Load the select box separately, after the main page has been
delivered, using an asynchronous javascript call.
Break the choice into a hierarchical series of choices. User
chooses the top-level category, then another select box is populated
with matching sub-categories. When they choose a sub-category, the
third box fills with the actual options in that sub-category. Something like
this.
Of course, this only works if those 2nd and 3rd controls are filled-in using an async
javascript call.
Either way, make sure gzip compression is enabled on your server.
Edit: More on browser caching
The browser caches individual files, and you typically don't ask it to cache PHP pages because they may be different next time. (Individual php includes are invisible to the browser, because PHP combines their contents into the HTML stream.) If you use a browser's developer console (hit f12 on Chrome and go to Network, for example), you can see that most pages cause multiple requests from the browser to the server, and you may even see that some of those files (js, css, images) are coming from the cache.
What the browser caches and for how long is controlled by various HTTP response headers, like Cache-Control and Expires. If you don't override these in php by calling the header function, they are controlled by the web server (Apache) configuration.
What I'm trying to do is render either a partial or a fragment in Symfony from the cache (the easy part) but if the cache does not exist, then I want Symfony to (instead of recreating the cache) render nothing.
My website pulls data from multiple other websites, which can insanely slow down page rendering speed, so instead of loading the info from other websites on the initial page load, I plan on doing it once the initial page is finished loading and a user clicks the appropriate button, then caching the data for later. However, if the data is cached (from a previous request) then I would rather dump the cached data right into the initial page load.
I tried to clarify it as much as possible, so hopefully it makes sense.
i think you could handle this with a filter and the getViewCacheManager()
So I have a PHP CodeIgniter webapp and am trying to decide whether to incorporate caching.
Please bear with me on this one, since I'll happily admit I don't fully understand caching!
So the first user loads up a page of user submitted-content. It takes 0.8 seconds (processing) to load it 'slow'. The next user then loads up that same page, it takes 0.1 seconds to load it 'fast' from cache.
The third user loads it up, also taking 0.1 seconds execution time. This user decides to comment on the page.
The fourth user loads it up 2 minutes later but doesn't see the third user's comment, because there's still another 50 minutes left before the cache expires
What do you do in this situation? Is it worth incorporating caching on pages like this?
The reason I'd like to use caching is because I ran some tests. Without caching, my page took an average of 0.7864 seconds execution time. With caching, it took an average of 0.0138 seconds. That's an improvement of 5599%!
I understand it's still only a matter of milliseconds, but even so...
Jack
You want a better cache.
Typically, you should never reach your cache's timeout. Instead, some user-driven action will invalidate the cache.
So if you have a scenario like this:
Joe loads the page for the first time (ever). There is no cache, so it takes a while, but the result is cached along the way.
Mary loads the page, and it loads quickly, from the cache.
Mary adds a comment. The comment is added to the database (or whatever), and the software invalidates the cache
Pete comes along and loads the page, the cache is invalid, so it takes a second to render the page, and the result is cached (as a valid cache entry)
Sam comes along, page loads fast
Jenny comes along, page loads fast.
I'm not a CodeIgniter guy, so I'm not sure what that framework will do for you, but the above is generally what should happen. Your application should have enough smarts built-in to invalidate cache entries when data gets written that requires cache invalidation.
Try CI's query caching instead. The page is still rendered every time but the DB results are cached... and they can be deleted using native CI functionality (i.e no third party libraries).
While CI offers only page level caching, without invalidation I used to handle this issue somewhat differently. The simplest way to handle this problem was to load all the heavy content from the cache, while the comments where loaded via a non cacheable Ajax calls.
Or you might look into custom plugins which solve this, like the one you pointed out earlier.
It all comes down to the granularity at which you want to control the cache. For simple things like blogs, loading comments via external ajax calls (on demand - as in user explicitly requests the comments) is the best approach.
Getting some data from a MySQL database and loading it into the page. None of the stuff really needs to be retrieved at any point other than page load, so that advantage of Ajax is moot. Is there any other advantage to Ajax?
It has the advantage that, if you can defer the retrieval of that data, you can potentially:
Make the page load faster (since the content sent will be smaller).
Provide more up-to-date content.
Additionally, if that data may not be retrieved, you can potentially:
Save bandwidth.
Lower the server load.
Finally, you need to use Ajax if you want to display content more recent than when the page was loaded without refreshing it.
EDIT
If you insist on loading everything when the page is loaded, the only possible reason I can concoct is when what you loading depends on some logic implemented in Javascript.