I read this link :-
http://codeigniter.com/user_guide/general/caching.html
It is written that :-
When a page is loaded for the first time, the cache file will be written to your system/cache folder
and that we can cache a view by $this->output->cache(60);. But how does it actually work? What if regularly my users keep updating and deleting records as a result of which view changes very often. Will it show the modified data? or will the cache bring back the old stale data? (before inserts and updates)? If it automatically manages and brings fresh data from the database, then what is purpose of specifying the minutes in cache function?
Thanks in advance :)
The way codeigniter's caching works generally is this:
A page request is made. Codeigniter (before very much of the framework has even been loaded) does a hash of the current url and if it finds that filename in the cache directory, it serves that.
The only way you can get fresh data is to manually delete the files. When codeigniter doesn't find the file from the hash it generated, it dynamically creates the page.
Codeigniter's implementation is called "full page" caching, and as so, is limited in it's usefullness. There's a partial caching library I've looked into from Phil Sturgeon here: http://philsturgeon.co.uk/code/codeigniter-cache
Honestly, for most projects, full page caching really isn't all that useful. In fact, the projects that I need full page caching I don't even leave that up to codeigniter (I leave it to the webserver: it's way faster).
I'd guess what you're looking for is a partial caching method; most people would prefer this. Look into APC if you're using a single server or Memcached if you have multiple servers.
Good Luck.
But how does it actually work?
If a cached version exists that is younger than the cache time, that cached version will be outputted.
Will it show the modified data?
Eventually yes, but with a lag of $cache_time
What if regularly my users keep updating and deleting records as a result of which view changes very often.
Reduce the cache time or don't use caching at all
Related
I'm a little confused here.
I know that for every PHP request, the entire application is bootstrapped all over again.
Given this, how can a cache be effective, if all of the globals are reloaded for each and every request?
For example:
User calls URI/user/view/123. User 123 is loaded from a database and stored in $user.
Why would you cache the contents of $user - when you merely need to refer to the variable in order to get the contents?
Am I missing the point?
Thank you,
Its more like caching images, common database querys
For instance say your site has a lot of articles and each article has categories. And say you dont change categories very often, then using a cached result of a query of the categories table is preferable then doing the query. this is a simplified example.
Another example is with images, if your site needs like thumbnailed version of user photos that they have uploaded instead of having php use the GD library to rescale the image and etc just save a version of that thumbnail version and use it instead of running through the GD code again.
As always, an image is worth a thousand words, here it is :)
(source)
As you can see, you reload some PHP librairies (like the basic environment (Globals, Requests, Cookies, etc), but not everything (in this case, Security, Application, various libraries, Views).
You skip what can be cached ;)
I have a home page as index.php and it gives a list of 10 items/ products.
Now, I am using the same page as landing page for inward traffic from facebook.
The url looks like index.php?productID=Q231 This page displays the product carrying the specified ID only.
I am aware of PHP output caching but I am new. I have learned that if I cache index.php, it will serve the same cached file to all the inward traffic from facebook.
Is my understanding correct? I have searched a lot about this but i am not clear as to how would one go about caching with this instance.
Is there a a way to skip or bypass the server cache file/caching if there's a query string in the url?
I would greatly appreciate if anyone could give me some pointers.
It really depends on your caching model and how you handle this in your code.
If you are creating the whole thing using output buffering you may want to use a method such as:
Generate Cache key based on requested script and/or request parameters i.e. using productId in your case
Check to see if you have saved the output for a given key to some persistent store
If yes, output
If no, then use an output buffer, generate the contents, save to a persistent store and save under the generated cache index, and output
Googling brings up this rudimentary example:
http://www.devshed.com/c/a/PHP/Output-Caching-with-PHP/
if i'm not mistaken caching is not based on referrer. Internal php caching is only to optimize code, it will not cache, as ie. 'external' caching systems, like inbuild caching in smarty for example, output. I think you'd only need to 'disable' caching for browsers, which will mean, send the proper headers with header(...)
You are using output caching. Then you should clear cache when your facebook or twitter links called. IN codeigniter(Framework of PHP), I have done this by clearing cache.
In core PHP, I don't know how to clear cache. But there must be some methods to clear cache. So try for that.
Here some links may be useful to you.
How to clear browser cache with php?
http://php.net/manual/en/function.clearstatcache.php
http://www.dreamincode.net/forums/topic/6519-clear-cache/
I'm using a Symfony 2 to generate my pages from data in a MySQL database. For most content, users have to be authenticated but the content itself does not change often and does not need to be customized for the users. So what's a good caching strategy for avoiding database calls while still maintaining the auth check?
Simply put, use Memcache to cache the SQL result-set for extended period of time.
Maybe this be too huge change, but the following scheme may be useful in the case:
Create several sets of pages, one for not-yet-authed users (let's put in the site root), and others for authenticated users that should see the same content (say, it two or more should see the same content when they are authenticated, then we'll create only one set for all of them), and put it into directory under root. Then form simple .htaccess/.htpasswd files for each of such 'for-authed-only' directory and then it'll be webserver's problem not your script.
Hope you got the idea. It is fuzzy to say, but will be easy to implement.
Example: say you care to allow only authenticated users to see page '/topsecret.html' on the site. Create dir (/authed), establish HTTP-auth on it, and put your topsecret.html into the dir (so it'll be '/authed/topsecret.html'). Now edit '/topsecret.html' and simple replace it's main content with 'sorry, please authenticate yourself' link that'll point to '/authed/topsecret.html'.
If you use Symfony2, you are using Doctrine2
if you use Doctrine2, caching should be enabled by default.
Choose your cache driver for your purposes and there should be no problem.
You might also be specifically interested in query result caching.
Do not use Doctrine without a metadata and query cache! Doctrine is
highly optimized for working with caches. The main parts in Doctrine
that are optimized for caching are the metadata mapping information
with the metadata cache and the DQL to SQL conversions with the query
cache. These 2 caches require only an absolute minimum of memory yet
they heavily improve the runtime performance of Doctrine. The
recommended cache driver to use with Doctrine is APC. APC provides you
with an opcode-cache (which is highly recommended anyway) and a very
fast in-memory cache storage that you can use for the metadata and
query caches
I solved this by using Zend_Cache inside the cacheable actions to store the rendered template result. I then create a new Response object from the cached content. If the cache is empty, I generate the content.
I thought of creating a plugin that checks for an annotation and stores the Response output automatically but it turned out that I only have 3-4 display actions that are cacheable and have very complex cache ID creation rules, so I put the caching logic directly into the controller code.
It appears that you have a lot of options for caching with symfony http://www.symfony-project.org/book/1_2/12-Caching (not for 2 but my guess is not a lot has changed).
You could put your heavy sql statements in its own script and turn caching on for that script
list:
enabled: on
with_layout: false # Default value
lifetime: 86400 # Default value
Further if you are sure that the generated tag won't change for a while you could use symfony to tell the user's browser not even to bother your server for the content which will cause the page to load nearly instananeously for the user.
$this->getResponse()->addCacheControlHttpHeader('max_age=1200'); // in seconds - less than 1 year seconds
Just make sure your max age is small enough that when something changes (say a code update) that the user doesn't get stuck with old page since there is no way to force them to request that page again short of changing the url.
Is this a good idea to use file caching on article/news style php websites with 10-15k records to solve PHP performance problems?
Is it better to use something like "cache_lite" than fetching an article or news from database by a "SELECT" query?
What about members profile pages? (~200k)
Well you should defiantly use cache. It helps speed things up when you have many/large queries to execute.
You can even cache part of the article or whatever you want. You can cache whole pages if you wanted.
Check out memcache'ing http://php.net/memcache
Hope this helps.
I personally do this with content that usually is static (which profile pages are - only refreshing the cached page when the info is updated), and i've seen a decrease in loadspeed by up to 200 times by using a custom tailored caching system, so it is a very good idea
yes. It will be good.
I will suggest you to go with memory cache in place of file cache.
Use Memcache
So I have a PHP CodeIgniter webapp and am trying to decide whether to incorporate caching.
Please bear with me on this one, since I'll happily admit I don't fully understand caching!
So the first user loads up a page of user submitted-content. It takes 0.8 seconds (processing) to load it 'slow'. The next user then loads up that same page, it takes 0.1 seconds to load it 'fast' from cache.
The third user loads it up, also taking 0.1 seconds execution time. This user decides to comment on the page.
The fourth user loads it up 2 minutes later but doesn't see the third user's comment, because there's still another 50 minutes left before the cache expires
What do you do in this situation? Is it worth incorporating caching on pages like this?
The reason I'd like to use caching is because I ran some tests. Without caching, my page took an average of 0.7864 seconds execution time. With caching, it took an average of 0.0138 seconds. That's an improvement of 5599%!
I understand it's still only a matter of milliseconds, but even so...
Jack
You want a better cache.
Typically, you should never reach your cache's timeout. Instead, some user-driven action will invalidate the cache.
So if you have a scenario like this:
Joe loads the page for the first time (ever). There is no cache, so it takes a while, but the result is cached along the way.
Mary loads the page, and it loads quickly, from the cache.
Mary adds a comment. The comment is added to the database (or whatever), and the software invalidates the cache
Pete comes along and loads the page, the cache is invalid, so it takes a second to render the page, and the result is cached (as a valid cache entry)
Sam comes along, page loads fast
Jenny comes along, page loads fast.
I'm not a CodeIgniter guy, so I'm not sure what that framework will do for you, but the above is generally what should happen. Your application should have enough smarts built-in to invalidate cache entries when data gets written that requires cache invalidation.
Try CI's query caching instead. The page is still rendered every time but the DB results are cached... and they can be deleted using native CI functionality (i.e no third party libraries).
While CI offers only page level caching, without invalidation I used to handle this issue somewhat differently. The simplest way to handle this problem was to load all the heavy content from the cache, while the comments where loaded via a non cacheable Ajax calls.
Or you might look into custom plugins which solve this, like the one you pointed out earlier.
It all comes down to the granularity at which you want to control the cache. For simple things like blogs, loading comments via external ajax calls (on demand - as in user explicitly requests the comments) is the best approach.