How is a cache system effective in the php lifecycle? - php

I'm a little confused here.
I know that for every PHP request, the entire application is bootstrapped all over again.
Given this, how can a cache be effective, if all of the globals are reloaded for each and every request?
For example:
User calls URI/user/view/123. User 123 is loaded from a database and stored in $user.
Why would you cache the contents of $user - when you merely need to refer to the variable in order to get the contents?
Am I missing the point?
Thank you,

Its more like caching images, common database querys
For instance say your site has a lot of articles and each article has categories. And say you dont change categories very often, then using a cached result of a query of the categories table is preferable then doing the query. this is a simplified example.
Another example is with images, if your site needs like thumbnailed version of user photos that they have uploaded instead of having php use the GD library to rescale the image and etc just save a version of that thumbnail version and use it instead of running through the GD code again.

As always, an image is worth a thousand words, here it is :)
(source)
As you can see, you reload some PHP librairies (like the basic environment (Globals, Requests, Cookies, etc), but not everything (in this case, Security, Application, various libraries, Views).
You skip what can be cached ;)

Related

Alternative to eval() when caching and displaying generated PHP pages

I've worked on a CMS which would use Smarty to build the content pages as PHP files, then save them to disc so all subsequent views of the same page could bypass the generation phase, keeping DB load and page loading times down. These pages would be completely standalone and not have to run in the context of another script.
The problem was the instance where a user first visited a page that wasn't cached, they'd still have to be displayed the generated content. I was hoping I could save my generated file, then include() it, but filesystem latency meant that this wasn't an option.
The only solution I could find was using eval() to run the generated string after it was generated and saved to disc. While this works, it's not nice to have to debug in, so I'd be very interested in finding an alternative.
Is there some method I could use other than eval in the above case?
Given your scenario, I do not think there is an alternative.
As for the debugging part, you could always write it to disc and include it for the development to test / fix it up that way and then when you have the bugs worked out, switch it over to eval.
Not knowing your system, I will not second guess that you know it better than I do, but it seems like a lot of effort, especially since that the above scenario will only happen once per page...ever. I would just say is it really worth it for that one instance to display the initial page through eval and why could you not be the initial user to generate the pages?

persistent data in php question

OK Ive written this neat javascript 'thing' using jquery and ajax. Its all based on the idea that a div has an attribute that lets you write inside the div. (contenteditable=true). I thought it would be cool to make a chatroom type thing out of it, and holy cow its doing some cool stuff(*), but I have an issue.
Using ajax I post to a php page that takes the posted data (x,y, text, id) and stuffs it into a JSON-like object. Without writing to a database (overkill I think), how can I make this data persist? See the problem? : The variables in a php page are essentially vapor after the page has ran, so my javascript ajax call to retrieveNewJSON() would find nothing.
*using jquery effects and setting colors I have variably placed text that scrolls and evaporates, matrix style, for example. Also, a cursor is placed in the div where the user clicks.
You have to store the data somewhere. If you don't want to use a full blown database you can store them in flat files (ie: txt) and use PHP's file functions to handle the files.
Of course this is not very scalable, and I'd strongly recommend using a database if you are going to be using this a lot.
You could use cookies (client-side) or session variables (server-side), or you could write to a file for longer-term storage.
You could use a the $_SESSION variable to persist data.
// Call at start of PHP script
session_start()
//....
// Store object
$_SESSION['obj'] = json_encode(obj);
in your pull script:
// Call at start of PHP script
session_start()
// Retrieve object
echo $_SESSION['obj'];
Note that when using sessions you have to make sure that you call session_start() at the top of every php script that uses the session.
I would not recommend trying to store this in a file unless you are supporting a very low number of users and have taken proper data sanitation steps to physically write files to the server. If you need this to persist past the length of a session you should be using a database.
It is worth noting that you can't update a users session without some other form of centralized storage. Unless you have some sort of long-polling / comet type setup you will have to have some sort of central storage place. Something I would take a look at would be memcache.
If you want to avoid using a database engine (which would have a lot of overhead for a multiple-read, multiple-write app like a chat room anyway), you might look at a simple object store like memcache, couch, or mongo. Files are also a valid option, provided you store them outside of the Web root with proper permissions. Bottom line is, you'll have to use some sort of storage engine on the back end in order to make the data shareable across multiple user sessions.
If this is simply a tech demo or a proof of concept, I wouldn't worry too much about overhead right away.

How does CodeIgniter output caching work?

I read this link :-
http://codeigniter.com/user_guide/general/caching.html
It is written that :-
When a page is loaded for the first time, the cache file will be written to your system/cache folder
and that we can cache a view by $this->output->cache(60);. But how does it actually work? What if regularly my users keep updating and deleting records as a result of which view changes very often. Will it show the modified data? or will the cache bring back the old stale data? (before inserts and updates)? If it automatically manages and brings fresh data from the database, then what is purpose of specifying the minutes in cache function?
Thanks in advance :)
The way codeigniter's caching works generally is this:
A page request is made. Codeigniter (before very much of the framework has even been loaded) does a hash of the current url and if it finds that filename in the cache directory, it serves that.
The only way you can get fresh data is to manually delete the files. When codeigniter doesn't find the file from the hash it generated, it dynamically creates the page.
Codeigniter's implementation is called "full page" caching, and as so, is limited in it's usefullness. There's a partial caching library I've looked into from Phil Sturgeon here: http://philsturgeon.co.uk/code/codeigniter-cache
Honestly, for most projects, full page caching really isn't all that useful. In fact, the projects that I need full page caching I don't even leave that up to codeigniter (I leave it to the webserver: it's way faster).
I'd guess what you're looking for is a partial caching method; most people would prefer this. Look into APC if you're using a single server or Memcached if you have multiple servers.
Good Luck.
But how does it actually work?
If a cached version exists that is younger than the cache time, that cached version will be outputted.
Will it show the modified data?
Eventually yes, but with a lag of $cache_time
What if regularly my users keep updating and deleting records as a result of which view changes very often.
Reduce the cache time or don't use caching at all

Security implications of writing files using PHP

I'm currently trying to create a CMS using PHP, purely in the interest of education. I want the administrators to be able to create content, which will be parsed and saved on the server storage in pure HTML form to avoid the overhead that executing PHP script would incur. Unfortunately, I could only think of a few ways of doing so:
Setting write permission on every directory where the CMS should want to write a file. This sounds like quite a bad idea.
Setting write permissions on a single cached directory. A PHP script could then include or fopen/fread/echo the content from a file in the cached directory at request-time. This could perhaps be carried out in a Mediawiki-esque fashion: something like index.php?page=xyz could read and echo content from cached/xyz.html at runtime. However, I'll need to ensure the sanity of $_GET['page'] to prevent nasty variations like index.php?page=http://www.bad-site.org/malicious-script.js.
I'm personally not too thrilled by the second idea, but the first one sounds very insecure. Could someone please suggest a good way of getting this done?
EDIT: I'm not in the favour of fetching data from the database. The only time I would want to fetch data from the database would be when the content is cached. Secondly, I do not have access to memcached or any PHP accelerator.
Since you're building a CMS, you'll have to accept that if the user wants to do evil things to visitors, they very likely can. That's true regardless of where you store your content.
If the public site is all static content, there's nothing wrong with letting the CMS write the files directly. However, you'll want to configure the web server to not execute anything in any directory writable by the CMS.
Even though you don't want to hit the database every time, you can set up a cache to minimize database reads. Zend_Cache works very nicely for this, and can be used quite effectively as a stand-alone component.
You should put your pages in a database and retrieve them using parameterized SQL queries.
I'd go with the second option but modify it so the files are retrieved using mod_rewrite rather than a custom php function.

Advice on the most efficient way to plan-out a html/php & multi-user Ajax project

I'm starting a experimental project and I was looking for advice to choose the best option I have.
The project will be on the 100 - 1,000 user count. It collects its main data using javascript + json data from the user's flickr page and then uses that to display specific flickr photos. The variables that need to be stored include user specific URL slug, and maybe 4 more short string variables. That all will need to be looked up on each page request. These variables will not change by time unless the user visits update page, and so these variables are static 99% of the time.
Each user's page will be located at /user-slug
I have thought out three options though I do not want to limit myself to these three (please offer your own), not being an experienced coder at higher access counts, I was looking for the fastest, most static & cacheable, least resource consuming way of achieving this, and I'm sure you guys are far more clever than I am at achieving this.
for the N amount users:
completely static approach: N static html pages are created, each user page is updated whenever asked, htaccess mod-rewrite is used to make each html file resemble a directory access. Updating; Php is used to rewrite the static pages when user asks them to be updated, or a full N user rewrite is performed when template needs updating. Most of the in-development code resides in a javascript file so the template itself will probably not be edited as frequently. Each time a user page is called, a static html file is displayed, javascript collects data from flickr server and displays it.
half static approach: Php + mod rewrite is used to simulate the different N user pages, user slug and only user slug is stored in MySQL database, then user specific variables are loaded via individual unique static texfiles named after the user slug (user-slug.txt) via javascript by the browser client (this data is not sensitive). Each time a page is called, 1 MySQL call is made and 1 extra txt file is loaded in the header via javascript. Javascript collects data from flickr and displays it.
full dynamic approach: Php + mod rewrite is used to simulate the different N pages (as the above method). All user specific variables are stored in MySQL. Each time a page is called, about 4 MySQL calls are made, Php creates the template page using those variables. Javascript collects data from flickr and displays it. In this method, which I believe is the more common approach to multi-user websites, I am also looking for ways to make these php/MySQL calls cacheable on the server itself. I'm on shared hosting btw, I don't have any low level access to the configuration itself.
Thank you so much for your input
Very, very appreciated!
I'd start with the full dynamic approach.
Then based on profiling and performance move those parts to caching that cost the most resources.
As they say 'premature optimization is the root of all evil'. Don't try to think what will take the most resources, but measure it by profiling time and memory usage.
I'd go with full dynamic as well. Though try your best to make whatever javascript/css you have static so it can be linked from an external file and not generated.

Categories