Codeigniter caching? - php

I'm trying to make my site faster, because I want it to load as fast as posable. I'm having a bit of trouble with caching though. I was trying to work with this, but it seems that that caches the whole page and my content changes quite a bit. Is it possible for me to only cach certain views that I know will not be changing? Like the header, footer and the main home page.

The point of CI caching is to reduce the number of database queries, any time-consuming PHP calculations, etc... Basically, it will render a plain HTML page from your controller (and all the views it calls, of course). So, it won't really speed up your header and footer, unless you pull the data for them from the database or anythings that dynamic and heavy... but any modern browser will cache that for you unless you specifically disallow caching.
So, the bottom line, CI caching allows caching of the full pages only, no separate parts. Of course, there are alternative ways to achieve what you wanted, for example you can make the header and the footer separate controllers, put caching in them, and call them by AJAX... but I'm not sure it's worth it.

using extentions this is possible
https://github.com/philsturgeon/codeigniter-cache
usually the displaying isue too much of an issue, its the generation of the data which needs to be displayed which takes longer and benefits most from caching.

A little late to the conversation, but have you taken a look at database cache? A lot of the web page delay can come from heavy db queries. Caching the results opens up dynamic views.
This is helpful when you are managing sessions.
$this->db->cache_on();
Place this in your model instead of controller. Make sure you have a writable db-cache folder in your app directory as well.

Related

dynamic database driven website with html pages

I have a small project about a dynamic database driven website which is users can post, comment, like,... on it.
and I was thinking about creating html pages instead of php pages.
for example after someone posts sth a php code will create a html page for that post and people can comment on it. while submitting a comment, an ajax call to a php page will save comment data to database and write that comment to the html file.
I thought it could be a good way to reduce server load. and have advantages for SEO.
Does this technique have a name? and Does it have any more advantages or disadvantages?
Thanks in advance
There's a flat-file blogging engine called Kure. It's an open-source project so feel free to check it out. "Flat file system" would be the closest thing to a name for this technique.
I have to agree with my SO colleagues here. Servers and even personal computers are, for the most part, more than capable to handling what you're describing using a real database. Blogging engines such as Wordpress, are incredibly powerful and flexible and will save you a lot of hassle down the line.
That said, if you insist in creating your own flat-file system... more power to you. Good luck.
Yes, it is a known technique for optimizing serving of relatively static pages. By 'relatively static' I mean "dynamic, but update rarely".
For example, Yandex (a search engine) uses this to serve its main page. It's pretty rich page, and it would require significant resources to generate it on each request.
Also, there is (at least, there was) a plugin for Wordpress that does this.
You can't use this technique if your pages update often (it will not worth it).
You can't use this technique if your pages are personalized (that is, if you can't serve the very same page to all your visitors).
I don't know any name for this technique. I know it's already been used for some big sites in the past but I'm not sure it's still used because it's not always easy to handle and if a file is corrupted because of a script, other scripts working with the page can make it even worse.

Why should MVC for websites require a single point of entry?

I see many MVC implementations for websites have a single-entry point such as an index.php file and then parses the URL to determine which controller to run. This seems rather odd to me because it involves having to rewrite the URL using Apache rewrites and with enough pages that single file will become bloated.
Why not instead just to have the individual pages be the controllers? What I mean is if you have a page on your site that lists all the registered members then the members.php page users navigate to will be the controller for the members. This php file will query the members model for the list of members from the database and pass it in to the members view.
I might be missing something because I have only recently discovered MVC but this one issue has been bugging me. Wouldn't this kind of design be preferable because instead of having one bloated entry-file that all pages unintuitively call the models and views for a specific page are contained, encapsulated, and called from its respective page?
From my experience, having a single-entry point has a couple of notorious advantages:
It eases centralized tasks such as resource loading (connecting to the db or to a memcache server, logging execution times, session handling, etc). If you want to add or remove a centralized task, you just have to change a singe file, which is the index.php.
Parsing the URL in PHP makes the "virtual URL" decoupled from the physical file layout on your webserver. That means that you can easily change your URL system (for example, for SEO purposes, or for site internationalization) without having to actually change the location of your scripts in the server.
However, sometimes having a singe-entry point can be a waste of server resouces. That applies obviously to static content, but also when you have a set of requests that have a very specific purpose and just need a very little set of your resorces (maybe they don't need DB access for instance). Then you should consider having more than one entry point. I have done that for the site I am working on. It has an entry point for all the "standard" dynamic contents and another one for the calls to the public API, which need much less resources and have a completely different URL system.
And a final note: if the site is well-implemented, your index.php doesn't have to become necessarily bloated :)
it is all about being DRY, if you have many php files handling requests you will have duplicated code. That just makes for a maintenance nightmare.
Have a look at the 'main' index page for CakePHP, https://github.com/cakephp/cakephp/blob/master/app/webroot/index.php
no matter how big the app gets, i have never needed to modify that. so how can it get bloated?
When deeplinking directly into the controllers when using an MVC framework it eliminates the possibility of implementing controller plugins or filters, depending on the framework you are using. Having a single point of entry standardizes the bootstrapping of the application and modules and executing previously mentioned plugins before a controller is accessed.
Also Zend Framework uses its own URL rewriting in the form of Routing. In the applications that use Zend Framework I work on have an .htaccess file of maybe 6 lines of rewriterules and conditions.
A single entry point certainly has its advantages, but you can get pretty much the same benefit from a central required file at the top of every single page that handles database connections, sessions, etc. It's not bloated, it conforms to DRY principles (except for that one require line), it seperates logic and presentation and if you change file locations, a simple search and replace will fix it.
I've used both and I can't say one is drastically better or worse for my purposes.
Software engineers are influencing the single point of entry paradigm. "Why not instead just to have the individual pages be the controllers?"
Individual pages are already Controllers, in a sense.
In PHP, there is going to be some boilerplate code that loads for every HTTP request: autoloader require statement (PSR-4), error handler code, sessions, and if you are wise, wrapping the core of your code in a try/catch with Throwable as the top exception to catch. By centralizing code, you only need to make changes in one place!
True, the centralized PHP will use at least one require statement (to load the autoloader code), but even if you have many require statements they will all be in one file, the index.php (not spread out over a galaxy of files on under the document root).
If you are writing code with security in mind, again, you may have certain encoding checks/sanitizing/validating that happens with every request. Using values in $_SERVER or filter_input_array()? Then you might as well centralize that.
The bottom line is this. The more you do on every page, the more you have a good reason to centralize that code.
Note, that this way of thinking leads one down the path of looking at your website as a web application. From the web application perspective, a single point of entry is justifiable because a problem solved once should only need to be modified in one place.

Full website pages in just one page!

hi
I am working on a great website (social network with php) and I've decided to create only one php page, (index.php), but this php page will contain php if conditions and statments of the $_GET value,and will display the page requered (but in the same page index.php).
This means that the code(javascript+xhtml+php) will be very huge (nearly all the project in one page).
I will also use the Htaccess to rewrite the urls of those pages to avoid any malicious requests (so it will appear just like a normal website).
But, before doing so, I just want to know about the advantages and downsides of this technique, seeing it from all other sides (security, server resources, etc...)
thank you
I think what you're trying to do is organize your code properly and effectively, which I commend.
However if I understand correctly, you're going to put all of your javascript, html, and PHP in one file, which is really bad. You want your code to be modular, not lumped together in a single file.
I think you should look into using a framework (eg Zend) - PHP frameworks are specifically designed to help your code remain organized, modular, and secure. Your intent (organizing your code effectively) is great, but your idea for how to organize your code isn't very good. If you're absolutely adament about not using a framework (for example if this is a learning/school project), you should at least make sure you're following best practices.
This approach is not good because of server resource usage. In order to get access to say jQuery.js your web server is going to:
Determine that jQuery.js actually passes through index.php
Pass index.php through the php parser
Wait for php to generate a response.
Serve that response.
Or, you could serve it this:
Determine jQuery.js exists in /var/www/mysite/jQuery.js
Serve it as the response.
Likewise for anything that's "static" i.e. isn't generated from PHP directly. The bigger the number of ifs in the PHP script, the more tests will need be done to find your file.
You do not need to pass your static content through some form of url routing; only your dynamic content. For real speed, its better to generate responses ready as well, called caching, particularly if the dynamic content is expensive in terms of cpu cycles to generate. Other caching techniques include leaving frequently accessed database data in memory, which is what memcached does.
If you're developing a social network, these things really do matter. Heck, facebook wrote a PHP-to-C++ compiler to save clock cycles.
I second the framework recommendation because it really will make code organisation easier and might integrate with a caching-based solution.
In terms of PHP frameworks, there are many. Here's a list of many web application frameworks in many languages and from the same page, the PHP ones. Take a look and decide which you like best. That's what I did and I ended up learning Python to use Django.
Came by this question searching so since the best answer is old, here is more modern one, from this question
Why use a single index.php page for entire site?
A front controller (index.php) ensures that everything that is common to the whole site (e.g. authentication) is always correctly handled, regardless of which page you request. If you have 50 different PHP files scattered all over the place, it's difficult to manage that. And what if you decide to change the order in which the common library files get loaded? If you have just one file, you can change it in one place. If you have 50 different entry points, you need to change all of them.
Someone might say that loading all the common stuff all the time is a waste of resources and you should only load the files that are needed for this particular page. True. But today's PHP frameworks make heavy use of OOP and autoloading, so this "waste" doesn't exist anymore.
A front controller also makes it very easy for you to have pretty URLs in your site, because you are absolutely free to use whatever URL you feel like and send it to whatever controller/method you need. Otherwise you're stuck with every URL ending in .php followed by an ugly list of query strings, and the only way to avoid this is to use even uglier rewrite rules in your .htaccess file. Even WordPress, which has dozens of different entry points (especially in the admin section), forces most common requests to go through index.php so that you can have a flexible permalink format.
Almost all web frameworks in other languages use single points of entry -- or more accurately, a single script is called to bootstrap a process which then communicates with the web server. Django works like that. CherryPy works like that. It's very natural to do it this way in Python. The only widely used language that allows web applications to be written any other way (except when used as an old-style CGI script) is PHP. In PHP, you can give any file a .php extension and it'll be executed by the web server. This is very powerful, and it makes PHP easy to learn. But once you go past a certain level of complexity, the single-point-of-entry approach begins to look a lot more attractive.
It will be a hell of a mess.
You also wont be able to upgrade parts of the website or work on them without messing with the whole thing.
You will not be able to apply some programming architecture like MVC.
It could theoretically be faster, because you have only one file that needs to be fetched from disk, but only under the assumption that all or at least almost all the code is going to be executed.
So you will have to load and compile the whole file for every single request, also the parts that are not needed. so it will slow you down.
What you however CAN do is have a single point of entry where all requests originate from. That helps controlling a lot and is called a bootstrap file.
But most importantly:
Why would you want that?
From what I know most CMSes (and probably all modern ones) are made so that the requested page is the same index.php, but that file is just a dispatcher to other sections. The code is written properly in different files that are built together with includes.
Edit: If you're afraid your included scripts are vulnerable the solutions is trivial. Put them outside of the web root.
Simplistic example:
<?php
/* This folder shouldn't even be in the site root,
it should be in a totally different place on the server
so there is no way someone could request something from it */
$safeRoot = '/path/to/safe/folder/';
include $safeRoot.'all_pages_need_this.php'; // aka The Bootstrap //
switch($_GET['page']){
case 'home':
include $safeRoot.'home.module.php';
break;
case 'blog':
include $safeRoot.'blog.module.php';
break;
case 'store':
include $safeRoot.'store.module.php';
break;
default:
include $safeRoot.'404.module.php';
}
This means that the code(javascript+xhtml+php) will be very huge (nearly all the project in one page).
Yes and it'll be slow.
So you're not going to have any HTML cacheing?
It's all purely in one file, hard to update and slow to interpret? geesh, good luck.
What you are referring to is called single point of entry and is something many web applications (most notably the ones built following the MVC pattern) use.
The code of your point of entry file doesn't have to be huge as you can simply include() other files as needed. For example:
<?php
if ($_GET['module'] == 'messages') {
include('inbox.php');
}
if ($_GET['module'] == 'profile') {
include('profile.php');
} etc..

Why use a single index.php page for entire site?

I am taking over an existing PHP project. I noticed that the previous developer uses a one index.php page for the entire site, currently 10+ pages. This is the second project that I have seen done like this. I don't see the advantage with this approach. In fact it seems like it over complicates everything because now you can't just add a new page to the site and link to it. You also have to make sure you update the main index page with a if clause to check for that page type and then load the page. It seems if they are just trying to reuse a template it would be easier to just use includes for the header and footer and then create each new page with those files referenced.
Can someone explain why this approach would be used? Is this some form of an MVC pattern that I am not familiar with? PHP is a second language so I am not as familiar with best practices.
I have tried doing some searches in Google for "single index page with php" and things like that but I can not find any good articles explaining why this approach is being used. I really want to kick this old stuff to the curb and not continue down that path but I want to have some sound reasoning before making the suggestion.
A front controller (index.php) ensures that everything that is common to the whole site (e.g. authentication) is always correctly handled, regardless of which page you request. If you have 50 different PHP files scattered all over the place, it's difficult to manage that. And what if you decide to change the order in which the common library files get loaded? If you have just one file, you can change it in one place. If you have 50 different entry points, you need to change all of them.
Someone might say that loading all the common stuff all the time is a waste of resources and you should only load the files that are needed for this particular page. True. But today's PHP frameworks make heavy use of OOP and autoloading, so this "waste" doesn't exist anymore.
A front controller also makes it very easy for you to have pretty URLs in your site, because you are absolutely free to use whatever URL you feel like and send it to whatever controller/method you need. Otherwise you're stuck with every URL ending in .php followed by an ugly list of query strings, and the only way to avoid this is to use even uglier rewrite rules in your .htaccess file. Even WordPress, which has dozens of different entry points (especially in the admin section), forces most common requests to go through index.php so that you can have a flexible permalink format.
Almost all web frameworks in other languages use single points of entry -- or more accurately, a single script is called to bootstrap a process which then communicates with the web server. Django works like that. CherryPy works like that. It's very natural to do it this way in Python. The only widely used language that allows web applications to be written any other way (except when used as an old-style CGI script) is PHP. In PHP, you can give any file a .php extension and it'll be executed by the web server. This is very powerful, and it makes PHP easy to learn. But once you go past a certain level of complexity, the single-point-of-entry approach begins to look a lot more attractive.
Having a single index.php file in the public directory can also protect against in the case of the php interpreter going down. A lot of frameworks use the index.php file to include the bootstrap file outside of the doc root. If this happens, the user will be able to see your sourcecode of this single file instead of the entire codebase.
Well, if the only thing that changes is the URL, It doesn't seem like it's done for any reason besides aesthetic purposes...
As for me - single entry point can help you to have better control of your application: it helps to handle errors easily, route requests, debug application.
A single "index.php" is an easy way to make sure all requests to your application flow through the same gate. This way when you add a second page you don't have to make sure bootstrapping, authentication, authorization, logging, etc are all configured--you get it for free by merit of the framework.
In modern web frameworks this could be using a front controller but it is impossible to tell since a lot of PHP code/developers suffer from NIH syndrome.
Typically such approaches are used when the contents of the pages are determined by database contents. Thus all the work would get done in a single file. This is seen often in CMS systems.

Caching one section of a CodeIgniter page

I'm using codeigniter to create a web app. I'm creating the drop down menus in the header. These are included on every page using load->view(). To populate the menus some quite complex SQL is used. The contents on the menu will change infrequently (once/twice a week), whereas the rest of the data on the page is constantly changing.
Therefore I don't want the overhead of running this SQL everytime a page is loaded. I looked at using CI caching but its on a page level. I really only want to cache a small segment.
What do you think the best approach to this is? I thought about writing this to a text file, but then you'd have to manually run the code to write this every so often.
Take a look at the query caching page, or alternatively you can use a caching library to cache sections of a page (preferred method IMHO) take a look at KhCache.
Another good caching library I have found is from Phil Sturgeon. http://github.com/philsturgeon/codeigniter-cache
It allows caching models, libraries, or just partial caching like you describe.

Categories