Recreating index.html from scratch when content changes - php

There is one section of my webpage which changes a couple times a day, the rest is static.
I am afraid that the overhead of having the webpage be rendered by the server/database (PHP/MySQL) every time the page is loaded is significant and not necessary if my content changes just a few times a day.
Would it be wrong to have a php script recreate my index.html using file_put_contents every time there is a change to my site? It seems the "con" is complexity of code, but the "pro" is this generates a clean static index.html which doesn't need server resources every time someone opens the page.

Unless you got a huge traffic and really low hardware resources... just use caching. Use something like Alternative PHP Cache (APC) and also memcache.
Than you use a template engine like Twig which also has caching and you're set.

Related

PHP Includes and Load Time

If I add PHP includes to a page. include('example.php') - It then has to load that file which would / could slow down load time correct? Right now I am making unnecessary redirects back to the a login page if the logins are wrong via javascript snippet that is inside my login.php page (which does all the login checking against the database). So in the address bar it shows "admin.php > login.php > admin.php" , but I don't ever want to show what file it is going to in order to test the logins, but I also do not want to include this inside admin.php because I'm afraid it might affect load time.
If you understand my question then suggestions would be helpful.
Do what you feel is best in terms of readability and maintainability at server-side. Then, if you have a performance problem, find where it comes from by measuring (and not guessing), and try to optimize.
You're optimizing prematurely, and this is the root of all evil. Compared to the time it takes to a HTTP request/response to execute, including some additional lines of PHP is very certainly negligible.
New HTTP requests (redirects) are always a bigger performance hit than includes, so there's no reason to use redirects (especially Javascript ones) if you're only concerned about the performance. So if I understand you I'd like to suggest that you just include the file.

PHP number of includes to use

I am using many include to show small sections of my site. Is it fine to use many include or I should just reduce them (as much possible). How much more time does a include function cost?
My home page loads very slowly. What is the way to make it load faster. (My homepage shows almost same content on home page for an hour daily (and it shows some different data in some sections only). Can I cache it..what is the best solution available for caching or some other way with which i can make things faster.)
If the information only lasts for one hour and will be changed, then it's no reason using cache for that section of information, because the next time people visit, they will get another information and the cached one goes waste.
And, I don't think there is much difference between including a file and including a file's content in the current page, since they will all be executed similarly. The use of include() just makes your code look cleaner, easier to control and maintain.
Turning now to the question why your homepage loads too slow, I think it's not a problem with your include()'s, but could be a problem with your way of processing data. As somebody commented in your post, use Xdebug to find what makes your homepage slow.
Good luck.
Maybe the answer to this question helps you:
PHP include(): File size & performance
If the content is updated on an hourly basis, why don't you create a static html (cam easily be done by php) upon an hourly basis, so that, only that static html is read and loaded to users instead of being generated upon web requests.
EDIT:
You create a php script that will generate a file like index.html and fill it with html code. Then you execute that php script every hour. This can be achieved by using CRON jobs. If you want more information on either of those then please ask another question specified on that subject.

Efficiency of using php to load scripts?

I have a website that's about 10-12 pages strong, using jQuery/Javascript throughout. Since not all scripts are necessary in each and every page, I'm currently using a switchstatement to output only the needed JS on any given page, so as to reduce the number of requests.
My question is, how efficient is that, performance-wise ? If it is not, is there any other way to selectively load only the needed JS on a page ?
This may not be necessary at all.
Bear in mind that if your caching is properly set up, embedding a JavaScript will take time only on first load - every subsequent request will come from the cache.
Unless you have big exceptions (like, a specific page using a huge JS library), I would consider embedding everything at all times, maybe using minification so everything is in one small file.
I don't see any performance issues with the method you are using, though. After all, it's about deciding whether to output a line of code or not. Use whichever method is most readable and maintainable in the long term.
Since you're using JS already, you can use JS solution completely - for example you could use yepnope instead of php. I don't know what's the structure of your website and how you determine which page needs what or at what point is something included (on load, on after some remote thing has finished delivering data), however if you use $.ajax extensively, you could also use yepnope to pull additional JS that's needed once $.ajax is done with what it was supposed to do.
You can safely assume the javascript is properly cached on the clientside.
As I also assume you serve a minified file, seen the size of your website I'd say the performance is neglectable.
It is much better to place ALL your JavaScript in a single separate ".js" file and reference this file in your pages.
The reason is that the browser will cache this file efficiently and it will only be downloaded once per session (or less!).
The only downside is you need to "refresh" a couple of times if you change your script.
So, after tinkering a bit, I decided to give LABjs a try. It does work well, and my code is much less bloated as a result. No noticeable increase in performance given the size of my site, but the code is much, much more maintainable now.
Funny thing is, I had a facebook like button in my header. After analyzing the requests in firebug I decided to remove it, and gained an astounding 2 seconds on the pages loading time. Holy crap is this damn thing inneficient...
Thanks for the answers all !

Alternative to eval() when caching and displaying generated PHP pages

I've worked on a CMS which would use Smarty to build the content pages as PHP files, then save them to disc so all subsequent views of the same page could bypass the generation phase, keeping DB load and page loading times down. These pages would be completely standalone and not have to run in the context of another script.
The problem was the instance where a user first visited a page that wasn't cached, they'd still have to be displayed the generated content. I was hoping I could save my generated file, then include() it, but filesystem latency meant that this wasn't an option.
The only solution I could find was using eval() to run the generated string after it was generated and saved to disc. While this works, it's not nice to have to debug in, so I'd be very interested in finding an alternative.
Is there some method I could use other than eval in the above case?
Given your scenario, I do not think there is an alternative.
As for the debugging part, you could always write it to disc and include it for the development to test / fix it up that way and then when you have the bugs worked out, switch it over to eval.
Not knowing your system, I will not second guess that you know it better than I do, but it seems like a lot of effort, especially since that the above scenario will only happen once per page...ever. I would just say is it really worth it for that one instance to display the initial page through eval and why could you not be the initial user to generate the pages?

How does CodeIgniter output caching work?

I read this link :-
http://codeigniter.com/user_guide/general/caching.html
It is written that :-
When a page is loaded for the first time, the cache file will be written to your system/cache folder
and that we can cache a view by $this->output->cache(60);. But how does it actually work? What if regularly my users keep updating and deleting records as a result of which view changes very often. Will it show the modified data? or will the cache bring back the old stale data? (before inserts and updates)? If it automatically manages and brings fresh data from the database, then what is purpose of specifying the minutes in cache function?
Thanks in advance :)
The way codeigniter's caching works generally is this:
A page request is made. Codeigniter (before very much of the framework has even been loaded) does a hash of the current url and if it finds that filename in the cache directory, it serves that.
The only way you can get fresh data is to manually delete the files. When codeigniter doesn't find the file from the hash it generated, it dynamically creates the page.
Codeigniter's implementation is called "full page" caching, and as so, is limited in it's usefullness. There's a partial caching library I've looked into from Phil Sturgeon here: http://philsturgeon.co.uk/code/codeigniter-cache
Honestly, for most projects, full page caching really isn't all that useful. In fact, the projects that I need full page caching I don't even leave that up to codeigniter (I leave it to the webserver: it's way faster).
I'd guess what you're looking for is a partial caching method; most people would prefer this. Look into APC if you're using a single server or Memcached if you have multiple servers.
Good Luck.
But how does it actually work?
If a cached version exists that is younger than the cache time, that cached version will be outputted.
Will it show the modified data?
Eventually yes, but with a lag of $cache_time
What if regularly my users keep updating and deleting records as a result of which view changes very often.
Reduce the cache time or don't use caching at all

Categories