Alternative to eval() when caching and displaying generated PHP pages - php

I've worked on a CMS which would use Smarty to build the content pages as PHP files, then save them to disc so all subsequent views of the same page could bypass the generation phase, keeping DB load and page loading times down. These pages would be completely standalone and not have to run in the context of another script.
The problem was the instance where a user first visited a page that wasn't cached, they'd still have to be displayed the generated content. I was hoping I could save my generated file, then include() it, but filesystem latency meant that this wasn't an option.
The only solution I could find was using eval() to run the generated string after it was generated and saved to disc. While this works, it's not nice to have to debug in, so I'd be very interested in finding an alternative.
Is there some method I could use other than eval in the above case?

Given your scenario, I do not think there is an alternative.
As for the debugging part, you could always write it to disc and include it for the development to test / fix it up that way and then when you have the bugs worked out, switch it over to eval.
Not knowing your system, I will not second guess that you know it better than I do, but it seems like a lot of effort, especially since that the above scenario will only happen once per page...ever. I would just say is it really worth it for that one instance to display the initial page through eval and why could you not be the initial user to generate the pages?

Related

How is a cache system effective in the php lifecycle?

I'm a little confused here.
I know that for every PHP request, the entire application is bootstrapped all over again.
Given this, how can a cache be effective, if all of the globals are reloaded for each and every request?
For example:
User calls URI/user/view/123. User 123 is loaded from a database and stored in $user.
Why would you cache the contents of $user - when you merely need to refer to the variable in order to get the contents?
Am I missing the point?
Thank you,
Its more like caching images, common database querys
For instance say your site has a lot of articles and each article has categories. And say you dont change categories very often, then using a cached result of a query of the categories table is preferable then doing the query. this is a simplified example.
Another example is with images, if your site needs like thumbnailed version of user photos that they have uploaded instead of having php use the GD library to rescale the image and etc just save a version of that thumbnail version and use it instead of running through the GD code again.
As always, an image is worth a thousand words, here it is :)
(source)
As you can see, you reload some PHP librairies (like the basic environment (Globals, Requests, Cookies, etc), but not everything (in this case, Security, Application, various libraries, Views).
You skip what can be cached ;)

PHP number of includes to use

I am using many include to show small sections of my site. Is it fine to use many include or I should just reduce them (as much possible). How much more time does a include function cost?
My home page loads very slowly. What is the way to make it load faster. (My homepage shows almost same content on home page for an hour daily (and it shows some different data in some sections only). Can I cache it..what is the best solution available for caching or some other way with which i can make things faster.)
If the information only lasts for one hour and will be changed, then it's no reason using cache for that section of information, because the next time people visit, they will get another information and the cached one goes waste.
And, I don't think there is much difference between including a file and including a file's content in the current page, since they will all be executed similarly. The use of include() just makes your code look cleaner, easier to control and maintain.
Turning now to the question why your homepage loads too slow, I think it's not a problem with your include()'s, but could be a problem with your way of processing data. As somebody commented in your post, use Xdebug to find what makes your homepage slow.
Good luck.
Maybe the answer to this question helps you:
PHP include(): File size & performance
If the content is updated on an hourly basis, why don't you create a static html (cam easily be done by php) upon an hourly basis, so that, only that static html is read and loaded to users instead of being generated upon web requests.
EDIT:
You create a php script that will generate a file like index.html and fill it with html code. Then you execute that php script every hour. This can be achieved by using CRON jobs. If you want more information on either of those then please ask another question specified on that subject.

Efficiency of using php to load scripts?

I have a website that's about 10-12 pages strong, using jQuery/Javascript throughout. Since not all scripts are necessary in each and every page, I'm currently using a switchstatement to output only the needed JS on any given page, so as to reduce the number of requests.
My question is, how efficient is that, performance-wise ? If it is not, is there any other way to selectively load only the needed JS on a page ?
This may not be necessary at all.
Bear in mind that if your caching is properly set up, embedding a JavaScript will take time only on first load - every subsequent request will come from the cache.
Unless you have big exceptions (like, a specific page using a huge JS library), I would consider embedding everything at all times, maybe using minification so everything is in one small file.
I don't see any performance issues with the method you are using, though. After all, it's about deciding whether to output a line of code or not. Use whichever method is most readable and maintainable in the long term.
Since you're using JS already, you can use JS solution completely - for example you could use yepnope instead of php. I don't know what's the structure of your website and how you determine which page needs what or at what point is something included (on load, on after some remote thing has finished delivering data), however if you use $.ajax extensively, you could also use yepnope to pull additional JS that's needed once $.ajax is done with what it was supposed to do.
You can safely assume the javascript is properly cached on the clientside.
As I also assume you serve a minified file, seen the size of your website I'd say the performance is neglectable.
It is much better to place ALL your JavaScript in a single separate ".js" file and reference this file in your pages.
The reason is that the browser will cache this file efficiently and it will only be downloaded once per session (or less!).
The only downside is you need to "refresh" a couple of times if you change your script.
So, after tinkering a bit, I decided to give LABjs a try. It does work well, and my code is much less bloated as a result. No noticeable increase in performance given the size of my site, but the code is much, much more maintainable now.
Funny thing is, I had a facebook like button in my header. After analyzing the requests in firebug I decided to remove it, and gained an astounding 2 seconds on the pages loading time. Holy crap is this damn thing inneficient...
Thanks for the answers all !

Tricks to speed up page load time in PHP

I know of these two tricks for speeding page load time up some:
#ini_set('zlib.output_compression', 1);
which turns on compression
ob_implicit_flush(true);
which implicitly flushes the output buffer, meaning as soon as anything is output it is immediately sent to the user's browser. This one's a tad tricky, since it just creates the illusion that the page is loading quickly while in actuality it takes the same amount of time and the data is just being shown faster.
What other php tricks are there to make your pages load (or appear to load) faster?
It is always better to define a real bottleneck and then try to avoid it.
The way to follow any trick that is supposed to make something faster without understanding whether you have the problem or not - is always a wrong way.
The best way is to ensure that your script isn't creating/destroying unnecessary variables and make everything as efficient as possible. After that, you can look into a caching service so that the server does not have to reparse specific parts of a page.
If all that doesn't make it as fast as you need it to be, you can even "compile" the php code. Facebook does this to support faster load times. They created something called "HipHop for PHP" and you can read about it at: https://developers.facebook.com/blog/post/358/
There are other PHP compilers you can use to help.
If all this fails, then I suggest you either recode the website in a different language, or figure out why it is taking so long (more specifically, WHAT is causing it to take so long) and change that part of the website.
There are some that can speed your website(code custmoization)
1) If you’re looping through an array, for example, count() it beforehand, store the value in a variable, and use that for your test. This way, you avoid needlessly firing the test function with every loop iteration.
2) use build in function instead of custom function
3) put JavaScript function and files at bottom of file
4) use caching
Among the best tricks to speed up PHP page loads is to use as little PHP as possible, i.e. use a PHP cache/accelerator such as Zend or APC, or cache as much as you can yourself. PHP that does not need to be parsed again is faster, and PHP that does not run at all is still faster.
The same goes (maybe even more so) for database. Use as few queries as possible. If you can combine two queries into one, you save one round trip.

Security implications of writing files using PHP

I'm currently trying to create a CMS using PHP, purely in the interest of education. I want the administrators to be able to create content, which will be parsed and saved on the server storage in pure HTML form to avoid the overhead that executing PHP script would incur. Unfortunately, I could only think of a few ways of doing so:
Setting write permission on every directory where the CMS should want to write a file. This sounds like quite a bad idea.
Setting write permissions on a single cached directory. A PHP script could then include or fopen/fread/echo the content from a file in the cached directory at request-time. This could perhaps be carried out in a Mediawiki-esque fashion: something like index.php?page=xyz could read and echo content from cached/xyz.html at runtime. However, I'll need to ensure the sanity of $_GET['page'] to prevent nasty variations like index.php?page=http://www.bad-site.org/malicious-script.js.
I'm personally not too thrilled by the second idea, but the first one sounds very insecure. Could someone please suggest a good way of getting this done?
EDIT: I'm not in the favour of fetching data from the database. The only time I would want to fetch data from the database would be when the content is cached. Secondly, I do not have access to memcached or any PHP accelerator.
Since you're building a CMS, you'll have to accept that if the user wants to do evil things to visitors, they very likely can. That's true regardless of where you store your content.
If the public site is all static content, there's nothing wrong with letting the CMS write the files directly. However, you'll want to configure the web server to not execute anything in any directory writable by the CMS.
Even though you don't want to hit the database every time, you can set up a cache to minimize database reads. Zend_Cache works very nicely for this, and can be used quite effectively as a stand-alone component.
You should put your pages in a database and retrieve them using parameterized SQL queries.
I'd go with the second option but modify it so the files are retrieved using mod_rewrite rather than a custom php function.

Categories