I have a website that's about 10-12 pages strong, using jQuery/Javascript throughout. Since not all scripts are necessary in each and every page, I'm currently using a switchstatement to output only the needed JS on any given page, so as to reduce the number of requests.
My question is, how efficient is that, performance-wise ? If it is not, is there any other way to selectively load only the needed JS on a page ?
This may not be necessary at all.
Bear in mind that if your caching is properly set up, embedding a JavaScript will take time only on first load - every subsequent request will come from the cache.
Unless you have big exceptions (like, a specific page using a huge JS library), I would consider embedding everything at all times, maybe using minification so everything is in one small file.
I don't see any performance issues with the method you are using, though. After all, it's about deciding whether to output a line of code or not. Use whichever method is most readable and maintainable in the long term.
Since you're using JS already, you can use JS solution completely - for example you could use yepnope instead of php. I don't know what's the structure of your website and how you determine which page needs what or at what point is something included (on load, on after some remote thing has finished delivering data), however if you use $.ajax extensively, you could also use yepnope to pull additional JS that's needed once $.ajax is done with what it was supposed to do.
You can safely assume the javascript is properly cached on the clientside.
As I also assume you serve a minified file, seen the size of your website I'd say the performance is neglectable.
It is much better to place ALL your JavaScript in a single separate ".js" file and reference this file in your pages.
The reason is that the browser will cache this file efficiently and it will only be downloaded once per session (or less!).
The only downside is you need to "refresh" a couple of times if you change your script.
So, after tinkering a bit, I decided to give LABjs a try. It does work well, and my code is much less bloated as a result. No noticeable increase in performance given the size of my site, but the code is much, much more maintainable now.
Funny thing is, I had a facebook like button in my header. After analyzing the requests in firebug I decided to remove it, and gained an astounding 2 seconds on the pages loading time. Holy crap is this damn thing inneficient...
Thanks for the answers all !
Related
Think of PHP templating.
I was recently contemplating whether it makes sense to read a template file once, storing it in memory, and then parsing it (replace placeholders with values, e.g.) rather than require-ing that file as many times as you need it. A usage scenario would be a list with list items templated as separate files. The first thoughts I had were inclined towards the former solution, because I reckon replacing values would be an easier operation than requiring the file from the file system. Later, however, I realized that pretty much all hard disk drives (or other storage, for that matter) have their own caching, and requiring the same file over and over, will not result in it being re-read each time, but rather re-served from the cache.
Any thoughts are appreciated.
I assume by "disk cache" you're actually referring to the page cache? Wikipedia: Page Cache
If so I wouldn't really be inclined to trust something like this with the performance of my application. Don't forget the page cache only uses UNUSED memory and will happily spit it back out when needed.
I would be inclined to use something like APC as an object cache, this has the great side effect of not having to actually rewrite any of your code as it's all done behind the scenes. Another possibility would be to just assign your template to a variable and constantly reuse that. Or, if you wanted to you could even use Memcache, this kind of stuff is more useful for caching database returns though, or large datasets.
Sorry for the slightly incoherent ramblings...
I was recently contemplating
That's quite wrong of you.
Groundless contemplating out of nowhere seldom does any good but most likely will put you in a trouble. Just out of nowhere.
Instead of contemplating, one have to do profiling.
Of course no to measure any changes, like H Hatfeld said, but to determine, if they need any changes at all. Most of time it turns out that you were barking wrong tree.
Profiling is the right thing to make you bark the right one.
whether it makes sense to read a template file onc, estoring it in memory, and then parsing it
For the highload(or bloated) projects it makes.
So, PHP already have such a feature, called bytecode cache. There is a plenty of the thing on the market, at our company we are using eAccelerator.
But most of time default every-request parsing is enough.
You are absolutely right about filesystem cache and parsing being blazingly fast, much faster than usual application logic, which has to be optimized at the first place.
Every time you include a file, PHP has to parse it. This penalty can be offset using an opcode cache like APC. If your templates don't contain any PHP (which it sounds like they don't), I would recommend loading the template into memory once and then re-using it as needed.
Another thing to keep in mind when looking to optimize your code is make sure you can measure the change. Use something like Xdebug to profile your code and measure what effect your changes are having.
Edit
Since the files do currently contain PHP, take a look at this question/answer. I would recommend putting a function in the file so that it only needs to be loaded once, but can be called multiple times with different parameters.
After doing a lot of reading on the subject, I realized that many developers mix javascript and php in the same file (by adding the .php extension or using other ways).
On the other hand, if I choose to separate the javascript from the php and store it in an external cacheable static file, I gain some performance advantage, but I also need to find creative ways to pass server-side data to the javascript.
For example, since I can't use a php foreach loop in the .js file I need to convert php arrays to json objects using json_encode. In other cases, I need to declare gloabl javascript variables in the original php file so I can use them in the external js file.
Since server side processing is considered faster than javascript, converting to js arrays and using global vars may also be a bad idea...
The bottom line is I'm trying to understand the trade off here. Which has more impact on performance, enable caching of js files or keeping a cleaner code by avoiding global js variables and multidemnsional js arrays?
are you talking about performance of the server or the browser?
my personal opinion is that given the choice between making a server slower or making a browser slower, you should always choose to let the browser be slower.
usually, "slow" means something like "takes 100ms" or so, which is not noticeable on an individual browser, but if you have a few hundred requests to a server and they're all delayed by that, the effect is cumulative, and the response becomes sluggish. very noticeable.
let the browser take the hit.
I think it depends on what your trying to do. My personal opinion is that it's a little bit of a pain to prevent your dynamic JavaScript from being cached.
Your static JS files need to contain your functions and no dynamic data. Your HTML page can contain your dynamic data. Either within a SCRIPT block (where you will be able to use PHP foreach), or by putting your data into the DOM where the JavaScript can read it, it can be visible (in a table) or invisible (e.g. in a comment) - depends on whether your data is presentable or not.
You could also use AJAX to fetch your dynamic data, but this will be an additional request, just like an external JS file containing the data would.
As Kae says, adding additional load onto the client would benefit your server in terms of scalability (how many users you can serve at any one time).
Data:
If the amount of dynamic data isn't too big and constantly changing (must not be cached by the browser), I would suggest adding it to the head of the HTML. To prevent it from polluting the global namespace, you can use either a closure or a namespace (object), to contain all related variables. Performance-wise, I don't think that in this case, there would be much difference between looping the data into JS-friendly format or handling it to the finest detail on the server (JS has become amazingly fast).
Things are a bit more complicated when the amount of data is huge (100+kbs to megabytes). In case the data is pretty much constant and cacheable, you should generate a external data file (not an actual new file, but an unique URL), which you could then include. Using a timestamp in the name or correctly set cache headers would then enable you to save the time on both server-side (generating the JS-friendly output) and client-side (downloading data) and still offer up to date data.
If you have a lot of data, but it's constantly changing, I'd still use external JS files generated by PHP, but you have to be extra careful to disable browser caching, which make your constantly changing data pretty much constant. You could also do dynamic loading, where you pull different parts of data in parallel and on demand via JS requests.
Code:
The functional part of your code should follow the explanation from before:
Now to the question whether JS should be inlined to the HTML or separated. This depends highly on code, mostly of it's length and reusability. If it's just 20 lines of JS, 10 of which are arrays, etc that are generated by PHP, it would make more sense to leave the code inside the HTML, because HTTP requests (the way how all resources are delivered to the client) are expensive and requesting a small file isn't necessarily a great idea.
However, if you have a bit bigger file with lots of functionality etc (10s of kbs), it would be sensible to include it as a separate .js file in order to make it cacheable and save it from being downloaded every time.
And there's no difference in PHP or JS performance, whether you include the JS inside templates/PHP or separately. It's just a matter of making a project manageable. Whatever you do, you should seriously look into using templates.
After doing a lot of reading
That's what you are probably doing wrong.
There are many people fond on writing articles (and answers on Stackovervlow as well) who has a very little experience and whose knowledge is based... on the other articles they read!
Don't follow their bad example.
Instead of "a lot of reading" you have to do a lot of profiling!.
First of all you have to spot the bottlenecks and see if any of them are caching related.
Next thing you have to decide is what kind caching your system require.
And only then you can start looking for the solution.
Hope it helps.
QDF to your problem is to send the data in a hidden HTML table.
HTML tables are easy to generate in php and easy to ready in JavaScript.
I have a solution when the situation is passing info from php to js and keep most js outside the main php file.
Use the js objects or js functions.
You make some code that needs data from php. When the page loads some small js code is generated from php. Like:
<script type="text/javascript">
a(param1, param2, param3)
</script>
and it's done. The server indicates param1, param2 and param3 directly in the code.
The function is inside a .js file that is cached. With this you reduce the server's upload and the time for the page js to start. The client's code is a bit slower but you win for the time to download and the server becomes faster.
I am using many include to show small sections of my site. Is it fine to use many include or I should just reduce them (as much possible). How much more time does a include function cost?
My home page loads very slowly. What is the way to make it load faster. (My homepage shows almost same content on home page for an hour daily (and it shows some different data in some sections only). Can I cache it..what is the best solution available for caching or some other way with which i can make things faster.)
If the information only lasts for one hour and will be changed, then it's no reason using cache for that section of information, because the next time people visit, they will get another information and the cached one goes waste.
And, I don't think there is much difference between including a file and including a file's content in the current page, since they will all be executed similarly. The use of include() just makes your code look cleaner, easier to control and maintain.
Turning now to the question why your homepage loads too slow, I think it's not a problem with your include()'s, but could be a problem with your way of processing data. As somebody commented in your post, use Xdebug to find what makes your homepage slow.
Good luck.
Maybe the answer to this question helps you:
PHP include(): File size & performance
If the content is updated on an hourly basis, why don't you create a static html (cam easily be done by php) upon an hourly basis, so that, only that static html is read and loaded to users instead of being generated upon web requests.
EDIT:
You create a php script that will generate a file like index.html and fill it with html code. Then you execute that php script every hour. This can be achieved by using CRON jobs. If you want more information on either of those then please ask another question specified on that subject.
I know of these two tricks for speeding page load time up some:
#ini_set('zlib.output_compression', 1);
which turns on compression
ob_implicit_flush(true);
which implicitly flushes the output buffer, meaning as soon as anything is output it is immediately sent to the user's browser. This one's a tad tricky, since it just creates the illusion that the page is loading quickly while in actuality it takes the same amount of time and the data is just being shown faster.
What other php tricks are there to make your pages load (or appear to load) faster?
It is always better to define a real bottleneck and then try to avoid it.
The way to follow any trick that is supposed to make something faster without understanding whether you have the problem or not - is always a wrong way.
The best way is to ensure that your script isn't creating/destroying unnecessary variables and make everything as efficient as possible. After that, you can look into a caching service so that the server does not have to reparse specific parts of a page.
If all that doesn't make it as fast as you need it to be, you can even "compile" the php code. Facebook does this to support faster load times. They created something called "HipHop for PHP" and you can read about it at: https://developers.facebook.com/blog/post/358/
There are other PHP compilers you can use to help.
If all this fails, then I suggest you either recode the website in a different language, or figure out why it is taking so long (more specifically, WHAT is causing it to take so long) and change that part of the website.
There are some that can speed your website(code custmoization)
1) If you’re looping through an array, for example, count() it beforehand, store the value in a variable, and use that for your test. This way, you avoid needlessly firing the test function with every loop iteration.
2) use build in function instead of custom function
3) put JavaScript function and files at bottom of file
4) use caching
Among the best tricks to speed up PHP page loads is to use as little PHP as possible, i.e. use a PHP cache/accelerator such as Zend or APC, or cache as much as you can yourself. PHP that does not need to be parsed again is faster, and PHP that does not run at all is still faster.
The same goes (maybe even more so) for database. Use as few queries as possible. If you can combine two queries into one, you save one round trip.
I've worked on a CMS which would use Smarty to build the content pages as PHP files, then save them to disc so all subsequent views of the same page could bypass the generation phase, keeping DB load and page loading times down. These pages would be completely standalone and not have to run in the context of another script.
The problem was the instance where a user first visited a page that wasn't cached, they'd still have to be displayed the generated content. I was hoping I could save my generated file, then include() it, but filesystem latency meant that this wasn't an option.
The only solution I could find was using eval() to run the generated string after it was generated and saved to disc. While this works, it's not nice to have to debug in, so I'd be very interested in finding an alternative.
Is there some method I could use other than eval in the above case?
Given your scenario, I do not think there is an alternative.
As for the debugging part, you could always write it to disc and include it for the development to test / fix it up that way and then when you have the bugs worked out, switch it over to eval.
Not knowing your system, I will not second guess that you know it better than I do, but it seems like a lot of effort, especially since that the above scenario will only happen once per page...ever. I would just say is it really worth it for that one instance to display the initial page through eval and why could you not be the initial user to generate the pages?