After doing a lot of reading on the subject, I realized that many developers mix javascript and php in the same file (by adding the .php extension or using other ways).
On the other hand, if I choose to separate the javascript from the php and store it in an external cacheable static file, I gain some performance advantage, but I also need to find creative ways to pass server-side data to the javascript.
For example, since I can't use a php foreach loop in the .js file I need to convert php arrays to json objects using json_encode. In other cases, I need to declare gloabl javascript variables in the original php file so I can use them in the external js file.
Since server side processing is considered faster than javascript, converting to js arrays and using global vars may also be a bad idea...
The bottom line is I'm trying to understand the trade off here. Which has more impact on performance, enable caching of js files or keeping a cleaner code by avoiding global js variables and multidemnsional js arrays?
are you talking about performance of the server or the browser?
my personal opinion is that given the choice between making a server slower or making a browser slower, you should always choose to let the browser be slower.
usually, "slow" means something like "takes 100ms" or so, which is not noticeable on an individual browser, but if you have a few hundred requests to a server and they're all delayed by that, the effect is cumulative, and the response becomes sluggish. very noticeable.
let the browser take the hit.
I think it depends on what your trying to do. My personal opinion is that it's a little bit of a pain to prevent your dynamic JavaScript from being cached.
Your static JS files need to contain your functions and no dynamic data. Your HTML page can contain your dynamic data. Either within a SCRIPT block (where you will be able to use PHP foreach), or by putting your data into the DOM where the JavaScript can read it, it can be visible (in a table) or invisible (e.g. in a comment) - depends on whether your data is presentable or not.
You could also use AJAX to fetch your dynamic data, but this will be an additional request, just like an external JS file containing the data would.
As Kae says, adding additional load onto the client would benefit your server in terms of scalability (how many users you can serve at any one time).
Data:
If the amount of dynamic data isn't too big and constantly changing (must not be cached by the browser), I would suggest adding it to the head of the HTML. To prevent it from polluting the global namespace, you can use either a closure or a namespace (object), to contain all related variables. Performance-wise, I don't think that in this case, there would be much difference between looping the data into JS-friendly format or handling it to the finest detail on the server (JS has become amazingly fast).
Things are a bit more complicated when the amount of data is huge (100+kbs to megabytes). In case the data is pretty much constant and cacheable, you should generate a external data file (not an actual new file, but an unique URL), which you could then include. Using a timestamp in the name or correctly set cache headers would then enable you to save the time on both server-side (generating the JS-friendly output) and client-side (downloading data) and still offer up to date data.
If you have a lot of data, but it's constantly changing, I'd still use external JS files generated by PHP, but you have to be extra careful to disable browser caching, which make your constantly changing data pretty much constant. You could also do dynamic loading, where you pull different parts of data in parallel and on demand via JS requests.
Code:
The functional part of your code should follow the explanation from before:
Now to the question whether JS should be inlined to the HTML or separated. This depends highly on code, mostly of it's length and reusability. If it's just 20 lines of JS, 10 of which are arrays, etc that are generated by PHP, it would make more sense to leave the code inside the HTML, because HTTP requests (the way how all resources are delivered to the client) are expensive and requesting a small file isn't necessarily a great idea.
However, if you have a bit bigger file with lots of functionality etc (10s of kbs), it would be sensible to include it as a separate .js file in order to make it cacheable and save it from being downloaded every time.
And there's no difference in PHP or JS performance, whether you include the JS inside templates/PHP or separately. It's just a matter of making a project manageable. Whatever you do, you should seriously look into using templates.
After doing a lot of reading
That's what you are probably doing wrong.
There are many people fond on writing articles (and answers on Stackovervlow as well) who has a very little experience and whose knowledge is based... on the other articles they read!
Don't follow their bad example.
Instead of "a lot of reading" you have to do a lot of profiling!.
First of all you have to spot the bottlenecks and see if any of them are caching related.
Next thing you have to decide is what kind caching your system require.
And only then you can start looking for the solution.
Hope it helps.
QDF to your problem is to send the data in a hidden HTML table.
HTML tables are easy to generate in php and easy to ready in JavaScript.
I have a solution when the situation is passing info from php to js and keep most js outside the main php file.
Use the js objects or js functions.
You make some code that needs data from php. When the page loads some small js code is generated from php. Like:
<script type="text/javascript">
a(param1, param2, param3)
</script>
and it's done. The server indicates param1, param2 and param3 directly in the code.
The function is inside a .js file that is cached. With this you reduce the server's upload and the time for the page js to start. The client's code is a bit slower but you win for the time to download and the server becomes faster.
Related
I've just started learning PHP and just done with $_POST/$_GET.
Now I want to know, what is the pro's and con's of having the PHP to process the data from a form inside the same file, or send the data to another file (action="anotherfile")?
Logically I will think that sending it to another file would increase the time process it, but is that true?
When I have the PHP script inside the same file, the page doesnt seem to reload when I hit the submit button (but the content changes). Or does it? If it does, wouldn't the only difference would be that I would have to type the script for the menu (lets say you have the same menu on all pages) in both files? Which would lead to more coding/less space?
what is the pro's and con's of having the PHP to process the data from a form inside the same file, or send the data to another file (action="anotherfile")?
You are conflating files and urls.
By having the logic split between different files (and then included where appropriate) you seperate concerns and make your code easier to manage.
By having a single URL be responsible for both displaying the form and processing the form data you don't end up in the awkward situation where the result of processing the form data requires that you redisplay the form with error messages in it. If you used two different URLs there you would need to either display the form on the processing URL (so you have two different URLs which display the form) or perform an HTTP redirect back to the original URL while somehow passing details of the errors to it.
Logically I will think that sending it to another file would increase the time process it, but is that true?
No. It makes no difference on the time scales being dealt with.
When I have the PHP script inside the same file, the page doesnt seem to reload when I hit the submit button (but the content changes).
It does reload.
If it does, wouldn't the only difference would be that I would have to type the script for the menu (lets say you have the same menu on all pages) in both files?
That's what includes are for.
In any language we always try to write clean code. That's why we follow MVC.
Logically I will think that sending it to another file would increase the time process it, but is that true? I think NO.
Because when we send data to another page and on another page at the top we echo that post data and exit. you will see it will not take time. it take time when we redirect/load some html page after that.
It does not matter where we sending data (same page or another page). matter is what is loading after that.
There is no difference about speed.
Whetever you post the content of your form in standard submit, this data will be sent to the server and a response (after processing ) will be downloaded.
The only difference is about organization of your code. The logic that draws themplate of page (menu or other fixed parts) should be stored in some file that you can include separately or call by a function.
Is also true that when you post your data you do for some reason, register a user for example. Is a good pratice that the php file that handles user registration will do that and output the messages and not other functions.
If your file has some logic switches that make it output either an empty form or a a registration message based on the presence of post or get variables, you will notice that when you scale to more complex tasks this will add complexity and make code mantainment harder.
I'll try to make sure I understand your question by restating it.
If you have a form (/form.php), and the "action" of that submit button leads you to a separate php page (/form_action.php), there is absolutely no difference in speed. Each HTTP request (form.php and form_action.php) is independent - "form_action.php" doesn't remember anything about "form.php" unless you pass that information through (as parameters). This is what people mean when they say that HTTP is stateless. It's worth learning about how HTTP works in general alongside the details of PHP.
If you have a PHP script which in turn includes other PHP scripts, there is a tiny performance impact - too small to measure in pretty much any case I've ever come across.
However, using includes allows you to separate your markup (the HTML) from the logic (the PHP). This is a really good thing if you are doing anything other than tinkering. It allows you to re-use functionality, it makes it much easier to change and maintain the code over time, and it helps you think through what you're trying to achieve.
There are many different ways people have solved the "how do I keep my code clean" puzzle; the current orthodoxy is "Model-View-Controller" (as #monty says). There are also PHP frameworks which make this a little easier to implement - once you've got the basics of the language, you might want to look at Zend or TinyMVC (there are several others, each with their benefits and drawbacks).
For example, you are building a dictionary app where the entries are objects, and all the values are stored in a server-side database. The entries are visible client-side, so they should eventually be JavaScript objects. However, since the data is server-side, there are two ways to construct the object:
Construct the entries objects via PHP, then pass the result to a .js script, which makes JavaScript objects from it.
Construct the entries via JavaScript, calling AJAX methods on the object to request the specific information about the entry (e.g. definition, synonyms, antonyms, etc.) from the server.
The first way ends up constructing each entry twice, once via PHP and once via JavaScript. The second way ends up calling several AJAX methods for every construction, and opening and closing the database connection each time.
Is one preferable to the other, or is there a better way to do this?
I use a rule of thumb, the less AJAX (on a page opener) the better.
If you can push all information on the page load to the user, do it. Then use AJAX on subsequent calls. Otherwise the user-experience will suffer from AJAX (rather than benefit) as the page will take longer to load.
Another option would be, if you're not tied to PHP, to have a JS-based back-end like Node.js. This way you can transmit everything in one format. In some cases you can even store the JS object directly on the database. An example of this kind back-end would be Node.js + Mondo DB, if document database is suitable to your needs.
If you're tied to PHP/JS, i'd go for minimizing AJAX calls. Making asynchronous transfer (duplicating objects) should achieve improved user experience, and the choices made should aim for this. Too many HTTP-requests usually end up making the site slow to react, which is one of the things we usually try to get rid of by using AJAX.
One way that's sometimes useful is also to render JS object by PHP, that could be used if the data is going to be needed but that should not be directly (or at all) shown to the user.
Totally depends on the project. There are just too many variables to say 'you should do it this way'.
Instead, test it. Do it one way, push it to a high number of requests, and profile it. Then switch and try the other way. Keep it easy to switch the output from the PHP by following the MVC pattern.
The only general rule is 'minimise the number of HTTP requests', as HTTP is by far the biggest bottleneck when a page is loading.
I'm creating a website which will require much AJAX functionality. At the moment I'm creating multiple PHP files each containing the processing for each AJAX 'function'. I just realised an alternative would be to create an uber-PHP file that all AJAX calls would request, passing a specific "cmd" parameter specifying what functionality to execute server-side.
Are there dis/advantages to either approach? How is it commonly done?
There are some consolidation opportunities with a big AJAX handler/dispatcher script. You can probe for HTTP_X_REQUESTED_WITH in a central place. And if you need proper authorization for some functions, do it for all requests alike.
And obviously you can also unify the output format/handling. Instead of replicating the header response, you need to write that only once.
But don't bother with assessing the performance theory behind this or that approach. Unless you need a very simple callback feature (e.g. dictionary checks) which might benefit from a separate script, handle it with one big ajax script. Most often a central script would provide the security and maintenance advantage.
I use one monster file when developing and testing - then split it up in smaller chunks for deployment.
Of course there's a performance penalty if the server has process 1200 lines just to get to the 20 it actually needs for a given task.
The same goes for importing lots of unnecessary "tools" from separate files.
Do the household-tasks once, before deployment
- instead of 10.000.000 times a day - it's just common sense
It helps to leave "markers" in your source code, to enable a script to do the splitting up, and pre-including of tools/tables automatically..
- some smart asses call it unnecessary (and dangerous!) optimization - they're just lazy : )
I personally do a bit of both.
I have a main script that includes separate scripts with require_once(); based on which one is needed. (using a switch and a "type" get/post variable).
This way I can handle certain things in a central location, like header handling.
And I can also easily separate scripts without the need to read a 2k line file each time.
I have a website that's about 10-12 pages strong, using jQuery/Javascript throughout. Since not all scripts are necessary in each and every page, I'm currently using a switchstatement to output only the needed JS on any given page, so as to reduce the number of requests.
My question is, how efficient is that, performance-wise ? If it is not, is there any other way to selectively load only the needed JS on a page ?
This may not be necessary at all.
Bear in mind that if your caching is properly set up, embedding a JavaScript will take time only on first load - every subsequent request will come from the cache.
Unless you have big exceptions (like, a specific page using a huge JS library), I would consider embedding everything at all times, maybe using minification so everything is in one small file.
I don't see any performance issues with the method you are using, though. After all, it's about deciding whether to output a line of code or not. Use whichever method is most readable and maintainable in the long term.
Since you're using JS already, you can use JS solution completely - for example you could use yepnope instead of php. I don't know what's the structure of your website and how you determine which page needs what or at what point is something included (on load, on after some remote thing has finished delivering data), however if you use $.ajax extensively, you could also use yepnope to pull additional JS that's needed once $.ajax is done with what it was supposed to do.
You can safely assume the javascript is properly cached on the clientside.
As I also assume you serve a minified file, seen the size of your website I'd say the performance is neglectable.
It is much better to place ALL your JavaScript in a single separate ".js" file and reference this file in your pages.
The reason is that the browser will cache this file efficiently and it will only be downloaded once per session (or less!).
The only downside is you need to "refresh" a couple of times if you change your script.
So, after tinkering a bit, I decided to give LABjs a try. It does work well, and my code is much less bloated as a result. No noticeable increase in performance given the size of my site, but the code is much, much more maintainable now.
Funny thing is, I had a facebook like button in my header. After analyzing the requests in firebug I decided to remove it, and gained an astounding 2 seconds on the pages loading time. Holy crap is this damn thing inneficient...
Thanks for the answers all !
I know of these two tricks for speeding page load time up some:
#ini_set('zlib.output_compression', 1);
which turns on compression
ob_implicit_flush(true);
which implicitly flushes the output buffer, meaning as soon as anything is output it is immediately sent to the user's browser. This one's a tad tricky, since it just creates the illusion that the page is loading quickly while in actuality it takes the same amount of time and the data is just being shown faster.
What other php tricks are there to make your pages load (or appear to load) faster?
It is always better to define a real bottleneck and then try to avoid it.
The way to follow any trick that is supposed to make something faster without understanding whether you have the problem or not - is always a wrong way.
The best way is to ensure that your script isn't creating/destroying unnecessary variables and make everything as efficient as possible. After that, you can look into a caching service so that the server does not have to reparse specific parts of a page.
If all that doesn't make it as fast as you need it to be, you can even "compile" the php code. Facebook does this to support faster load times. They created something called "HipHop for PHP" and you can read about it at: https://developers.facebook.com/blog/post/358/
There are other PHP compilers you can use to help.
If all this fails, then I suggest you either recode the website in a different language, or figure out why it is taking so long (more specifically, WHAT is causing it to take so long) and change that part of the website.
There are some that can speed your website(code custmoization)
1) If you’re looping through an array, for example, count() it beforehand, store the value in a variable, and use that for your test. This way, you avoid needlessly firing the test function with every loop iteration.
2) use build in function instead of custom function
3) put JavaScript function and files at bottom of file
4) use caching
Among the best tricks to speed up PHP page loads is to use as little PHP as possible, i.e. use a PHP cache/accelerator such as Zend or APC, or cache as much as you can yourself. PHP that does not need to be parsed again is faster, and PHP that does not run at all is still faster.
The same goes (maybe even more so) for database. Use as few queries as possible. If you can combine two queries into one, you save one round trip.