I am using many include to show small sections of my site. Is it fine to use many include or I should just reduce them (as much possible). How much more time does a include function cost?
My home page loads very slowly. What is the way to make it load faster. (My homepage shows almost same content on home page for an hour daily (and it shows some different data in some sections only). Can I cache it..what is the best solution available for caching or some other way with which i can make things faster.)
If the information only lasts for one hour and will be changed, then it's no reason using cache for that section of information, because the next time people visit, they will get another information and the cached one goes waste.
And, I don't think there is much difference between including a file and including a file's content in the current page, since they will all be executed similarly. The use of include() just makes your code look cleaner, easier to control and maintain.
Turning now to the question why your homepage loads too slow, I think it's not a problem with your include()'s, but could be a problem with your way of processing data. As somebody commented in your post, use Xdebug to find what makes your homepage slow.
Good luck.
Maybe the answer to this question helps you:
PHP include(): File size & performance
If the content is updated on an hourly basis, why don't you create a static html (cam easily be done by php) upon an hourly basis, so that, only that static html is read and loaded to users instead of being generated upon web requests.
EDIT:
You create a php script that will generate a file like index.html and fill it with html code. Then you execute that php script every hour. This can be achieved by using CRON jobs. If you want more information on either of those then please ask another question specified on that subject.
Related
There is one section of my webpage which changes a couple times a day, the rest is static.
I am afraid that the overhead of having the webpage be rendered by the server/database (PHP/MySQL) every time the page is loaded is significant and not necessary if my content changes just a few times a day.
Would it be wrong to have a php script recreate my index.html using file_put_contents every time there is a change to my site? It seems the "con" is complexity of code, but the "pro" is this generates a clean static index.html which doesn't need server resources every time someone opens the page.
Unless you got a huge traffic and really low hardware resources... just use caching. Use something like Alternative PHP Cache (APC) and also memcache.
Than you use a template engine like Twig which also has caching and you're set.
If I add PHP includes to a page. include('example.php') - It then has to load that file which would / could slow down load time correct? Right now I am making unnecessary redirects back to the a login page if the logins are wrong via javascript snippet that is inside my login.php page (which does all the login checking against the database). So in the address bar it shows "admin.php > login.php > admin.php" , but I don't ever want to show what file it is going to in order to test the logins, but I also do not want to include this inside admin.php because I'm afraid it might affect load time.
If you understand my question then suggestions would be helpful.
Do what you feel is best in terms of readability and maintainability at server-side. Then, if you have a performance problem, find where it comes from by measuring (and not guessing), and try to optimize.
You're optimizing prematurely, and this is the root of all evil. Compared to the time it takes to a HTTP request/response to execute, including some additional lines of PHP is very certainly negligible.
New HTTP requests (redirects) are always a bigger performance hit than includes, so there's no reason to use redirects (especially Javascript ones) if you're only concerned about the performance. So if I understand you I'd like to suggest that you just include the file.
I have a website that's about 10-12 pages strong, using jQuery/Javascript throughout. Since not all scripts are necessary in each and every page, I'm currently using a switchstatement to output only the needed JS on any given page, so as to reduce the number of requests.
My question is, how efficient is that, performance-wise ? If it is not, is there any other way to selectively load only the needed JS on a page ?
This may not be necessary at all.
Bear in mind that if your caching is properly set up, embedding a JavaScript will take time only on first load - every subsequent request will come from the cache.
Unless you have big exceptions (like, a specific page using a huge JS library), I would consider embedding everything at all times, maybe using minification so everything is in one small file.
I don't see any performance issues with the method you are using, though. After all, it's about deciding whether to output a line of code or not. Use whichever method is most readable and maintainable in the long term.
Since you're using JS already, you can use JS solution completely - for example you could use yepnope instead of php. I don't know what's the structure of your website and how you determine which page needs what or at what point is something included (on load, on after some remote thing has finished delivering data), however if you use $.ajax extensively, you could also use yepnope to pull additional JS that's needed once $.ajax is done with what it was supposed to do.
You can safely assume the javascript is properly cached on the clientside.
As I also assume you serve a minified file, seen the size of your website I'd say the performance is neglectable.
It is much better to place ALL your JavaScript in a single separate ".js" file and reference this file in your pages.
The reason is that the browser will cache this file efficiently and it will only be downloaded once per session (or less!).
The only downside is you need to "refresh" a couple of times if you change your script.
So, after tinkering a bit, I decided to give LABjs a try. It does work well, and my code is much less bloated as a result. No noticeable increase in performance given the size of my site, but the code is much, much more maintainable now.
Funny thing is, I had a facebook like button in my header. After analyzing the requests in firebug I decided to remove it, and gained an astounding 2 seconds on the pages loading time. Holy crap is this damn thing inneficient...
Thanks for the answers all !
I've worked on a CMS which would use Smarty to build the content pages as PHP files, then save them to disc so all subsequent views of the same page could bypass the generation phase, keeping DB load and page loading times down. These pages would be completely standalone and not have to run in the context of another script.
The problem was the instance where a user first visited a page that wasn't cached, they'd still have to be displayed the generated content. I was hoping I could save my generated file, then include() it, but filesystem latency meant that this wasn't an option.
The only solution I could find was using eval() to run the generated string after it was generated and saved to disc. While this works, it's not nice to have to debug in, so I'd be very interested in finding an alternative.
Is there some method I could use other than eval in the above case?
Given your scenario, I do not think there is an alternative.
As for the debugging part, you could always write it to disc and include it for the development to test / fix it up that way and then when you have the bugs worked out, switch it over to eval.
Not knowing your system, I will not second guess that you know it better than I do, but it seems like a lot of effort, especially since that the above scenario will only happen once per page...ever. I would just say is it really worth it for that one instance to display the initial page through eval and why could you not be the initial user to generate the pages?
StackOverflow is telling me this is a subjective question, but I think it's a matter of fact!
I have a number of scripts that I'm using on different parts of my site. In terms of making fewer http requests, I know it's better to combine all of these scripts into one .js file. However, isn't a waste of time for a page to call a .js full of 10 or 15 different functions when it's only using one?
The other method I am using is to use PHP conditional statements...
<?php if( is_page() ) { >
$(document).ready(function(){
...
});
<?php } ?>
What's the best method or comination of these methods?
Seeing as the script file is cached by the browser and needs to be loaded only once, it is usually the smartest thing to in fact combine all JS code into one file.
It may be different if you have huge libraries upward of 100 kilobytes that get used only by certain users (e.g. users that log in). In such cases, it makes sense to make distinctions. Otherwise, I'd say go with one big file.
I would say, provided you expect users to require the vast majority of these resources as they browse your site, taking the hit all at once isn't much of an issue.
You need to make sure that your homepage loads quickly enough though, maybe consider a cut down script file for the homepage, and a fully bloated version for the other pages. Or separate out the truly required on every page features into a file that is included on the homepage, and the other features into an -extra file. Then include both files on pages which require them. The browser will already have cached the basic file from the homepage.