What is faster on opening and proccecing:
Having under 1 file all the jquery functions or
Each function to a separate file and call it when ever you need it?
Ex. I have a blabla.js file that has 4 functions in it.
And my xaxa.php that calls the blabla.js.
Now, when I firstly open my page its fast enough. No problem (even with cookies cleared and all)
BUT... when I first (and after) click a button that activates a part of my blabla.js all my links and functions are opening/working slower.
So should I separate my functions and load each js file where ever I need it or my problem is somewhere else?
Thank you
(As I said I start to suspect something in my structure)
So here is a sample of my jq:
$(document).ready(function(){
$(".avoid_ref_add").click(function(){
var keyValues = {
pid : $(this).parent().find('input[name="pid"]').val()
};
$.post('help_scripts/cartfunct.php', keyValues, function(rsp){
$('#content').load("p_body.php");
});
return false;
});
function remove
function update
});
and I have my items.php items2.php items3.php...
Now, MY COOKIES I HAVE THEM CLEARED NO CACHE... When I firt open the site, it loads fast and all links are fast...
But if I click that add button everything start to work REALLY slow...
IF I just refresh the whole page it starts working fast again and so on...
FOR ME is quite strange and I cannot figure what I did wrong... Because if the page was slow, it wouldn't load fast from the first time... Correct? Is it something in my code?
You should optimally put all your javascript in one javascript file.
This file could be served with gzip compression and far-future expiry headers to limit bandwidth usage (and result in faster pageloads). You could even run a minimizer on your javascript to reduce file size.
What you are really asking for is the art of minimalization/optimisation and this article is a good read: http://developer.yahoo.com/performance/rules.html
It is clearly better to hold all your JS functions in a single file and this should be faster. Think that when a different function needs to be called you want to avoid the delay inserted by the hit to the server
Execution speed should not be affected by the way you distribute functions among the files. However, download of all resources related to a page is faster when fewer files need to be downloaded. That's because each HTTP requests comes with a HTTP request header and each HTTP response come with a HTTP response header. If the keep-alive feature of HTTP is not used, additional overhead comes from establishing a TCP connection for each request.
You don't really have to separate each function to a separate file. I would suggest keeping the functions that are related to each other in the same JS file and putting others in another JS file.
This makes it easier for you to manage the code and increases the reusability/modularity of the JS file.
After that, you can use minify http://code.google.com/p/minify/ to combine the necessary JS files when you need to, thus reducing the number of requests made to retrieve the JS files.
Related
I have a very large site and it takes pretty long time to load. It takes around 120 seconds. What I'm trying to do is loads 1st half of the site loads 1st. Then user can surf while others parts are being loaded.
What I'm trying to do is below.
1st of all is this possible ?
According to my knowledge Yes since Google PageSpeed does that. But the problem is if I use PageSpeed I would have to change my DNS server settings and etc. I would like to do this myself.
How can I get it done ?
What type of technology should I use ?
Given that pages have the .php extension and written in PHP language.
You can use the concept of lazy loading.
You can load only content that is necessary during the load then using jquery and ajax you can load the remaining content.
In this way user can surf and interact easily with the the part already loaded while the other part will be loading asynchronously.
jQuery ajax or post method can help you on this.
A simple example could be,
If There are 5 parts of contents in your page, 2 needs to be loaded immediately
The page will be loaded with 2 parts loaded, so it will take quite less time than 5 parts loading
After document is loaded you will use ajax to load the remaining 3 parts.
Ajax will send request to the specific page of your website(can be possibly named AjaxRequestHandler.php) with some parameters, and this page will process your request and generate html for this and will send it back to your main page which will just show this returned html and this all be happening asynchronously, so the user will be able to communicate with the initially loaded 2 parts
And even if you are new to web technologies, I suppose you have to have the knowledge of atleast ajax and asynchronous calls etc. to achieve lazy loading.
Edit :
For your this question
Except AJAX Is there way around for this?
I think you can try iframes if they can help.
Loading the main content in the page load without iframe while loading other contents in the iframes after pages is loaded.
This jsFiddle
jsfiddle.net/cGDuV/
can help you understand lazy loading with iframe, mentioned in this post of stackoverflow.
You can use javascript for the same if you want to avoid jquery.
You can manipulate the output buffer such that it flushes early thus achieving what your after in the screenshot you posted in your question.
http://www.stevesouders.com/blog/2013/01/31/http-archive-adding-flush/
You can lazyload all your images. Here's a jquery plugin that does it easily
http://www.appelsiini.net/projects/lazyload
You can combine all your js in one file. Same with your css files. This will help the speed.
You can incorporate caching, expires headers and gzip/deflate compression
https://github.com/h5bp/html5-boilerplate/blob/master/dist/.htaccess
I would suggest you load your 3rd party javascript widget garbage (Google+ buttons, fabebook like buttons, social, twitter stuff) in a non blocking asynchronous way so it does not slow down the page in the beginning.
http://css-tricks.com/thinking-async/
Optimize your images as much as possible.
http://kraken.io/
Use a CDN
http://www.maxcdn.com/
Finally test your site and see where is the big bottleneck and where you can improve the site for speed optimization. Use the waterfall chart feature
http://www.webpagetest.org/
One of the things you can do is to load all the essential (top half) of the page normally, then use javascript/ajax to load the second half of the page. This is a very common technique (and is often used to load images).
Here is an excellent tutorial from jQuery for Designers, walking through how to use jQuery to load images asynchronously after the page loads. http://jqueryfordesigners.com/image-loading/
Having said that, a two minute load time seems very excessive. Maybe you should check if there is anything that could be slowing down your server.
You need to determine why the site is loading slow. What is the size of the data you are sending? Google and Firefox have web developer tools to help you determine which elements are taking the longest too load. Once you've determined the culprit, try to load the worst offenders asynchronously.
Check out this article on aync requests: https://segment.io/blog/how-to-make-async-requests-in-php/
in my opinion you need an endless scrolling solution. That is, have a fixed amount of content per "page" (could be an estimated 1500px worth of height). Use jQuery to load another "page" when user scrolls down by a set amount.
If you really want to unconditionally load all the content, just use the same approach, and on document ready trigger the next page to load. The loop the page loader until the whole thing is loaded. That way, you load the first "page", and defer the content "below the fold" to subsequent requests.
What you want is what Facebook does Bigpipe and here is a relevant SO post: Facebook Bigpipe Technique Algorithm
There are other solutions involving all sorts of Javascript but since you want PHP and Facebook uses PHP you should read up on Bigpipe. Juho even has an example written in PHP so that should meet your PHP requirements (but yes it still requires js but not AJAX).
Prefetching Resources the web page require large files for loading can often benefited from changing the order that those files are requested from the server. Sometimes, it makes sense to download files before they are necessary, so that they are instantly available once requested. When the resources required for a page can be loaded in advance, the user-perceived network latency for that page can be significantly reduced or even eliminated. When you run Google pagespeed insights and see the result, you will see how the fix the problems in your website.
Some tips to load site faster:
Make fewer HTTP requests
Add a far-future expires header
Gzip your page's components
Minify your JavaScript, CSS and HTML
One more thing when loading a webpage and if you are using php with smarty you can use this plugin which reduces the number of http requests to you server and makes the site load faster by combining all the js and css resource's request into one single HTTP request.
Alternatively you might be looking for these plugins.
http://masonry.desandro.com/
http://isotope.metafizzy.co/
http://www.wookmark.com/jquery-plugin
Does all this stuff have to be on the same page? Does it make sense to split the content over multiple pages? Can some of it be delayed until the person requests it? Can it be grouped into tabs? Hidden tabs could be lazy loaded for instance.
Give serious thought to restructuring the content in other ways. You might be able to come up with an alternate arrangement that simplifies the problem.
Having in mind all that was mentioned above you may think of caching parts of your data/html code with memcache or in any other way possible so you skip its generation every time. Of course this depends pretty much on how often the data changes.
Don't browsers render the document as it comes in? Whatever you put at the top of the file will be received by the client first, and therefore will be displayed first. For example, when you try to view a very large image file online, it loads from top to bottom. The same is true for web pages. Just put the content you want to load first at the top of the page!
Answer to question one: yes
Answer to question two: above
Answer to question three: Nothing, just put the page in the correct order.
Well the idea is more or less the same as described by Pawan Nogariya above. You will need to fetch views and data asynchronously and then display these. But this means that you will never redirect or post back to any other page rather will get every view via ajaz. This will make you application SPA (Single Page Application) like Gmail. And, this will also mean you need to keep track of what has been renedered and what not, leaving you in a mess. So, instead of doing everything your way there are already developed and popular frameworks available that let you do that but they also make it SPA. Which means that your application doesnt "posts" to the server as in redirection but everything is doen using Ajax.
You can use Backbone (Backbone.js), Knockout (Knockout.js) and may others to achieve this. These are javascript based frameworks that help achieving what you have just asked and may expample and tutorials are also easily available. You can use it with any language as we are using it with C# (MVC) for a relatively large applicaiton.
this is going to be ugly! You should definitely consider using ajax calls to load page fragments AFTER a first content stage is loaded!
This is going to break almost all known web standards, but it might render the website in parts....
this being said: here's the ugly stuff
First: get rid of the <html> tag of your website, start with the <head> DO NOT use a <body> tag either.
Now send your html-code in the order you want it to be loaded (top first) using echo ...
after each closing tag of a group (say </table> or </div>) use flush(); ob_flush(); this will send all known content to the browser immediately.
The browser now decides if it can render the known content or not and if it will (based on the browser specifics and user settings) but with few exceptions it will.
some browsers like to wait for the closing body-tag that's why we dropped it, others even wait for the closing html tag (safari afair) that's why we dropped that too.
If you use the echo-flush scenario wisely you should be able to split the page into renderable parts which most browsers will display without an error.
Again... don't do it this way.. it's bad, ugly and not even near any web standards
But you asked for it.
For your this question
Except AJAX Is there way around for this?
I think you can try iframes if they can help.
Loading the main content in the page load without iframe while loading other contents in the iframes after pages is loaded.
This jsFiddle
jsfiddle.net/cGDuV/
can help you understand lazy loading with iframe, mentioned in this post of stackoverflow.
You can use javascript for the same if you want to avoid jquery.
With pure PHP? Not smart.
$(function() {
$('#body').delay(1).fadeOut();
});
Fiddle example: http://jsfiddle.net/r7MgY/
Does going from
<script type="text/javascript" src="jquery.js"></script>
to
<script type="text/javascript">
<?php echo file_get_contents('jquery.js'); ?>
</script>
really speed things up?
I am thinking it does because php can fetch and embed a file's contents faster than the client's browser can make a full request for the file, because php isn't going over the network.
Is the main difference that the traditional method can be cached?
It may be faster on the first page load, but on every subsequent load it will be much slower. In the first example, the client browser would cache the result. In the second, it can not.
If you only ever serve one single website in your client's life, then yes, because you only have one HTTP request instead of two.
If you are going to serve multiple sites which all link to the same javascript source file, then you're duplicating all this redundant data and not giving the client a chance to cache the file.
You need to transfer the bytes to the browser in both cases. The only difference is that you save a HTTP request in the latter case.
Also make sure to escape the javascript with CDATA or using htmlspecialchars.
If you include your JS lib in your HTML page, it cannot be cached by the browser. It's generally a good idea to keep the JS separate from the normal HTML code because the browser can cache it and does not need to fetch it on subsequent requests.
So to make it short, it's an optimization that works only if the page is called once by the user and jquery is not used on other pages.
Alternatively, you may want to use the jquery from google apis - with the effect that they are often in the browser's cache anyway, so there is no need to transfer the lib at all.
It does so for that ONE PAGE.
All subsequent pages using the same library (jquery.js downloaded from the same URL) SUFFER, because if you include the reference to the external file yes, it has to be downloaded in an extra connection (which is relatively cheap with HTTP\1.1 and pipelining), BUT - provided your webserver serves it with useful headers (Expires:-header far in the future), the browser caches that download, while with the "optimization" it has to retrieve it with every single content-page.
Also see pages like this one:
http://www.stevesouders.com/blog/2008/08/23/revving-filenames-dont-use-querystring/
(the keyword here is "revving" in connection with those far-future expiration dates)
The first one is better since the browser can cache the script. With the second version it will have to re-download the script every time it loads the page even if the script didn't change.
The only time the second version is an improvement for scripts that cannot be cached by the browser.
It depends on how many files use the same file. However, in most situations this will be slower than your first piece of code, mostly because jquery.js can be cached.
Yes, that would initially be a performance optimization regarding the number of HTTP-requests being used to serve the page - your page will however become a bit bigger per pageload as the jquery.js will be cached in the browser after the first download.
It does if your page is static.
But if its not static your browser will download the page very time while jquery doesn't change but still included. if you use src="jquery.js" and the page changes, the browser will load jquery from cache and not download it again so using src="jquery.js" is actually faster.
What is the difference between the following two codes in an HTML file? If I add one more javascript file xyz.js after including the abc.js, is there any priority associated when the scripts are being used?
First code:
<script src="js/abc.js" type="text/javascript" language="javascript"> </script>
Second code:
<script language="javascript">
/*same code of abc.js*/
</script
The primary difference is that the javascript file can be cached by the browser and network devices so the user doesn't have to download it on every page load.
So if you have 100k of javascript files, your visitor only needs to download them once. Otherwise, they'd have to download those same exact 100k every page load and visit.
This allow applies to inline and external CSS and images as well!!
Granted this is only the tip of the iceburg of caching and browser performance (Steve's book is one of the web 'bibles'):
http://yuiblog.com/blog/2006/11/28/performance-research-part-1/
http://www.yuiblog.com/blog/2007/01/04/performance-research-part-2/
http://www.stevesouders.com/
What is the difference between the following two codes in an HTML file?
One requires an extra HTTP request, but gets cached. The other doesn't.
If I add one more javascript file xyz.js after including the abc.js, is there any priority associated when the scripts are being used?
Loading of external scripts is blocking. The first one will be loaded first. Then the second one will be loaded.
They both cause the browser to read the javascript and execute it. The first code can leverage caching while the latter DOES NOT cache.
The first use case also requires another HTTP request which could be costly.
There is no priority otherwise.
The first difference between loading a script from a file and running a script from a script tag is that the loading requires an extra HTTP request. This is usually trivial, but you will get a speed increase from having the script embedded in the page. However, loading from an external file does allow for the script to be cached. It seems like you cannot rely on caching, though.
Now, I should tell you, having all of your scripts hard-coded on the page is not very manageable. If you want to update one of the scripts but it's tied to a specific html file, it becomes that much harder to update.
As for your second question, scripts are loaded in order. All external loading is blocked while scripts are loading. Therefore, it is advisable to put all of your script includes at the bottom of the <body> tag.
Besides the primary reason of cacheing, a secondary and important difference is the maintenance of Separation of Concerns, which, among other things, goes to say that, in web development, markup (html) should be separated from style (css) and behaviors (js). These elements should be held in separate locations, and only linked to in the markup. This is important for project organization, ongoing upkeep and optimization. Writing a mess of spaghetti code with inline everything makes you a sad panda.
Hey guys quick question, I am currently echoing a lot of javascript that is based conditionally on login status and other variables. I was wondering if it would be better to simply echo the script include like <script type="text/javascript" src="javascript/openlogin.js"></script> that has been run through a minifying program and been gzipped or to echo the full script in raw format. The latter suggestion is messier to me but it reduces http requests while the latter would probably be smaller but take more cpu? Just wondering what some other people think. Thanks in advance for any advice.
I would go the first option, even though its an extra request it means the html/php page will be smaller. Also, it is my understanding once the Javascript is cached it won't be requested again whereas the html/php page will be requested every time.
Depending on your javascript functionality you could also add the async="true" to the script include to ensure the page is downloaded first then the javascript.
Include it externally (your first option). Then when you're doing javascript maintenance, you're not doing it inside PHP as well.
Including the raw text is preferred if you do not expect the page loads per user to go much beyond 1. If you expect your users to request your page multiple times, then the external, cacheable include is the right option. This is usually the case.
Echo the script include so that the javascript in in an external file and then the browser's cache can do it's job.
i have divided various components of the page in different php file. In the navigation php file i have the objects i want to use in the javascript.
where should i put the javascript <script ...> so that it loads fine? right now i am putting it in a completely seperate file header.php? but i dont think the javascript is picking objects from nav.php
i hope i am making sense ;)
The standard suggestion is that you should put all of your SCRIPT links prior to your closing BODY tag at the bottom of your document. This streamlines network connections:
http://developer.yahoo.com/performance/rules.html
It doesn't matter where in the PHP rendering process you put it, it only matters that when the output HTML and javascript are combined, the HTML elements exist before you try to access them in javascript.
It's for this reason that most javascript toolkits have a function for executing javascript once the page elements are loaded, such as jquery's document.ready function.
Generally the advice is to put the <script> at the bottom of your HTML page.
http://developer.yahoo.com/performance/rules.html
My understanding is that the best speed comes from putting the script at the end of the page?
http://developer.yahoo.com/performance/rules.html
Put Scripts at the Bottom
The problem caused by scripts is that they block parallel downloads. The HTTP/1.1 specification suggests that browsers download no more than two components in parallel per hostname. If you serve your images from multiple hostnames, you can get more than two downloads to occur in parallel. While a script is downloading, however, the browser won't start any other downloads, even on different hostnames.
In some situations it's not easy to move scripts to the bottom. If, for example, the script uses document.write to insert part of the page's content, it can't be moved lower in the page. There might also be scoping issues. In many cases, there are ways to workaround these situations.
An alternative suggestion that often comes up is to use deferred scripts. The DEFER attribute indicates that the script does not contain document.write, and is a clue to browsers that they can continue rendering. Unfortunately, Firefox doesn't support the DEFER attribute. In Internet Explorer, the script may be deferred, but not as much as desired. If a script can be deferred, it can also be moved to the bottom of the page. That will make your web pages load faster.
Noet that the performance benefit you gain from repositioning the tags (or using more esoteric methods for avoiding blocking) is very small compared to the benefit of getting them cached correctly at the browser.
C.