How to check what's slowing down a page? - php

I'm working on a website that can be found here:
http://odesktestanswers2013.com/Metareviewer
The index appears to be unusually slow (slowing down the browser as it loads) even though Yslow doesn't seem to see anything particularly wrong with it and that my php microtime returns a fine value.
What's the other things I should be looking into ?

Using Chrome Developer Tools, the network tab shows this:
... a timeline of what's loading in your page.
There are also plenty of good practices that aren't being made here. Some of these can also be flagged up by using Google Chrome's Audit tool (F12 menu), but in my opinion the most important are:
Use a CDN for serving common library code. Do you really need to host Jquery yourself? (side-rant, do you really need jquery at all?)
Your JavaScript files are taking a long time to load, because they are all served as separate HTTP calls. You can combine them into a single JavaScript file, and also minify them to save lots of bandwidth.
Foundation.css is very large - not that there's a problem with large CSS files, but it looks like there are over 2000 rules in the CSS file that aren't being used on your site. Do you need this file?
CACHE ALL THE THINGS - there are 26 HTTP requests that are made, that are uncached, meaning that everyone who clicks on your site will have to download everything, every request.
The whole bandwidth can be reduced by about two thirds if you enabled gzip compression on your server (or even better, implement SPDY, but that's a newer technology with less of a community).
Take a look on http://caniuse.com - there are a lot of CSS technologies that are supported in modern browsers without the need for -webkit or -moz, which could save a fortune of kebabbobytes.
If I could change one thing on your site...
Saying all of that, each point above will make a very small (but accumulative) difference to the speed of your site, but it's probably a good idea to attack the worst offender first.
Look at the network graph. While all that JavaScript is downloaded, it is blocking the rest of the site to download.
If you're lazy, just move it all to the end of the document body. That way, the rest of the page will download before the JavaScript has to, but this could harm the execution of your scripts if they are programmed in particular styles.
Hope this helps.

You should also consider using http://www.webpagetest.org/
It's one of the best tools when it comes to benchmarking your site's performance.

You can use this site (http://gtmetrix.com/) to analyze the causes and to fix them them.The site provides the reasons as well as solutions like js and css in optimized formats.
As per this site's report, you need to optimize images and minify js and css files. The optimized images and js and css files can be downloaded from this site.

Use Google Chrome -> F12 -> Network and check the connect, send, receive and etc. time for each resource, used in your page.
It looks like your CSS and JS scripts have a very long conntect and wait times.

you can use best add-one available for both chrome and firefox
YSlow analyzes web pages and suggests ways to improve their performance based on a set of rules for high performance web pages.
above is link for firefox add-one you can also search chrome and is freely available.

Yslow gives you details about your website's front end. Most likely you have a script that is looping one to many times in the background.
If you suspect that a sequence of code is hanging server side then you need to do a stack trace to pinpoint exactly where the overhead is taking place.
I recommend using New Relic.

Try to use Opera. Right click -> Inspect element -> Profiler.
Look to Inspect element -> Errors.

Related

Reduce Expires Header SEO

I am using Wordpress for my Content management system.
After a while I am trying to make it better performance wise and the tools I am using to analyse my website is Gtmetrix.
I have done a lot of performance optimization but there is one problem I can't solve, actually I don't even know when its being generated.
The problem is related to Expires Header for the following files:
connect.facebook.net/en_US/sdk.js#xfbml=1&version=v2.3
platform.twitter.com/widgets.js
platform.stumbleupon.com/1/widgets.js
apis.google.com/js/platform.js
www.facebook.com/impression.php/f38617d6c3bf89e/lid=115&payload=%7B%22source%22%3A%22jssdk%22%7D
I am also using Woocommerce.
Can you help me to remove this files from my website or at least add expire header to them? which I think is not possible since this file are being loaded from somewhere else right?!
You cannot change the headers sent by websites you don't control.
Facebook, Twitter, StumbleUpon and Google all control their own headers and have their own reasons for the expiration time and other cache settings that they send through.
You can only make these changes on a domain that you control. In this case your options are:
Consider alternative ways to connect to these social networks. For example, you could code your own custom scripts, or find a Wordpress plugin that will run locally on your page (even with this, you may not totally be able to remove reliance on these external scripts, depending on the exact functionality you want)
Don't get hung up trying to get good results for everything GTMetrix tells you. Don't get me wrong, GTMetrix's suggestions are great and you can learn a lot from them, but it's rarely possible to do absolutely everything 100% of the time. Do your best, do what's under your control, and you'll definitely notice your site's speed (and hopefully ranking) increase.
For what it's worth, installing a local caching plugin like W3 Total Cache generally deals with several of GTMetrix's metrics in one go!

Mobile vs non-mobile: serving appropriate content

I am writing a mobile version of a website, and I want to
use WURFL-PHP to perform the mobility check on the client.
re-use the existing .htaccess files. I don't want to parse anything manually. This is better left to Apache.
For simplicity, let's say I have two flat files: file.d.html and file.m.html. In index.php, I consider making an appropriate HTTP request on one of these files depending on the client we are serving (mobile vs non-mobile) and forwarding the response to the client. Everything will be the way .htaccess files prescribe.
To make the request, a PHP library called pecl_http exists. Problem is: I only have a shared hosting plan, so my webhost is reluctant to make it available.
So, can someone tell me if this is reasonable, or I'm just complicating things?
Also, is there any way to accomplish this without an extra HTTP request? I wonder if Apache has some way to test for mobility. Perhaps I can add something to .htaccess instead of coding it in PHP.
Many thanks.
Personally I wouldn't do this server-side but with javascript and CSS media queries. Specifically there is process called 'Responsive Web Design' where your browser picks your styles based on the screen width in pixels. This avoids the evil of browser sniffing and therefore will generally support future mobile platforms without any changes. You can use the display style to hide whole blocks of content if necessary or just use different values for width and float to change from horizontal to vertical layouts. The Wikipedia page on RWD warns that some mobile platforms don't support media queries but in practice as of 2012 that's an extremely small share of the mobile market.

What kind of caching machanism is used over at Wikipedia?

If you open up your mozilla Firefox web browser and turn on firebug to check for incoming and outcoming network traffic, you see that, when you look at Wikipedia articles, the amount of cached content is very large.
Unless the article in question has many pictures, most of the content comes from the cache.
I'd like to know whether that is done by the browser itself or if it's some underlying PHP Caching mechanism. (is that what they call memcache?APC?) It works very well so I'd like to know how they do it.
Memcacahe, APC etc are server side data stores. You basically use it as a key value store so you don't have to ping your database all the time.
However, what you're actually seeing is a site being loaded on a primed cache. This is the technique of telling your web server to let the browser know that your commonly used resources haven't changed since the last time you viewed it. This effect is achieved by setting far future headers so that the browser doesn't keep requesting the resources. A lot of sites use this technique, including SO.
Here's a great source to read up on, if you want more info : http://developer.yahoo.com/performance/rules.html

Optimizations to reduce website loading time

What are some important optimizations that can be made to a website to reduce the loading time?
Remove/Minimize any bottlenecks on the server side. For this purpose, use a profiler like Xdebug or Zend Debugger to find out where your application is doing expensive and slow operations. Implement caching where possible. Use an OpCode Cache. If this still isn't fast enough consider investing in more CPU or RAM or SSDs (depending on whether you are CPU, IO or Memory bound)
For general server/client side optimizations, see the Yahoo YSlow! User Guide.
It basically sums it up to:
Minimize HTTP Requests
Use a Content Delivery Network
Add an Expires or a Cache-Control Header
Gzip Components
Put StyleSheets at the Top
Put Scripts at the Bottom
Avoid CSS Expressions
Make JavaScript and CSS External
Reduce DNS Lookups
Minify JavaScript and CSS
Avoid Redirects
Remove Duplicate Scripts
Configure ETags
Make AJAX Cacheable
Use GET for AJAX Requests
Reduce the Number of DOM Elements
No 404s
Reduce Cookie Size
Use Cookie-Free Domains for Components
Avoid Filters
Do Not Scale Images in HTML
Make favicon.ico Small and Cacheable
Also see the comments contributed below, as they contain some additional useful information for other users.
Before attempting any optimizations first you need to be able to profile, get FireBug for Firefox. Then you can run some analysis that will tell you exactly what to do using YSlow. Fundamental things that you should do are listed here.
definitely want to look at caching, as round trips to DB are expensive.
also, minify JS
Here are a few "best practice" things:
Caching CSS, JavaScript, images, etc.
Minifying Javascript files.
gzip content.
Place links to JavaScript files, JavaScript code, and links to CSS files at the bottom of your page when possible.
Load only what is necessary.
For an existing website, before you do any of this determine where your bottlenecks are with tools like Firebug and as someone else mentioned YSlow (I highly recommend this tool).
install firebug and pagespeed plugin
follows all the pagespeed directives (until possible) and be happy
http://code.google.com/intl/it/speed/page-speed/
anyway the most importante optimization in my experience is to reduce the number of HTTP requests to a minimum...
The simple options I can think of are:
Gzip (x)html, so a compressed file should arrive more quickly to the user
minify the CSS
minify the JS
use caching where possible
use a content-delivery network
use a tool, such as yslow to identify bottlenecks and further suggestions
There are two sides you can care about, when optimizing :
The server side : what matters is generating the ouput faster
The client side : what matters is getting all that has to be displayed faster.
Note : we, as developpers, often think about optimizing the server-side first... Which in most cases only represents less than 10% percent of the loading-time of the page !
On the server side, you'll generally want to :
profile, to determine what's long
optimize your SQL queries, and reduce their number
use caching
For more informations, you can take a look to the answer I gave some time ago to this question : Optimizing Kohana-based Websites for Speed and Scalability
On the client side, the biggest gains are generally achieved by :
Reducing the number of HTTP requests -- the easiest way being to reduce the number of JS/CSS/images files, by combining several files into one
Compressing CSS/JS/HTML, using for instance Apache's mod_deflate.
About that, there is a lot of great stuff on Yahoo's Exceptional Performance : they've released lots of good pratices and tools, such as yslow.
We recently did this on our web site. Here we outlined nine techniques that seemed to have the highest impact with the least difficulty: http://mentormate.com/blog/easy-ways-speed-website-load-time/
The first optimisation is: Decide if it is slow, and if not, don't bother.
This is trickier than it sounds, because it's not like testing a desktop app or game. A game is slow if when you play it on the target hardware, the frame rate is too low. This is very easy to measure.
A web site is trickier, because you, as the developer, are probably using a local test system with a very fast network. Even when you use your staging / system test servers, you're probably still on the local network. Even your production servers are in all likelihood, on the same continent.
The same is possibly not true for quite a lot of your users.
Therefore the options which exist are:
Find out by asking your users, whether they find it to be slow
Simulate a high latency environment and test it yourself (or your QA team)
Guesswork
The latter is not recommended.
An option which the holier-than-thou Yahoo Web Sites performance book (which yes, is a book you can buy) doesn't mention a lot is HTTPS. Most web applications which handle important data run mostly or entirely over HTTPS, which changes the rules of the game rather a lot. Remember to do all testing with it enabled.
i wrote some things about, see:
Google page speed test optimization
As already mentioned, you can use Yslow or PageSpeed firefox extension. But you can also use GTmetrix, an online service scanning your page with both tools.
Features I like / use:
soft, clean and usable presention
comparison with another page. It's really interesting to see where are your friends / competitors.
(by the way, i'm not related to gtmetrix !)
To reduce network traffic, you can minify static files, such as CSS and Javascript, and use gzip compression on generated content. You can also try using tools such as optipng to reduce the size of images.
However, the first step to take is to actually analyse what's taking all of the time -- whether it's sending the bits over the network, or actually generate the content to send. There's no point making your CSS files 10% smaller if it takes a minute to generate each HTML page.
Don't use whitespace in code.
Load balancing would help to reduce the loading time immense.

Speedup a php web site

What's the best way (ways?) to speed up a php web site and how much faster it can using this or that way?
PHP isn't really the kind of language where you can do micro-optimizations, or just work on the code alone. There's really no point. Although PHP isn't particularly fast, PHP itself is rarely the bottleneck in a given web site.
You need to work out where that bottleneck is before you can fix it. There are a lot of common bottlenecks, with common solutions. It's difficult to generalize, given so few details, but there are a lot of performance hints that apply to most web sites.
The first good place to look is actually on the client side, rather than the server side. How large are your pages (including images, CSS, JavaScript and the like)? How many HTTP requests does a single page view require? Use something like Firebug (and the YSlow add-on for Firebug) to see how long your page actually takes to load, and which bits of your page cause the problem. Some general hints:
Work out ways to shrink the CSS and JavaScript - remove anything you don't need, and run the rest through a tool like YUI Compressor.
If you have multiple CSS and JavaScript files, try to combine them into a single file.
Optimize all of your images as much as possible, and see if you can combine any of those into a single file using CSS sprites or similar. PunyPNG is good for lossless images. A decent JPEG encoder (NOT Photoshop) is good for photos.
Move the CSS to the top of the page, and the JavaScript to the bottom, so the browser can render the page before the JavaScript has finished downloading.
Make sure that all of your CSS, JavaScript and HTML are being served compressed.
Make sure that you're using appropriate caching - if a file hasn't changed, there's no point in re-downloading it.
Once you've got the client side out of the way, you might have to turn your attention to the server side.
Install an opcode cache, like APC, XCache, or Zend Optimizer. It's very easy to do, and will always provide some improvement. Once you've done that, profile your pages, to find out where the time is actually being spent.
More likely than not, you'll be spending most of your time waiting for the database to return results. So, at a bare minimum:
Work out which queries are taking the longest, and work on them first. Use your head though - a query that takes five seconds on an admin page that nobody looks at is not as important as a query that takes one second on the front page.
Make sure that your query uses appropriate indexes. No common query should ever need to do a full table scan. Certain kinds of sorting or grouping may be unable to use indexes - try to avoid them, or modify the query so that it can use indexes.
Make sure that your queries aren't using temporary tables.
Use the EXPLAIN keyword - it's very useful.
Tune the database server itself. MySQL is generally not optimized for performance.
Once you've done that, it's usually best to start working out how to use caching. The best way to speed PHP code up is to reduce the amount of work it has to do.
Make sure your database's query cache is working properly.
Use something like Memcached to store frequently used results, instead of getting them from the database.
If you have enough memory, try to keep everything in Memcached, resorting to the database only when something isn't present in the cache.
If you have chunks of pages that are dynamic, but the same for all users, try caching those chunks. For example, if two users are looking at an article, the article itself is going to be exactly the same for each user, even if the rest of the page isn't. Generate the HTML for the article, and chuck it in the cache.
If you have lots of non-authenticated users, it's entirely possible that they'll all be seeing the exact same page. Two non-authenticated users looking at the above article won't just see an identical article - they'll see an identical page, right down to the login links. Set your PHP scripts up so you can use HTTP caching headers (check the last modified date, and return a 304 Not Modified if it's not been changed). Once you've done that, stick a Squid reverse-proxy in front of the webserver, and let Squid serve pages out of it's cache.
After that point, the general approach is to start using more servers, and the problem becomes one of scaling, rather than raw speed. The general plan is to make sure that your website has a shared-nothing architecture - all persistent data is stored in the database. Then, you install multiple webservers, move the database server to a separate machine, and run the entire thing behind a caching reverse proxy. To add more capacity, you add more machines.
One way: php accelerators, e.g. APC.
Another; read blog articles, e.g. performance tuning overview.
A general question i would say. Try looking for optimazation tips online...
Several parameters are involved:
I/O access (using it a lot - file_exists, is_file overheads)
Database access (optimize queries, use stored procedures, check your db cache)
Using an opcode cache (like APC)
Compressing output
Serving js/css minified and compressed (and using subdomains to deliver them to the browser)
Using memcache to cache data into memory for faster access
You can use benchmarking tools to test your environment before and after the optimizations.
Try apache bench for example.
Filesize.
A file of 500 KB takes longer to download then a file of 300 KB. So optimize and crop as much as you can.
Accelators
Self explainable: List of PHP accelerators
Server upgrade
Though this costs money, when dealing with a lot of traffic, it will have impact on how fast the .php files gets processes and how fast data will be send to the user.
I don't recommend this though since there are other (free) ways to improve speed.
Don't user external resources
If you are linking some images trough other sites, the speed of the downloading will not be in your control. Instead, if you plan on using images from others download them to your own server first (or upload them to your own provider) and load them that way.
Review and improve your code
Find short cuts, remove unnecessary code, delete unused variables, reuse others etc.
There are other ways but I believe the above information has the most impact on your speed
You should probably do some search for existing answers to this question, however...
APC for opcode caching
Memcached for object storing (to reduce the number of database queries)
Check for / optimize slow SQL queries
Measure and find bottlenecks
Don't rely on (slow) web services on each page load, etc.
Yahoo has got some good basic advice on speeding up web pages, much of it very easy to implement. You may also want to download yslow + firebug for firefox; they will help indicate possible basic bottlenecks from a client request perspective.
The rest of the advice here is good, so I wont add much else other than; don't bother optimising any code until you're 100% sure that you've found a bottleneck. I can't stress that enough. Don't waste time tweaking code or implementing new things (ie caching) because you "feel" will make things quicker, act only on real evidence (ie performance profiling).

Categories