I am using Wordpress for my Content management system.
After a while I am trying to make it better performance wise and the tools I am using to analyse my website is Gtmetrix.
I have done a lot of performance optimization but there is one problem I can't solve, actually I don't even know when its being generated.
The problem is related to Expires Header for the following files:
connect.facebook.net/en_US/sdk.js#xfbml=1&version=v2.3
platform.twitter.com/widgets.js
platform.stumbleupon.com/1/widgets.js
apis.google.com/js/platform.js
www.facebook.com/impression.php/f38617d6c3bf89e/lid=115&payload=%7B%22source%22%3A%22jssdk%22%7D
I am also using Woocommerce.
Can you help me to remove this files from my website or at least add expire header to them? which I think is not possible since this file are being loaded from somewhere else right?!
You cannot change the headers sent by websites you don't control.
Facebook, Twitter, StumbleUpon and Google all control their own headers and have their own reasons for the expiration time and other cache settings that they send through.
You can only make these changes on a domain that you control. In this case your options are:
Consider alternative ways to connect to these social networks. For example, you could code your own custom scripts, or find a Wordpress plugin that will run locally on your page (even with this, you may not totally be able to remove reliance on these external scripts, depending on the exact functionality you want)
Don't get hung up trying to get good results for everything GTMetrix tells you. Don't get me wrong, GTMetrix's suggestions are great and you can learn a lot from them, but it's rarely possible to do absolutely everything 100% of the time. Do your best, do what's under your control, and you'll definitely notice your site's speed (and hopefully ranking) increase.
For what it's worth, installing a local caching plugin like W3 Total Cache generally deals with several of GTMetrix's metrics in one go!
Related
Trying to get the loading time of a Wordpress website (with three.js) - https://igotchamedia.com/arvr down from 6seconds to under 1.5s - the "Waiting" and "Receiving" part of the page loading is taking the bulk of the time. Caching plugins did not help.
Any help much appreciated!
Your times are slow for the initial Get parameter for the root document of the site, and that's called Time to First Byte. You have a redirect from the non-https site to the https site, and that is part of the slowness issue.
You can get rid of the redirects depending on how you implement SSL on your site and in WordPress: either by a redirect in .htacccess (not the best), or simply being sure your WordPress site and address settings are https and all URLs in the database are https, and then no redirects are needed.
But overall slow TTFB times are a server lag issue. If you are on a shared host, slow TTFB speeds can be slow because of all the other users of the server. Your overall speed - 4 seconds - is not bad for a very image heavy site with a fairly high number of http requests: https://gtmetrix.com/reports/igotchamedia.com/GLQwMRRs
You can talk to the webhost about the TTBF issues. But GoDaddy shared hosting is well-known to be a slow.
If you want to get under a few seconds, don't depend on a caching plugin to do all the work. 1) Get a better server and use a CDN; 2) lower the weight of the images and get the total weight of the site under 1 meg; 3) and use a theme that requires fewer scripts and style sheets which result in a high number of http requests; and 4) keep your external requests, like third-party fonts, to a minimum.
The vast majority of the time a Wordpress or any website is slow; is because of too much rich media (video and images) hosted or used on the site.
Try going back in and optimizing your website's media files for web. Save them at a smaller weight, and use better practices with file extensions. Also, CSS3 techniques are powerful these days and in many cases you don't even need to call so many image files. ie. for backgrounds, gradients, shadows, menus, buttons, etc. If you have lots' and lot's of media being hosted on your site, consider the use of a CDN (content delivery network).
Another good practice. Go in and make sure your website was coded properly. Calling external .js files in the footer, html is semantic, etc etc. Test your Wordpress plugins being used. Make sure your Wordpress is up to date, and has correct memory set. Check your developer console for any JavaScript errors or conflicts bogging the site down. Check your Wordpress database for any corruptions.
Lastly, check your host or server environment. Make sure your using the correct version of PHP, you have enough space, everything is efficient etc.
Also, check out these additional site performance optimization tools.
https://www.pingdom.com/
https://developers.google.com/speed/pagespeed/
Hope this helps, g'luck!
I would like to know if I can exclude some of the HTML from being cached. I am using MediaWiki software. So any MediaWiki solution or any other PHP solution will work as well.
My Mediawiki pages are cached and I am implementing the site notice feature which expires after few days. But when pages are cached, it doesn't honor my expiration date and being displayed all the time. so I want to exclude that part of the code from being cached. I am implementing it as a MediaWiki extension.
Thanks
MediaWiki caching works in many layers. There are a number of server side caches, apart from the caching in the client. (As you might have noticed, MW i notoriously slow, unless you implement at least some of the caching functionalities.)
First of all you will want to figure out which sitenotices are cached. As I'm sure you are aware, there is more than one place where you can set the sitenotice:
MediaWiki:Anonnotice
MediaWiki:Sitenotice
$wgSiteNotice in LocalSettings.php
Through a few different extensions
Do they all stay on the page for too long?
Secondly, you can try and figure out where your sitenotice is cached:
Is there any difference if you are logged in/out? Some parts of the interface can be harder cached for anonymous users.
Does the message disappear if you clear you browsers cache?
Try and disable Varnish (or Squid), if you are using any of them.
Temporarily disable $wgEnableParserCache to see what difference it makes.
Set $wgCachePages = false; in your LocalSettings.php, to try and disallow client side caching
I'm working on a website that can be found here:
http://odesktestanswers2013.com/Metareviewer
The index appears to be unusually slow (slowing down the browser as it loads) even though Yslow doesn't seem to see anything particularly wrong with it and that my php microtime returns a fine value.
What's the other things I should be looking into ?
Using Chrome Developer Tools, the network tab shows this:
... a timeline of what's loading in your page.
There are also plenty of good practices that aren't being made here. Some of these can also be flagged up by using Google Chrome's Audit tool (F12 menu), but in my opinion the most important are:
Use a CDN for serving common library code. Do you really need to host Jquery yourself? (side-rant, do you really need jquery at all?)
Your JavaScript files are taking a long time to load, because they are all served as separate HTTP calls. You can combine them into a single JavaScript file, and also minify them to save lots of bandwidth.
Foundation.css is very large - not that there's a problem with large CSS files, but it looks like there are over 2000 rules in the CSS file that aren't being used on your site. Do you need this file?
CACHE ALL THE THINGS - there are 26 HTTP requests that are made, that are uncached, meaning that everyone who clicks on your site will have to download everything, every request.
The whole bandwidth can be reduced by about two thirds if you enabled gzip compression on your server (or even better, implement SPDY, but that's a newer technology with less of a community).
Take a look on http://caniuse.com - there are a lot of CSS technologies that are supported in modern browsers without the need for -webkit or -moz, which could save a fortune of kebabbobytes.
If I could change one thing on your site...
Saying all of that, each point above will make a very small (but accumulative) difference to the speed of your site, but it's probably a good idea to attack the worst offender first.
Look at the network graph. While all that JavaScript is downloaded, it is blocking the rest of the site to download.
If you're lazy, just move it all to the end of the document body. That way, the rest of the page will download before the JavaScript has to, but this could harm the execution of your scripts if they are programmed in particular styles.
Hope this helps.
You should also consider using http://www.webpagetest.org/
It's one of the best tools when it comes to benchmarking your site's performance.
You can use this site (http://gtmetrix.com/) to analyze the causes and to fix them them.The site provides the reasons as well as solutions like js and css in optimized formats.
As per this site's report, you need to optimize images and minify js and css files. The optimized images and js and css files can be downloaded from this site.
Use Google Chrome -> F12 -> Network and check the connect, send, receive and etc. time for each resource, used in your page.
It looks like your CSS and JS scripts have a very long conntect and wait times.
you can use best add-one available for both chrome and firefox
YSlow analyzes web pages and suggests ways to improve their performance based on a set of rules for high performance web pages.
above is link for firefox add-one you can also search chrome and is freely available.
Yslow gives you details about your website's front end. Most likely you have a script that is looping one to many times in the background.
If you suspect that a sequence of code is hanging server side then you need to do a stack trace to pinpoint exactly where the overhead is taking place.
I recommend using New Relic.
Try to use Opera. Right click -> Inspect element -> Profiler.
Look to Inspect element -> Errors.
I am currently creating a website in php that has a database backend (can be either MySQL or SQL Server) and I realized recently that if my database crashes at any time, my website will not run properly and probably cause some headaches.
So what is the proper thing to display on the website if my database (or any crucial outside component) goes down? My particular website relies heavily on its database and will be almost useless without it.
One option I have been told is to email the website admin and display a Error 500 page that says something is wrong with the server and basically make the website unusable till the issue is fixed. Is there anything else I could do to work around this problem? Are there any ways to design a website so that the database (any crucial component) crashing isn't an issue?
I am looking for general rules of thumb as well as specific examples of how people have worked around this in the past. Also, these examples don't just have to be for my website example.
If you only have one database server, and the website cannot work without it's database, there is no magic : you'll have to display some sort of nice error page, informing the users there is a technical problem and that the website will come back shortly.
Generally speaking :
Chances of such a problem are pretty low
If your website is a normal one, people will tend to accept a problem once in a while, especially if you communicate about it.
If you can afford it (and have the technical knowledge to set this up), you could use two database servers, with replication (MySQL supports this) between them : one master, which you use, and a slave, that's considered as a backup.
Then, if the master falls, your application will use the slave.
Of course, this will greatly reduce the risks of a database-related problem (having two servers crash at the same time is quite unlikely), but you'll still have problems with all other components -- like your webserver : if you only have one, you might want to consider using two, with the second one as a fallback.
After that, if you still have money (and think you need an even better uptime for your website), you'll want to think about the case when your datacenter has a problem -- setting up server in two separate locations...
The proper thing to display is a simple "oops" error message that gives away no information that would be helpful to hackers. Something along the lines of "We're experiencing technical difficulties" or "website unavailable". This is for security purposes.
It would be good to have an error logging and notification system in place to notify an administrator in case of a crash. That would be fairly simple to write, but I'm sure there are already libraries that handle this. (There's a tutorial with code samples at http://net.tutsplus.com/tutorials/php/404403-website-error-pages-with-php-auto-mailer/ and a simpler example at http://www.w3schools.com/php/php_error.asp)
There are ways to design the architecture of your web site to handle a database component crashing. It's not architecting your website, it's architectin the whole environment. For example, database clustering for high availability (http://en.wikipedia.org/wiki/High-availability_cluster). It's not cheap.
Overall, you just need to ensure that you're doing your error handling properly. A database crash is a classic example ofr why we need error handling. There are plenty of resources and guidance for this.
http://www.google.com/search?q=Error+Handling+Guidelines&rls=com.microsoft:en-us&ie=UTF-8&oe=UTF-8&startIndex=&startPage=1
Edit
I found this and thought it was a very nice resource for answering how to handle the errors:
http://www.nyphp.org/PHundamentals/7_PHP-Error-Handling
It is considered best practice to return a HTTP 500 status code in the event that your database being down, or any other crippled service, prevents your website from functioning properly. Depending on your websites functionality, this could be on a page by page basis or site wide. For example, your "About Us" page may not need database capabilities while your search page would. You could thus keep the "About Us" page up and running but return a 500 status code when someone goes to your search page.
Do not give any technical information about why the site is not working to the end user. This could be a security risk.
If you are using apache, this document will tell you how to setup custom error pages:
http://httpd.apache.org/docs/2.0/custom-error.html
I recommend you use plain HTML for your 500 status code pages. You can also have your PHP pages send a 500 status code via the header() function, documented here:
http://php.net/manual/en/function.header.php
What's the best way (ways?) to speed up a php web site and how much faster it can using this or that way?
PHP isn't really the kind of language where you can do micro-optimizations, or just work on the code alone. There's really no point. Although PHP isn't particularly fast, PHP itself is rarely the bottleneck in a given web site.
You need to work out where that bottleneck is before you can fix it. There are a lot of common bottlenecks, with common solutions. It's difficult to generalize, given so few details, but there are a lot of performance hints that apply to most web sites.
The first good place to look is actually on the client side, rather than the server side. How large are your pages (including images, CSS, JavaScript and the like)? How many HTTP requests does a single page view require? Use something like Firebug (and the YSlow add-on for Firebug) to see how long your page actually takes to load, and which bits of your page cause the problem. Some general hints:
Work out ways to shrink the CSS and JavaScript - remove anything you don't need, and run the rest through a tool like YUI Compressor.
If you have multiple CSS and JavaScript files, try to combine them into a single file.
Optimize all of your images as much as possible, and see if you can combine any of those into a single file using CSS sprites or similar. PunyPNG is good for lossless images. A decent JPEG encoder (NOT Photoshop) is good for photos.
Move the CSS to the top of the page, and the JavaScript to the bottom, so the browser can render the page before the JavaScript has finished downloading.
Make sure that all of your CSS, JavaScript and HTML are being served compressed.
Make sure that you're using appropriate caching - if a file hasn't changed, there's no point in re-downloading it.
Once you've got the client side out of the way, you might have to turn your attention to the server side.
Install an opcode cache, like APC, XCache, or Zend Optimizer. It's very easy to do, and will always provide some improvement. Once you've done that, profile your pages, to find out where the time is actually being spent.
More likely than not, you'll be spending most of your time waiting for the database to return results. So, at a bare minimum:
Work out which queries are taking the longest, and work on them first. Use your head though - a query that takes five seconds on an admin page that nobody looks at is not as important as a query that takes one second on the front page.
Make sure that your query uses appropriate indexes. No common query should ever need to do a full table scan. Certain kinds of sorting or grouping may be unable to use indexes - try to avoid them, or modify the query so that it can use indexes.
Make sure that your queries aren't using temporary tables.
Use the EXPLAIN keyword - it's very useful.
Tune the database server itself. MySQL is generally not optimized for performance.
Once you've done that, it's usually best to start working out how to use caching. The best way to speed PHP code up is to reduce the amount of work it has to do.
Make sure your database's query cache is working properly.
Use something like Memcached to store frequently used results, instead of getting them from the database.
If you have enough memory, try to keep everything in Memcached, resorting to the database only when something isn't present in the cache.
If you have chunks of pages that are dynamic, but the same for all users, try caching those chunks. For example, if two users are looking at an article, the article itself is going to be exactly the same for each user, even if the rest of the page isn't. Generate the HTML for the article, and chuck it in the cache.
If you have lots of non-authenticated users, it's entirely possible that they'll all be seeing the exact same page. Two non-authenticated users looking at the above article won't just see an identical article - they'll see an identical page, right down to the login links. Set your PHP scripts up so you can use HTTP caching headers (check the last modified date, and return a 304 Not Modified if it's not been changed). Once you've done that, stick a Squid reverse-proxy in front of the webserver, and let Squid serve pages out of it's cache.
After that point, the general approach is to start using more servers, and the problem becomes one of scaling, rather than raw speed. The general plan is to make sure that your website has a shared-nothing architecture - all persistent data is stored in the database. Then, you install multiple webservers, move the database server to a separate machine, and run the entire thing behind a caching reverse proxy. To add more capacity, you add more machines.
One way: php accelerators, e.g. APC.
Another; read blog articles, e.g. performance tuning overview.
A general question i would say. Try looking for optimazation tips online...
Several parameters are involved:
I/O access (using it a lot - file_exists, is_file overheads)
Database access (optimize queries, use stored procedures, check your db cache)
Using an opcode cache (like APC)
Compressing output
Serving js/css minified and compressed (and using subdomains to deliver them to the browser)
Using memcache to cache data into memory for faster access
You can use benchmarking tools to test your environment before and after the optimizations.
Try apache bench for example.
Filesize.
A file of 500 KB takes longer to download then a file of 300 KB. So optimize and crop as much as you can.
Accelators
Self explainable: List of PHP accelerators
Server upgrade
Though this costs money, when dealing with a lot of traffic, it will have impact on how fast the .php files gets processes and how fast data will be send to the user.
I don't recommend this though since there are other (free) ways to improve speed.
Don't user external resources
If you are linking some images trough other sites, the speed of the downloading will not be in your control. Instead, if you plan on using images from others download them to your own server first (or upload them to your own provider) and load them that way.
Review and improve your code
Find short cuts, remove unnecessary code, delete unused variables, reuse others etc.
There are other ways but I believe the above information has the most impact on your speed
You should probably do some search for existing answers to this question, however...
APC for opcode caching
Memcached for object storing (to reduce the number of database queries)
Check for / optimize slow SQL queries
Measure and find bottlenecks
Don't rely on (slow) web services on each page load, etc.
Yahoo has got some good basic advice on speeding up web pages, much of it very easy to implement. You may also want to download yslow + firebug for firefox; they will help indicate possible basic bottlenecks from a client request perspective.
The rest of the advice here is good, so I wont add much else other than; don't bother optimising any code until you're 100% sure that you've found a bottleneck. I can't stress that enough. Don't waste time tweaking code or implementing new things (ie caching) because you "feel" will make things quicker, act only on real evidence (ie performance profiling).