Trying to get the loading time of a Wordpress website (with three.js) - https://igotchamedia.com/arvr down from 6seconds to under 1.5s - the "Waiting" and "Receiving" part of the page loading is taking the bulk of the time. Caching plugins did not help.
Any help much appreciated!
Your times are slow for the initial Get parameter for the root document of the site, and that's called Time to First Byte. You have a redirect from the non-https site to the https site, and that is part of the slowness issue.
You can get rid of the redirects depending on how you implement SSL on your site and in WordPress: either by a redirect in .htacccess (not the best), or simply being sure your WordPress site and address settings are https and all URLs in the database are https, and then no redirects are needed.
But overall slow TTFB times are a server lag issue. If you are on a shared host, slow TTFB speeds can be slow because of all the other users of the server. Your overall speed - 4 seconds - is not bad for a very image heavy site with a fairly high number of http requests: https://gtmetrix.com/reports/igotchamedia.com/GLQwMRRs
You can talk to the webhost about the TTBF issues. But GoDaddy shared hosting is well-known to be a slow.
If you want to get under a few seconds, don't depend on a caching plugin to do all the work. 1) Get a better server and use a CDN; 2) lower the weight of the images and get the total weight of the site under 1 meg; 3) and use a theme that requires fewer scripts and style sheets which result in a high number of http requests; and 4) keep your external requests, like third-party fonts, to a minimum.
The vast majority of the time a Wordpress or any website is slow; is because of too much rich media (video and images) hosted or used on the site.
Try going back in and optimizing your website's media files for web. Save them at a smaller weight, and use better practices with file extensions. Also, CSS3 techniques are powerful these days and in many cases you don't even need to call so many image files. ie. for backgrounds, gradients, shadows, menus, buttons, etc. If you have lots' and lot's of media being hosted on your site, consider the use of a CDN (content delivery network).
Another good practice. Go in and make sure your website was coded properly. Calling external .js files in the footer, html is semantic, etc etc. Test your Wordpress plugins being used. Make sure your Wordpress is up to date, and has correct memory set. Check your developer console for any JavaScript errors or conflicts bogging the site down. Check your Wordpress database for any corruptions.
Lastly, check your host or server environment. Make sure your using the correct version of PHP, you have enough space, everything is efficient etc.
Also, check out these additional site performance optimization tools.
https://www.pingdom.com/
https://developers.google.com/speed/pagespeed/
Hope this helps, g'luck!
Related
I am currently tasked with finding a solution for a serious PHP bottleneck which is apparently caused by server-side minification of CSS and JS when our sites are under high load.
Some details and what I have found out so far
I inherited a web application running on Wordpress and which uses a complex constellation of Doctrine, Memcached and W3 Total Cache for minification and caching. When under heavy load our application begins to slow down rapidly. So far we have narrowed part of the problem down to the server-side minification process. Preliminary analysis has shown that the number PHP processes start to stack up under load, and when reaching the process limit of 500 processes, start to slow everything down. Something which is also mentioned by the author of the minify library.
Solutions I have evaluated so far
Pre-minification
The most logical solution would be to pre-minify any of the files before going live. Unfortunately our workflow demands that non-developers should be able to edit said files on our production servers (i.e. after the web app has gone live). Therefore I think that pre-processing is out of the question, as it limits the editability of minified files.
Serving unminified files
75% of our users are accessing our web application with their mobile devices, especially smartphones. Unminified JS and CSS amounts to 432KB and is reduced by 60-80% in size when minified. Therefore serving unminified files, while solving the performance and editability problem, is for the sake of mobile users out of the question.
I understand that this is as much a technical problem as it is a workflow problem and I guess we are open to working on both as long as we end up with a better overall performance.
My questions
Is there a reasonable compromise which solves the PHP bottleneck
problem, allows for non-devs to make changes to live CSS/JS and
still serves reasonably sized files to clients.
If there is no such one-size-fits-all solution, what can I do to
better our workflow and / or server-side behaviour?
EDIT: Because there were some questions / comments regarding the server configuration, our servers run Debian and are equipped with 32GB of RAM and 24 core CPUs.
You can run a css/javascript compilation service like Gulp or Grunt via Node.js that minifies all your js and css assets on change.
This service can run in production but that is not recommended without some architectural setup ( having multiple versioned compiled files and auto-checking them via gulp or another extension ).
I emphasize that patching features into production and directly
editing it is strongly discouraged as it can present live issues to
your visitors reducing your credibility.
http://gulpjs.com/
Using Gulp/Grunt would require you to change how you write your css/javascript files.
I would solve this with 2 solutions - first, removing any WP-CRON operation that runs every time a user runs the application and move that to actual CRON on the server. Second I would use load balancing so that a single server is not taking the load of the work. That is your real problem and even if you fix your perceived code issues you are still faced with the load issue.
I don't believe you need to change your workflow at all or go down the road of major modification to your existing system.
The WP-CRON tasks that runs each time a page is loaded causes significant load and slowness. You can shift this from the users visiting running that process to your server just running it at the server level. This reduces load. It is also most likely running these processes that you believe are slowing down the site.
See this guide:
http://www.inmotionhosting.com/support/website/wordpress/disabling-the-wp-cronphp-in-wordpress
Next - load balancing. Having a single server supplying all your users when you have a lot of traffic is a terrible idea. You need to split the webservers load.
I'm not sure where or how you are hosted but I would move things to AWS. Setup my WordPress site for load balancing # AWS: http://www.mornin.org/blog/scalable-wordpress-amazon-web-services/
This will involve:
Load Balancer
EC2 Instances running PHP/Apache
RDS for your database storage for all EC2 instances
S3 Storage for the sites media
For user sessions I suggest you just setup stickiness on the load balancer so users are continually served the same node they arrived on.
You can get a detailed guide on how to do this here:
http://www.mornin.org/blog/scalable-wordpress-amazon-web-services/
Or at server fault another approach:
https://serverfault.com/questions/571658/load-balancing-wordpress-on-amazon-web-services-managing-changes
The assumption here is that if you are high traffic you are making revenue from this high traffic so anytime your service responds slowly it will turn away users or possibly discourage them from returning. Changing the software could help - but you're treating the symptom not the illness. The illness is that your server comes under heavy load. This isn't uncommon with WordPress and high traffic, so you need to spread the load instead of try and micro-optimize. The difference is the optimizations will be small gains while the load balancing and spread of load actually solves the problem.
Finally - consider using a CDN for serving all of your media. This loads media faster and it removes load from your system by reducing the amount of requests to the server and it's output to the clients. It also loads pages faster consistently for people wherever they are visiting from by supplying media from nodes closest to them. At AWS this is called CloudFront. WordPress also offers this service free via Jetpack (I believe) but it does not handle all media from my understanding.
I like the idea of using GulpJS. One thing you might consider is to have a wp-cron or even just a system cron that runs every 5 minutes or so and then runs a gulp task to minify and concatenate your css and js files.
Another option that doesn't require scheduling but is based off of watching the file system for changes and then triggering a Gulp build to happen is to use incron (inotify cron). Check out the incron man page. Incron is great in that it triggers actions based on file system events such as file changes. You could use this to trigger a gulp build when any css file changes on the file system.
One caveat is that this is a Linux solution so if you're hosting on Windows you might have to look for something similar.
Edit:
Incron Documentation
I'm hosting wordpress on a debian machine with nginx and php-fpm. Sometimes, especially on the first request, the performance is not amazing: It tooks up to 2,7 seconds only for generating the html source! But when I refresh the page it will go down to about 700ms sometimes. For me it seems an issue with the theme or a plugin, because on the same server I have a second wordpress installation which is using the same server-side configuration, but load always quite fast (~400s for html generation!).
My suspicion is that the theme or a plugin is doing some slow remote requests, because there are also widgets included who are e.g. loading the amout of likes from a facebook-page, which will even slow down more the generation time. I would like to look for a way to debug the causes of this problem. I'm thinking of a possibility for example to catch all remote requests to functions like file_get_contents, curl and so on.
I sure can disable every single plugin and install another theme to isolate the problem. But as a single plugin can be build on thousands lines of code, it would be cost much time to find the issue. Is there any kind of debugging here which can help to find the problem more quickly? XDebug seems to offer something like this but I never worked with it and currently I don't have really time to get familar with it.
Any external api calls on initial page load will indeed slow rendering time. For social websites, you can use ajax after page load, or even better, query those social websites once a day then store the results in a simple db table, for example wp_social_data, and cache it or index it or store the results a Json file or any other solution that works for you, then render on page load the data you stored instead of making an external http/s call. This will solve the external api calls part.
For nginx and php-fpm, in general : compression should be enabled, caching for static assets or pages, and some realistic values for other settings should be in place, depending on the your application. You can find Wordpress recommendation for nginx server block on here
I have a dynamic php (Yii framework based) site. User has to login to do anything on the site. I am trying to understand how caching and CDN work; and I am a bit confused.
Caching (memcache):
My site has a good amount of css, js, and images. I've been given to understand that enabling caching ("memcache"?) will GREATLY speed up my site. But this has me confused. How does caching help? I mean, how can you cache something that's coming out of DB for each user separately? For instance, user-1 logs-in, he sees his control panel. User-2 logs-in, user 2 will see their control panel.
How do I determine what to cache? Plus, how do I enable caching (memcaching)?
CDN:
I have been told to use a content delivery network like CloudFlare. It is suppose to automatically cache my site. So, when my user-1 logs in, what will it cache? Will it cache only the homepage CSS, JS, and homepage images? Because everything else requires login? What happens when user logs-out? I mean, do "sessions" interfere with working of a CDN?
Does serving up images via CDN reduce significant load on my server? I don't have much cash for getting a clustered-server configuration. So, I just want my (shared) server to be able to devote all its resources in processing PHP code. So, how much load can I save by using "caching" (something like memcache) and/or "CDN" (something like CloudFlare)?
Finally,
What would be general strategy to implement in this scenario for caching, cdn, and basic performance optimization? do I need to make any changes to my php-code to enable CDN like CloudFlare and to enable/implement/configure caching? What can I do that would take least amount of developer/coding time and will make my site run much much faster?
Oh wait, some of my pages like "about us" page etc. are going to be static html too. But they won't get as many hits. Except for maybe the iFrame page that will be used for my Facebook Page.
I actually work for CloudFlare & thought I would hop in to address some of the concerns.
"do I need to make any changes to my php-code to enable CDN like
CloudFlare and to enable/implement/configure caching? What can I do
that would take least amount of developer/coding time and will make my
site run much much faster?"
No, nothing like a need to re-write urls, etc. We automatically cache static content by file extension. This does require changing your DNS to point to us, however.
Does serving up images via CDN reduce significant load on my server?
Yes, and it should also help most visitors access the site faster and save you a fair amount on bandwidth.
"Oh wait, some of my pages like "about us" page etc. are going to be
static html too."
CloudFlare doesn't cache HTML by default. You use PageRules to setup more advanced caching options for things like static HTML.
Caching helps because instead of performing disk io for each user the data is stored in the memory, ie memcached. This provides a SIGNIFICANT increase in performance.
Memcache is generally used for cacheing data ie query results.
http://pureform.wordpress.com/2008/05/21/using-memcache-with-mysql-and-php/
There are lots of tutorials.
I have only ever used amazon s3 which is is not quite a cdn. It is more of a storage platform but still it helps to take the load off of my own servers when serving media.
I would put all of your static resources on a CDN so your own server would not have to serve these. It would not require any modifcation to your php code. This includes JS, and CSS.
For your static pages (your about page) I'd make sure that php isn't processing that since there is no reason for it. Your web server should serve it directly.
Cacheing will require changes to your code. For cacheing a normal flow is:
1) user makes a request
2) check if data is in cache
3) if it is not in cache do the DB query and put it in cache
4) if it is in cache retrieve it
5) return data.
You can cache anything that requires disk io and you should see a speed up.
Memcached works by storing database information (usually from a remote server or even a database engine on the same server) in a flat file format in the filesystem of the web server. Accessing a flat file directly to retrieve data stored in a regulated format is much much muuuuuch faster than accessing that data from a remote query each time. This is typically useful when you have data that can be safely stored for certain periods of time as it is not subject to regular changes.
The way this works is that if you want to store a user's account information in a cache to speed up loading pages where that user is logged in. You would load the information and cache it locally. On any subsequent requests for that data, it will load in a fraction of the time it normally would take to load that information from the database itself. Obviously you will need to make sure that you update/recache that information if the user changes it while logged in, but you will greatly reduce the time it takes to serve up pages if you implement a caching system that can minimize the time spent waiting on the database.
I'm personally not familiar with CloudFlare so I can't offer any advice to that effect, but in terms of implementing caching in your application, you should check out:
http://code.google.com/p/memcached/wiki/NewOverview
And read the rest of the Wiki entries there which cover installation/implementation/etc. That should get you started on the right track.
My website is loading slowly and I ran this test: http://www.webpagetest.org/result/120227_MD_3CQZM/1/performance_optimization/
Which indicates that files stored on gametrackers.com is not being cached.
Apache and joomla already cache content that is on my server.
I'm using a script from gametrackers.com to show my teamspeak 3 statistics on my website1
However this script sometimes loads slowly duo to issues with gametrackers.com server and that's why I'd like to store a copy of it on my own webserver as cache and refresh it every 30 minutes from the gametrackers website.
If the gametrackers website is down(which is quite common) it should keep the last successful cache check.
How would I do this with apache 2.4.1 and possibly php?
If its possible I'd also like to use css sprites because webpagetest.org indicates:
The following images served from gametracker.com should be combined into as few images as possible using CSS sprites.
http://cache.www.gametracker.com/images/components/html0/gt_icon.gif
http://cache.www.gametracker.com/images/components/html0/online.gif
http://cache.www.gametracker.com/images/flags/nl.gif
http://cache.www.gametracker.com/images/game_icons/ts3.png
http://cache.www.gametracker.com/images/server_info/16x16_channel_green.png
http://cache.www.gametracker.com/images/server_info/16x16_player_off.png
http://cache.www.gametracker.com/images/server_info/vs_tree_item.gif
http://cache.www.gametracker.com/images/server_info/vs_tree_last.gif
http://cache.www.gametracker.com/images/server_info/vs_tree_outer.gif
http://www.gametracker.com/images/game_icons/ts3.png
CSS Sprites are a concept image resource where you use one image with several icons and other items positioned so you can with only one request load several images.
If the images aren´t on your site, it will be very difficult to implement that, and to do so you need strict patterns.
Check: http://coding.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques-tools-and-tutorials/
If you have a vps / dedicated server you can use mod_pagespeed it does several combination of things that web site optimizers like, automatically.
But don´t just believe that web site optimizers and testing tools like that are accurate.
They just suggest measures that could help, some practical, some don´t.
Good luck.
I am currently running a small piece of PHP software which is being accessed by two groups of people, one group in China, the other in Ireland. The server is hosted in Ireland, and no doubt pages load between 0.3 and 0.8 seconds.
The Chinese people are recording times of between 3.0 seconds and 10 seconds.
I have run speed tests and the Chinese have an internet connection of 1.5 mbit (testing server in Ireland) and a ping of 750ms.
So far, to improve load time I have redone a lot of the MySQL database interaction to maximize efficiency, but this did not help. I presume this only slightly reduces server process time but has little effect on page download time.
I was thinking, could I get hosting in China, and use this as a gateway for the system in Ireland. Surely the latency and download speed between this Chinese server would be better than the average person.. And the average persons requests would've re-routed through here.
Does this sound at all feasible? Or does anyone else have suggestions in relation to this?
Your biggest problem is the latency. If you can minimize the http requests, you should see large gains.
But, it's probably easier to just the buy services of a content delivery network and move all your static files to them, making sure they have well connected servers in china and ireland. I think this is easier than redesigning your website.
The cheapest bang for the buck would come from sending suitable http headers to indicate to the web browsers that they don't need to constantly check with your servers for freshness validation(don't do conditional http requests). If you can do this for all external webpage objects(images, css, js etc...) then the first page load will still take 10 seconds or whatever, but subsequent page loads should be very close to your irish visitors. THis is just a matter of webserver configuration. Heres a tutorial if needed. To be honest, sending cache friendly headers is always a good idea, this is how you get those snappy responsive websites we all love.
First of Follow best practices to speed up you website.Then also check YSlow to measure your site's speed and check what you can do to improve performance.
I think that changing servers won't give you better responce time.
I think you should worry first for optimization and then check for a good and powerful hosting company and hosting plan.
Cheers
Hosting/mirroring in China is definitely the way to go. It'll be faster for the people there and more reliable. Ideally you would also have it hosted in Ireland too, and have some kind of load balancing in place.
I would definitely mirror your site to a server somewhere in asia. You already did some performance improvements that didn't result in much better results from China so I assume the network latency is causing most of the load times.
You can deploy your website to one of the cloud services. Amazon EC2 instances are available in Singapore and Tokyo for example. Or you can deploy to a "regular" web hoster in China.
If you have to deliver mostly static files, check for a Content Distribution Network (CDN) service.
To speed up the load time, you may consider hosting your static files (images, css, javascript, documents) to a CDN (Content Delivery Network). CDNs work by hosting your files in multiple nodes around the world. Once it receives a request, the closest server to the visitor will serve the file - thus reducing latency.
You may want to try CloudFlare (http://cloudflare.com), it's a free product that serves as a reverse proxy and CDN of your website. Instead of the user accessing the site directly, CloudFlare will be the one to access your site and optimize it for delivery to the visitor.
Move to the cloud.
http://en.wikipedia.org/wiki/Cloud_computing
I'm in the process of doing the same thing. I will create a site mirror on aliyun.com cloud. I'm hosting my site (www.discoverwish.com) at siteground in singapore, but load times for my Chinese users without VPN is incredibly slow.
I will use rsync to keep the information sync'd up, then redirect chinese users using DNSpod
Regarding this comment: "You may want to try CloudFlare (http://cloudflare.com), it's a free product that serves as a reverse proxy and CDN of your website. Instead of the user accessing the site directly, CloudFlare will be the one to access your site and optimize it for delivery to the visitor."
We tried this for a client in China but, unfortunately, CloudFlare don't yet have a local version of their CDN working in China. We are currently testing Amazon's CDN, which seems to have positive reviews.