Not seeing improvement in response times after using Cloudflare CDN - php

I have a php, Mysql, Apache based website. Hosting server located in London.
Opened cloudflare account, enabled and configured my website to route via cloudflare and enabled caching for static content.
Ran a page load test from different countries. and could not see any improvements.
The test tool howwver detects that i am making effective use of CDN. but there isnt any performance improvement.
1 My static resources each takes around 20ms to download when accessed from london.
2. When accessed from other countries, these resources are taking a good 600ms roughly.
am i missing something?

A couple of things to check:
Are you sending appropriate Cache-Control headers for the static content and confirmed that CloudFlare is caching the content (you should get "HIT" for one of the headers)
Have you run multiple tests for the resources? Perhaps they just haven't been cached yet so you were getting the timing of the CloudFlare->origin fetch

CloudFlare by default does not cache pages, only resources. That may be why your pages are not any faster. If you want your pages to be cached and faster, you have to set up rules to do so.

Related

Wordpress Site Slow Loading Time in Receiving/Waiting

Trying to get the loading time of a Wordpress website (with three.js) - https://igotchamedia.com/arvr down from 6seconds to under 1.5s - the "Waiting" and "Receiving" part of the page loading is taking the bulk of the time. Caching plugins did not help.
Any help much appreciated!
Your times are slow for the initial Get parameter for the root document of the site, and that's called Time to First Byte. You have a redirect from the non-https site to the https site, and that is part of the slowness issue.
You can get rid of the redirects depending on how you implement SSL on your site and in WordPress: either by a redirect in .htacccess (not the best), or simply being sure your WordPress site and address settings are https and all URLs in the database are https, and then no redirects are needed.
But overall slow TTFB times are a server lag issue. If you are on a shared host, slow TTFB speeds can be slow because of all the other users of the server. Your overall speed - 4 seconds - is not bad for a very image heavy site with a fairly high number of http requests: https://gtmetrix.com/reports/igotchamedia.com/GLQwMRRs
You can talk to the webhost about the TTBF issues. But GoDaddy shared hosting is well-known to be a slow.
If you want to get under a few seconds, don't depend on a caching plugin to do all the work. 1) Get a better server and use a CDN; 2) lower the weight of the images and get the total weight of the site under 1 meg; 3) and use a theme that requires fewer scripts and style sheets which result in a high number of http requests; and 4) keep your external requests, like third-party fonts, to a minimum.
The vast majority of the time a Wordpress or any website is slow; is because of too much rich media (video and images) hosted or used on the site.
Try going back in and optimizing your website's media files for web. Save them at a smaller weight, and use better practices with file extensions. Also, CSS3 techniques are powerful these days and in many cases you don't even need to call so many image files. ie. for backgrounds, gradients, shadows, menus, buttons, etc. If you have lots' and lot's of media being hosted on your site, consider the use of a CDN (content delivery network).
Another good practice. Go in and make sure your website was coded properly. Calling external .js files in the footer, html is semantic, etc etc. Test your Wordpress plugins being used. Make sure your Wordpress is up to date, and has correct memory set. Check your developer console for any JavaScript errors or conflicts bogging the site down. Check your Wordpress database for any corruptions.
Lastly, check your host or server environment. Make sure your using the correct version of PHP, you have enough space, everything is efficient etc.
Also, check out these additional site performance optimization tools.
https://www.pingdom.com/
https://developers.google.com/speed/pagespeed/
Hope this helps, g'luck!

Cache remotely accessed websites

I have an application that often retrieves remote websites(via cURL), and I was wondering what are my options regarding to caching of those HTTP requests. For example:
application -->curl-->www.example.com
The issue is that cURL could be called hundreds of times in an hour, and every time would need to make hundreds of HTTP requests, that are basically the same. So, what I could to speed things up ? I was experimenting with Traffic server but wasn't very satisfied with results. I guess DNS caching is a must, but what else I can do here? The system that the app is running on is CentOS.
I don't know why Traffic Server didn't provide satisfiable results, but in general a Forward Proxy setup with caching is the way to do this. You would of course make sure that the response from www.example.com is cacheable, either via configurations on the caching proxy server, or directly on the origin (example.com). This is probably the biggest confusion in the proxy caching world, the expectations of what is cacheable or not does not meet the requirements.

Caching, CDN - are they the same for PHP Yii site? How to use it for a dynamic php site?

I have a dynamic php (Yii framework based) site. User has to login to do anything on the site. I am trying to understand how caching and CDN work; and I am a bit confused.
Caching (memcache):
My site has a good amount of css, js, and images. I've been given to understand that enabling caching ("memcache"?) will GREATLY speed up my site. But this has me confused. How does caching help? I mean, how can you cache something that's coming out of DB for each user separately? For instance, user-1 logs-in, he sees his control panel. User-2 logs-in, user 2 will see their control panel.
How do I determine what to cache? Plus, how do I enable caching (memcaching)?
CDN:
I have been told to use a content delivery network like CloudFlare. It is suppose to automatically cache my site. So, when my user-1 logs in, what will it cache? Will it cache only the homepage CSS, JS, and homepage images? Because everything else requires login? What happens when user logs-out? I mean, do "sessions" interfere with working of a CDN?
Does serving up images via CDN reduce significant load on my server? I don't have much cash for getting a clustered-server configuration. So, I just want my (shared) server to be able to devote all its resources in processing PHP code. So, how much load can I save by using "caching" (something like memcache) and/or "CDN" (something like CloudFlare)?
Finally,
What would be general strategy to implement in this scenario for caching, cdn, and basic performance optimization? do I need to make any changes to my php-code to enable CDN like CloudFlare and to enable/implement/configure caching? What can I do that would take least amount of developer/coding time and will make my site run much much faster?
Oh wait, some of my pages like "about us" page etc. are going to be static html too. But they won't get as many hits. Except for maybe the iFrame page that will be used for my Facebook Page.
I actually work for CloudFlare & thought I would hop in to address some of the concerns.
"do I need to make any changes to my php-code to enable CDN like
CloudFlare and to enable/implement/configure caching? What can I do
that would take least amount of developer/coding time and will make my
site run much much faster?"
No, nothing like a need to re-write urls, etc. We automatically cache static content by file extension. This does require changing your DNS to point to us, however.
Does serving up images via CDN reduce significant load on my server?
Yes, and it should also help most visitors access the site faster and save you a fair amount on bandwidth.
"Oh wait, some of my pages like "about us" page etc. are going to be
static html too."
CloudFlare doesn't cache HTML by default. You use PageRules to setup more advanced caching options for things like static HTML.
Caching helps because instead of performing disk io for each user the data is stored in the memory, ie memcached. This provides a SIGNIFICANT increase in performance.
Memcache is generally used for cacheing data ie query results.
http://pureform.wordpress.com/2008/05/21/using-memcache-with-mysql-and-php/
There are lots of tutorials.
I have only ever used amazon s3 which is is not quite a cdn. It is more of a storage platform but still it helps to take the load off of my own servers when serving media.
I would put all of your static resources on a CDN so your own server would not have to serve these. It would not require any modifcation to your php code. This includes JS, and CSS.
For your static pages (your about page) I'd make sure that php isn't processing that since there is no reason for it. Your web server should serve it directly.
Cacheing will require changes to your code. For cacheing a normal flow is:
1) user makes a request
2) check if data is in cache
3) if it is not in cache do the DB query and put it in cache
4) if it is in cache retrieve it
5) return data.
You can cache anything that requires disk io and you should see a speed up.
Memcached works by storing database information (usually from a remote server or even a database engine on the same server) in a flat file format in the filesystem of the web server. Accessing a flat file directly to retrieve data stored in a regulated format is much much muuuuuch faster than accessing that data from a remote query each time. This is typically useful when you have data that can be safely stored for certain periods of time as it is not subject to regular changes.
The way this works is that if you want to store a user's account information in a cache to speed up loading pages where that user is logged in. You would load the information and cache it locally. On any subsequent requests for that data, it will load in a fraction of the time it normally would take to load that information from the database itself. Obviously you will need to make sure that you update/recache that information if the user changes it while logged in, but you will greatly reduce the time it takes to serve up pages if you implement a caching system that can minimize the time spent waiting on the database.
I'm personally not familiar with CloudFlare so I can't offer any advice to that effect, but in terms of implementing caching in your application, you should check out:
http://code.google.com/p/memcached/wiki/NewOverview
And read the rest of the Wiki entries there which cover installation/implementation/etc. That should get you started on the right track.

How to speed up website loading for opposite side of planet

I am currently running a small piece of PHP software which is being accessed by two groups of people, one group in China, the other in Ireland. The server is hosted in Ireland, and no doubt pages load between 0.3 and 0.8 seconds.
The Chinese people are recording times of between 3.0 seconds and 10 seconds.
I have run speed tests and the Chinese have an internet connection of 1.5 mbit (testing server in Ireland) and a ping of 750ms.
So far, to improve load time I have redone a lot of the MySQL database interaction to maximize efficiency, but this did not help. I presume this only slightly reduces server process time but has little effect on page download time.
I was thinking, could I get hosting in China, and use this as a gateway for the system in Ireland. Surely the latency and download speed between this Chinese server would be better than the average person.. And the average persons requests would've re-routed through here.
Does this sound at all feasible? Or does anyone else have suggestions in relation to this?
Your biggest problem is the latency. If you can minimize the http requests, you should see large gains.
But, it's probably easier to just the buy services of a content delivery network and move all your static files to them, making sure they have well connected servers in china and ireland. I think this is easier than redesigning your website.
The cheapest bang for the buck would come from sending suitable http headers to indicate to the web browsers that they don't need to constantly check with your servers for freshness validation(don't do conditional http requests). If you can do this for all external webpage objects(images, css, js etc...) then the first page load will still take 10 seconds or whatever, but subsequent page loads should be very close to your irish visitors. THis is just a matter of webserver configuration. Heres a tutorial if needed. To be honest, sending cache friendly headers is always a good idea, this is how you get those snappy responsive websites we all love.
First of Follow best practices to speed up you website.Then also check YSlow to measure your site's speed and check what you can do to improve performance.
I think that changing servers won't give you better responce time.
I think you should worry first for optimization and then check for a good and powerful hosting company and hosting plan.
Cheers
Hosting/mirroring in China is definitely the way to go. It'll be faster for the people there and more reliable. Ideally you would also have it hosted in Ireland too, and have some kind of load balancing in place.
I would definitely mirror your site to a server somewhere in asia. You already did some performance improvements that didn't result in much better results from China so I assume the network latency is causing most of the load times.
You can deploy your website to one of the cloud services. Amazon EC2 instances are available in Singapore and Tokyo for example. Or you can deploy to a "regular" web hoster in China.
If you have to deliver mostly static files, check for a Content Distribution Network (CDN) service.
To speed up the load time, you may consider hosting your static files (images, css, javascript, documents) to a CDN (Content Delivery Network). CDNs work by hosting your files in multiple nodes around the world. Once it receives a request, the closest server to the visitor will serve the file - thus reducing latency.
You may want to try CloudFlare (http://cloudflare.com), it's a free product that serves as a reverse proxy and CDN of your website. Instead of the user accessing the site directly, CloudFlare will be the one to access your site and optimize it for delivery to the visitor.
Move to the cloud.
http://en.wikipedia.org/wiki/Cloud_computing
I'm in the process of doing the same thing. I will create a site mirror on aliyun.com cloud. I'm hosting my site (www.discoverwish.com) at siteground in singapore, but load times for my Chinese users without VPN is incredibly slow.
I will use rsync to keep the information sync'd up, then redirect chinese users using DNSpod
Regarding this comment: "You may want to try CloudFlare (http://cloudflare.com), it's a free product that serves as a reverse proxy and CDN of your website. Instead of the user accessing the site directly, CloudFlare will be the one to access your site and optimize it for delivery to the visitor."
We tried this for a client in China but, unfortunately, CloudFlare don't yet have a local version of their CDN working in China. We are currently testing Amazon's CDN, which seems to have positive reviews.

If you have an SSL Cert, why not use https for the whole site?

I was asked this question not too long ago, and didn't have a good answer...
Is there a good reason why a site that has an SSL certificate wouldn't use https:// for their entire site rather than http://
Are there SEO issues? Performance overhead for the server?
Just in case it matters, we use LAMP stacks.
Thanks!
A few reasons:
Generating SSL content takes some extra work so performance of a busy site could be an issue
Most (all?) browsers stop sending referrer info with requests to tracking users through your site could be more challenging
You might have to be more deliberate in how you serve pages to get browsers to cache them properly
If the page is SSL, all content loaded on the page should be SSL, too, to avoid mixed-content warnings in the browser; serving dependencies like scripts, images, etc. under SSL is not always convenient
Note, however, that a lot of sites do do this. For example, several of the banks I use are always https, even for the parts that don't require it.
for each request your data will be encoded and and decoded this will increase unnecessary load on server and would also increase response time of ur site.
Using SSL/TLS does no longer add very much overhead: http://www.imperialviolet.org/2010/06/25/overclocking-ssl.html
(As #erickson said in a comment on this page, the most computationally expensive part is the handshake. Good comment in general.)
I think you may get a loss in performance in some cases because browsers tend not to keep content obtained via HTTPS in the file cache if you close them (assuming that it's sensitive content that shouldn't be kept on disk), therefore you wouldn't benefit from the browser's cache and would have to reload the content.

Categories