Very high TTFB on shared hosting - php

I have a php, Mysql, Apache site. Which performance really fast in the order of 20ms to 40ms for page loads when tested locally.
However, when i hosted this on a Shared hosting server and configured all configurations and settings as close to the local host as possible, I am seeing a relatively slow website.
The Network latency is just about 3ms as the shared host server is located in the same city and a normal ping to the ipaddress proves it is 3ms.
The site takes aanywhere between 150ms to 4seconds to load.
90% of the time its around 400ms. And rarely it does take 2 seconds and upto 4 seconds to load the page.
Upon checking the timeline. I am seeing a very high TTFB. a minimum of 150ms and sometimes even 1.6 seconds. I also noticed, whenever this happens the ttfb of all resources such as font.woff files etccf are very high too.
What could be possible reasons here. Does it sound like a poor performance on shared hosting server and should i go for a cloud based server ?

The reason for this is pretty simple. Localhost will very well have low latencies because is simply "local".
You will not get the same kind of performance on shared hosting servers because you are not providing the same kind of latency for every user. e.g If your client accesses from Asia and your hosting happens in the US, you are bound to get high latencies. But on the contrary, if you serve an Asian client from a hosting in Asia and the US client from a hosting in the US, you'll get much better response times.
Another possibility is to serve static assets via CDN. CDN would geo-replicate your assets and manage the serving the client from the closest replicated asset.
Another reason could be your data access patterns. If your webserver is accessing a database service which is not in the vicinity of your application (by vicinity I mean the same local network) and has to go via the public DNS route, that will definitely cause very very high latencies and is a bad architectural pattern in general.
Cloud based servers provide you with two main benefits to this regard:
You get a very low-latency local network where you can spawn machines and get around 10Gbps connections on the network (internal latencies are improved drastically!).
You get geo-replication (beware! Not out of the box, you need to design your architecture which can scale and then use the CDN and geo-replication services provided by your cloud provider)

Related

Adding websockets to existing application

So I wrote this nice SAAS solution and got some real clients. Now a request was made by a client to add some functionality that requires websockets.
In order to keep things tidy, I would like to use another server to manage the websockets.
My current stack is an AWS application loadbalancer, behind it two servers - one is the current application server. It's an Apache web server with PHP on it running the application.
The client side is driven by AngularJS.
The second server (which does not exist yet) is going to Nginx, and session will be stored on a third Memcache server.
Websocket and application server are both on the same domain, using different port in order to send the requests to the right server (AWS ELB allows sending requests to different server groups by port). Both the application and websockets will be driven by PHP and Ratchet.
My questions are two:
is for the more experienced developers - does such architecture sounds reasonable (I'm not aiming for 100Ks of concurrent yet - I need a viable and affordable solution aiming to max 5000 concurrents at this stage).
What would be the best way to send requests from the application server (who has the logic to generate the requests) to the websocket server?
Please notice I'm new to websockets so maybe there are much better ways to do this - I'll be grateful for any idea.
I'm in the middle of using Ratchet with a SPA to power a web app. I'm using Traefik as a front-end proxy, but yes, Nginx is popular here, and I'm sure that would be fine. I like Traefik for its ability to seamlessly redirect traffic based on config file changes, API triggers, and configuration changes from the likes of Kubernetes.
I agree with Michael in the comments - using ports for web sockets other than 80 or 443 may cause your users to experience connection problems. Home connections are generally fine on non-standard ports, but public wifi, business firewalls and mobile data can all present problems, and it's probably best not to risk it.
Having done a bit of reading around, your 5,000 concurrent connections is probably something that is going to need OS-level tweaks. I seem to recall 1,024 connections can be done comfortably, but several times that level would need testing (e.g. see here, though note the comment stream goes back a couple of years). Perhaps you could set up a test harness that fires web socket requests at your app, e.g. using two Docker containers? That will give you a way to understand what limits you will run into at this sort of scale.
Your maximum number of concurrent users strikes me that you're working at an absolutely enormous scale, given that any given userbase will usually not all be live at the same time. One thing you can do (and I plan to do the same in my case) is to add a reconnection strategy in your frontend that does a small number of retries, and then pops up a manual reconnect box (Trello does something like this). Given that some connections will be flaky, it is probably a good idea to give some of them a chance to die off, so you're left with fewer connections to manage. You can add an idle timer in the frontend also, to avoid pointlessly keeping unused connections open.
If you want to do some functional testing, consider PhantomJS - I use PHPUnit Spiderling here, and web sockets seems to work fine (I have only tried one at a time so far, mind you).
There are several PHP libraries to send web socket requests. I use Websocket Client for PHP, and it has worked flawlessly for me so far.

Filter Traffic Before Hitting Server Environment

We have a pretty robust setup for our websites. Firewall with load balancer with 2 web servers behind it and 2 high availability MySQL servers. Our business is to send as much traffic as possible to our advertiser-supported websites.
Lately, we have been receiving a ton of foreign traffic (mainly China). Several million visitors per hour. It is almost behaving like an attack.
I can analyze the IP using PHP and redirect the visitor based on their country, but even that is causing load to be high and our fCGI slots to fill up completely. Also, our host tells us that applying country filters on the firewall will slow things down and cause issues.
Does anyone know of a service or method to filter certain traffic before it hits our environment? Maybe a way to do it at the DNS level?

How to change programmatically change name server using php

I have a website that receives occasional but predictable traffic spikes and my website crashes or slows down. Is it possible for me to change the name-servers programmatically using php or any other language so that the server changes after the traffic touches a certain peak. I am using a godaddy-windows hosting. And I do not want to use cloud computing.
You aren't going to have much luck because the load balancing stuff happens before PHP like apache. You probably will have to upgrade.
You would have to use something like a load balancer on a proxy that distributes traffic before it hits your server. Its very tricky to do and its almost impossible through a godaddy like host because you can't customise anything on a VPS.
You can't just change nameservers for load balancing (CDNs are based on that principle, but they are implemented on the DNS, since changes need time to propagate). You could pay for more resources and balance them using a proxy, but a better solution is to move your app to Amazon AWS, Google App Engine or a similar cloud based service: they handle load balancing much more efficiently, and that's what you need for occasional spikes (otherwise you end up paying for horsepower you only use occasionally)

Does Apache have built in DDOS protection? I'm getting interesting JMeter results

I was recently asked by a client to load test their server to see if it could handle 10,000 concurrent users. To do this I've been using JMeter but getting less than favorable results.
Let me just say that this is my first time using jmeter so I'm not super sure of what Im doing, BUT here's what I've found.
On a test of 1000 concurrent users all launched at once and each user going to 2 pages, the failure rate is 96%. This seems bad...like really really bad.
Is there something that could possibly be going wrong in JMeter? All I'm doing is sending HTTP GET requests to their server.
I don't know what kind of plan the client is on but I do know that they are using GoDaddy as their provider and in my experience GoDaddy's "unlimited" bandwidth is rather limited. Is this the problem OR and I'm really hoping that this is the case, is the Apache server for the website blocking the repeated attempts.
I get an error saying org.apahe.http.com.HttpHostConnectException: Connection to ~~~.com refused.
Is this the server being smart?
Or the server being bogged down?
Thanks in advance for your help, let me know if you need any more information.
Apache can't protect you from ddos attacks, but you can use some modules to reduce risks, they are: mod_qos and mod_evasive.
If you are using shared hosting from GoDaddy, seems you are loading all websites in one server and Godaddy may block your site or they may treat your load testing as ddos attack. For experiments you need isolated VDS server or cloud server.
If you want protect your project you can:
Use load balancer
Use caching tool
Use firewall protection
OS tuning
Use CDN

How to speed up website loading for opposite side of planet

I am currently running a small piece of PHP software which is being accessed by two groups of people, one group in China, the other in Ireland. The server is hosted in Ireland, and no doubt pages load between 0.3 and 0.8 seconds.
The Chinese people are recording times of between 3.0 seconds and 10 seconds.
I have run speed tests and the Chinese have an internet connection of 1.5 mbit (testing server in Ireland) and a ping of 750ms.
So far, to improve load time I have redone a lot of the MySQL database interaction to maximize efficiency, but this did not help. I presume this only slightly reduces server process time but has little effect on page download time.
I was thinking, could I get hosting in China, and use this as a gateway for the system in Ireland. Surely the latency and download speed between this Chinese server would be better than the average person.. And the average persons requests would've re-routed through here.
Does this sound at all feasible? Or does anyone else have suggestions in relation to this?
Your biggest problem is the latency. If you can minimize the http requests, you should see large gains.
But, it's probably easier to just the buy services of a content delivery network and move all your static files to them, making sure they have well connected servers in china and ireland. I think this is easier than redesigning your website.
The cheapest bang for the buck would come from sending suitable http headers to indicate to the web browsers that they don't need to constantly check with your servers for freshness validation(don't do conditional http requests). If you can do this for all external webpage objects(images, css, js etc...) then the first page load will still take 10 seconds or whatever, but subsequent page loads should be very close to your irish visitors. THis is just a matter of webserver configuration. Heres a tutorial if needed. To be honest, sending cache friendly headers is always a good idea, this is how you get those snappy responsive websites we all love.
First of Follow best practices to speed up you website.Then also check YSlow to measure your site's speed and check what you can do to improve performance.
I think that changing servers won't give you better responce time.
I think you should worry first for optimization and then check for a good and powerful hosting company and hosting plan.
Cheers
Hosting/mirroring in China is definitely the way to go. It'll be faster for the people there and more reliable. Ideally you would also have it hosted in Ireland too, and have some kind of load balancing in place.
I would definitely mirror your site to a server somewhere in asia. You already did some performance improvements that didn't result in much better results from China so I assume the network latency is causing most of the load times.
You can deploy your website to one of the cloud services. Amazon EC2 instances are available in Singapore and Tokyo for example. Or you can deploy to a "regular" web hoster in China.
If you have to deliver mostly static files, check for a Content Distribution Network (CDN) service.
To speed up the load time, you may consider hosting your static files (images, css, javascript, documents) to a CDN (Content Delivery Network). CDNs work by hosting your files in multiple nodes around the world. Once it receives a request, the closest server to the visitor will serve the file - thus reducing latency.
You may want to try CloudFlare (http://cloudflare.com), it's a free product that serves as a reverse proxy and CDN of your website. Instead of the user accessing the site directly, CloudFlare will be the one to access your site and optimize it for delivery to the visitor.
Move to the cloud.
http://en.wikipedia.org/wiki/Cloud_computing
I'm in the process of doing the same thing. I will create a site mirror on aliyun.com cloud. I'm hosting my site (www.discoverwish.com) at siteground in singapore, but load times for my Chinese users without VPN is incredibly slow.
I will use rsync to keep the information sync'd up, then redirect chinese users using DNSpod
Regarding this comment: "You may want to try CloudFlare (http://cloudflare.com), it's a free product that serves as a reverse proxy and CDN of your website. Instead of the user accessing the site directly, CloudFlare will be the one to access your site and optimize it for delivery to the visitor."
We tried this for a client in China but, unfortunately, CloudFlare don't yet have a local version of their CDN working in China. We are currently testing Amazon's CDN, which seems to have positive reviews.

Categories