We are switching web hosts, and I have been asked to find out how many concurrent users the Magento-based websites have in order to estimate the appropriate hardware.
How can I find this information out?
The webserver is lightspeed (like apache) and it is PHP-based.
Other information that may (or may not) be helpful is that the sites are currently hosted on a shared hosting solution, so I don't think I can install any monitoring software.
I have noticed that Magento has a built in report that may be similar to this... Admin -> Customers -> Online Customers. But I have a feeling this report isn't really what the new web host is looking for.
Should this question be posted in another Stack Exchange site?
Sign up for Google Analytics at http://www.google.com/analytics/. They will provide you with a tracking code to paste into the html of your site. Insert that into your header template. It may take half a day or longer for the stats to accumulate.
Then take a look at your peak hourly stats. Click visitors then switch the graph to hourly and take a look at the hour with the max visits. That will help you approximate the amount of traffic your new host will need to accomodate. Also take a look at pageviews since that's also an important metric. Visits and pageviews are not the same as concurrent users but it should put you in the right direction.
Are you moving to a dedicated or at least dedicated resource vps? If you're currently on a shared host, it's really hard to get a true sense for what type of hardware you are going to need (I'm assuming the reason you are switching is because of performance problems). I'd suggest starting with a basic dedicated server and then either going up or down from there based on your results.
You could try the m1.small instance at Amazon EC2 which will cost you about $70/month and you'll be able to host multiple sites on there. Of course you'll have to manage the server yourself.
Run some analysis on your server logs. Or if there are no logs available to you set up google analytics, let it run for a while and get a good indication of traffic levels.
Related
I am currently leaning towards the AWS EC2 server. I'm looking at the console right now in which i can host my apis and database.
My question is which instance type should i use?
Which database should be used? RDS or on same instance?
Game will have more than 100k users.
Apis are for authentications and user file saving and retrieval.
sincere suggestions required.
thanks
Nobody will be able to give you an answer to this question. And if they do, they will be wrong.
The only way to know what resources a system would consume is to build it, then test it using simulated load. This will help identify the bottlenecks in the system (disk? database? memory? network throughput?).
You will then begin the iterative process of finding the worst bottleneck, rearchitecting the app or changing system components, celebrating, then repeating the whole test-measure-fix cycle. Welcome to the exciting world of application performance management!
You can also take advantage of the Cloud by scaling your resources based upon load. When the system is busy (eg in the evening), add more capacity. When things are quiet, remove resources to save money. This is exactly what Fortnite does.
See: How would you keep 125 million gamers playing smoothly online? Epic Games shares its Fortnite story. | Amazon Game Tech Blog
Depending upon what data you wish to store, you could consider using Amazon GameLift:
Amazon GameLift is a fully managed service for deploying, operating, and scaling your session-based multiplayer game servers in the cloud. Amazon GameLift replaces the work required to host your own game servers, including buying and setting up hardware, and managing ongoing activity, security, storage, and performance tracking. The Amazon GameLift auto-scaling capabilities provide additional protection from having to pay for more resources than you need, while helping to ensure that your players can find and join games with minimal waiting.
In reading your question, I suspect that your application has not yet been written yet. So, it's a good time to make general architectural decisions (eg using DynamoDB instead of a relational database since it provides fast and predictable performance with seamless scalability), but don't worry too much about minor details. First, you need to make a product that people want to use. That will be a more difficult challenge than scaling to meet eventual demand.
Concentrate on getting your first 100 users and worry about the other 100,000 later.
The question is very subjective to answer. Reason is You need to identify below things before coming to conclusion:
How much CPU per user per month is required?
How much memory is required per user per month?
How much bandwidth you required per user per month?
Once you have an idea about these value, May be you should take a look at how many concurrent users you want your server to give service, before scaling to launch another server.
So for example your website have 50000 users consistently accessing your application, then you should choose ec2 instance accordingly followed by auto-scaling group and load balancer to serve your traffic.
For database you should always go with RDS (For relational DB).DB on EC2 instance is not recommended.
For EC2 Instance types please check instance-types
I have been facing a weird situation for a while now and need a guidance regarding this.
Problem:
Since last two days we were experiencing very slow website as compared to what it was when we launched the server . We thought it was temperory issue . But now , it has gone dead slow & a page takes atleast 3 mins to load. I also checked that the CPU uitlization somehow reached 100% and believe that the crawling might doing this.
We are using some third party to do our SEO and google dynamic remarketing and advertising of our magento website. i firmly believe that these things needs to crawl my website for indexing over the search engine.
I have seen that google and bing crawls our website regularly. You may call it google bot and bing bot and suddenly it has seen a largest spike.
Have a look at the screenshot:
https://www.dropbox.com/s/2c4u04rhtbi99j0/Screenshot%202015-11-14%2014.16.41.png?dl=0
With the largest spike being caused by bing and google at the same time and the smaller ones appear to only be google bot.
So i just had a quick question regarding this?
Do you guys think that if a bot IP is whitelisted, will we have a problem with the SEO and google advertising and dynamic remarketing, because then it will not allow that IP to crawl on our website???
Is this a spam or the bots crawling our store which is causing store response time to reduce which can impact search engine ranking and conversions on our store??
Can a large instance type of AWS will help us solving our CPU-usage problem??
Note: We are already using m3.large instance type.
Is this a spam or the bots crawling our store which is causing store response time to reduce which can impact search engine ranking and conversions on our store??
Bots and crawlers can cause a sustainable traffic and resource spike for a single magento server. Regardless of what's in place to speed up magento's performance like: magento's default caching, nginx or apache settings, installed extensions etc...
Can a large instance type of AWS will help us solving our CPU-usage problem?? Note: We are already using m3.large instance type.
Absolutely -- a Burstable t2.large instance can be more cost effective and could better handle traffic spikes like those caused by bots. So long as you have a semi-predictable traffic pattern. Like higher traffic during the day and lower overnight, the instance will gain credits that it can use to burst above normal CPU capacity see this for a thorough explanation:
https://aws.amazon.com/blogs/aws/low-cost-burstable-ec2-instances/
The biggest help I saw was having a properly configured robots.txt for magento It makes sure crawlers are directed to the right places making sure your server only has to serve up only the pages it needs to. This post is a great place to start:
https://magento.stackexchange.com/questions/14891/how-do-i-configure-robots-txt-in-magento
In Google and Bing's webmaster tool once you verify your domain you can change the crawl rate if necessary.
You can also implement a referral spam blocking with Nginx see:
https://github.com/Stevie-Ray/referrer-spam-blocker
I have developed a Social Networking website based on WAMP(Windows, Apache, MySQL, PHP) server. I put it up on a Free Host (the hosting serves it in LAMP), and it works fine.
Now, I researched a little and found out that PHP applications are difficult to scale, and take a lot of parallel algorithms. I would like to test how many users does the webhost support for my website, and how much does my localhost.
It's a social network like any other, involving:
Posting data on main page(with images).
Chat between users (with polling every 3 seconds, consider as one in
facebook).
A question-answer forum(just as this one, or yahoo answers -
including upvotes,downvotes, points etc)
Two HTML5 server sent event loops running infinitely.
Many AJAX requests for retrieving data from MySQL database.
As for now, I haven't applied any Cache options, which I plan for later. Also, the chat application has to be switched from Polling to Websockets(HTML5).
My estimated user database goes to be a lot more than 100,000 users. That may need some serious scalability.
I need to know what kind of server may I need for the same. Should it be a dedicated server, should it be 2 of 'em or even more?
I tried this ab.exe located in bin folder of Apache, but it tests the location we provide manually. A social network needs login information to access all the data, which unfortunately limits the functionality of ab.exe only to availability of the "Welcome" page, and nothing towards the AJAX and HTML5 features which I mentioned above.
So, how exactly should I test the scalibilty of website for a hardware same as my laptop(Windows, Intel i5, 4gb Ram, 2.0 GHz), and what about scalability on Shared servers available out there, or even the dedicated ones.
Simply put: You are counting your chickens before they hatch. If you become overwhelmed by a bunch of new users, then that is what we tend to call a "good problem". If you're worried about scalability along the way, then you should look into:
Caching with Memcached or Redis.
Load balancing.
Switching from Apache to Nginx.
Providing a proper CDN.
Since you're using PHP, you should install an opcache.
There are a lot of different ways to squeeze out results. Until you need them, I'd suggest sticking by best practices (normalization, etc).
If you are worried about the capacity of your hosting company to cope with your application then the first thing to do is to contact them and discuss what capacity they have available and how scalable their environment is.
Unless you are hosting yourself, then you have virtually no control over the situation.
But if you believe the number of users is going to grow very quickly then it would be wise to enter a dialogue very early with your provider.
I am currently running a small piece of PHP software which is being accessed by two groups of people, one group in China, the other in Ireland. The server is hosted in Ireland, and no doubt pages load between 0.3 and 0.8 seconds.
The Chinese people are recording times of between 3.0 seconds and 10 seconds.
I have run speed tests and the Chinese have an internet connection of 1.5 mbit (testing server in Ireland) and a ping of 750ms.
So far, to improve load time I have redone a lot of the MySQL database interaction to maximize efficiency, but this did not help. I presume this only slightly reduces server process time but has little effect on page download time.
I was thinking, could I get hosting in China, and use this as a gateway for the system in Ireland. Surely the latency and download speed between this Chinese server would be better than the average person.. And the average persons requests would've re-routed through here.
Does this sound at all feasible? Or does anyone else have suggestions in relation to this?
Your biggest problem is the latency. If you can minimize the http requests, you should see large gains.
But, it's probably easier to just the buy services of a content delivery network and move all your static files to them, making sure they have well connected servers in china and ireland. I think this is easier than redesigning your website.
The cheapest bang for the buck would come from sending suitable http headers to indicate to the web browsers that they don't need to constantly check with your servers for freshness validation(don't do conditional http requests). If you can do this for all external webpage objects(images, css, js etc...) then the first page load will still take 10 seconds or whatever, but subsequent page loads should be very close to your irish visitors. THis is just a matter of webserver configuration. Heres a tutorial if needed. To be honest, sending cache friendly headers is always a good idea, this is how you get those snappy responsive websites we all love.
First of Follow best practices to speed up you website.Then also check YSlow to measure your site's speed and check what you can do to improve performance.
I think that changing servers won't give you better responce time.
I think you should worry first for optimization and then check for a good and powerful hosting company and hosting plan.
Cheers
Hosting/mirroring in China is definitely the way to go. It'll be faster for the people there and more reliable. Ideally you would also have it hosted in Ireland too, and have some kind of load balancing in place.
I would definitely mirror your site to a server somewhere in asia. You already did some performance improvements that didn't result in much better results from China so I assume the network latency is causing most of the load times.
You can deploy your website to one of the cloud services. Amazon EC2 instances are available in Singapore and Tokyo for example. Or you can deploy to a "regular" web hoster in China.
If you have to deliver mostly static files, check for a Content Distribution Network (CDN) service.
To speed up the load time, you may consider hosting your static files (images, css, javascript, documents) to a CDN (Content Delivery Network). CDNs work by hosting your files in multiple nodes around the world. Once it receives a request, the closest server to the visitor will serve the file - thus reducing latency.
You may want to try CloudFlare (http://cloudflare.com), it's a free product that serves as a reverse proxy and CDN of your website. Instead of the user accessing the site directly, CloudFlare will be the one to access your site and optimize it for delivery to the visitor.
Move to the cloud.
http://en.wikipedia.org/wiki/Cloud_computing
I'm in the process of doing the same thing. I will create a site mirror on aliyun.com cloud. I'm hosting my site (www.discoverwish.com) at siteground in singapore, but load times for my Chinese users without VPN is incredibly slow.
I will use rsync to keep the information sync'd up, then redirect chinese users using DNSpod
Regarding this comment: "You may want to try CloudFlare (http://cloudflare.com), it's a free product that serves as a reverse proxy and CDN of your website. Instead of the user accessing the site directly, CloudFlare will be the one to access your site and optimize it for delivery to the visitor."
We tried this for a client in China but, unfortunately, CloudFlare don't yet have a local version of their CDN working in China. We are currently testing Amazon's CDN, which seems to have positive reviews.
We are thinking to use Dotnetnuke(ASP.NET) as platform for creating asp.net based social networking site. Surprisingly, Its difficult to find webhost with unlimited diskspace/badnwidth to store/stream lots of photos/videos.
On the other hand, PHP based webhosting companies do provide unlimited diskspace/bandwidth. Check it here
So, I have two questions:
1) Does Diskspace/Bandwidth in webhosting scenario have anything to do with technology(PHP/ASP.NET) and OS(Windows/Linux)?
2) Recommendation of hosting providers with diskspace/Bandwidth for hosting ASP.NET website.
The reason you can't find one is because there aren't any hosts who can offer you unlimited bandwidth because it is a finite resource. Check the terms of all these offerings and you will see that they generally mean "there's no arbitrary limit" when they say "unlimited" but that's not the same as "you can use any amount of bandwidth".
Using that list you gave, I went to HostMonster who happened to be top of the list and they state in their terms:
What "Unlimited" means. HostMonster.Com does not set an arbitrary limit or cap on the amount of resources a single Subscriber can use.
...snip ...
As a result, a typical website may experience periods of great popularity and resulting increased storage without experiencing any associated increase in hosting charges.
Basically, if you get slashdotted then they won't cut you off. That's not the same as being permanently busy.
What "Unlimited" DOES NOT mean. ... snip ... HostMonster.Com's offering of "unlimited" services is not intended to allow the actions of a single or few Subscribers to unfairly or adversely impact the experience of other Subscribers.
HostMonster.Com's service is a shared hosting service, which means that multiple Subscriber web sites are hosted from the same server and share server resources. HostMonster.Com's service is designed to meet the typical needs of small business and home business website Subscribers in the United States. It is NOT intended to support the sustained demand of large enterprises, internationally based businesses, or non-typical applications better suited to a dedicated server.
If you want as much bandwidth as you will need, you need to pay for it.
For something as large as a social networking site, you want to look into your own server.
PHP/MySQL is free as it's open source - the hosting companies can then offer you much more in the way of 'extras' - such as a greater allowance of disk space and bandwidth. The Microsoft stack costs thousands and they have to make that money back!
The best solution for your situation seems to be a VPS, all the control you need at a fraction of the cost of a dedicated server.
Look into some of the cloud storage solutions, like Amazon's S3 or Mosso's Cloud Files. It's unlimited space/bandwidth but like everyone has said - nothing is free. The prices are very fair though.