My client has their web hosting account on a shared server and their account has been suspended because is was "causing critical server overload". I have looked at the code and it is functionally programmed php that uses a lot of database queries. I have looked through and most of them are "SELECT *". This database has tables with a 10 or more rows and more than 1000 records.
I was wondering if the cause could be all the sql queries not being freed up, but I'm not sure when "script execution" finishes. Is it after the function has finished execution, or the whole page has been rendered? Could it be the size of the tables (structure or records)? Does anyone have any other ideas?
It really depends on the kind of package your client was on, what type of script custom coded or a standard script like wordpress.
Causing critical server overload - Could be a multitude of things:
High Memory usage: The script is not using a singleton model or its assigning huge amounts of data to arrays, variables or including lots of files. basically bad design & code smell.
High CPU: Insanely long scripts, Long iterated loops with complex calculations in between or infinite loops (sockets) ect on each page view.
High Network Traffic: Screen Scappers like a crawler thats requesting high amounts of traffic from other site or basically scripts that grab external content ALOT, or something like a torrent tracker.
High Disk usage: Constantly bombarding the servers IO stack (Writing and reading to the disk constantly)
A script with lots of database query's could fall into: High Disk usage (reading)+High Memory usage (iterating the result)+High CPU (doing stuff with the result))
You should use a tool to performance profile the script locally: xDebug or PQP, and find out whats really happening.
If your client is serious about there site then they should invest in a VPS.
Make sure you are closing your SQL connections properly. If you are doing alot of queries at once it might be more efficient to leave the connection open for longer periods. Or if you are not closing them after each query maybe try doing this. I must say 10 tables is not a lot and would it would suprise me that this is overloading the shared server.
Related
I have a PHP application that is executed up to one hundred times simultaneously, and very often. (its a telegram anti-spam bot with 250k+ users)
The script itself makes various DB calls (tickers update, counters etc.) but it also load each time some more or less 'static' data from the database, like regexes or json config files.
My script is also doing image manipulation, so the server's CPU and RAM are sometimes under pressure.
Some days ago i ran into a problem, the apache2 OOM-Killer was killing the mysql server process due to lack of avaible memory. The mysql server were not restarting automaticaly, leaving my script broken for hours.
I already made some code optimisations that enabled my server to breathe, but what i'm looking now, is to have some caching method to store data between script executions, with the possibility to update them based on a time interval.
First i thought about flat file where i could serialize data, but i would like to know if it is a good idea or not regarding performances.
In my case, is there a benefit of using caching data over mysql queries ?
What are the pro/con, regarding speed of access, speed of execution ?
Finaly, what caching method should i implement ?
I know that the simplest solution is to upgrade my server capacity, I plan to do so anytime soon.
Server is running Debian 11, PHP 8.0
Thank you.
If you could use a NoSQL to provide those queries it would speed up dramatically.
Now if this is a no go, you can go old school and keep that "static" data in the filesystem.
You can then create a timer of your own that runs, for example, every 20 minutes to update the files.
When you ask info regarding speed of access, speed of execution the answer will always be "depends" but from what you said it would be better to access the file system that being constantly querying the database for the same info...
The complexity, consistency, etc, lead me to recommend against a caching layer. Instead, let's work a bit more on other optimizations.
OOM implies that something is tuned improperly. Show us what you have in my.cnf. How much RAM do you have? How much RAM des the image processing take? (PHP's image* library is something of a memory hog.) We need to start by knowing how much RAM can MySQL can have.
For tuning, please provide GLOBAL STATUS and VARIABLES. See http://mysql.rjweb.org/doc.php/mysql_analysis
That link also shows how to gather the slowlog. In it we should be able to find the "worst" queries and work on optimizing them. For "one hundred times simultaneously", even fast queries need to be further optimized. When providing the 'worst' queries, please provide SHOW CREATE TABLE.
Another technique is to decrease the number of children that Apache is allowed to run. Apache will queue up others. "Hundreds" is too many for Apache or MySQL; it is better to wait to start some of them rather than having "hundreds" stumbling over each other.
In PHP, shared hosting environment, what shall be an optimal memory consumption to load a page. My current PHP script is consuming 3,183,440 bytes of memory. What shall I consider a good memory usage, to entertain say, 10000 users parallely?
Please be detailed, as I am a novice in optimization part.
Thanks in advance
3MB isn't that bad - keep in mind that parts of PHP are shared, depending on which server is used (IIS, ngx, apache etc.) you can specify pools and clusters as well when having to scale up.
But the old adage testing is knowledge goes well here, try load tests on the site, concurrent 10 -> 100 -> 1000 connections and look at the performance metrics, it wil give you more insight on how much memory is required.
For comparison, the site I normally work on has an average of 300+ users concurrently online and the memory usage is just under 600MB, however I run certain processes locally it will easily use up 16MB.
I'm building a PHP application with an API that has be able to respond very rapidly (within 100ms) to all requests, and must be able to handle up to 200 queries per second (requests are in JSON, and responses require a DB lookup + save every time). My code runs easily fast enough (very consistently around 30ms) for single requests, but as soon as it has to respond to multiple requests per second, the response times start jumping all over the place.
I don't think it's a memory problem (PHP's memory limit is set to 128MB and the code's memory usage is only around 3.5MB) or a MySQL problem (the code before any DB request is as likely to bottleneck as the bit that interacts with the DB).
Because the timing is so important, I need to get the response times as consistent as possible. So my question is: are there any simple tweaks I can make (to php.ini or Apache) to stabilise PHP's response times when handling multiple simultaneous requests?
One of the slowest things (easiest to fix) in my experience in a server in terms of bottleneck is going to be your filesystem and hard drives. I think speeding this up will help out in all other areas.
So you could for example upgrade the hard drive where your httpdocs and database resides. You can put it on an SSD drive for example. Or even make a RAM disk and place all files on it.
Alternatively you can setup your database such that it operates off of a Memory storage engine.
(Related info here too)
Of course for all that you'll need a lot of physical memory. It is also important to note if your web/app hosting you got is shared then your going to have problems with Shared Memory.
Tune Mysql
Tune Apache
Performance tune PHP
Get Zend Optimizer enabled, or look at APC, or eAccelerator
Here's some basic LAMP tuning tips from IBM
Here's a slideshare with some good advice as well
Hi guys I have a question about server's RAM and PHP/MySQL/Jquery script.
Can scripts kills RAM when script doesn't take extra RAM? (I know it could happen when RAM grow up to maximum or because of memory limit. But it isn't this case.)
I'm testing script but everytime when I do that RAM goes quickly down.
Script doesn't show error for memory limit and it's correctly loading all data. When I don't test script RAM is still down.
In database is a couple records - maybe 350 records in 9 tables (the bigges tables has 147 records).
(I haven't any logs just simply (really simple) graph for running server.)
Thank for your time.
If you're not getting errors in your PHP error log about failing to allocate memory, and you're not seeing other problems with your server running out of RAM (such as extreme performance degradation due to memory pages being written to disk for demand paging) you probably don't need to really worry about it. Any use case where a web server uses up that much memory in a single request is going to be pretty rare.
As for trying to profile the actual memory usage, trying to profile it by watching something like the task manager is going to be pretty unreliable. Most PHP scripts are going to complete in milliseconds, which isn't enough time for the memory allocations to really even register in the task manager.
Even if you have a more reliable method of profiling the memory usage (I don't recall if PHP has built in functions for this, but probably does), bear in mind that memory usage is going to flucuate tremendously for reasons that may be hard to understand. PHP in particular is very high level: you can open a database connection, which involves everything down to the OS opening network sockets, creating internal datastructures, caching things, and much more all in a single line of code. The script may allocate many megabytes of memory for such a thing for a single database row, but may then deallocate it a millisecond later.
Those database sizes are pretty neglibible. Depending on the row sizes it's possibly under a megabyte of data which is a tiny drop in a bucket for memory on anything remotely modern. Don't worry about memory usage for something like that. Only if you see your scripts failing and your error log reports running out of memory should you really worry about it.
The first page I load from my site after not visiting it for 20+ mins is very slow. Subsequent page loads are 10-20x faster. What are the common causes of this symptom? Could my server be sleeping or something when it's not receiving http requests?
I will answer this question generally because I'm sure it's something that confuses a lot of newcomers.
The really short answer is: caching.
Just about every program in your computer uses some form of caching to remember data that has already been loaded/processed recently, so it doesn't have to do the work again.
The size of the cache is invariably limited, so stuff has to be thrown out. And 99% of the time the main criteria for expiring cache entries is, how long ago was this last used?
Your operating system caches file data that is read from disk
PHP caches pages and keeps them compiled in memory
The CPU caches memory in its own special faster memory (although this may be less obvious to most users)
And some things that are not actually a cache, work in the same way as cache:
virtual memory aka swap. When there not enough memory available for certain programs, the operating system has to make room for them by moving chunks of memory onto disk. On more recent operating systems the OS will do this just so it can make the disk cache bigger.
Some web servers like to run multiple copies of themselves, and share the workload of requests between them. The copies individually cache stuff too, depending on the setup. When the workload is low enough the server can terminate some of these processes to free up memory and be nice to the rest of the computer. Later on if the workload increases, new processes have to be started, and their memory loaded with various data.
(Note, the wikipedia links above go into a LOT of detail. I'm not expecting everyone to read them, but they're there if you really want to know more)
It's probably not sleeping. It's just not visited for a while and releases it's resources. It takes time to get it started again.
If the site is visited frequently by many users it should response quickly every time.
It sounds like it could be caching. Is the server running on the same machine as your browser? If not, what's the network configuration (same LAN, etc...)?