Codeigniter and too many database connections? (Unable to connect...) - php

I've a lot of cronjobs (50 -100) which all start at the same time. (Refreshing data for every single client.) There are many different jobs to do in a single hour so I can't differ the times of the jobs. And I decided not to make a loop - but single jobs - to avoid that possible errors affect the refreshs of the others.
At first all was ok - but now - having about 100 clients - nearly 30% of the jobs end up with a
A Database Error Occurred
Unable to connect to your database server using the provided settings.
Filename: core/Loader.php
Line Number: 346
But the max. connections of mySQL are NOT reached. I've already tried to switch between connect and pconnect but thas has no effect.
Any idea where the bottleneck is? And how to avoid this?

The default max connections is set to 150. If you have 100 clients and 50 to 100 cronjobs that do database queries, I come up to 100*100 = at least 10,000 connections.
If you have 10,000 connects at the same time you can get weird errors, for example time outs or concurrency problems (one script locks a table and another tries to access, this should not give an unable to connect error though but in some cases it does). You can try to bundle the queries.
What happens if you raise the max connections to 400 or so? Does it reduce the number or errors?
A workaround might be that when a job fails that you wait a second or what and try it again. More stable would be the use of a queuing mechanism like Gearman. This helps to spread the load.
Edit:
Codeigniter closes connections for you, but you can do it also manually by using
$this->db->close();

Related

What is the best way to minimize number of connections?

My host has a really, really low number of max connections for a database user. This is the error my users are getting:
User 'username_here' has exceeded the 'max_user_connections' resource (current value: 15).
I don't think it's in my power to raise that value, unless I upgrade to a much more expensive plan, so I'm looking for a way to use these 15 connections effectively.
My way of handling connections is the following: I connect to the database in a script I load at the start of every page load and then I pass that connection to a function that runs the queries. I thought I could minimize the time a connection is open by opening the connection inside the query function and closing it right after the return statement, is that fine or am I making things more complicated for no reason?
As a last resort, I was thinking of putting the connection inside of a try/catch and attempt to reconnect every few seconds for a few more times. Would that be something wise to do, or is it even worse?
Here's how you can optimize the number of connections:
Make sure that you are not using persistent connection anywhere. This is the easiest way to lose track of open connections and the most common reason for running out of available connections. In mysqli the persistent connection is opened by prepending p: to the hostname when connecting.
Make sure that you are only opening a single connection on each HTTP request. Don't open and close them as this can quickly get out of hand and will have bad performance impact on your application. Have single global connection that you pass around to functions that need it.
Optimize your queries so that they are processed faster and free up the connection quicker. This also applies to optimizing indexes and getting rid of the N+1 problem. (From experience I can say that PDO helps a lot in refactoring your code to avoid poorly designed queries.)
If you need to perform some other time-demanding task in the same process, do all your SQL operations first and then close the connection. Same applies to opening the connection. Open it only when you know you will need it.
If you find yourself running into a problem of exceeding the 'max_user_connections' limit then it means that your web server is not configured properly. In an ideal scenario the MySQL connection would be unlimited, but on shared hosting this limitation has to be put in place to protect against resource abuse (either accidental or on purpose). However, the number of available MySQL connection should match the number of available server threads. This can be a very opinionated topic, but I would say that if your application needs to perform some SQL operation on every request then the number of available server connections should not exceed the number of available MySQL connections. On apache, you can calculate the number of possible connections as shown in this link.
On a reasonably designed application even with 15 concurrent MySQL connections, you should still be able to handle a satisfactory amount of requests per second. For example, if each request takes 100ms to complete, you could handle 150 requests per second.

MYSQL optimize for a lot of GET_LOCK and high number of connections (Codeigniter Sessions)

I use Codeigniter 3 with database-backed sessions. It uses SELECT GET_LOCK("$SESSION_ID", 300) to prevent concurrent requests causing issue with session data.
This works fine until I have a ton of traffic causing a bunch of open connections all calling this function and waiting for the lock to be released
It sometimes brings the application to a halt for a couple of minutes. It never reaches my max number of connections in mysql and CPU and RAM usages are fine, but performance is slow for all users of that database server. (Not sure why it doesn't perform ok). I do use pt-kill to remove any queries taking longer than 15 seconds.
I already tried using the redis driver that had its own performance issues so I am back to the database sessions.
So my questions are:
How do I optimize my application to perform well when I get a ton of traffic and cases GET_LOCK queries to pile up and open connections to mysql. I was thinking I could use persistent connection; but not sure if that is a good idea as most people never recommend this

PHP Gearman too much mysql connections

I'm using Gearman in a custom Joomla application and using Gearman UI to track active workers and jobs number.
I'm facing an issue with MYSQL load and number of connections, I'm unable to track the issue, But I've few questions that might help me.
1- Does Gearman Workers launch a new database connection for each job or do they share the same connection?
2- If Gearman launches a new connection everytime a job runs how can I change that to make all jobs share same connection?
3- How can I balance the load between more than one server?
4- Is there is something like "Pay-as-you-go" package for MYSQL hosting? if yes, Please mention them.
Thanks a lot!
This is often an overlooked issue when using any kind of a job queue with workers. 100 workers will open a separate database connection each (they are separate PHP processes). If MySQL is configured allow 50 connections, workers will start failing. To answer your questions:
1) Each worker runs inside one PHP process each, and that process will open 1 database connection. Workers do not share database connections.
2) If only one worker is processing jobs, then only one database connection will be opened. If you have 50 workers running, expect 50 database connections. Since these are not web requests, persistent connections will not work, sharing will not work.
3) You can balance the load by adding READ slaves, and using a MySQL proxy to distribute the load.
4) I've never seen a pay-as-you-go MySQL hosting solution. Ask your provider to increase your number of connections. If they won't, it might be time to run your own server.
Also, the gearman server process itself will only use one database connection to maintain the queue (if you have enabled mysql storage).
Strategies you can use to try and make your worker code play nicely with the database:
After each job, terminate the worker and start it up again. Don't open a new database connection until a new job is received. Use supervisor to keep your workers running all the time.
Close database connections after every query. if you see a lot of connections open in a 'sleep' state, this will help clean them up and keep database connections low. Try $pdo = null; after each query (if you use PDO).
Cache frequently used queries where the result doesn't change, to keep database connections low.
Ensure your tables are properly indexed so queries run as fast as possible.
Ensure database exceptions are caught in a try/catch block. Add retry logic (while loop), where the worker will fail gracefully after say, 10 attempts. Make sure the job is put back on the queue after a failure.
I think the most important thing to look at, before anything else, is the MySQL load. It might be that you have some really heavy queries that are causing this mess. Have you checked the MySQL slow query log? If yes, what did you find? Note that any query that takes more than a second to execute is a slow query.

Mysql PHP dropping connection during long script

I am having a problem that I can't seem to figure out involving a long php script that does not complete due to a Database connection failure.
I am using PHP 5.5.25 and MySQL 5.6.23, with the mysqli interface.
The script uses the TCPDF library to create a PDF of a financial report. Overall it runs fine. However, when the data set gets large (the report can iterate over numerous accounts to create a multiple page report with all the accounts that match criteria) it will fail after about 30 seconds (not exactly 30, sometimes a couple of seconds more by time stamps). It seems to run fine for about 25-35 loops, but more than that causes the problem.
I don't think its an issue of timing out (although it certainly could be). I have PHP set to fairly generous amounts of resources to process this.
max_execution_time = 600
memory_limit = 2048M
The script does hit the DB pretty hard with hundreds of queries per second. As best as I can tell from some stats from the DB, there are only a couple of active connections at a time so it does not appear that the I am anywhere close to the default setting of 150 max connections.
This is the error I get when it eventually fails with a large data set.
Warning: mysqli::mysqli(): (HY000/2002): Can't assign requested address in...
Fatal error: Database connection failed: Can't assign requested address in...
Does anyone have any suggestions on what may be causing the script to eventually not be able to connect to the DB and fail to complete? I've tried searching for some answers but pretty much everything I have found so far about Database Connection failures are not being able to connect at all, rather than not being able to connect midway through a large script.
Thanks in advance for any advice.
I don't think it's an issue of timing out
You should know.
It seems strange that the issue is arising so long after the start of execution. Would it have been so hard to check what the timeout is? To try changing it? To add some logging to your code?
The other thing you should be checking is whether the script is opening a single connection and reusing it or constantly opening new connections.
Without seeing the code, its hard to say for sure, but a single script executing hundreds of queries per second for tens of seconds sounds like the split between SQL and PHP logic has been very poorly thought out.

How to find root cause for "too many connections" error in MySQL/PHP

I'm running a web service which runs algorithms that serve millions of calls daily and run some background processing as well.
Every now and than I see "Too many connections" error in attempts to connect to the MySQL box" for a few seconds. However this is not necessarily attributed to high traffic times or anything I can put my finger on.
I want to find the bottleneck causing it. Other than in the specific times this happens the server isn't too loaded in terms of CPU and Memory, and has 2-3 connections (threads) open and everything works smoothly. (I use Zabbix for monitoring)
Any creative ideas on how to trace it?
try to have an open mysql console when this happens and issue a SHOW PROCESSLIST; to see what queries are being executed.
Alternatively you could enable logging slow queries (in my.cnf insert this line:
log-slow-queries=/var/log/mysql-log-slow-queries.log
in the [mysqld] section and use set-variable=long_query_time=1 to define what's the minimum time a query should take in order to be considered slow. (remember to restart mysql in order for changes to take effect)
What MySQL table type are you using? MyISAM or InnoDB (or another one)? MyISAM will use table level locking, so you could run into a scenario where you have a heavy select running, followed by an update on the same table and numerous select queries. The last select queries will then have to wait until the update is finished (which in turn has to wait until the first - heavy - select is finished).
For InnoDB a tool like innotop could be useful to find the cause of the deadlock (see http://www.xaprb.com/blog/2006/07/31/how-to-analyze-innodb-mysql-locks/).
BTW The query that is causing the lock to occur should be one of those not in locked state.
The SHOW OPEN TABLES command will display the lock status of all the tables in MySQL. If one or more of your queries is causing the connection backlock, combining SHOW PROCESSLIST and the open tables should narrow it down as to exactly which query is holding up the works.
Old topic. However, I just had this issue and it was because I had a mysqldump script scheduled for 3 times per day. At these times, if my web application was also getting a fair amount of usage, all of the web application queries just queued themselves up on top of each other while the mysqldump was locking all of the tables in the database. The best option is to setup a replication slave on a separate machine, and take your backups from the slave rather than from the production server.
May be related to this bug in MySQL for FULLTEXT search:
http://bugs.mysql.com/bug.php?id=37067
In this case, the FULLTEXT initialization actually hangs MySQL. Unfortunately there doesn't seem to be a solution.
Without knowing too much of your implementation, and PHP in general, but are you sure that you do not have any problems with lingering DB connections? E.g connections that stay open even after the request has been processed?
In PHP a connection is usually closed automatically when the script ends or when calling mysql_close($conn); but if you use any sort of homegrown connection pooling, that could introduce problems.

Categories