Execution Creates "MySQL server has gone away" Error - php

I am using a PHP script which is meant to execute continuously for a long time (a one-time mapping script) and insert data into a MySQL database. The script works fine, however, eventually (after a minute or two) it gets an error:
MySQL server has gone away
I have changed the ini settings for both mysql connection timeout, and for the php script execute timeout but neither have changed the outcome.
I have make a VERY similar script in the past that had run on the same server for long amounts of time without ever running into this error.
I thank you for your time, hopefully your help can allow me to solve this problem along with any other frustrated scripter coming across this post in the future.

There are many reasons for this to happen : timeout, big packets size etc.
Please check this

Did you restarted mysqld after config changes?
Do you have enough memory, so that it's not killed by OOM killer?
UPDATE: here is the solution, you need to set "wait timeout"
http://dev.mysql.com/doc/refman/5.0/en/gone-away.html

check out max_allowed_packet on mysql server settings. Client cannot send packets larger than this or mysql server will close the connection.
This is only related to data inserts when the query string gets very long. This does not affect SELECTs as server will automatically enlarge the sending packet.
Good size would be the size of your data multiplied by two. The multiplication is needed as often data is escaped before sending and header and possible footer added in the SQL QUery.
The use for the max_allowed_packet would be to control the server memory usage and to limit DoS attacks.

Related

Getting "MySQL Server has gone away" in PHP, even if I'm closing my database connection

I have a PHP Socket Server that I can connect to via Telnet. Once connected, I am able to send messages in a certain format and they're saved in the database.
What happens is that when the PHP Socket receives a message, it opens a Database connection, executes the query, then closes the connection. When it receives another message, it opens the connection again, executes the query, then closes the connection.
So far, this works when I'm sending messages in an interval of 5-10 minutes. However, when the interval increases for over an hour or so, I get a MySQL Server has gone away error.
Upon doing some research, the common solution seems to be increasing the wait time, which is not an option for me. The PHP Socket server is supposed to be open 24/7, and I doubt there's a way to increase the wait time to infinity.
The other option is to check in PHP itself if the MySQL Server has gone away or not, and depending on the result, reset the MySQL Server, and try to establish the connection again.
Does anyone know how to do this in PHP? Or if anyone has other methods in keeping a MySQL Server constantly alive in a PHP Socket server, I would also be open to that idea.
You'll get this error if the database connection has timed out (perhaps from a long-running process since the last time the connection was used).
You can easily check the connection and restore it if necessary with a single command:
$mysqli->ping()
This error means that the connection has been established to the DBMS but has subsequently been lost. I've run sites with hundreds of concurrent connections to the same database instance handling thousands of queries per minute and have only seen this error when a query has been deliberately killed by an admin. You should check your config for the interactive timeout and read your server logs.
Given your description of the error (you really need to gather more data and characterized the circumstances of the error) the only explanation which springs to mind is that there is a deadlock somewhere.
I would question whether there is any benefit to closing the connection after each message (depending on the usage of the system). Artistic phoenix's comment is somewhat confused. But if there are capacity issues then I'd suggest using persistent connections with your existing open/close model, but I doubt that is relevant to the problem you describe here.

MySQL "Sending to client" processes backing up

I'm currently trying to fix an issue with our production server not being able to handle SQL queries.
Looking at the process list, MySQL is taking 120 seconds plus to complete processes that are running queries, that when I run them myself through Heidi, are completing in less than a second. So, why would queries that are being processed coming from PHP take significantly longer (and in most cases timing out) than when the same query goes through straight away from HeidiSQL?
You are probably using a persistent connection, and it can cause such problems, in case the previous PHP code that used this connection had been stopped in the middle and never ended.
read more here: What are the disadvantages of using persistent connection in PDO
Turns out the problem was that the server where PHP was running (on a different hosting provider as we're migrating to cloud) had a throttled network connection and was unable to handle all of the data being sent back from MySQL. Turning on caching on the PHP side solved the problem.

Mysql PHP dropping connection during long script

I am having a problem that I can't seem to figure out involving a long php script that does not complete due to a Database connection failure.
I am using PHP 5.5.25 and MySQL 5.6.23, with the mysqli interface.
The script uses the TCPDF library to create a PDF of a financial report. Overall it runs fine. However, when the data set gets large (the report can iterate over numerous accounts to create a multiple page report with all the accounts that match criteria) it will fail after about 30 seconds (not exactly 30, sometimes a couple of seconds more by time stamps). It seems to run fine for about 25-35 loops, but more than that causes the problem.
I don't think its an issue of timing out (although it certainly could be). I have PHP set to fairly generous amounts of resources to process this.
max_execution_time = 600
memory_limit = 2048M
The script does hit the DB pretty hard with hundreds of queries per second. As best as I can tell from some stats from the DB, there are only a couple of active connections at a time so it does not appear that the I am anywhere close to the default setting of 150 max connections.
This is the error I get when it eventually fails with a large data set.
Warning: mysqli::mysqli(): (HY000/2002): Can't assign requested address in...
Fatal error: Database connection failed: Can't assign requested address in...
Does anyone have any suggestions on what may be causing the script to eventually not be able to connect to the DB and fail to complete? I've tried searching for some answers but pretty much everything I have found so far about Database Connection failures are not being able to connect at all, rather than not being able to connect midway through a large script.
Thanks in advance for any advice.
I don't think it's an issue of timing out
You should know.
It seems strange that the issue is arising so long after the start of execution. Would it have been so hard to check what the timeout is? To try changing it? To add some logging to your code?
The other thing you should be checking is whether the script is opening a single connection and reusing it or constantly opening new connections.
Without seeing the code, its hard to say for sure, but a single script executing hundreds of queries per second for tens of seconds sounds like the split between SQL and PHP logic has been very poorly thought out.

maximum query length in mysqli_query

How can I determine the maximum $query parameter received by function mysqli_multi_query (or mysqli_query), in PHP?
I have a php program which generates a large string made of UPDATE sql commands, separated by ';' The Problem is that if that string exceeds a certain length mysqli_query generates an error like 'MySQL server has gone away'. I notice that that length seems to be around 1MB, but how can I probe-it so that I can make sure that I never exceed that length?
The script needs to run about 7000 updates, on 25 or so fields. Executing one update at a time proved very slow, Concatenating multiple updates runs much faster.
Any possibility to run multiple queries even faster?
Thanck you for any advice!
You should take a look at MySQL error logs.
If you dont have access to machine (hosting etc) you may ask your administrator or helpdesk for that log.
MySQL supports very big queries. Im not sure if there is any limit, but when you are using network - you may have problem with packet size.
You may check --max_allowed_packet in MySQL configuration, and try to set bigger packet size. Im not sure about default configuration, but it may be 1MB which may be too small value to get query with 7000 updates at once.
MySQL may need more RAM to process query like this.
If you cant reconfigure MySQL you have to split your big query to smaller queries somehow.
You may also read this for more information:
devshed - MySQL server has gone away
You asked:
Any possibility to run multiple queries even faster?
there is no simple answer for that question. It depends on query, database schema etc.
Increasing MySQL cache size in configuration file may help a lot in most cases related with big simple updates with not much computing, because database engine will operate on RAM memory, not on hard disk. When big cache is used - sometimes first big query may be slower, because data is not yet loaded into RAM, but when it finally loads - queries that need a lot of read/write operations will work much faster.
Added later:
I assume your data processing needs php deserialize() function which may be hard to implement in pure SQL and you have to do it in PHP :) If you have access to server console you may create cron (linux sheduler) job, that call PHP script from shell during night.
Added later later
After discussion in comments i have one more idea. You can make full database or one table backup from phpmyadmin, download it, restore data on home computer (on Windows you may use XAMPP, WAMP server). On your home computer you can run mysql.exe and process data locally.
I found a limit of 16 field/value pairs on an INSERT statement. Beyond that number I got a "Forbidden" error. My total INSERT statement length on the working statement was 392 characters.
Use a for loop to do any massive work and just use regular mysqli_query. I got over 16000 queries to go in like that. Some things have to be changed in the php.ini file also. post mb size needs to change. Make it as big as you can if your sending a lot of characters. var input max should be changed also if your sending a lot of different variables. Make php memory size bigger. Make sure your not running out of system memory when running when running the queries. If you do everything right you could send over 20000 queries. Set php memory to at least 256mb. You might need to increase the timeout times from 30 and 60 to 200 or higher sometimes if your sending really large amounts of queries. If your don't change any settings you will have your post fail even if everything is true. PHP will make the script conditions false if your going beyond any php.ini setting limits/maxes. You don't have to change any mysql settings doing it one by one. It will take some time if your inserting or updating anything over a 1000 queries.

Increasing MySQL server timeout value

I have a find() call which sometimes takes a long time to complete, depending on the date range selected by the user. This can sometimes cause the server to time out (2006: MySQL server has gone away), causing the find() to fail. I have tried altering the timeout value using the following:
ini_set('mysql.connect_timeout', 5);
My presumption is that this is failing because I cannot override the server settings on the hosting package.
I was advised by the hosting company to use the following code:
SET ##session.wait_timeout=60
I would be very grateful for any advice on increasing the MySQL server timeout through CakePHP.
i guess you should increase php request timeout rather then mysql timeout
set_time_limit(250);
I think you should consider another approach.
Because php run by apache will timeout, although you can set it a large value(however, some hosting prohibit to do so).
You can try to submit the form by AJAX when user request the data. And using some backend technology(node.js) to connect to mysql and query the data. Then send back to front end.

Categories