I've been trying to debug a randomly dropping mysql connection in my php project.
Things I've discovered so far:
Once the input over a single mysql connection (total chars used in queries over a single mysql connection) passes 66,946 chars, php just locks up indefinitely on the query that caused it to go over.
If I reconnect the mysql server occasionally rather than re-using the existing connection for the whole time, the mysql connection won't drop as long as I don't go over 66946 chars in the input for any single mysql connection.
It doesn't matter whether it's a single query, or a bunch of little queries. As soon as the "66947" threshold is passed on a single mysql connection, php hangs indefinitely.
It's not a memory issue. Php is only taking 10Mb of memory at most, and it has a max memory of 512Mb.
The mysql connection is remote, if that matters.
This exact code works on my local dev environment (with any length of query, same remote connection), but not on the production server with queries adding up to over 66,946 chars in length
The php version & config files for my dev environment and the live environment are identical, and both are running Ubuntu (well, locally it's technically WSL)
Switching between mysqli and PDO doesn't make a significant difference (just a different number of input chars before it crashes, less chars with PDO than mysqli)
(update): I also tried this script on another similar ubuntu host with the same version of PHP, using the same mysql host. It worked flawlessly there... so I'm really lost as to what the issue could be.
I've narrowed it down to this minimal reproduction case:
<?php
if ( ! empty($argv[1])) {
$count = intval($argv[1]);
}
if (empty($count) || $count < 34) {
$count = 34;
}
$longText = str_repeat("a", $count - 34);
$query = "select * from abcdef limit 1/* {$longText} */"; // where "abcdef" is my table name
$mysqliConnection = mysqli_connect("my_host", "my_username", "my_password", "my_database");
echo "query length: " . strlen($query);
$result = mysqli_query($mysqliConnection, $query);
echo "\n\nSuccess!\n\n";
providing the argument 66946 returns "success!" instantly, and 66947 just makes the production server hang indefinitely (but works fine in my local box!).
Why is there some mysterious input limit on my php mysql connections? How can I stop it (whatever "it" is) from limiting my php mysql connection input length?
You're probably looking for max_allowed_packet. It's defined in my.cnf.
Use SHOW VARIABLES LIKE 'max_allowed_packet'; to check it.
It turns out there was a misconfiguration in the routing table for the server. Fixing the routing table solved the issue.
Took many hours to come to this conclusion, everything was pointing to a php fpm or mysql setting being misconfigured, I didn't even think it could be a routing issue. Hopefully this helps someone in the future.
Related
I'm using PDO in PHP 7 on an Amazon Linux AMI to connect to a SQL Server DB running on Amazon RDS.
Thus far, I've been able to extract all of the data out of the DB just fine except for one column in one table. The column is of the type image and contains binary data.
When I attempt to use PDO in PHP to select from the column in question and then output the binary to the browser window with the necessary header sent (e.g., header('Content-Type: image/jpeg');), I can only see a part of the file (in the case of an image) or I get a corrupt-file error (in the case of a PDF).
In a few edge cases (I'm presuming when the file size is quite small), I can see the full image, but that's rare.
The following is an example of this problem in Chrome:
I've done some pretty extensive Googling on this issue, but there doesn't seem to be a lot of info on this issue, and all the pages I've seen are quite old and related to either mssql (no longer available in PHP 7) or sqlsrv drivers.
Here are some examples of (seemingly) related pages I've found:
https://www.reddit.com/r/PHP/comments/m6v1t/need_help_displaying_jpg_from_mssql_image_field/
https://css-tricks.com/forums/topic/blob-display-is-truncated/
PHP is truncating MSSQL Blob data (4096b), even after setting INI values. Am I missing one?
Does anyone have any ideas as to why this is happening and how I can fix it? I feel like PDO should be able to handle this, but if there is an issue with PDO and I need to use some other drivers to handle this, that's fine too.
Thank you.
I finally found the answer. The data coming back to PHP via PDO was in fact being truncated at 64k, thus causing issues.
Alex helped lead me on the right path by suggesting setting TEXTSIZE to -1. The rookie mistake that I made was that I did SET TEXTSIZE -1 from Microsoft SSMS, assuming that it would be set globally for all connections, which was not the case. It only set it for the SSMS connection, thus the problem.
However, when I finally did the following in PDO in PHP, that is, set TEXTSIZE to -1 with the PDO connection and then make the query from PDO, I was able to set TEXTSIZE for the PDO connection and then get back all the data:
doSqlServerQuery("SET TEXTSIZE -1;");
$results = doSqlServerQuery("SELECT binary-data-image-column AS data
FROM table-name
WHERE key = 95948578934578934;");
//Function definition
function doSqlServerQuery($query, $dbh = null) {
if (!isset($dbh)) {
global $dbh;
}
$stmt = $dbh->prepare($query);
if ($stmt) {
$stmt->execute();
$results = [];
while ($result = $stmt->fetch(PDO::FETCH_ASSOC)) {
$results[] = $result;
}
return $results;
} else {
pO($dbh->errorInfo());
}
}
Just fixed this myself, it is being cut off after 64KB has been uploaded, to fix this, either set a limit at 64KB or change it to a longblob, which gives you 4 gigabytes of storage instead of 64 kilobytes.
Ok, so we've got a new server with
Debian Wheezy 32BIT
PHP 5.5.18
FreeTDS 0.91
This PHP app needs to talk to an old SQL server 2000 server. We used the old code from our previous server (PHP 5.2 and older FreeTDS - can't get the version).
We connect to SQL server 2000 through PDO using dblib driver.
We're experiencing weird behaviour with the fetch function. Basically if we issue a query during a fetch loop on the same pdo connection object, the main query gets reset and next fetch call will return false even if there are still records to be fetched.
// PSEUDO CODE
// Here the main query
$q = $sql7->query("SELECT TOP 5 * FROM News ORDER BY Data Desc");
while ($row = $q->fetch(PDO::FETCH_ASSOC)) {
// Looping through the results
echo "<h1>Main query</h1>";
print_r($row);
// Issue a query on the same pdo connection
$subq = $sql7->query("SELECT TOP 1 * FROM News WHERE IDNews = " . $row['IDNews'] . " ");
while ($subResult = $subq->fetch(PDO::FETCH_ASSOC)) {
echo "<h1>Inner query</h1>";
print_r($subResult);
}
// Here the main query $q->fetch(PDO::FETCH_ASSOC) will answer false on the next iteration
// if we remove the subq, the main query loops just fine
echo "<hr>";
}
Same code on a Windows PHP with pdo_sqlserver driver works just fine.
It doesn't matter the type of fetch that we pass as argument of fetch function.
PHP doesn't throw any warning or error.
I really don't know what's going on here.
As of: reference (PHP BUG SITE)
This is the behavior of MSSQL (TDS), DBLIB and FreeTDS. One statement
per connection rule. If you initiate another statement, the previous
statement is cancelled.
The previous versions buffered the entire result set in memory leading
to OOM errors on large results sets.
So, it seems that it was the previous versions of PHP (5.3 and previous) that were not conforming to the TDS behaviour.
We need to refactor the code then.
My issues basically revolves around me needing/preferring to use PHP's sqlsrv to access a sql-server 2000 database. I've worked on the project already using that with a sql server 2005 to run all the queries through and switching to something like the ODBC PHP drivers would be a pretty big headache right now. So now I have the original SQL Server 2000 database and 2005 installed on the same computer and I've created a linked server between the 2. Testing it out in Management Studio Express worked by running a simple query to one of the tables.
Now, I'm using the exact same query in PHP using sqlsrv_query and running into an error. My PHP code to test this out looks like this
$connectionOptions = array("UID"=>"user","PWD"=>"password");
$res = sqlsrv_connect("(local)\SQLExpress", $connectionOptions);
if(!$res) die("ERRORS : " . print_r(sqlsrv_errors()));
echo "SELECT * FROM [ServerName].DB.dbo.Table<br/>";
$res = sqlsrv_query($res,"SELECT * FROM [ServerName].DB.dbo.Table");
if($res===false){
die("ERRORS : " . print_r(sqlsrv_errors()));
}else{
var_dump($res);
$ary = sqlsrv_fetch_array($res);
}
var_dump($ary);
echo "<hr>";
var_dump(sqlsrv_errors());
The problem with this code is that the result of of sqlsrv_query doesn't return false but returns resource(11) of type unknown. So running fetch_array on that result tells me that an invalid parameter was passed to sqlsrv_fetch_array. I'm not sure what to do at this point. Is there just a problem running a query on a linked server through sqlsrv?
seems no error in your code.
Please try the fetch_array in a loop and update the result.
while($row = sqlsrv_fetch_array($result))
{
echo($row['field_name']);
}
also remove var_dump($ary);
Usually even if there is any error in the data display using '$row', the error message will show problem with 'sqlsrv_fetch_array'.
Well I figured out the problem. In my haste to make some code to test out whether or not the linked server worked, I just used the same variable name for the result of sqlsrv_connect and sqlsrv_query. Turns out, that was the whole problem. I switched the variable names so that the connection object is $Link and the query stays as $res. Now, I get the access to the database that I was trying to get through 2005 into 2000. So in the future, I definitely will name my variables a bit more carefully so I don't bang my head against the wall for hours.
i do have a Query here (PHP with Zend Framework on a MySQL Database) using one POST Parameter as an argument in an SQL Statement.
So i do have a local XAMPP Installation on my development machine and the runtime of this small Script ist something like 150ms no matter if i pass the argument with or without using mysql_real_escape_string() on that argument before.
Then i do have a virtual server with BitNami-WAMP-Stack installed. When i run the Script there (100% same database content) it takes around 260ms without mysql_real_escape_string, this is not as "fast" (i know that 150ms isn't really fast at all) as my local machine but would be okay. But if i do add only one mysql_real_escape_string() to the argument from the POST variable the whole thing takes 1.2 seconds.
And i further noticed that every call to mysql_real_escape_string makes the script run around 1 second slower on the virtual server. On my local machine this does not have any effect.
How can this be? Is this a MySQL Setup thing or a PHP.ini thing or what? Because i do have the 100% same database and PHP source on both machines i guess it can only be parametrization?
Thanks for any help in advance!
EDIT:
So here is what i do, first connecting the DB (in Bootstrap.php):
$GLOBALS["db"]= new Zend_Db_Adapter_Pdo_Mysql(array(
'host' => "localhost",
'username' => "user",
'password' => "password",
'dbname' => "database"
));
And then later in want to query the database:
global $db;
$query = sprintf("SELECT * FROM table WHERE id = '%s'", mysql_real_escape_string($id) );
$db->query("SET NAMES 'utf8'");
$db->fetchAll($query);
I just made another test: when i add this simple test-line to my code, it makes the script ~600ms slower on the virtual machine:
mysql_real_escape_string("TEST");
use prepared statements to do this:
http://framework.zend.com/manual/en/zend.db.statement.html
The long-term solution must be switching to prepared statements, that is right. An equivalent to mysql_real_escape_string for PDO connections seems to be PDO::quote:
http://php.net/manual/de/pdo.quote.php
Any disadvantages compared to a mysql_real_escape_string based solution?
I had the same delay on first call mysql_real_escape_string method, solution in my case was information from manual: http://php.net/manual/en/function.mysql-real-escape-string.php
"A MySQL connection is required before using mysql_real_escape_string() otherwise an error of level E_WARNING is generated, and FALSE is returned. If link_identifier isn't defined, the last MySQL connection is used."
The only thing missig was information about delay if connection is not estabilished.
In your case I suppose that if you are using PDO for connection then you should use PDO::quote. If you are using mysql_pconnect/mysql_connect then you can use mysql_real_escape_string without delay.
I'm having trouble with this PHP script where I get the error
Fatal error: Maximum execution time of 30 seconds exceeded in /var/www/vhosts/richmondcondo411.com/httpdocs/places.php on line 77
The code hangs here:
function getLocationsFromTable($table){
$query = "SELECT * FROM `$table`";
if( ! $queryResult = mysql_query($query)) return null;
return mysql_fetch_array($queryResult, MYSQL_ASSOC);
}
and here (so far):
function hasCoordinates($houseNumber, $streetName){
$query = "SELECT lat,lng FROM geocache WHERE House = '$houseNumber' AND StreetName = '$streetName'";
$row = mysql_fetch_array(mysql_query($query), MYSQL_ASSOC);
return ($row) ? true : false;
}
both on the line with the mysql_query() call.
I know I use different styles for each code snippet, it's because I've been playing with the first one trying to isolate the issue.
The $table in the first example is 'school' which is a table which definitely exists.
I just don't know why it sits there and waits to time out instead of throwing an error back at me.
The mysql queries from the rest of the site are working properly. I tried making sure I was connected like this
//connection was opened like this:
//$GLOBALS['dbh']=mysql_connect ($db_host, $db_user, $db_pswd) or die ('I cannot connect to the database because: ' . mysql_error());
if( ! $GLOBALS['dbh']) return null;
and it made it past that fine. Any ideas?
Update
It's not the size of the tables. I tried getting only five records and it still timed out. Also, with this line:
$query = "SELECT lat,lng FROM geocache WHERE House = '$houseNumber' AND StreetName = '$streetName'";
it is only looking for one specific record and this is where it's hanging now.
It sounds like MySQL is busy transmitting valid data back to PHP, but there's so much of it that there isn't time to finish the process before Apache shuts down the PHP process for exceeding its maximum execution time.
Is it really necessary to select everything from that table? How much data is it? Are there BLOB or TEXT columns that would account for particular lag?
Analyzing what's being selected and what you really need would be a good place to start.
Time spent waiting for mysql queries to return data does not count towards the execution time. See here.
The problem is most likely somewhere else in the code - the functions that you are blaming are possibly called in an infinite loop. Try commenting out the mysql code to see if I'm right.
Does your code timeout trying to connect or does it connect and hang on the query?
If your code actually gets past the mysql_query call (even if it has to wait a long time to timeout) then you can use the mysql_error function to determine what happened:
mysql_query("SELECT * FROM table");
echo mysql_errno($GLOBALS['dbh']) . ": " . mysql_error($GLOBALS['dbh']) . "\n";
Then, you can use the error number to determine the detailed reason for the error: MySQL error codes
If your code is hanging on the query, you might try describing and running the query in a mysql command line client to see if it's a data size issue. You can also increase the maximum execution time to allow the query to complete and see what's happening:
ini_set('max_execution_time', 300); // Allow 5 minutes for execution
I don't know about the size of your table, but try using LIMIT 10 and see if still hangs.
It might be that your table is just to big to fetch it in one query.
Unless the parameters $houseNumber and $streetName for hasCoordinates() are already sanitized for the MySQL query (very unlikely) you need to treat them with mysql_real_escape_string() to prevent (intentional or unintentional) sql injections. For mysql_real_escape_string() to work properly (e.g. if you have changed the charset via mysql_set_charset) you should also pass the MySQL connection resource to the function.
Is the error reporting set to E_ALL and do you look at the error.log of the webserver (or have set display_erorrs=On)?
Try this
function hasCoordinates($houseNumber, $streetName) {
$houseNumber = mysql_real_escape_string($houseNumber);
$streetName = mysql_real_escape_string($streetName);
$query = "
EXPLAIN SELECT
lat,lng
FROM
geocache
WHERE
House='$houseNumber'
AND StreetName='$streetName'
";
$result = mysql_query($query) or die(mysql_error());
while ( false!==($row=mysql_fetch_array($result, MYSQL_ASSOC)) ) {
echo htmlspecialchars(join(' | ', $row)), "<br />\n";
}
die;
}
and refer to http://dev.mysql.com/doc/refman/5.0/en/using-explain.html to interpret the output.
-If you upped the execution time to 300 and it still went through that 300 seconds, I think that by definition you've got something like an infinite loop going.
-My first suspect would be your php code since mysql is used to dealing with large sets of data, so definitely make sure that you're actually reaching the mysql query in question (die right before it with an error message or something).
-If that works, then try actually running that query with known data on your database via some database gui or via the command line access to the database if you have that, or replacing the code with known good numbers if you don't.
-If the query works on it's own, then I would check for accidental sql injection coming from with the $houseNumber or $streetName variables, as VolkerK mentioned.