Stored Procedure causing problems with mysqli when used with fetch_all - php

I have broken down this issue to it's essence but am still having problems.
When I try and use fetch_all to get the results of a stored procedure, I get results returned to array as expected, but subsequent mysqli calls throw "command out of sync" errors.
I even tried simplifying my stored procedure to...
CREATE PROCEDURE testp()
BEGIN
SELECT * FROM tbl
END
I made the table small, 10 rows 3 cols.
I use the following PHP
$query = "CALL testp()";
$result = $db->query($query);
$rows = $result->fetch_all(MYSQL_ASSOC);
$result->free();
print_r($rows);
$query = "SELECT * FROM tbl WHERE id = ?";
$stmt = $this->db->prepare($query);
echo $this->db->error;
$stmt->close();
Echo's "Commands out of sync; you can't run this command now"
and throws
Fatal error: Call to a member function close() on a non-object
$rows outputs as expected
Any ideas?

This is actually how mysqli works to be honest!
Things happen in batches of 2, which means you need to clear the last result before doing anything else. If you don't -> ERROR!
You can achieve that by using next_result(), example follows:
while($this->_mysqli->next_result());
..then carry on with your code successfully. The above example would fit just after the fetch_assoc_ process is run before returning the array.
(I know this was asked 3 months ago, but rather late than never ;)

Related

Why does mysqli_result::free_result not work on mysqli_stmt::get_result

I have the following code:
$post_title = "Some dynamic POST data";
$stmt = $conn->prepare("SELECT * FROM `posts` WHERE title=? ");
$stmt->bind_param("s", $post_title);
$stmt->execute();
$rows = $stmt->get_result()->num_rows;
$stmt->get_result()->free_result(); // throws error: commands out of sync
$stmt = $conn->prepare("... some new condition");
$stmt->bind_param(...);
$stmt->execute();
$rows = $stmt->get_result()->num_rows;
I know I can simply use $stmt->free_result, but the php docs https://www.php.net/manual/en/mysqli-result.free.php mention you can use free_result on mysqli_result object as well so why can't we use it on mysqli_stmt->get_result(), which is a result object as well?
mysqli_stmt::get_result() is not idempotent. You can only call it once. If you call it again you will get an error:
Commands out of sync; you can't run this command now
This error has nothing to do with free_result() which you probably should not be using in the way you showed anyway.
You need to store the result in a variable and only then you can perform all the operations you want.
$stmt = $mysqli->prepare("SELECT ? ");
$stmt->bind_param("s", $post_title);
$stmt->execute();
$result = $stmt->get_result();
$rows = $result->num_rows;
$result->free_result();
I would also recommend that you don't ever use free_result().
Explanation:
When mysqli makes a call to MySQL server to execute a prepared statement, the server will produce a result set. This is not the outcome of the EXECUTE call, but the actual output of the SQL. By default, mysqli prepared statements are running in unbuffered mode, which means that upon execution PHP will not fetch the results from the server. You must use one of the functions to retrieve it from the MySQL server. You can do it row by row using fetch(), you can tell PHP to buffer the result set in internal memory using store_result() or you can ask PHP to buffer the result encapsulated in a mysqli_result object using get_result(). The connection line will be busy as long as there are pending rows on the MySQL server.
Once you fetch the results from MySQL, there is nothing else to read. If you try to read again, you will get OOS error mentioned above. This is why you can't call get_result() multiple times and expect the same result. Once the data is fetched to PHP, it's gone from the line, and is now stored in PHP memory. The prepared statement can of course be executed again and a new result set will be produced.
See also mysqli - Executing statements.

PHP / mysqli: Prepared Statements with num_rows constantly returning nothing

In my test-surroundings there is a database containing some Person Information (Name, E-Mail, Adress etc.). These Informations can be inserted by anyone into the database via a form. In the background they are inserted with a parameterized INSERT into the database after submission.
What I now would like to do is to detect if some person tries to insert the same values into the database again, and if he does, not inserting the new values and instead showing an error message. (So every person name in the database is unique, there are no multiple rows linked to one name).
I had a numerous number of ideas on how to accomplish this. My first one was to use a query like REPLACE or INSERT IGNORE, but this method would not give me feedback so I can display the error message.
My second attempt was to first do a SELECT-query, checking if the row already exists, and if num_rows is greater than 0, exit with the error message (and else do the INSERT-part). For this to work I will have to use parameterized queries for the SELECT too, as I´m putting some user input into it. Figuring that parameterized queries need special functions for everything you could normally do with way less lines of code, I researched in the internet on how to get num_rows from my $statement parameterized-statement-object. This is what I had in the end:
$connection = new mysqli('x', 'x', 'x', 'x');
if (mysqli_connect_error()) {
die("Connect Error");
}
$connection->set_charset("UTF-8");
$statement = $connection->stmt_init();
$statement = $connection->prepare('SELECT Name FROM test WHERE Name LIKE ?');
flags = "s";
$statement->bind_param($flags, $_POST["person_name"]);
$statement->execute();
$statement->store_result();
$result = $statement->get_result(); //Produces error
if ($result->num_rows >= 1) {
$output = "Your already registered";
} else {
$output = "Registering you...";
}
exit($output);
After all, I can´t get why mysqli still won´t give me num_rows from my statement. Any help is appreciated, thanks in advance!
Oh, and if you guys could explain to me what I have to do to get affected_rows,that would be awesome!
EDIT: I know I could to this by using unique constraints. I also found out that I can find out if INSERT IGNORE skipped the INSERT or not. But that won´t answer my complete question: Why does the SELECT num_rows alternative not work?
ANOTHER EDIT: I changed the code snippet to what I now have. Although my mysql(i)-version seems to be 5.6.33 (I echo´d it via $connection->server_info) get_result() produces the following error message:
Fatal error: Call to undefined method mysqli_stmt::get_result() in X on line X (line of get_result)
The behaviour of mysqli_num_rows() depends on whether buffered or unbuffered result sets are being used. For unbuffered result sets, mysqli_num_rows() will not return the correct number of rows until all the rows in the result have been retrieved. Note that if the number of rows is greater than PHP_INT_MAX, the number will be returned as a string.
Also make sure that you declare ->store_result() first. Moreover the function doesn't work with LIMIT used jointly with SQL_CALC_FOUND_ROWS. If you want to obtain the total rows found you must do it manually.
EDIT:
If nothing from the suggestions does not work for you, then I would propose to rewrite your SQL query:
SELECT `Name`, (SELECT COUNT(*) FROM `Persons`) AS `num_rows` FROM `Persons` WHERE `Name` LIKE ?
This query will return the total number from your Persons table, as well as Name, if exist.

MySQL Update and Select in one statement

I am attempting to do an UPDATE and a SELECT in the same sql statement. For some reason, the below code is failing.
$sql = "UPDATE mytable SET last_activity=CURRENT_TIMESTAMP,
info1=:info1, info2=:info2 WHERE id = {$id};";
$sql .= "SELECT id, info1, info2 FROM myTable
WHERE info1 >=:valueA AND info2>:valueB;"
$stmt = $conn->prepare($sql);
$stmt->bindParam(":info1", $info1);
$stmt->bindParam(":info2", $info2);
$stmt->bindParam(":valueA", $valueA);
$stmt->bindParam(":valueB", $valueB);
$stmt->execute();
$result = $stmt->fetchAll(PDO::FETCH_ASSOC);
echo json_encode($result);
QUESTION: what might I be doing wrong? I have been spending hours on this issue knowing that it's probably a small error right under my nose.
Edited:
I obtained this error message when loading the page that contains the php code:
Uncaught exception 'PDOException' with message 'SQLSTATE[HY000]:
General error' in ajaxCall.php:89 Stack trace: #0 ajaxCall.php(89):
PDOStatement->fetchAll(2) #1 {main} thrown in ajaxCall.php on line 89
I am using ajax to call the php page that contains the above code, and when I load the php page from the browser, I get the above error message.
Line 89 is: $result = $stmt->fetchAll(PDO::FETCH_ASSOC);
Since you are running two queries, you need to call nextRowset to access the results from the second one.
So, do it like this:
// code
$stmt->execute();
$stmt->nextRowset();
// code
When you run two or more queries, you get a multi-rowset result. That means that you get something like this (representation only, not really this):
Array(
[0] => rowset1,
[1] => rowset2,
...
)
Since you want the second set -the result from the SELECT-, you can consume the first one by calling nextRowset. That way, you'll be able to fetch the results from the 'important' set.
(Even though 'consume' might not be the right word for this, it fits for understanding purposes)
Executing two queries with one call is only allowed when you are using mysqlnd. Even then, you must have PDO::ATTR_EMULATE_PREPARES set to 1 when using prepared statements. You can set this using:
$conn->setAttribute(PDO::ATTR_EMULATE_PREPARES, 1);
Alternatively, you can use $conn->exec($sql), which works regardless. However, it will not allow you to bind any data to the executed SQL.
All in all, don't execute multiple queries with one call.

multi query select using wrong array?

I have a multi query select which half works. The first query is straight forward.
$sql = "SELECT riskAudDate, riskClientId, RiskNewId FROM tblriskregister ORDER BY riskId DESC LIMIT 1;";
The second one doesn't seem to work even when I do it on its own:
$sql ="SELECT LAST(riskFacility) FROM tbleClients";
If I get rid of the LAST it returns the first entry in that field of the table. I want to use the LAST to get the LAST entry in that field.
When I do the first query on its own I get the data returned and I can echo it to the screen. When I add the second (with out the LAST) I get nothing. Here is what I am using
$result = $conn->query($sql);
if ($result == TRUE){
$r = $result->fetch_array(MYSQLI_ASSOC);
echo $r['riskAudDate'];
echo $r['riskClientId'];
echo $r['RiskNewId'];
echo $r['riskFacility'];
echo "<pre>";
print_r($r);
echo "</pre>";
}
The last bit is just for me to see whats in the array and just for testing.
So I have worked out that its the results array that is not right.
If I change the actual query to multi query I get this:
Call to a member function fetch_array() on boolean
So the array bit seems to be wrong for a multi query. The data returned is one row from each table. It works for the top query but add in the second (which I'm not sure is correct anyway) and the whole things crashes. So I guess it's a two part question. Whats wrong with my inserts and whats wrong with my returned array?
There is no last() function in mysql, it is only supported in ms access, if I'm not much mistaken. In mysql you can do what you do in the 1st query: do an order by and limit the results to 1.
According to the error message, the $conn->query($sql) returns a boolean value (probably true), therefore you cannot call $result->fetch_array(MYSQLI_ASSOC) on it. Since we have no idea what exactly you have in $sql variable, al I can say is that you need to debug your code to detrmine why $conn->query($sql) returns a boolean value.
Although it is not that clear from mysqli_query()'s documentation, but it only supports the execution of 1 query at a time. To execute multiple queries in one go, use mysqli_multi_query() (you can call this one in OO mode as well, see documentation). However, for security reasons I would rather call mysqli_query() twice separately. It is more difficult to execute a successful sql injection attack, if you cannot execute multiple queries.
It seems to me you are trying to do two SQL-queries at once.
That is not possible. Do a separate
$result = $conn->query($sql);
if ($result == TRUE){
while( $r = $result->fetch_array(MYSQLI_ASSOC)) {
...
}
}
for each SQL-query.
concerning :
$sql ="SELECT LAST(riskFacility) FROM tbleClients";
since the last function does not exists in MySQL i would recommend doing a sort like this(because i don't know what you mean with last )
$sql ="SELECT riskFacility FROM tbleClients order by riskFacility desc limit 0,1";

PDO “Uncaught exception 'PDOException' .. Cannot execute queries while other unbuffered queries are active. Consider using PDOStatement::fetchAll().”

I know this question has been asked many times, but I've read the answers to many of the questions and still cannot understand why I am receiving this error:
Fatal error: Uncaught exception 'PDOException' with message
'SQLSTATE[HY000]: General error: 2014 Cannot execute queries while
other unbuffered queries are active. Consider using
PDOStatement::fetchAll(). Alternatively, if your code is only ever
going to run against mysql, you may enable query buffering by setting
the PDO::MYSQL_ATTR_USE_BUFFERED_QUERY attribute.'
The first thing that is odd, is that I do not get an error on my localhost (wampserver), but I do get it on my web server. The php version on my localhost is 5.3.10, and on my web server it is 5.3.13.
I have read that the source of this error is making a query when data left in the buffer from a previous query. This is not the case for me -- I have echo'd out all of the data and I know for a fact that every row returned in a query is being fetched.
With that said, I have found that changing one of my queries to fetchAll instead of fetch fixes the problem, but it simply makes no since because I know that all of the rows returned are being read. When I used fetchAll for the query (it is being made in a loop), I printed out the array each loop, and only one item was in the array for each query in the loop.
One more piece of information. It's not the query that I changed to fetchAll (which makes the error go away) that throws the PDO error, there is another query later in my php file that throws the error. My file is basically like this:
... code ...
query 1
... code ...
loop
query 2
end loop
... code ...
query 3
If I comment out query 3, there is no error. If I comment out, or change to fetchAll, query 2, there is no error. query 1 has no affect whatsoever.
I would also like to add that I have tried adding LIMIT 1 to all of the queries on the page (at the same time), and the error is still there. I think this proves there is not unread data in the buffer, right?
I'm really confused, so I would appreciate your advice. Before someone asks, I can't post the full code for this, but here is a simplified version of my code:
$stmt = $this->db->prepare('SELECT ... :par LIMIT 1');
makeQuery($stmt, array(':par' => $var));
$row = $stmt->fetch(PDO::FETCH_ASSOC);
$stmt = $this->db->prepare('SELECT ... :par LIMIT 1');
for loop
makeQuery($stmt, array(':par' => $var));
$row2 = $stmt->fetch(PDO::FETCH_ASSOC);
... [use row2] ...
end for loop
$stmt = $this->db->prepare('SELECT ... :par LIMIT 1');
makeQuery($stmt, array(':par' => $var));
$row3 = $stmt->fetch(PDO::FETCH_ASSOC);
Here is makeQuery().
/**************************************************************************************************************
* Function: makeQuery *
* Desc: Makes a PDO query. *
* Pre conditions: The statement/query and an array of named parameters (may be empty) must be passed. *
* Post conditions: The PDO query is executed. Exceptions are caught, displayed, and page execution stopped. *
**************************************************************************************************************/
function makeQuery($stmt, $array, $errMsg = '')
{
try
{
$stmt->execute($array);
}
catch (PDOException $e)
{
print $errMsg != ''?$errMsg:"Error!: " . $e->getMessage() . "<br/>";
die();
}
}
Thanks for your help!
EDIT: I also tried doing the following after query 2 (since that seems to be the source of the problem:
$row2 = $stmt->fetch(PDO::FETCH_ASSOC); var_dump($row2);
The output was:
bool(false)
Have I stumbled across a PDO bug?
You need to fetch until a row fetch attempt fails. I know you may only have one row in the result set and think one fetch is enough, but its not (when you're using unbuffered queries). PDO doesn't know how many rows there are until it reaches the end, where it tries to fetch the next row, but it fails.
You probably have other statements where you didn't fully "fetch until a fetch failed". Yes, I see that you fetch until the fetch failed for one of the statements, but that doesn't mean you did it for all of them.
To clarify -
When you execute a query via execute(), you create a result set that must be fetched from the db into php. PDO can only handle 1 of these "result set in progress of being fetched" at a time (per connection). You need to completely fetch the result set, all the way to the end of it, before you can start fetching a different result set from a different call to execute().
When you "call fetch() until a fetch() fails", the fact that you reached the end of the results is internally noted by PDO when that final call to fetch() fails due to there being no more results. PDO is then satisfied that the results are fully fetched, and it can clean up whatever internal resources between php and the db that were established for that result set, allowing you to make/fetch other queries.
There's other ways to make PDO "call fetch() until a fetch() fails".
Just use fetchAll(), which simply fetches all rows, and so it will hit the end of the result set.
or just call closeCursor()
*if you look at the source for closeCursor(), the default implementation literally just fetches the rows and discards them until it reaches the end. It's written in c obviously, but it more or less does this:
function closeCursor() {
while ($row = $stmt->fetch()) {}
$this->stmtFullyFetched = true;
}
Some db drivers may have a more efficient implementation that doesn't require them to fetch lots of rows that nobody cares about, but that's the default way PDO does it. Anyway...
Normally you don't have these problems when you use buffered queries. The reason is because with buffered queries, right after you execute them, PDO will automatically fully fetch the db results into php memory, so it does the "call fetch() until a fetch() fails" part for you, automatically. When you later call fetch() or fetchAll() yourself, it's fetching results from php memory, not from the db. So basically, the result set is immediately fully fetched when using buffered queries, so there's no opportunity to have more than 1 "result set in progress of being fetched" at the same time (because php is single threaded, so no chance of 2 queries running at the same time).
Given this:
$sql = "select * from test.a limit 1";
$stmt = $dbh->prepare($sql);
$stmt->execute(array());
Ways to fully fetch the result set (assuming you only want the first row):
$row = $stmt->fetch();
$stmt->closeCursor();
or
list($row) = $stmt->fetchAll(); //tricky
or
$row = $stmt->fetch();
while ($stmt->fetch()) {}
After struggling with this issue for days, I finally found that this worked for me:
$db = new PDO ($cnstring, $user, $pwd);
$db->setAttribute (PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, true);
This also happen if you are trying to fetch a non SELECT query (Eg - UPDATE/INSERT/ALTER/CREATE). Make sure to use fetch or fetchAll only for SELECT queries.
Possible Duplicate Answer/Question

Categories