I am trying to extract two fields from tables accesscode and student the update the table borrowers with the data that i have extracted from the previous tables.
$q=$db->query("SELECT regnum, accesscode FROM student,student_accesscode WHERE student.id=student_accesscode.studentid");
while($qd=$q->fetch(PDO::FETCH_ASSOC))
{
$access=$qd['accesscode'];
$regnum=$qd['regnum'];
$q2=$db->exec("UPDATE borrowers SET cardnumber='$access' WHERE cardnumber='$regnum'");
if($q2)
{
echo $access.' '. $regnum.'<br/>';
}
else
{
echo'erro....<br/>';
}
}
?>
Appending $db->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_WARNING); before $db->query() probably gives you an Exception...
Prior to the $db->setAttribute, you will get this:
PDO::prepare(): SQLSTATE[HY000]: General error: 2014 Cannot execute queries while other unbuffered queries are active. Consider using PDOStatement::fetchAll(). Alternatively, if your code is only ever going to run against mysql, you may enable query buffering by setting the PDO::MYSQL_ATTR_USE_BUFFERED_QUERY attribute.
So, instead of fetch(), use fetchAll() with foreach loop, it will make you less insane.
Ref. from php.net
For a statement that you need to issue multiple times, prepare a PDOStatement
object with PDO::prepare() and issue the statement with PDOStatement::execute().
From the PHP exec page at http://www.php.net/manual/en/pdo.exec.php
I don't know if this is the problem, I always use prepare and execute. It could just be for performance reasons. Something to try anyway.
Related
We are using PHP >= 7.2 with PDO with the MySQL driver to execute multiple SQL statements in one exec() call. We use this concept to create DB Migration scripts, which are executed by PHP. These plain SQL scripts contain multiple SQL statements, which are executed by our PHP framework, like this:
$conn = PDO::conn('mysql:......','user','pw');
$ret = $conn->exec('
DROP VIEW IF EXISTS v_my_view;
CREATE VIEW v_my_view AS SELECT id,name FROM mytable;
');
// $ret now contains 1
if ($ret === FALSE) { // some error handling }
This works fine, even for many statements within one exec() call: The database executes all statements.
Now if the SQL itself contains an error, (e.g. syntax error, or unknown column etc.), the call to exec() returns exactly the same as before (1), and also DBH::errorInfo() returns no error:
$conn = PDO::conn('mysql:......','user','pw');
$ret = $conn->exec('
DROP VIEW IF EXISTS v_my_view;
-- The following statement creates an SQL error (unknown column):
CREATE VIEW v_my_view AS SELECT id,name_not_known FROM mytable;
');
// $ret still contains 1
var_dump($conn->errorInfo()); // Returns Error Code 0, all fine.
So it seems that PDO::exec() does support multiple statements, but does not implement any error handling / abort mechanism.
This also seems to be a MySQL-only problem: The same mechanism works fine on a PostgreSQL database.
Is there any way we can force PDO to "stop on errors" on multiple statement queries?
Yes, there is. It is explained in my PDO tutorial, under the running multiple queries section.
Basically you need to temporarily switch the emulation mode on with
$conn->setAttribute(PDO::ATTR_EMULATE_PREPARES, true);
and then run your SQL dump using a regular query() method. and then you will have to loop over results of every query to catch the error. As your queries don't seem to return any data, the loop could be as simple as
do {} while ($stmt->nextRowset());
Note that you must enable exception mode for PDO in order to get an error exception for the erroneous query. In case you will need some data from the queries, such as insert id, you may add necessary commands inside the curly braces.
I new there are lots of answer as well as accepted answers related to this question but none of them solve my problem. Still I am getting this error.
Procedures:
CREATE PROCEDURE getAllProducts()
BEGIN
SELECT * FROM products;
END //
CREATE PROCEDURE getAllCategories()
BEGIN
SELECT * FROM category;
END //
Connection & calling:
$link = mysql_connect($host,$username,$password) or die(mysql_error());
mysql_select_db($database, $link) or die(mysql_error());
$allProducts = mysql_query("CALL getAllProducts");
while($row = mysql_fetch_array($allProducts)) { }
$allCategory = mysql_query("CALL getAllCategories");
while($row = mysql_fetch_array($allCategory)) { }
I've even called mysql_free_result($allProducts) before executing the next query. But nothing happens.
mysql_get_client_info() return mysqlnd 5.0.5-dev - 081106 - $Revision: 1.3.2.27 $
I found that the problem only arises if I run two queries.
As the MySQL-Documentation for 'Commands out of sync' points out:
[...] It can also happen if you try to execute two queries that return data
without calling mysql_use_result() or mysql_store_result() in between.
The Documentation for mysql_use_result() says e.g.:
After invoking mysql_query() or mysql_real_query(), you must call
mysql_store_result() or mysql_use_result() for every statement that
successfully produces a result set (SELECT, SHOW, DESCRIBE, EXPLAIN,
CHECK TABLE, and so forth). You must also call mysql_free_result()
after you are done with the result set.
Basically you need to tell your client what it should do with the result.
Well, usually this error occurs because there are still results pending from the query. There are mysqli_store_result and mysqli_free_result functions available. Since you are using mysql and mysql extension does not have such functions, you can try closing the connection after executing the first procedure and establishing the connection again to execute next procedure. Though this is not the perfect solution, but it will work in your case.
mysql_close($connection);
$connection = mysql_connect("localhost","username","password");
You can also try
mysql_free_result($allProducts);
Stored procedures always return an extra result set with errors/warnings information. As such, your stored procedures return multiple result sets (the actual result set from your select query and the extra errors/warnings result set). Calling mysql_fetch_array in a loop you only saturate one of them, leaving the other still pending, causing the error you see.
I don't know how to fix it with vanilla mysql_ library, with mysqli_ you can issue mysqli_multi_query, and then only use the first result set. See the example in the docs.
It's not a fix per se, but if you insist on staying with mysql_* functions, and assuming that you actually want to work with more complicated stored procedures (i.e. that the ones you supplied are just a simplified example) - you can change the stored procedures code to write the resultset into a temporary table (e.g. tmp_getAllProducts) instead of returning it, and then SELECT from it in your PHP.
This is what worked for me a while back when I was stuck with mysql_* and couldn't upgrade...
This is a known limitation of the mysql extension. You must either not use more than one stored procedure per connection or upgrade to mysqli.
https://bugs.php.net/bug.php?id=39727
Maybee some other have the same problem than me.
I run over the error:
Cannot execute queries while other unbuffered queries are active.
Consider using PDOStatement::fetchAll(). Alternatively, if your code
is only ever going to run against mysql, you may enable query
buffering by setting the PDO::MYSQL_ATTR_USE_BUFFERED_QUERY attribute.
on PDO. As in many threads mentioned the error can at be at least one of the following problems:
The query cursor was not closed with closeCursor() as mentioned here; Causes of MySQL error 2014 Cannot execute queries while other unbuffered queries are active
There are more than two querys with one statement like mentioned here: PDO Cannot execute queries while other unbuffered queries are active
A bug in mysql-driver as mentioned here: What is causing PDO error Cannot execute queries while other unbuffered queries are active?
In my case all above did not help and it took some time till i solved the problem. this was my code (pseudo-code):
$stmt->startTransaction();
$stmt = db::getInstance()->prepare("CALL phones(:phone)");
$stmt->prepare('SELECT * FROM database');
$stmt->execute();
$aData = $stmt->fetchAll();
$stmt->closeCursor();
$stmt->query("USE sometable;");
After I changed it to:
$stmt->startTransaction();
$stmt = db::getInstance()->prepare("CALL phones(:phone)");
$stmt->prepare('SELECT * FROM database');
$stmt->execute();
$aData = $stmt->fetchAll();
$stmt->closeCursor();
$stmt->exec("USE sometable;");
It worked for my. What is the difference between query and exec?
PDO::exec() - "Execute an SQL statement and return the number of affected rows"
PDO::query() - "Executes an SQL statement, returning a result set as a PDOStatement object"
Why in this case PDO::query() does not work? The cursor IS closed, when called.
While it could conceivably be true that you've encountered the mysql driver bug here, we can't be sure of that because you've not given us that information (what version of PHP are you using? Does it use mysqlnd => check with php -i | grep mysqlnd? What does the rest of your code look like?).
There are many other possible explanations for your problem. I suspect the issue is actually your failing to close all the cursors, and/or fetch all the results, because $stmt is being reused heavily:
Quoted directly from the PDO::query manual page:
If you do not fetch all of the data in a result set before issuing your next call to PDO::query(), your call may fail. Call PDOStatement::closeCursor() to release the database resources associated with the PDOStatement object before issuing your next call to PDO::query().
You call closeCursor on $stmt, that's true, but you've not closed all cursors that have been created by you:
//<-- what is $stmt here?
$stmt->startTransaction();
//no matter, you've reassigned it a PDOStatement instance
$stmt = db::getInstance()->prepare("CALL phones(:phone)");
//Huh? You're preparing yet another query on an instance of PDOStatement?
$stmt->prepare('SELECT * FROM database');
//you're executing this one, though
$stmt->execute();
//and fetching all data
$aData = $stmt->fetchAll();
//and closing this last statement
$stmt->closeCursor();
But what about the first statement you assigned to $stmt (the stored procedure call)? That cursor isn't closed anywhere
Now for the major difference between PDO::query and PDO::exec. Again, quoting the manual:
PDO::exec() does not return results from a SELECT statement.
Whereas:
PDO::query() executes an SQL statement in a single function call, returning the result set (if any) returned by the statement as a PDOStatement object.
I came across this problem too. It is likely to be a bug. If we take the following code, then you will see how it fails with the message 'General error: 2014 Cannot execute queries while other unbuffered queries are active. Consider using PDOStatement::fetchAll().'
$pdo = new \PDO("mysql:host=localhost", "root", "");
$pdo->setAttribute(\PDO::ATTR_ERRMODE, \PDO::ERRMODE_EXCEPTION);
$pdo->setAttribute(\PDO::ATTR_EMULATE_PREPARES, false);
$pdo->query("USE test");
If you change $pdo->query("USE test"); to $pdo->exec("USE test"); it will work. If you change $pdo->setAttribute(\PDO::ATTR_EMULATE_PREPARES, false); to $pdo->setAttribute(\PDO::ATTR_EMULATE_PREPARES, true); it will also work. I haven't been able to find a proper solution yet though.
I solve the issue with steps:
After from performed:
$stmt = db::getInstance()->prepare("CALL phones(:phone)");
I close the:
$stmt->startTransaction();
And after that, I open again the transaction to use the query below:
$stmt->prepare('SELECT * FROM database');
My solve it is: One statement to call the procedure "CALL phones(:phone)" and another to execute the query wtih "SELECT * FROM database".
That is it.
Be careful, This can also happen if you are trying to fetch a non SELECT query (Eg - UPDATE/INSERT/ALTER/CREATE)
I know this question has been asked many times, but I've read the answers to many of the questions and still cannot understand why I am receiving this error:
Fatal error: Uncaught exception 'PDOException' with message
'SQLSTATE[HY000]: General error: 2014 Cannot execute queries while
other unbuffered queries are active. Consider using
PDOStatement::fetchAll(). Alternatively, if your code is only ever
going to run against mysql, you may enable query buffering by setting
the PDO::MYSQL_ATTR_USE_BUFFERED_QUERY attribute.'
The first thing that is odd, is that I do not get an error on my localhost (wampserver), but I do get it on my web server. The php version on my localhost is 5.3.10, and on my web server it is 5.3.13.
I have read that the source of this error is making a query when data left in the buffer from a previous query. This is not the case for me -- I have echo'd out all of the data and I know for a fact that every row returned in a query is being fetched.
With that said, I have found that changing one of my queries to fetchAll instead of fetch fixes the problem, but it simply makes no since because I know that all of the rows returned are being read. When I used fetchAll for the query (it is being made in a loop), I printed out the array each loop, and only one item was in the array for each query in the loop.
One more piece of information. It's not the query that I changed to fetchAll (which makes the error go away) that throws the PDO error, there is another query later in my php file that throws the error. My file is basically like this:
... code ...
query 1
... code ...
loop
query 2
end loop
... code ...
query 3
If I comment out query 3, there is no error. If I comment out, or change to fetchAll, query 2, there is no error. query 1 has no affect whatsoever.
I would also like to add that I have tried adding LIMIT 1 to all of the queries on the page (at the same time), and the error is still there. I think this proves there is not unread data in the buffer, right?
I'm really confused, so I would appreciate your advice. Before someone asks, I can't post the full code for this, but here is a simplified version of my code:
$stmt = $this->db->prepare('SELECT ... :par LIMIT 1');
makeQuery($stmt, array(':par' => $var));
$row = $stmt->fetch(PDO::FETCH_ASSOC);
$stmt = $this->db->prepare('SELECT ... :par LIMIT 1');
for loop
makeQuery($stmt, array(':par' => $var));
$row2 = $stmt->fetch(PDO::FETCH_ASSOC);
... [use row2] ...
end for loop
$stmt = $this->db->prepare('SELECT ... :par LIMIT 1');
makeQuery($stmt, array(':par' => $var));
$row3 = $stmt->fetch(PDO::FETCH_ASSOC);
Here is makeQuery().
/**************************************************************************************************************
* Function: makeQuery *
* Desc: Makes a PDO query. *
* Pre conditions: The statement/query and an array of named parameters (may be empty) must be passed. *
* Post conditions: The PDO query is executed. Exceptions are caught, displayed, and page execution stopped. *
**************************************************************************************************************/
function makeQuery($stmt, $array, $errMsg = '')
{
try
{
$stmt->execute($array);
}
catch (PDOException $e)
{
print $errMsg != ''?$errMsg:"Error!: " . $e->getMessage() . "<br/>";
die();
}
}
Thanks for your help!
EDIT: I also tried doing the following after query 2 (since that seems to be the source of the problem:
$row2 = $stmt->fetch(PDO::FETCH_ASSOC); var_dump($row2);
The output was:
bool(false)
Have I stumbled across a PDO bug?
You need to fetch until a row fetch attempt fails. I know you may only have one row in the result set and think one fetch is enough, but its not (when you're using unbuffered queries). PDO doesn't know how many rows there are until it reaches the end, where it tries to fetch the next row, but it fails.
You probably have other statements where you didn't fully "fetch until a fetch failed". Yes, I see that you fetch until the fetch failed for one of the statements, but that doesn't mean you did it for all of them.
To clarify -
When you execute a query via execute(), you create a result set that must be fetched from the db into php. PDO can only handle 1 of these "result set in progress of being fetched" at a time (per connection). You need to completely fetch the result set, all the way to the end of it, before you can start fetching a different result set from a different call to execute().
When you "call fetch() until a fetch() fails", the fact that you reached the end of the results is internally noted by PDO when that final call to fetch() fails due to there being no more results. PDO is then satisfied that the results are fully fetched, and it can clean up whatever internal resources between php and the db that were established for that result set, allowing you to make/fetch other queries.
There's other ways to make PDO "call fetch() until a fetch() fails".
Just use fetchAll(), which simply fetches all rows, and so it will hit the end of the result set.
or just call closeCursor()
*if you look at the source for closeCursor(), the default implementation literally just fetches the rows and discards them until it reaches the end. It's written in c obviously, but it more or less does this:
function closeCursor() {
while ($row = $stmt->fetch()) {}
$this->stmtFullyFetched = true;
}
Some db drivers may have a more efficient implementation that doesn't require them to fetch lots of rows that nobody cares about, but that's the default way PDO does it. Anyway...
Normally you don't have these problems when you use buffered queries. The reason is because with buffered queries, right after you execute them, PDO will automatically fully fetch the db results into php memory, so it does the "call fetch() until a fetch() fails" part for you, automatically. When you later call fetch() or fetchAll() yourself, it's fetching results from php memory, not from the db. So basically, the result set is immediately fully fetched when using buffered queries, so there's no opportunity to have more than 1 "result set in progress of being fetched" at the same time (because php is single threaded, so no chance of 2 queries running at the same time).
Given this:
$sql = "select * from test.a limit 1";
$stmt = $dbh->prepare($sql);
$stmt->execute(array());
Ways to fully fetch the result set (assuming you only want the first row):
$row = $stmt->fetch();
$stmt->closeCursor();
or
list($row) = $stmt->fetchAll(); //tricky
or
$row = $stmt->fetch();
while ($stmt->fetch()) {}
After struggling with this issue for days, I finally found that this worked for me:
$db = new PDO ($cnstring, $user, $pwd);
$db->setAttribute (PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, true);
This also happen if you are trying to fetch a non SELECT query (Eg - UPDATE/INSERT/ALTER/CREATE). Make sure to use fetch or fetchAll only for SELECT queries.
Possible Duplicate Answer/Question
So far I have been using PDO->bindParam however while reading the manual I found PDO->bindValue from what I can tell PDO->bindValue passes by value where as PDO->bindParam passes by reference, is this the only difference?
$modThread = db()->prepare("UPDATE `threads` SET `modtime` = UNIX_TIMESTAMP( ) WHERE `threadid` =:id LIMIT 1");
while(something)
{
$modThread->bindParam(':id', $thread);
$modThread->execute();
//*******************HERE********************//
}
Again while reading the manual I found: PDO->closeCursor should I place it where marked? Is it optional/automatically called? Seems only certain drivers need it. Will calling it on a driver that doesn't need/support it cause errors? How about MySQL?
This isn't true. If you find yourself needing to use closeCursor, one of the most optimal times is for insert/update/delete commands, and rarely for SELECT statements for which you have already fetched results.
For example, if you select all records from a table, then issue $stmt->fetch(), this actually accomplishes the goal for closeCursor immediately as the rows are now no longer in an unfetched status.
From the manual:
This method is useful for database drivers that do not support executing a PDOStatement object when a previously executed PDOStatement object still has unfetched rows. If your database driver suffers from this limitation, the problem may manifest itself in an out-of-sequence error.
When you will really need closeCursor is during any of the following instances:
If your DB driver doesn't allow for a new stmt to be executed while unfetched rows are available from the previous execute
You have multiple prepared statements and would like to execute them one-after-another ($stmt1->execute(); $stmt->closeCursor(); $stmt2->execute(); $stmt2->closeCursor(); $stmt3...etc)
You have multiple stmts that must execute insert/update/delete inside the same block. This is true because, while you dont get mysql row results back, you DO get number of affected rows result set back (which is still a result).
When using transactions
When you want to issue select-style prepared statements and execute them, but not retrieve the data until later
When you don't need the closeCursor statement:
If you have already fetched the rows (as with $stmt->fetch()) before your next statement is to be executed. At this point the rows are in a "fetched" state and frees up the driver to execute new statements.
Just as useful for closing a cursor is unset() (ie: unset($stmt)) and setting the statement to null ($stmt = null), opening the doors for the built-in Garbage Collector to clear everything up.
See the manual for more information: http://php.net/manual/en/pdostatement.closecursor.php
The 'recurring' bindParam() here is not really necessary:
$thread = 0;
$modThread->bindParam(':id', $thread);
while($thread < 20)
{
$thread++;
$modThread->execute(); //executing with the new value, which you couldn't do with bindValue
}
You don't need a closeCursor() when there is no resultset (i.e, only with SELECT s or procedures giving results back) , but usually I've already done a fetchAll somewhere in a previous statement / row.