I'm using PDO against MSSQL, and need to run nested queries. They are all prepared statements. If I try to use the fetch() method, it inner queries fail immediately, so I used fetchAll(). So, I get something like this, with Programs, Products and Budgets:
$pgm_stmt->execute();
$pgm_res = $pgm_stmt->fetchAll(PDO::FETCH_ASSOC);
foreach ($pgm_res as $pgmrow) {
$prod_stmt->execute(array($pgmrow['ID']));
$prod_res = $prod_stmt->fetchAll(PDO::FETCH_ASSOC);
foreach ($prod_res as $prodrow) {
$bdgt_stmt->execute(array($pgmrow['ID'], $prodrow['ID']));
$bdgt_res = $bdgt_stmt->fetchAll(PDO::FETCH_NUM);
foreach ($bdgt_res as $bdgtrow) {
... work here
}
}
}
OK, everything works the first time through, but when it loops back for the 2nd program, the product result set gets corrupted somehow. When I dump the $prod_res variable right after the fetchAll(), the values are randomly assigned from other parts of memory, bits of other arrays, etc. Of course it fails because the $prodrow['ID'] value is undefined, because that whole result set is mangled.
Can someone help me troubleshoot this? I'm stumped.
Thanks.
Not a bug, but a feature, see: https://bugs.php.net/bug.php?id=65945
This is the behavior of MSSQL (TDS), DBLIB and FreeTDS. One statement per connection rule. If you initiate another statement, the previous statement is cancelled.
The previous versions buffered the entire result set in memory leading to OOM errors on large results sets.
The previous behavior can be replicated using fetchAll() and a loop if desired. Another workaround is to open 2 connection objects, one per statement.
Related
I see comparisons between mysqli_fetch_array() and mysqli_fetch_all() that say that with mysqli_fetch_all() it will take more memory and I will have to iterate over the array.
mysqli_fetch_all() is one call to mysqli but mysqli_fetch_array() is one call per row, for example 100 calls.
I don't know how the mysqli processing works: is calling mysqli_fetch_array() really more efficient when you also take the number of calls into account?
(I already understand that the returned data can be associative arrays or not)
It has nothing to do with whatever efficiency. It's all about usability only.
fetch_all() is a thing that is called a "syntax sugar" - a shorthand to automate a frequently performed operation. It can be easily implemented as a userland function:
function mysqli_fetch_all ($resouce, $mode = MYSQLI_BOTH)
{
$ret = [];
while ($row = $resource->fetch_array($mode))
{
$ret[] = $row;
}
return $ret;
}
Thus you can tell use cases for these functions:
fetch_all() have to be used if you need an array, consists of all the returned rows, that will be used elsewhere.
fetch_assoc() in a loop have to be used if you're going to process all the rows one by one right in place.
As simple as that.
These functions bear different purpose and thus there is no point in comparing them.
Note that PDO is tenfold more sweet in terms of syntax sugar, as it's fetchAll() function can return data in dozens different formats
From PHP's page on mysql_fetch_all():
I tested using fetch_all versus while / fetch_array and:
fetch_all uses less memory (but not for so much).
In my case (test1 and test2): 147008,262848 bytes (fetch_all) versus 147112,262888 bytes (fetch_array & while).
So, about the memory, in both cases are the same.
However, about the performance:
My test takes :350ms (worst case) using fetch_all, while it takes 464ms (worst case) using fetch_array, or about 35% worst using fetch_array and a while cycle.
So, using fetch_all, for a normal code that returns a moderate amount of information is:
a. cleaner (a single line of code)
b. uses less memory (about 0.01% less)
c. faster.
php 5.6 32bits, windows 8.1 64bits
I have a PHP script which processes a "large" dataset (about 100K records) from a PDO query into a single collection of objects, in a typical loop:
while ($record = $query->fetch()) {
$obj = new Thing($record);
/* do some processing */
$list[] = $obj;
$count++;
}
error_log('Processed '.$count.' records');
This loop processes about 50% of the dataset and then inexplicably breaks.
Things I have tried:
Memory profiling: memory_get_peak_usage() consistently outputs about 63MB before the loop dies. The memory limit is 512MB, set through php.ini.
Using set_time_limit() to increase script execution time to 1 hour (3600 seconds). The loop breaks long before that and I don't see the usual error in the log for this one.
Setting PDO::MYSQL_ATTR_USE_BUFFERED_QUERY to false to avoid buffering the entire dataset
Logging out $query->errorInfo() immediately after the loop break. This was no help as the error code was "00000".
Checking the MySQL error log. Nothing of note in there before, after, or while this script runs.
Batching the processing into 20K-record chunks. No difference. Loop broke in the same spot. However, by "cleaning up" the PDO statement object at the end of each batch, I was able to get the processed total to 54%.
Other weird behavior:
When I set the memory limit using ini_set('memory_limit', '1024MB'), the loop actually dies earlier than with a smaller memory limit, at about 20% progress.
During this loop, the PHP process uses 100% CPU, but once it breaks, usage drops back down to 2%, despite immediate processing in another loop immediately afterwards. Likely, the connection with the MySQL server in the first loop is very resource-intensive.
I am doing this all locally using MAMP PRO if that makes any difference.
Is there something else that could be consistently breaking this loop that I haven't checked? Is this simply not a viable strategy for processing this many records?
UPDATE
After using a batching strategy (20K increments), I have started to see a MySQL error consistently around the third batch: MySQL server has gone away; possibly a symptom of a long-running unbuffered query.
If You really need to process 100K records on the fly, You should do the processing in SQL, and fetch the result as You need it - it should save a lot of time.
But You probably cant do that for some reason. You always process all the rows from statement, so use fetchAll once - and let MySQL alone after that, like that:
$records = $query->fetchAll()
foreach ($records as record)
{
$obj = new Thing($record);
/* do some processing */
$list[] = $obj;
$count++;
}
error_log('Processed '.$count.' records');
Also, select only rows that You will use.
If this does not help, You can try with this: Setting a connect timeout with PDO .
Here is the edited script without errors. And the 2 fixes applied to it. To those who helped in part, thank you. To mentions that the code is unclear or messy is inconsequential. Given that most of the following is common structure in mysql queries. Even the example documentation for mysql followed this similar flow. Members who reply should negate from pointless internet banter. Its more worth your time, and my own to do so. Those who stayed on topic and assisted, I thank you.
For example:
$row = mysqli_fetch_row(mysqli_query($con, "SELECT test_table.points FROM test_table WHERE test_table.key = '" . $key . "'"));
if ($row[0] > 0){ // exists
Where $row will return a non-zero result if true. Otherwise 0 on false. There is little need to check mysqli_fetch_row and/or mysqli_query. Since checking $row in simplicity works fine. It is unneeded to check mysqli_fetch_row and/or mysqli_query individually in a general exists condition. It does accurately provide exist / does not exist results. There is no $result $row $query just $row.
The noted deviation to that normal flow was my desire to use call_user_func. And to poll in func and params through $_GET. Will be looking more at PDO. However, the clean code before exec should do alright job for now. Which is to clean before exec.
All in all, the code works just as it should. And have since written more to manage a mysql database. From write, write chunk, read, read chunk, delete, delete chunk.
Also to collect numbered records on request. For example say you have 6 records for the same John Smith. You can now collate and scan for differences in those records. Either for what you want, dont want, etc. Or if say you just want to blindly call the first 3 of those records for John Smith.
mysqli_fetch_row & mysqli_fetch_row fix :
FROM Calling $con outside function then into as per mysql. Which in mysqli does not work as expected. There was no error with the functions, over how $con was being handled.
TO Calling $con inside function with just the added global $con. May end up using $GLOBALS even for this.
Result : Calling $con outside function then in works fine in mysql. In mysqli it requires global be set within the function. ie global $con. Or it fails.
call_user_func non-critical error fix :
FROM call_user_func($func($_GET['user'],$_GET['key'],$_GET['points'],$_GET['type']));
TO call_user_func($func,$_GET['user'],$_GET['key'],$_GET['points'],$_GET['type']);
Result : Both lines execute correctly. From executed with a non-critical error. TO does the same thing, but with no following non-critical error.
Sample Output for both : user=MY_Name;key=34342$ee56i1;points=1234;type=
-- code removed as fixes solved the issues --
You are using call_user_func wrong read the manutal
call_user_func first parameter is the callback - in your case it's a function inside your class so it should be something like this:
If you have a non-static function in an object:
class Test{
public function doit($a){
echo($a);
}
}
$t = new Test();
call_user_func(array($t,'doit'),'asfaasfs');
and in static functions inside object:
class Test{
public static function doit($a){
echo($a);
}
}
call_user_func('Test::doit','asfaasfs');
You have a few problems.
$con is declared outside the class, and is thus not available inside the class. You need to pass it into the class (the better option), or specify it as a global (the quick+dirty option).
mysqli_fetch_row(mysqli_query($con,'...'))
This code is obviously converted directly from your old mysql_xx() code, but it's not great.
You're ignoring any possible error condition that is returned by mysqli_query(). This means that if it fails, it'll pass false into the mysqli_fetch_row() function, which will then fail with a meaningless error expects parameter 1 to be mysqli_result, rather than actually telling you what the error was in the query.
The thing is, because of my first point above, with $con not being set, mysqli_query() is failing, and this is why you're getting the error in mysqli_fetch_row().
Ideally, you should split this code out into multiple lines. Call mysqli_query() on its own, then do some error checking, then call mysqli_fetch_row() only once you know that the query actually worked.
Hope that helps explain what the problems are here. Solve those two points, and you should be well on the way to sorting the whole thing out.
Once you've got rid of those fatal errors, you should also take time to work on the problem that your code is vulnerable to SQL injection attacks. You're currently passing your $_GET variables directly into the query strings without any sanitisation. This will make your system very fragile and easy to hack. You should consider using Parameterised Queries, which is a feature of the mysqli library designed to make it easier to deal with variables in SQL queries in a safe and secure way.
Your class is pointless at the moment, perhaps stick to writing imperative style code as it will at least be cleaner.
At the moment, you should pass $con to your MYsql class to use itself as a resource, not try to access it as a global variable.
Your are not filtering your user's input either, this is dangerous and could lead to SQL injection attacks on your site.
I'd encourage you to read through these two articles, and once you grok them, I'd also encourage you to simply switch to using PDO with prepared statements. This will stop SQL injection attacks that your code currently allows.
http://net.tutsplus.com/tutorials/php/pdo-vs-mysqli-which-should-you-use/
http://net.tutsplus.com/tutorials/php/why-you-should-be-using-phps-pdo-for-database-access/
I usually try to minimize calls to MySQL where possible, but I've finally encountered the case where I have to do multiple calls to MySQL in a single script.
It looks like you can't use mysql_fetch_assoc twice in a single script(!).
It seems that mysql_data_seek is the solution, but I cannot seem to get it to work.
To be clear, I have two queries I need to make. The second query depends on results from the first... Here's the structure:
$result = mysql_query($query1);
while($row = mysql_fetch_assoc($result)){
$pos = $row['position'];
}
mysql_free_result($result); // result freed per comment below.
$query2 = ' '; //... dependent on $pos - in mysql shell this returns results!
$result2 = mysql_query($query2)
while($row = mysql_fetch_assoc($result2)){
echo $row['id'];
}
What happens above is that the second while loop returns no results even though the query should have nontrivial rows.
Just to be sure:
Is this how you clear the pointer from the previous result to be able to use mysql_fetch_assoc again?
mysql_data_seek($result,mysql_num_rows($result) - 1);
I'm not sure what to use as the second argument. Admittedly, I am not that clear on pointers, but it seems I should clear the pointer to 0. But I get this error:
Offset 0 is invalid for MySQL result index 8 (or the query data is unbuffered
Check your connection with mysql_error() and see if you're getting the "commands out of sync" error. If so, you need to call mysql_free_result() after completing the first query and before starting the second. You could also just call mysql_close() to close your database connection and then reopen a new one.
For more details, see this question.
OP changed the question, so see the edit
*Deleted the posted codes here**
EDIT
After your edited your question and made clear you have actually 2 resources it looks like there is something else wrong. You don't have to worry about pointer when you use two different resources to supply mysql_fetch_assoc(). The thing with mysql_fetch_assoc() is that it takes your param ($result) by reference.
Now to answer your question:
I usually try to minimize calls to MySQL where possible, but I've finally encountered the case where I have to do multiple calls to MySQL in a single script.
Nothing wrong with multiple SQL calls in one script. Although in general you should try to minimize the SQL calls (because they may hurt performance).
It looks like you can't use mysql_fetch_assoc twice in a single script(!).
Plain wrong. Ofc you can do it. As long as you note the above. However when you have two result sets this wouldn't be your problem.
It seems that mysql_data_seek is the solution, but I cannot seem to get it to work.
Again: this has nothing to do with it when you use two (different) result sets.
To be clear, I have two queries I need to make. The second query depends on results from the first.
This shouldn't be any problem at all. It looks like is something else wrong. Have you verified that the second query really is what you think it is? Are you sure there are records? Are you sure there aren't any (MySQL) errors. Do you have error reporting enabled? Have you tried printing out mysql_error()? To better be able to help you can you please provide your real code and not etc etc stuff? Maybe something else is going on.
Or maybe you are simply trying to run the second query inside the first loop. Which would be bad in so many ways.
I'm having trouble trying to use $stmt->store_result() after executing a prepared statement. The manual says "Transfers a result set from a prepared statement", but to where?
I would ideally like to use it, all nice and buffered in memory, like $mysqli->query() returns.
The PHP documentation examples make it seem like the only good reason to use store_result() is to get the number of rows.
Ideally, I need something that can get the entire result set from a prepared statement and do something like fetch_rows on it. I DID hack the mysqli_stmt class by extending it and adding a fetch_assoc function using some backwards methods, but I would really like to remove that hack if I can get all my data buffered into memory.
So I guess my short question is, I know what store_result() does, but how do you USE it?
(And honestly, I wouldn't even be using $mysqli->prepare() to get a stmt, if it wasn't so damn useful in preventing SQL injection attacks, sterilizing variables, and making my SQL look so nice and pretty with all those question marks.)
The only thing that store_result() does is it moves the result from MySQL into PHP memory. It's not very useful since you still need to fetch the results into PHP variables.
What you are looking for is get_result(). This function will fetch results from MySQL and store them in PHP memory, additionally returning a familiar mysqli_result object.
$stmt = $mysqli->prepare('SELECT ? as col');
$stmt->bind_param('s', $variable);
$stmt->execute();
// returns mysqli_result object
$result = $stmt->get_result();
// loop to get associative rows
foreach ($result as $row) {
echo $row['col'];
}