I'm having some serious problems with the PHP Data Object functions. I'm trying to loop through a sizeable result set (~60k rows, ~1gig) using a buffered query to avoid fetching the whole set.
No matter what I do, the script just hangs on the PDO::query() - it seems the query is running unbuffered (why else would the change in result set size 'fix' the issue?). Here is my code to reproduce the problem:
<?php
$Database = new PDO(
'mysql:host=localhost;port=3306;dbname=mydatabase',
'root',
'',
array(
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => true
)
);
$rQuery = $Database->query('SELECT id FROM mytable');
// This is never reached because the result set is too large
echo 'Made it through.';
foreach($rQuery as $aRow) {
print_r($aRow);
}
?>
If I limit the query with some reasonable number, it works fine:
$rQuery = $Database->query('SELECT id FROM mytable LIMIT 10');
I have tried playing with PDO::MYSQL_ATTR_MAX_BUFFER_SIZE and using the PDO::prepare() and PDO::execute() as well (though there are no parameters in the above query), both to no avail. Any help would be appreciated.
If I understand this right, buffered queries involve telling PHP that you want to wait for the entire result set before you begin processing. Prior to PDO, this was the default and you had to call mysql_unbuffered_query if you wanted to deal with results immediately.
Why this isn't explained on the PDO MySQL driver page, I don't know.
You could try to split it up into chunks that aren't big enough to cause problems:
<?php
$id = 0;
$rQuery = $Database->query('SELECT id FROM mytable ORDER BY id ASC LIMIT 100');
do {
stuff($rQuery);
$id += 100;
} while ( $rQuery = $Database->query(
'SELECT id FROM mytable ORDER BY id ASC LIMIT 100 OFFSET '.$id
)
);
?>
...you get the idea, anyway.
Or maybe you could try mysql functions instead:
while ($row = mysql_fetch_row($query)) {
...
}
Which will definitely be faster, since that foreach statement makes an impression to use fetchAll() instead fetch() each row
Related
In my database, I have huge tables with a couple of billions of results and the problems are that my PHP code is not capable of getting it right. I'm waiting for hours of results and almost always get an error like 'served moved away' or 'maximum limit of passing data is xx'.
Tables have indexed columns. Queries are simplified to the minimum. I have tried some test in PHP but all of them failed.
// approach #1 with PHP
$pdo = new PDO("mysql:host=".$data["host"].";dbname=".$data["dbname"].";charset=utf8", $data["username"], $data["pass"]);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$sql = "SELECT id FROM huge_table bt INNER JOIN even_bigger_table ebt ON bt.id = ebt.id WHERE bt.column = 'data1' AND ebt.column2 = 'data2'";
// ends up with error with no results
$pdo->query($sql)->fetchAll();
// approach #2 with PHP
$pdo = new PDO("mysql:host=".$data["host"].";dbname=".$data["dbname"].";charset=utf8", $data["username"], $data["pass"]);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$resultsCounted = 5000000;
$offset = 0;
for($i = 0; $offset < $resultsCounted; $i += 500) {
$sql = "SELECT id FROM huge_table bt INNER JOIN even_bigger_table ebt ON bt.id = ebt.id WHERE bt.column = 'data1' AND ebt.column2 = 'data2' LIMIT ".$offset.", 500";
// ends up with error with no results
$result = $pdo->query($sql)->fetchAll();
// do some stuff with results
do_stuff($result);
$offset += $i;
}
Strangely this SQL works in PhpStorm database console when selecting "Execute to file" function:
SELECT id FROM huge_table bt INNER JOIN even_bigger_table ebt ON bt.id = ebt.id WHERE bt.column = 'data1' AND ebt.column2 = 'data2'
I can see how PhpStorm saves 100 by 100 results into the file until all are saved.
I need to find a way how can I recreate PhpStorm database data saving that with PHP or find another way how to properly manage results saving.
LIMIT hugenumber smallnumber generally works poorly: it's a notorious performance antipattern. MySQL has to work hard to skip over a hugenumber of records before returning a smallnumber. Then it has to do it all again, and again...
Instead process your rows one at a time.
Most queries in php are buffered. That is, fetchAll() and its equivalents slurp up everything in the query resultset in one big gulp.
That works badly for queries with vast resultsets, partly because most php implementations have RAM limits.
Use an unbuffered query for this. Process the rows you need one at a time, then discard them. That way your resultset will flow through your php program in a continuous stream. (I did not debug this example).
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);
$sql = "SELECT id FROM huge_table bt INNER JOIN even_bigger_table ebt ON bt.id = ebt.id WHERE bt.column = 'data1' AND ebt.column2 = 'data2'";
$unbuff = $pdo->query($sql);
if ($unbuff) {
while ($row = $unbuff->fetch(PDO::FETCH_ASSOC)) {
echo $row['whatever']; /* handle the row */
}
}
I am using PDO to execute a query for which I am expecting ~500K results. This is my query:
SELECT Email FROM mytable WHERE flag = 1
When I run the query in Microsoft SQL Server management Studio I consistently get 544838 results. I wanted to write a small script in PHP that would fetch these results for me. My original implementation used fetchAll(), but this was exhausting the memory available to php, so I decided to fetch the results one at a time like so:
$q = <<<QUERY
SELECT Email FROM mytable WHERE flag = 1
QUERY;
$stmt = $conn->prepare($q);
$stmt->execute();
$c = 0;
while ($email = $stmt->fetch()[0]) {
echo $email." $c\n";
$c++;
}
but each time I run the query, I get a different number of results! Typical results are:
445664
445836
445979
The number of results seems to be short 100K +/- 200 ish. Any help would be greatly appreciated.
fetch() method fetches one row at a time from current result set. $stmt->fetch()[0] is the first column of the current row.
Your sql query has no ordering and can have some null or empty values (probably).
Since you are controlling this column value in while loop, if the current row's first value is null, it will exit from the loop.
Therefore, you should control only fetch(), not fetch()[0] or something like that.
Also, inside the while loop, use sqlsrv_get_field() to access the columns by index.
$c = 0;
while ($stmt->fetch()) { // You may want to control errors
$email = sqlsrv_get_field($stmt, 0); // get first column value
// $email can be false on errors
echo $email . " $c\n";
$c++;
}
sqlsrv_fetch
So I want to pull a random record off my DB:
mysql_connect("127.0.0.1","my_shop","xxxxxx");
mysql_select_db("my_shop_db");
error_reporting(E_ALL && ~E_NOTICE);
$random_record = "SELECT * FROM products ORDER BY RAND() LIMIT 1";
$random_product = mysql_query($random_record) or die ($random_record);
echo $random_product['id'];
What I tried so far:
DB Connection works 100%
Tables exist, have exact wordings and contain data
DB User has rights to pull Data
What is wrong with my script?
That can't work. You make a query thats it. You should fetch the data first with mysql_fetch_assoc or another function to get an output.
And the mysql_* functions are deprecated you should start with mysqli or PDO and prepared statements.
Obligatory comment that mysql_* is deprecated and not safe and should not be used anymore.
mysql_query returns a resource object NOT an array. You need to fetch the data into an array so you can access it.
$random_record = "SELECT * FROM products ORDER BY RAND() LIMIT 1";
$random_product = mysql_query($random_record) or die ($random_record);
//fetch data
$data_array = mysql_fetch_assoc($random_product);
echo $data_array['id'];
print_r the array so you understand the structure and how to access the element you want.
A lot of folks on here (correctly) point out that mysql should no longer be used, then go on to answer the question using mysql. I think it's worthwhile to show you how easy it is to make the change to mysqli. Using dan08's answer as a jumping off point:
//set up connection, save it into the $link variable
$link = mysqli_connect("127.0.0.1","my_shop","xxxxxx", "my_shop_db");
//your query, same as before
$random_record = "SELECT * FROM products ORDER BY RAND() LIMIT 1";
//almost the same syntax as mysql_*, but with the $link added in to specify the connection
$random_product = mysqli_query($link, $random_record) or die ($random_record);
//fetch data
$data_array = mysqli_fetch_assoc($random_product);
echo $data_array['id'];
I'm working on a autosuggest field and it works already. The only thing I'm trying to do now, is to limit the output of data in the list below the search field.
Can someone help me how to do it? I tried a few ways already, but ending up with errors. So what I'm trying is to limit the amount of results which get pulled out of the database. I tried doing this in php (I think it's better performance wise, isn't it?). Here's the code that works fine already:
<?php
require_once 'connect.php';
if (isset($_POST['search_term']) == true && empty ($_POST['search_term']) == false) {
$search_term = mysql_real_escape_string ($_POST['search_term']);
$query = mysql_query ("SELECT `word` FROM `datalist` WHERE `word` LIKE '$search_term%'");
while (($row = mysql_fetch_assoc($query)) !==false) {
echo '<li>', $row['word'], '</li>';
}
}
?>
Since I'm not an expert, I would be happy for some help to learn more...
You can limit it in your query. It would be better to limit the amount of data you retrieve from the database, rather than retrieve everything and then filter the results later. MySQL is built to run these query's quickly, so use it to your advantage.
mysql_query ("SELECT `word` FROM `datalist` WHERE `word` LIKE '$search_term%' LIMIT 5");
However you can do it in php if you want by fetching in a for loop:
$limit = 5 // Make limit whatever you want
// Make sure you have enough results to fetch
if(mysql_num_rows($query) < $limit)
$limit = mysql_num_rows($query);
for($i = 0; $i < $limit; $i++){
$row = mysql_fetch_assoc($query);
echo '<li>', $row['word'], '</li>';
}
The easiest approach would be to update your sql query and add a limit clause. So if you maybe want only the first 10 results, do it like this:
SELECT `word`
FROM `datalist`
WHERE `word`
LIKE '$search_term%'
LIMIT 10;
You can do that in php without modifying your query, too, but I found it easiest if you just pull from the database what you need.
By the way, you're using the old mysql connector. This extension is deprecated as of PHP 5.5.0, and will be removed in the future. Instead, the MySQLi or PDO_MySQL extension should be used.
I am trying to query a database, but it seems to just load for an age and not do anything. It's a simple query and shouldnt take longer than a millisecond.
while($row = mysql_fetch_array(getWallPosts($userid)))
{
}
Now when I replace this code with:
echo mysql_num_rows(getWallPosts($userid));
It just displays '1' in fact there's only one record in the DB and it's just a simple SELECT statement.
Here's the getWallPosts function:
function getWallPosts($userid, $limit = "10")
{
$result = dbquery("SELECT * FROM commentpost
WHERE userID = ".$userid."
ORDER BY commentPostID DESC LIMIT 0, ".$limit);
return $result;
}
Also, when I put the SQL string that it's executing into MySQL's query browser. It takes no time at all.
Any ideas?
You appear to be looping indefinitely because you're retrieving a new set (one record) of data each time.
$result = getWallPosts($userid);
while ($row = mysql_fetch_array($result)) {
//printf($row[0]);
}
You need to get the data once and loop through it. Your code is getting the data, running the loop and then getting the data again.
Try:
$posts = getWallPosts($userid);
while($row = mysql_fetch_array($posts))
{
//Code to execute
}
Its an infinite loop. The expression in the while always executes so it will always be true.
You're returning a new result set each time the while statement executes. You should call getWallPosts first, assign it to $results, and then loop over it.