Running sql query inside loop - php

I store the array into session for easily to retrieve and work.
$responses = session('get_all_response');
$responses contains 30 records maximum.
I aiming to make the pushing of data into the array more fast. Because if I have 10 records in $responses (array) it takes 30secs to load all the possible info regarding each content of that array (But the real thing is. The count of records in an array is more likely 30 maximum)
I loop inside the array
foreach($responses as $res)
{
$bo_images = DB::select('SELECT
image.bo_hotel_code,
image.bo_image_type_code,
image.bo_path,
imagetypes.bo_content_imagetype_description
FROM
bo_images AS image
RIGHT JOIN bo_content_imagetypes AS imagetypes
ON imagetypes.bo_content_imagetype_code = image.bo_image_type_code
WHERE image.bo_hotel_code = "'.$res['code'].'" AND image.bo_image_type_code = "COM" LIMIT 1');
if($bo_images != null)
{
foreach($bo_images as $row)
{
$responses[$res['code']]['information']['bo_images'] = array(
'image_type_code' => $row->bo_image_type_code,
'image_path' => 'http://photos.hotelbeds.com/giata/'.$row->bo_path,
'image_type_description' => $row->bo_content_imagetype_description,
);
}
}
$bo_categories = DB::select('SELECT
a.category_code,
b.bo_content_category_description
FROM
bo_hotel_contents AS a
RIGHT JOIN bo_content_categories AS b
ON b.bo_content_category_code = a.category_code
WHERE a.hotel_code= "'.$res['code'].'"');
if($bo_categories != null)
{
foreach($bo_categories as $row)
{
$responses[$res['code']]['information']['rating'] = array(
'description' => $row->bo_content_category_description,
);
}
}
}
In every loop, there is a code in there that will hold the key to get the contents inside the database.
then after that, it will push the content into that array that equal to the index of the array.
Otherwise. It is a success. But I know this is not the proper way of doing it. I know there is much better to do this.
Any help is so much appreciated

I'm not familiar with Laravel, so I don't know if prepared statements work with it, but you should do something to clean &/or verify the $res['code'] to make sure it is an integer, assuming that's what it's supposed to be.
First, prepare a string for an WHERE IN clause.
$str = "";
foreach ($responses as $res){
$str .= ','.$res['code'];
}
$str = substr($str,1); // to remove the comma
Then you'll need to change your query to use the IN statement.
WHERE a.hotel_code IN({$str})
I'm guessing image.bo_hotel_code refers to $res['code']. But in case it doesn't, you could modify your SELECT statement (if memory serves):
$code = $res['code'];
SELECT {$code} as code,
image.bo_hotel_code,
image.bo_image_type_code,
...
Then you'll loop over the results and put them into the array in the same manner, where $row['code'] would refer to the code used to select it. It should be MUCH faster than running repeated queries, and there should be one row for each code in the IN statement.

Related

What is faster in PHP/MYSQL, NTH term query in a for loop, or a IN query after pushing an array of ids after the initial query?

I'm wondering whether this kind of logic would improve query performance, say for example rather then checking a user likes a post on each element in an array and firing a query for each.
Instead i could push the primary id's into an array and then perform an IN query on them, this would reduce 15 nth term queries, and batch it into 2 query including the initial one.
I'm using PHP PDO, MYSQL.
Any advice? Am i on the right track people? :D
$items is the result set from the database, in this case they are questions that users are asking, i get a response in about 140ms and i've set a limit on how many items are loaded at once with pagination.
$questionIds = [];
foreach ($items as $item) {
array_push($questionIds, $item->question_id);
}
$items = loggedInUserLikesQuestions($questionIds, $items, $user_id);
Definitely the IN clause is faster on execution of the SQL query. However, you will only see significant actual clock-speed benefits once the number of items in your IN clause (on average) gets high.
The reason there is a speed difference, even though the individual update may be lightning-fast, is the setup, executing, tear-down, and response of each query, send/receive to the server. When you are doing thousands (or millions) of these as fast as you can, I've seen, instead of 500/sec, getting 200,000/sec. This may give you some idea.
However, with the IN-clause method, you need to make sure your IN clause does not become too big, and hitting the max query size (see variable max_allowed_packet)
Here is a simple set of functions that will automatically batch up into IN clauses of 1000 items each:
<?php
$db = new PDO('...');
$__q = [];
$flushQueue = function() use ($db, &$__q) {
if ( count($__q) > 0 ) {
$sanitized_ids = [];
foreach ( $__q as $id ) { $sanitized_ids[] = (int) $id; }
$db->query("UPDATE question SET linked = 1 WHERE id IN (". join(',',$sanitized_ids) .")");
$__q = [];
}
};
$queuedUpdate = function($question_id) use (&$__q, $flushQueue){
$__q[] = $question_id;
if ( count( $__q) > 1000 ) { $flushQueue(); }
};
// Then your code...
foreach ($items as $item) {
$queuedUpdate($item->question_id);
}
$flushQueue();
Obviously, you don't have to use anon functions, if you are in a class. But the above will work anywhere (assuming you are on >= PHP 5.3).

How to add element to php array without it showing as a new array

The intention with the below code is to extract messages from a mysql table, and put each of them inside ONE array with {} around each output. Each output consists of various parameters as you can see, and is an array in itself.
What the code does is that each time the loop is processed, in the JSON array that this later is converted into, it wraps the output in []´s, hence it´s now a new array which is created.
What I get is:
[{"sender":"ll","message":"blah","timestamp":"2016-12-21 14:43:04","username":"","msgtype":"","threadid":"32629016712222016034323"},{"sender":"kk","message":"blahblah","timestamp":"2016-12-21 14:43:23","username":"","msgtype":"","threadid":"32629016712222016034323"},{"sender":"ll","message":"blahblahblah","timestamp":"2016-12-21 14:43:47","username":"","msgtype":"","threadid":"32629016712222016034323"}],[{"sender":"ll","message":"blahblahblahblah","timestamp":"2016-12-21 14:43:04","username":"","msgtype":"","threadid":"92337321312222016034304"},{"sender":"kk","message":"blahblahblahblahblah","timestamp":"2016-12-21 14:44:05","username":"","msgtype":"","threadid":"92337321312222016034304"}]]
And what I want is:
[{"sender":"ll","message":"blah","timestamp":"2016-12-21 14:43:04","username":"","msgtype":"","threadid":"32629016712222016034323"},{"sender":"kk","message":"blahblah","timestamp":"2016-12-21 14:43:23","username":"","msgtype":"","threadid":"32629016712222016034323"},{"sender":"ll","message":"blahblahblah","timestamp":"2016-12-21 14:43:47","username":"","msgtype":"","threadid":"32629016712222016034323"}],{"sender":"ll","message":"blahblahblahblah","timestamp":"2016-12-21 14:43:04","username":"","msgtype":"","threadid":"92337321312222016034304"},{"sender":"kk","message":"blahblahblahblahblah","timestamp":"2016-12-21 14:44:05","username":"","msgtype":"","threadid":"92337321312222016034304"}]
How do I proceed to get the right result here?
$data = array ();
foreach($threads as $threadid){
$sql = ("SELECT sender,message,timestamp,username,msgtype,threadid FROM Messages WHERE threadid = '$threadid' AND subject = '' AND timestamp > '$newtimestamp' ORDER BY timestamp");
$arrayOfObjects = $conn->query($sql)->fetchAll(PDO::FETCH_OBJ);
$data[] = $$arrayOfObjects;
}
And FYI, $threadid is another array containing... threadids, and the loop correctly fetches these one by one, that´s not where the problem is.
Thanks in advance!!
You are doing O(N) database queries, consider doing just O(1) using an IN expression in your where clause. No need for a foreach loop and you'll get all your data in one array.
SELECT ... FROM Messages WHERE threadid IN (1, 2, 3, ...) AND ...
You might have to use a prepared statement for that.
I think you are searching for PDO::FETCH_OBJ.
You had FETCH_ASSOC, which will return an array of associative arrays.
FETCH_OBJwill return an array ob stdObjects.
Also you reassigned $array to itself when doing $array[] = $array;..
$data = array();
foreach($threads as $threadid){
$sql = ("SELECT sender,message,timestamp,username,msgtype,threadid FROM Messages WHERE threadid = '$threadid' AND subject = '' AND timestamp > '$newtimestamp' ORDER BY timestamp");
// here it is:
$arrayOfObjects = $conn->query($sql)->fetchAll(PDO::FETCH_OBJ);
$data[] = $arrayOfObjects;
}
// now you can encode that as json and show it:
echo json_encode($data);
#akuhn
Well, I decided to give your suggestion one more try, and managed to do it in a none prepared way. I´m aware that this is supposed to be risky, but so far this project just needs to work, then have the php codes updated to safer versions, and then go live. It works, so thanks a bunch!
$sql = ("SELECT sender,message,timestamp,username,msgtype,threadid FROM Messages WHERE threadid IN ('" . implode("','",$threadid) . "') AND subject = '' AND timestamp > '$newtimestamp' ORDER BY timestamp");
$data = $conn->query($sql)->fetchAll(PDO::FETCH_OBJ);

How to handle a large post request

Unfortunately I can't show you the code but I can give you an idea of what it looks like, what it does and what the problem I have...
<?php
include(db.php);
include(tools.php);
$c = new GetDB(); // Connection to DB
$t = new Tools(); // Classes to clean, prevent XSS and others
if(isset($_POST['var'])){
$nv = json_decode($_POST['var'])
foreach($nv as $k) {
$id = $t->clean($k->id);
// ... goes on for about 10 keys
// this might seems redundant or insufficient
$id = $c->real_escape_string($id);
// ... goes on for the rest of keys...
$q = $c->query("SELECT * FROM table WHERE id = '$id'");
$r = $q->fetch_row();
if ($r[1] > 0) {
// Item exist in DB then just UPDATE
$q1 = $c->query(UPDATE TABLE1);
$q4 = $c->query(UPDATE TABLE2);
if ($x == 1) {
$q2 = $c->query(SELECT);
$rq = $q2->fetch_row();
if ($rq[0] > 0) {
// Item already in table just update
$q3 = $c->query(UPDATE TABLE3);
} else {
// Item not in table then INSERT
$q3 = $c->query(INSERT TABLE3);
}
}
} else {
// Item not in DB then Insert
$q1 = $c->query(INSERT TABLE1);
$q4 = $c->query(INSERT TABLE2);
$q3 = $c->query(INSERT TABLE4);
if($x == 1) {
$q5 = $c->query(INSERT TABLE3);
}
}
}
}
As you can see is a very basic INSERT, UPDATE tables script, so before we release to full production we did some test to see that the script is working as it should, and the "result" where excellent...
So, we ran this code against 100 requests, everything when just fine... less than 1.7seconds for the 100 requests... but then we saw the amount of data that needed to be send/post it was a jaw drop for me... over 20K items it takes about 3 to 5min to send the post but the script always crash the "data" is an array in json
array (
[0] => array (
[id] => 1,
[val2] => 1,
[val3] => 1,
[val4] => 1,
[val5] => 1,
[val6] => 1,
[val7] => 1,
[val8] => 1,
[val8] => 1,
[val9] => 1,
[val10] => 1
),
[1] => array (
[id] => 2,
[val2] => 2,
[val3] => 2,
[val4] => 2,
[val5] => 2,
[val6] => 2,
[val7] => 2,
[val8] => 2,
[val8] => 2,
[val9] => 2,
[val10] => 2
),
//... about 10 to 20K depend on the day and time
)
but in json... any way, sending this information is not a problem, like I said it can take about 3 to 5mins the problem is the code that does the job receiving the data and do the queries... in a normal shared hosting we get a 503 error which by doing a debug it turn out to be a time out, so for our VPS we can increment the max_execution_time to whatever we need to, to process 10K+ it takes about 1hr in our VPS, but in a shared hosting we can't use max_execution_time... So I ask the other developer the one that is sending the information that instead of sending 10K+ in one blow to send a batch of 1K and let it rest for a second then send another batch..and so on ... so far I haven't got any answer... so I was thinking to do the "pause" on my end, say, after process 1K of items wait for a sec then continue but I don't see it as efficient as receiving the data in batches... how would you solve this?
Sorry, I don't have enough reputation to comment everywhere, yet, so I have to write this in an answer. I would recommend zedfoxus' method of batch processing above. In addition, I would highly recommend figuring out a way of processing those queries faster. Keep in mind that every single PHP function call, etc. gets multiplied by every row of data. Here are just a couple of the ways you might be able to get better performance:
Use prepared statements. This will allow MySQL to cache the memory operation for each successive query. This is really important.
If you use prepared statements, then you can drop the $c->real_escape_string() calls. I would also scratch my head to see what you can safely leave out of the $t->clean() method.
Next I would evaluate the performance of evaluating every single row individually. I'd have to benchmark it to be sure, but I think running a few PHP statements beforehand will be faster than making umpteen unnecessary MySQL SELECT and UPDATE calls. MySQL is much faster when inserting multiple rows at a time. If you expect multiple rows of your input to be changing the same row in the database, then you might want to consider the following:
a. Think about creating a temporary, precompiled array (depending on memory usage involved) that stores the unique rows of data. I would also consider doing the same for the secondary TABLE3. This would eliminate needless "update" queries, and make part b possible.
b. Consider a single query that selects every id from the database that's in the array. This will be the list of items to use an UPDATE query for. Update each of these rows, removing them from the temporary array as you go. Then, you can create a single, multi-row insert statement (prepared, of course), that does all of the inserts at a single time.
Take a look at optimizing your MySQL server parameters to better handle the load.
I don't know if this would speed up a prepared INSERT statement at all, but it might be worth a try. You can wrap the INSERT statement within a transaction as detailed in an answer here: MySQL multiple insert performance
I hope that helps, and if anyone else has some suggestions, just post them in the comments and I'll try to include them.
Here's a look at the original code with just a few suggestions for changes:
<?php
/* You can make sure that the connection type is persistent and
* I personally prefer using the PDO driver.
*/
include(db.php);
/* Definitely think twice about each tool that is included.
* Only include what you need to evaluate the submitted data.
*/
include(tools.php);
$c = new GetDB(); // Connection to DB
/* Take a look at optimizing the code in the Tools class.
* Avoid any and all kinds of loops–this code is going to be used in
* a loop and could easily turn into O(n^2) performance drain.
* Minimize the amount of string manipulation requests.
* Optimize regular expressions.
*/
$t = new Tools(); // Classes to clean, prevent XSS and others
if(isset($_POST['var'])){ // !empty() catches more cases than isset()
$nv = json_decode($_POST['var'])
/* LOOP LOGIC
* Definitely test my hypothesis yourself, but this is similar
* to what I would try first.
*/
//Row in database query
$inTableSQL = "SELECT id FROM TABLE1 WHERE id IN("; //keep adding to it
foreach ($nv as $k) {
/* I would personally use specific methods per data type.
* Here, I might use a type cast, plus valid int range check.
*/
$id = $t->cleanId($k->id); //I would include a type cast: (int)
// Similarly for other values
//etc.
// Then save validated data to the array(s)
$data[$id] = array($values...);
/* Now would also be a good time to add the id to the SELECT
* statement
*/
$inTableSQL .= "$id,";
}
$inTableSQL .= ");";
// Execute query here
// Then step through the query ids returned, perform UPDATEs,
// remove the array element once UPDATE is done (use prepared statements)
foreach (.....
/* Then, insert the remaining rows all at once...
* You'll have to step through the remaining array elements to
* prepare the statement.
*/
foreach(.....
} //end initial POST data if
/* Everything below here becomes irrelevant */
foreach($nv as $k) {
$id = $t->clean($k->id);
// ... goes on for about 10 keys
// this might seems redundant or insufficient
$id = $c->real_escape_string($id);
// ... goes on for the rest of keys...
$q = $c->query("SELECT * FROM table WHERE id = '$id'");
$r = $q->fetch_row();
if ($r[1] > 0) {
// Item exist in DB then just UPDATE
$q1 = $c->query(UPDATE TABLE1);
$q4 = $c->query(UPDATE TABLE2);
if ($x == 1) {
$q2 = $c->query(SELECT);
$rq = $q2->fetch_row();
if ($rq[0] > 0) {
// Item already in table just update
$q3 = $c->query(UPDATE TABLE3);
} else {
// Item not in table then INSERT
$q3 = $c->query(INSERT TABLE3);
}
}
} else {
// Item not in DB then Insert
$q1 = $c->query(INSERT TABLE1);
$q4 = $c->query(INSERT TABLE2);
$q3 = $c->query(INSERT TABLE4);
if($x == 1) {
$q5 = $c->query(INSERT TABLE3);
}
}
}
}
The key is to minimize queries. Often, where you are looping over data doing one or more queries per iteration, you can replace it with a constant number of queries. In your case, you'll want to rewrite it into something like this:
include(db.php);
include(tools.php);
$c = new GetDB(); // Connection to DB
$t = new Tools(); // Classes to clean, prevent XSS and others
if(isset($_POST['var'])){
$nv = json_decode($_POST['var'])
$table1_data = array();
$table2_data = array();
$table3_data = array();
$table4_data = array();
foreach($nv as $k) {
$id = $t->clean($k->id);
// ... goes on for about 10 keys
// this might seems redundant or insufficient
$id = $c->real_escape_string($id);
// ... goes on for the rest of keys...
$table1_data[] = array( ... );
$table2_data[] = array( ... );
$table4_data[] = array( ... );
if ($x == 1) {
$table3_data[] = array( ... );
}
}
$values = array_to_sql($table1_data);
$c->query("INSERT INTO TABLE1 (...) VALUES $values ON DUPLICATE KEY UPDATE ...");
$values = array_to_sql($table2_data);
$c->query("INSERT INTO TABLE2 (...) VALUES $values ON DUPLICATE KEY UPDATE ...");
$values = array_to_sql($table3_data);
$c->query("INSERT INTO TABLE3 (...) VALUES $values ON DUPLICATE KEY UPDATE ...");
$values = array_to_sql($table4_data);
$c->query("INSERT IGNORE INTO TABLE4 (...) VALUES $values");
}
While your original code executed between 3 and 5 queries per row of your input data, the above code only executes 4 queries in total.
I leave the implementation of array_to_sql to the reader, but hopefully this should explain the idea. TABLE4 is an INSERT IGNORE here since you didn't have an UPDATE in the "found" clause of your original loop.

While loop to gather data from database until array contains a certain value

I'm trying to create a while loop in PHP which retrieves data from a database and puts it into an array. This while loop should only work until the array its filling contains a certain value.
Is there a way to scan through the array and look for the value while the loop is still busy?
to put it bluntly;
$array = array();
$sql = mysql_query("SELECT * FROM table");
while($row = mysql_fetch_array($sql)){
//Do stuff
//add it to the array
while($array !=) //<-- I need to check the array here
}
You can use in_array function and break statement to check if value is in array and then stop looping.
First off, I think it'd be easier to check what you're filling the array with instead of checking the array itself. As the filled array grows, searching it will take longer and longer. Insted, consider:
$array = array_merge($array, $row);
if (in_array('ThisisWhatIneed', $row)
{
break;//leaves the while-loop
}
However, if you're query is returning more data, consider changing it to return what you need, only process the data that needs to be processed, otherwise, you might as well end up with code that does something like:
$stmt = $db->query('SELECT * FROM tbl');
while($row = $stmt->fetch(PDO::FETCH_ASSOC))
{
if ($row['dataField'] === 'username')
{
$user = $row;
break;
}
}
WHERE could help a lot here, don't you think? As well taking advantage of MySQL's specific SELECT syntax, as in SELECT fields, you, need FROM table, which is more efficient.
You may also have noticed that the code above uses PDO, not mysql_*. Why? Simply because the mysql_* extension Is deprecated and should not be used anymore
Read what the red-warning-boxes tell you on every mysql* page. They're not just there to add some colour, and to liven things up. They are genuine wanrings.
Why don't you just check each value when it gets inserted into the array? It is much more efficient than iterating over the whole array each time you want to check.
$array = array();
$stopValue = ...
$sql = mysql_query("SELECT * FROM table");
while($row = mysql_fetch_assoc($sql)){
array_push($array,$row['column']);
if($row['column'] == $stopValue){
// The array now contains the stop value
break;
}

How to output multiple rows from an SQL query using the mysqli object

Assuming that the mysqli object is already instantiatied (and connected) with the global variable $mysql, here is the code I am trying to work with.
class Listing {
private $mysql;
function getListingInfo($l_id = "", $category = "", $subcategory = "", $username = "", $status = "active") {
$condition = "`status` = '$status'";
if (!empty($l_id)) $condition .= "AND `L_ID` = '$l_id'";
if (!empty($category)) $condition .= "AND `category` = '$category'";
if (!empty($subcategory)) $condition .= "AND `subcategory` = '$subcategory'";
if (!empty($username)) $condition .= "AND `username` = '$username'";
$result = $this->mysql->query("SELECT * FROM listing WHERE $condition") or die('Error fetching values');
$this->listing = $result->fetch_array() or die('could not create object');
foreach ($this->listing as $key => $value) :
$info[$key] = stripslashes(html_entity_decode($value));
endforeach;
return $info;
}
}
there are several hundred listings in the db and when I call $result->fetch_array() it places in an array the first row in the db.
however when I try to call the object, I can't seem to access more than the first row.
for instance:
$listing_row = new Listing;
while ($listing = $listing_row->getListingInfo()) {
echo $listing[0];
}
this outputs an infinite loop of the same row in the db. Why does it not advance to the next row?
if I move the code:
$this->listing = $result->fetch_array() or die('could not create object');
foreach ($this->listing as $key => $value) :
$info[$key] = stripslashes(html_entity_decode($value));
endforeach;
if I move this outside the class, it works exactly as expected outputting a row at a time while looping through the while statement.
Is there a way to write this so that I can keep the fetch_array() call in the class and still loop through the records?
Your object is fundamentally flawed - it's re-running the query every time you call the getListingInfo() method. As well, mysql_fetch_array() does not fetch the entire result set, it only fetches the next row, so your method boils down to:
run query
fetch first row
process first row
return first row
Each call to the object creates a new query, a new result set, and therefore will never be able to fetch the 2nd, 3rd, etc... rows.
Unless your data set is "huge" (ie: bigger than you want to/can set the PHP memory_limit), there's no reason to NOT fetch the entire set and process it all in one go, as shown in Jacob's answer above.
as a side note, the use of stripslashes makes me wonder if your PHP installation has magic_quotes_gpc enabled. This functionality has been long deprecrated and will be removed from PHP whenever v6.0 comes out. If your code runs as is on such an installation, it may trash legitimate escaping in the data. As well, it's generally a poor idea to store encoded/escaped data in the database. The DB should contain a "virgin" copy of the data, and you then process it (escape, quote, etc...) as needed at the point you need the processed version.

Categories