Unfortunately I can't show you the code but I can give you an idea of what it looks like, what it does and what the problem I have...
<?php
include(db.php);
include(tools.php);
$c = new GetDB(); // Connection to DB
$t = new Tools(); // Classes to clean, prevent XSS and others
if(isset($_POST['var'])){
$nv = json_decode($_POST['var'])
foreach($nv as $k) {
$id = $t->clean($k->id);
// ... goes on for about 10 keys
// this might seems redundant or insufficient
$id = $c->real_escape_string($id);
// ... goes on for the rest of keys...
$q = $c->query("SELECT * FROM table WHERE id = '$id'");
$r = $q->fetch_row();
if ($r[1] > 0) {
// Item exist in DB then just UPDATE
$q1 = $c->query(UPDATE TABLE1);
$q4 = $c->query(UPDATE TABLE2);
if ($x == 1) {
$q2 = $c->query(SELECT);
$rq = $q2->fetch_row();
if ($rq[0] > 0) {
// Item already in table just update
$q3 = $c->query(UPDATE TABLE3);
} else {
// Item not in table then INSERT
$q3 = $c->query(INSERT TABLE3);
}
}
} else {
// Item not in DB then Insert
$q1 = $c->query(INSERT TABLE1);
$q4 = $c->query(INSERT TABLE2);
$q3 = $c->query(INSERT TABLE4);
if($x == 1) {
$q5 = $c->query(INSERT TABLE3);
}
}
}
}
As you can see is a very basic INSERT, UPDATE tables script, so before we release to full production we did some test to see that the script is working as it should, and the "result" where excellent...
So, we ran this code against 100 requests, everything when just fine... less than 1.7seconds for the 100 requests... but then we saw the amount of data that needed to be send/post it was a jaw drop for me... over 20K items it takes about 3 to 5min to send the post but the script always crash the "data" is an array in json
array (
[0] => array (
[id] => 1,
[val2] => 1,
[val3] => 1,
[val4] => 1,
[val5] => 1,
[val6] => 1,
[val7] => 1,
[val8] => 1,
[val8] => 1,
[val9] => 1,
[val10] => 1
),
[1] => array (
[id] => 2,
[val2] => 2,
[val3] => 2,
[val4] => 2,
[val5] => 2,
[val6] => 2,
[val7] => 2,
[val8] => 2,
[val8] => 2,
[val9] => 2,
[val10] => 2
),
//... about 10 to 20K depend on the day and time
)
but in json... any way, sending this information is not a problem, like I said it can take about 3 to 5mins the problem is the code that does the job receiving the data and do the queries... in a normal shared hosting we get a 503 error which by doing a debug it turn out to be a time out, so for our VPS we can increment the max_execution_time to whatever we need to, to process 10K+ it takes about 1hr in our VPS, but in a shared hosting we can't use max_execution_time... So I ask the other developer the one that is sending the information that instead of sending 10K+ in one blow to send a batch of 1K and let it rest for a second then send another batch..and so on ... so far I haven't got any answer... so I was thinking to do the "pause" on my end, say, after process 1K of items wait for a sec then continue but I don't see it as efficient as receiving the data in batches... how would you solve this?
Sorry, I don't have enough reputation to comment everywhere, yet, so I have to write this in an answer. I would recommend zedfoxus' method of batch processing above. In addition, I would highly recommend figuring out a way of processing those queries faster. Keep in mind that every single PHP function call, etc. gets multiplied by every row of data. Here are just a couple of the ways you might be able to get better performance:
Use prepared statements. This will allow MySQL to cache the memory operation for each successive query. This is really important.
If you use prepared statements, then you can drop the $c->real_escape_string() calls. I would also scratch my head to see what you can safely leave out of the $t->clean() method.
Next I would evaluate the performance of evaluating every single row individually. I'd have to benchmark it to be sure, but I think running a few PHP statements beforehand will be faster than making umpteen unnecessary MySQL SELECT and UPDATE calls. MySQL is much faster when inserting multiple rows at a time. If you expect multiple rows of your input to be changing the same row in the database, then you might want to consider the following:
a. Think about creating a temporary, precompiled array (depending on memory usage involved) that stores the unique rows of data. I would also consider doing the same for the secondary TABLE3. This would eliminate needless "update" queries, and make part b possible.
b. Consider a single query that selects every id from the database that's in the array. This will be the list of items to use an UPDATE query for. Update each of these rows, removing them from the temporary array as you go. Then, you can create a single, multi-row insert statement (prepared, of course), that does all of the inserts at a single time.
Take a look at optimizing your MySQL server parameters to better handle the load.
I don't know if this would speed up a prepared INSERT statement at all, but it might be worth a try. You can wrap the INSERT statement within a transaction as detailed in an answer here: MySQL multiple insert performance
I hope that helps, and if anyone else has some suggestions, just post them in the comments and I'll try to include them.
Here's a look at the original code with just a few suggestions for changes:
<?php
/* You can make sure that the connection type is persistent and
* I personally prefer using the PDO driver.
*/
include(db.php);
/* Definitely think twice about each tool that is included.
* Only include what you need to evaluate the submitted data.
*/
include(tools.php);
$c = new GetDB(); // Connection to DB
/* Take a look at optimizing the code in the Tools class.
* Avoid any and all kinds of loops–this code is going to be used in
* a loop and could easily turn into O(n^2) performance drain.
* Minimize the amount of string manipulation requests.
* Optimize regular expressions.
*/
$t = new Tools(); // Classes to clean, prevent XSS and others
if(isset($_POST['var'])){ // !empty() catches more cases than isset()
$nv = json_decode($_POST['var'])
/* LOOP LOGIC
* Definitely test my hypothesis yourself, but this is similar
* to what I would try first.
*/
//Row in database query
$inTableSQL = "SELECT id FROM TABLE1 WHERE id IN("; //keep adding to it
foreach ($nv as $k) {
/* I would personally use specific methods per data type.
* Here, I might use a type cast, plus valid int range check.
*/
$id = $t->cleanId($k->id); //I would include a type cast: (int)
// Similarly for other values
//etc.
// Then save validated data to the array(s)
$data[$id] = array($values...);
/* Now would also be a good time to add the id to the SELECT
* statement
*/
$inTableSQL .= "$id,";
}
$inTableSQL .= ");";
// Execute query here
// Then step through the query ids returned, perform UPDATEs,
// remove the array element once UPDATE is done (use prepared statements)
foreach (.....
/* Then, insert the remaining rows all at once...
* You'll have to step through the remaining array elements to
* prepare the statement.
*/
foreach(.....
} //end initial POST data if
/* Everything below here becomes irrelevant */
foreach($nv as $k) {
$id = $t->clean($k->id);
// ... goes on for about 10 keys
// this might seems redundant or insufficient
$id = $c->real_escape_string($id);
// ... goes on for the rest of keys...
$q = $c->query("SELECT * FROM table WHERE id = '$id'");
$r = $q->fetch_row();
if ($r[1] > 0) {
// Item exist in DB then just UPDATE
$q1 = $c->query(UPDATE TABLE1);
$q4 = $c->query(UPDATE TABLE2);
if ($x == 1) {
$q2 = $c->query(SELECT);
$rq = $q2->fetch_row();
if ($rq[0] > 0) {
// Item already in table just update
$q3 = $c->query(UPDATE TABLE3);
} else {
// Item not in table then INSERT
$q3 = $c->query(INSERT TABLE3);
}
}
} else {
// Item not in DB then Insert
$q1 = $c->query(INSERT TABLE1);
$q4 = $c->query(INSERT TABLE2);
$q3 = $c->query(INSERT TABLE4);
if($x == 1) {
$q5 = $c->query(INSERT TABLE3);
}
}
}
}
The key is to minimize queries. Often, where you are looping over data doing one or more queries per iteration, you can replace it with a constant number of queries. In your case, you'll want to rewrite it into something like this:
include(db.php);
include(tools.php);
$c = new GetDB(); // Connection to DB
$t = new Tools(); // Classes to clean, prevent XSS and others
if(isset($_POST['var'])){
$nv = json_decode($_POST['var'])
$table1_data = array();
$table2_data = array();
$table3_data = array();
$table4_data = array();
foreach($nv as $k) {
$id = $t->clean($k->id);
// ... goes on for about 10 keys
// this might seems redundant or insufficient
$id = $c->real_escape_string($id);
// ... goes on for the rest of keys...
$table1_data[] = array( ... );
$table2_data[] = array( ... );
$table4_data[] = array( ... );
if ($x == 1) {
$table3_data[] = array( ... );
}
}
$values = array_to_sql($table1_data);
$c->query("INSERT INTO TABLE1 (...) VALUES $values ON DUPLICATE KEY UPDATE ...");
$values = array_to_sql($table2_data);
$c->query("INSERT INTO TABLE2 (...) VALUES $values ON DUPLICATE KEY UPDATE ...");
$values = array_to_sql($table3_data);
$c->query("INSERT INTO TABLE3 (...) VALUES $values ON DUPLICATE KEY UPDATE ...");
$values = array_to_sql($table4_data);
$c->query("INSERT IGNORE INTO TABLE4 (...) VALUES $values");
}
While your original code executed between 3 and 5 queries per row of your input data, the above code only executes 4 queries in total.
I leave the implementation of array_to_sql to the reader, but hopefully this should explain the idea. TABLE4 is an INSERT IGNORE here since you didn't have an UPDATE in the "found" clause of your original loop.
Related
This question already has answers here:
Is there a way to fetch associative array grouped by the values of a specified column with PDO?
(8 answers)
Closed 1 year ago.
I want to send the overdue tasks that are assigned to a specific employee as an email summary. It is possible that multiple todos are assigned to the same employee. So multiple todos can be assigned to the same employee AND are overdue. That's where the problem starts...
So what I did, is grabbing all the overdue tasks from the database and then I grabbed the assigned employees to the tasks. I created an array that consists of the todoID, the employeeID and the employeeEmail. Now, is there a better way to do this and if not, how can I group the rows by email address?
The end result should be an array that shows every overdue todo that's assigned to one employee.
// Get all Todos that are not archived
$sql = "SELECT * FROM todo WHERE archiv = 0";
$abfrage = $db->prepare($sql);
$abfrage->execute();
$overdue_array = array();
// Now we get everything that's overdue from the Database
while ($row = $abfrage->fetch()) {
if ($row["status"] !== 3) {
if ($row["archiv"] !== 1) {
if ($row["durchfuehrung"]) {
if (strtotime($row["durchfuehrung"]) < strtotime(date("Y-m-d"))) {
// Here we now get the email from the assiged employee to the todo
$sql2 = "SELECT email FROM mitarbeiter WHERE mitarbeiterID = :mitarbeiterFID";
$abfrage2 = $db->prepare($sql2);
$abfrage2->bindParam("mitarbeiterFID", $row["mitarbeiterFID"]);
$abfrage2->execute();
while ($row2 = $abfrage2->fetch()) {
$overdue_array[] = array("todoID" => $row["todoID"], "mitarbeiterID" => $row["mitarbeiterFID"], "mitarbeiterEmail" => $row2["email"]);
}
}
}
}
}
The result is the following:
Let's pretty up your scripting with some best practices...
only add columns to your SELECT clause when you have a use for them
enjoy PDO's very handy fetching modes -- FETCH_GROUP is perfect for your case.
always endeavor to minimize trips to the database
always endeavor to minimize the number of loops that you use.
Recommended code (yes, it is just that simple):
$sql = "SELECT email, todoID, mitarbeiterFID
FROM todo
JOIN mitarbeiter ON mitarbeiterID = mitarbeiterFID
WHERE archiv = 0
AND status != 3
AND durchfuehrung < CURRENT_DATE";
foreach ($db->query($sql)->fetchAll(PDO::FETCH_GROUP) as $email => $rows) {
sendSummary($email, $rows, $company, $db);
}
For the record, I don't know where $company comes from.
You could strongly improve your request with a JOIN (as said #mickmackusa) and even structure your response with PDO Fetch Statements (like PDO FETCH_GROUP or PDO FETCH ASSOC). You could directly get the result you want with one (bigger but better) request.
Nevertheless, if you want to simply sort your array with PHP, the use of foreach can do the job.
foreach ($array as $todo) {
if (!isset($newArray[$todo["mitarbeiterEmail"]])) {
// Here we create a new array with the email as a key and put the two first key-values in it with array_slice
$newArray[$todo["mitarbeiterEmail"]] = [array_slice($todo, 0, 2)];
} else {
// Here we push another todo if the email is already declared as a key
array_push($newArray[$todo["mitarbeiterEmail"]], array_slice($todo, 0, 2));
}
}
// Check your new array
print_r($newArray);
You could also avoid foreach by combining array_multisort to sort by emails then array_reduce to remove duplicate elements keeping associated data. Many solutions can be proposed.
I store the array into session for easily to retrieve and work.
$responses = session('get_all_response');
$responses contains 30 records maximum.
I aiming to make the pushing of data into the array more fast. Because if I have 10 records in $responses (array) it takes 30secs to load all the possible info regarding each content of that array (But the real thing is. The count of records in an array is more likely 30 maximum)
I loop inside the array
foreach($responses as $res)
{
$bo_images = DB::select('SELECT
image.bo_hotel_code,
image.bo_image_type_code,
image.bo_path,
imagetypes.bo_content_imagetype_description
FROM
bo_images AS image
RIGHT JOIN bo_content_imagetypes AS imagetypes
ON imagetypes.bo_content_imagetype_code = image.bo_image_type_code
WHERE image.bo_hotel_code = "'.$res['code'].'" AND image.bo_image_type_code = "COM" LIMIT 1');
if($bo_images != null)
{
foreach($bo_images as $row)
{
$responses[$res['code']]['information']['bo_images'] = array(
'image_type_code' => $row->bo_image_type_code,
'image_path' => 'http://photos.hotelbeds.com/giata/'.$row->bo_path,
'image_type_description' => $row->bo_content_imagetype_description,
);
}
}
$bo_categories = DB::select('SELECT
a.category_code,
b.bo_content_category_description
FROM
bo_hotel_contents AS a
RIGHT JOIN bo_content_categories AS b
ON b.bo_content_category_code = a.category_code
WHERE a.hotel_code= "'.$res['code'].'"');
if($bo_categories != null)
{
foreach($bo_categories as $row)
{
$responses[$res['code']]['information']['rating'] = array(
'description' => $row->bo_content_category_description,
);
}
}
}
In every loop, there is a code in there that will hold the key to get the contents inside the database.
then after that, it will push the content into that array that equal to the index of the array.
Otherwise. It is a success. But I know this is not the proper way of doing it. I know there is much better to do this.
Any help is so much appreciated
I'm not familiar with Laravel, so I don't know if prepared statements work with it, but you should do something to clean &/or verify the $res['code'] to make sure it is an integer, assuming that's what it's supposed to be.
First, prepare a string for an WHERE IN clause.
$str = "";
foreach ($responses as $res){
$str .= ','.$res['code'];
}
$str = substr($str,1); // to remove the comma
Then you'll need to change your query to use the IN statement.
WHERE a.hotel_code IN({$str})
I'm guessing image.bo_hotel_code refers to $res['code']. But in case it doesn't, you could modify your SELECT statement (if memory serves):
$code = $res['code'];
SELECT {$code} as code,
image.bo_hotel_code,
image.bo_image_type_code,
...
Then you'll loop over the results and put them into the array in the same manner, where $row['code'] would refer to the code used to select it. It should be MUCH faster than running repeated queries, and there should be one row for each code in the IN statement.
I have 50 databases with the same design and structure. For benchmarking data analysis and evaluation of the several thousand of variables contained therein I need to build SQL queries. I have a SuperAdmin account who has the permission to view all these databases through an SSH access.
Now, I can write SQL queries combining data from one database after another using UNION ALL like this (simplified):
SELECT * FROM db1.table WHERE blah
UNION ALL
SELECT * FROM db2.table WHERE blah
UNION ALL
[...]
SELECT * FROM db50.table WHERE blah
These queries are within a PHP environment and work like a charm - I get all data analyses and evaluations I want.
But:
A single SQL query in my case, however, has about 1.000 lines of query statement and conditional code. Querying 50 databases this sequential way means that my query becomes quite large and I have to do a lot of error-prone copying the SQL query statement adjusting "db1" to "db50" every single time plus within a lot of JOINs additionally, so it becomes a pain in the youknowhere and becomes time-consuming.
My question:
Isn't there a way in which I can tell PHP to put all the "dbx" into an array and reiterate then only the one basic SQL query by exchanging the "dbx" from 1 to 50 using a foreach or while or similar statement?
Such as:
<?php
function getResultValuesFromDatabase() {
global $DATABASE;
$db1 = "database1";
$db2 = "database2";
[...]
$db50 = "database50";
$no_of_databases = array($db1, $db2, [...], $db50);
foreach ($no_of_databases as $value) {
$sql_query = "SELECT * FROM $value.table WHERE blah";
}
$rQuery = mysql_query($qQuery, $oDatabase);
$result = array("count" => mysql_num_rows($rQuery ), "result" => $rQuery);
return $result;
}
I receive an error which affects the reiteration process. Can anybody please tell me where my logical error is?
You don't need an array of the names. Since they are all similar a simple for loop should work for you. You also need to do the query in the loop and accumulate the results before you return them.
for ($db = 1;$db <= 50;$db++) {
$sql_query = 'SELECT * FROM database' . str_pad($db,2,'0',STR_PAD_LEFT) . '.table WHERE blah';
$rQuery = mysql_query($qQuery, $oDatabase);
$result[] = array("count" => mysql_num_rows($rQuery ), "result" => $rQuery);
}
return $result;
Of course, you can do it, but you must change the database before doing any request.
Also, please note that mysql_* functions are deprecated and you must consider changing them to mysqli or PDO.
Here is how you could do it:
function getResultValuesFromDatabase()
{
$result = [];
$con = mysqli_connect("localhost","my_user","my_password");
for ($i = 1; $i <= 50; $i++) {
$databaseName = 'database' . str_pad($i, 2, '0', STR_PAD_LEFT);
mysqli_select_db($con, $databaseName);
// Probably the database doesn't exist
if (mysqli_errno($con) !== 0) {
continue;
}
$queryResult = mysqli_query($con, "SELECT * FROM table WHERE blah");
$result[$databaseName] = [
'count' => mysqli_num_rows($queryResult),
'result' => mysqli_fetch_all($queryResult, MYSQLI_ASSOC)
];
mysqli_free_result($queryResult);
}
mysqli_close($con);
return $result;
}
I have a bunch of photos on a page and using jQuery UI's Sortable plugin, to allow for them to be reordered.
When my sortable function fires, it writes a new order sequence:
1030:0,1031:1,1032:2,1040:3,1033:4
Each item of the comma delimited string, consists of the photo ID and the order position, separated by a colon. When the user has completely finished their reordering, I'm posting this order sequence to a PHP page via AJAX, to store the changes in the database. Here's where I get into trouble.
I have no problem getting my script to work, but I'm pretty sure it's the incorrect way to achieve what I want, and will suffer hugely in performance and resources - I'm hoping somebody could advise me as to what would be the best approach.
This is my PHP script that deals with the sequence:
if ($sorted_order) {
$exploded_order = explode(',',$sorted_order);
foreach ($exploded_order as $order_part) {
$exploded_part = explode(':',$order_part);
$part_count = 0;
foreach ($exploded_part as $part) {
$part_count++;
if ($part_count == 1) {
$photo_id = $part;
} elseif ($part_count == 2) {
$order = $part;
}
$SQL = "UPDATE article_photos ";
$SQL .= "SET order_pos = :order_pos ";
$SQL .= "WHERE photo_id = :photo_id;";
... rest of PDO stuff ...
}
}
}
My concerns arise from the nested foreach functions and also running so many database updates. If a given sequence contained 150 items, would this script cry for help? If it will, how could I improve it?
** This is for an admin page, so it won't be heavily abused **
you can use one update, with some cleaver code like so:
create the array $data['order'] in the loop then:
$q = "UPDATE article_photos SET order_pos = (CASE photo_id ";
foreach($data['order'] as $sort => $id){
$q .= " WHEN {$id} THEN {$sort}";
}
$q .= " END ) WHERE photo_id IN (".implode(",",$data['order']).")";
a little clearer perhaps
UPDATE article_photos SET order_pos = (CASE photo_id
WHEN id = 1 THEN 999
WHEN id = 2 THEN 1000
WHEN id = 3 THEN 1001
END)
WHERE photo_id IN (1,2,3)
i use this approach for exactly what your doing, updating sort orders
No need for the second foreach: you know it's going to be two parts if your data passes validation (I'm assuming you validated this. If not: you should =) so just do:
if (count($exploded_part) == 2) {
$id = $exploded_part[0];
$seq = $exploded_part[1];
/* rest of code */
} else {
/* error - data does not conform despite validation */
}
As for update hammering: do your DB updates in a transaction. Your db will queue the ops, but not commit them to the main DB until you commit the transaction, at which point it'll happily do the update "for real" at lightning speed.
I suggest making your script even simplier and changing names of the variables, so the code would be way more readable.
$parts = explode(',',$sorted_order);
foreach ($parts as $part) {
list($id, $position) = explode(':',$order_part);
//Now you can work with $id and $position ;
}
More info about list: http://php.net/manual/en/function.list.php
Also, about performance and your data structure:
The way you store your data is not perfect. But that way you will not suffer any performance issues, that way you need to send less data, less overhead overall.
However the drawback of your data structure is that most probably you will be unable to establish relationships between tables and make joins or alter table structure in a correct way.
Good evening all.
I'm currently working on a small personal project. It's purpose is to retrieve numerous values from a database on my backend and store them as variables. These variables are then used to modify the appearance of some HTML5 Canvas objects (in this case, i'm using arcs).
Please note that the values in the database are Text and thus my bind statements refer to that. The queries i'm calling (AVG, MIN, MAX) work fine with the values i've got as the fields store numerical data (this is merely due to another script that deals with adding or updating the data -- that's already running MySQLi, and using Text was the best solution for my situation).
Now, i achieved what i wanted with standard MySQL queries, but it's messy code and the performance of it could prove to be terrible as the database grows. For that reason, i want to use loops. I also feel that bind_param of MySQLi would be much better for security. The page doesn't accept ANY user input, it's merely for display and so injection is less of a concern, but at some point in the future, i'll be looking to expand it to allow users to control what is displayed.
Here's a sample of my original MySQL PHP code sample;
$T0A = mysql_query('SELECT AVG(Temp0) FROM VTempStats'); // Average
$T0B = mysql_query('SELECT MIN(Temp0) FROM VTempStats'); // Bottom/MIN
$T0T = mysql_query('SELECT MAX(Temp0) FROM VTempStats'); // Top/MAX
$T1A = mysql_query('SELECT AVG(Temp1) FROM VTempStats'); // Average
$T1B = mysql_query('SELECT MIN(Temp1) FROM VTempStats'); // Bottom/MIN
$T1T = mysql_query('SELECT MAX(Temp1) FROM VTempStats'); // Top/MAX
$r_T0A = mysql_result($T0A, 0);
$r_T0T = mysql_result($T0T, 0);
$r_T0B = mysql_result($T0B, 0);
$r_T1A = mysql_result($T1A, 0);
$r_T1T = mysql_result($T1T, 0);
$r_T1B = mysql_result($T1B, 0);
if ($r_T0A == "" ) {$r_T0A = 0;}
if ($r_T1A == "" ) {$r_T1A = 0;}
if ($r_T0B == "" ) {$r_T0B = 0;}
if ($r_T1B == "" ) {$r_T1B = 0;}
if ($r_T0T == "" ) {$r_T0T = 0;}
if ($r_T1T == "" ) {$r_T1T = 0;}
That's shorter than the original, as there's 4x3 sets of queries (Temp0,Temp1,Temp2,Temp3, and min,max,avg for each). Note that the last 6 if statements are merely there to ensure that fields that are null are automatically set to 0 before my canvas script attempts to work with them (see below).
To show that value on the arc, i'd use this in my canvas script (for example);
var endAngle = startAngle + (<?= $r_T0A ?> / 36+0.02);
It worked for me, and what was displayed was exactly what i expected.
Now, in trying to clean up my code and move to a loop and MySQLi, i'm running into problems. Being very new to both SQL and PHP, i could use some assistance.
This is what i tried;
$q_avg = "SELECT AVG(Temp?) FROM VTempStats";
for ($i_avg = 0; $i_avg <= 3; ++$i_avg)
{
if ($s_avg = $mysqli->prepare($q_avg))
{
$s_avg->bind_param('s',$i_avg);
$s_avg->execute();
$s_avg->bind_result($avg);
$s_avg->fetch();
echo $avg;
}
}
Note: mysqli is the MySQLi connection. I've cut the code down to only show the AVG query loop, but the MIN and MAX loops are nearly identical.
Obviously, that won't work as it's only assigning one variable for each set of queries, instead of 4 variables for each loop.
As you can imagine, what i want to do is assign all 12 values to individual variables so that i can work with them in my canvas script. I'm not entirely sure how i go about this though.
I can echo individual values out through MySQLi, or i can query the database to change or add data through MySQLi, but trying to make a loop that does what i intend with MySQLi (or even MySQL), that's something i need help with.
From my reading of your code, you have a fixed number of columns and know their names, and you are applying the AVG(), MIN(), MAX() aggregates to the same table over the same aggregate group, with no WHERE clause applied. Therefore, they can all be done in one query from which you just need to fetch one single row.
SELECT
AVG(Temp0) AS a0,
MIN(Temp0) AS min0,
MAX(Temp0) AS max0,
AVG(Temp1) AS a1,
MIN(Temp1) AS min1,
MAX(Temp1) AS max1,
AVG(Temp2) AS a2,
MIN(Temp2) AS min2,
MAX(Temp2) AS max2,
AVG(Temp3) AS a3,
MIN(Temp3) AS min3,
MAX(Temp3) AS max3
FROM VTempStats
This can be done in a single call to $mysqli->query(), and no parameter binding is necessary so you don't need the overhead of prepare(). One call to fetch_assoc() is needed to retrieve a single row, with columns aliased like a0, min0, max0, etc... as I have done above.
// Fetch one row
$values = $result_resource->fetch_assoc();
print_r($values);
printf("Avg 0: %s, Min 0: %s, Max 0: %s... etc....", $values['a0'], $values['min0'], $values['max0']);
These can be pulled into the global scope with extract(), but I recommend against that. Keeping them in their $values array makes their source more explicit.
As you can imagine, what i want to do is assign all 12 values to individual variables so that i can work with them in my canvas script. I'm not entirely sure how i go about this though.
Understood. Here is what I would do.
<?php // RAY_temp_scottprichard.php
error_reporting(E_ALL);
echo '<pre>';
// RANGE OF TEMPS
$temps = range(0,3);
// RANGE OF VALUES
$funcs = array
( 'A' => 'AVG'
, 'B' => 'MIN'
, 'T' => 'MAX'
)
;
// CONSTRUCT THE QUERY STRING
$query = 'SELECT ';
foreach ($temps as $t)
{
foreach ($funcs as $key => $func)
{
$query .= PHP_EOL
. $func
. '(Temp'
. $t
. ') AS '
. 'T'
. $t
. $key
. ', '
;
}
}
// DECLOP THE UNWANTED TRAILING COMMA
$query = rtrim($query, ', ');
// ADD THE TABLE NAME
$query .= ' FROM VTempStats';
// ADD ANY ORDER, LIMIT, WHERE CLAUSES HERE
$query .= ' WHERE 1=1';
// SHOW THE WORK PRODUCT
var_dump($query);
See the output query string here: http://www.laprbass.com/RAY_temp_scottpritchard.php
When you run this query, you will fetch one row with *mysql_fetch_assoc()* or equivalent, and it will have all the variables you want in that row, with named keys. Then you can use something like this to inject the variable names and values into your script. http://php.net/manual/en/function.extract.php
PHP extract() allows the use of a prefix, so you should be able to avoid having to make too many changes to your existing script.
HTH, ~Ray