Related
i have some data with userid and date.
Sometimes there is large datas i need to loop through and update the sql database but the database times out.
is there any better way i can do this please, sample code below.
foreach($time[$re->userid][$today] as $t){
if(($re->time >= $t->in_from) && ($re->time < $t->in_to)
&& md5($t->WorkDay."_in".$re->date) != $in){//in
$tble = tools::sd("{$t->WorkDay} in");
}
if(($re->time >= $t->out_from) && ($re->time < $t->out_to)
&& md5($t->WorkDay."_out".$re->date) != $out){//out
$tble = tools::sd("{$t->WorkDay} out");
if($tble =='nout'){
$re->date2 = tools::ndate($re->date . "- 1");
}
}
if(!empty($tble)){
$q = array(
"id" => $re->userid
, "dt" => $re->date2
, "{$tble}" => $re->time
);
dump($q); // insert into sql
}
}
dump function:::
function dump($d ='')
{
if(!empty($d)){
end($d);
$tble = key($d);
$d['ld'] = "{$d['dt']} {$d[$tble]}";
$r = $GLOBALS['mssqldb']->get_results("
IF NOT EXISTS (select id,ld,dt,{$tble} from clockL
WHERE id = '{$d['id']}'
AND dt ='{$d['dt']}')
INSERT INTO clockL (id,ld,dt,{$tble})
VALUES ('{$d['id']}','{$d['ld']}','{$d['dt']}'
,'{$d[$tble]}')
ELSE IF EXISTS (select id,{$tble} from clockL
WHERE id = '{$d['id']}'
AND dt ='{$d['dt']}'
AND {$tble} = 'NOC'
)
update clockL SET {$tble} ='{$d[$tble]}', ld ='{$d['ld']}' WHERE id = '{$d['id']}'
AND dt ='{$d['dt']}' AND {$tble} ='NOC'
");
//print_r($GLOBALS['mssqldb']);
}
}
Thank You.
Do the insert/update outside of the loop. Enclose it in a transaction so that you don't get an inconsistent database state if the script dies prematurely. Using one big query is usually faster than making lots of small queries. You might also set higher values for time and memory limits, but be aware of the consequencies.
Are you aware of a PHP function called set_time_limit()? You can find the detailed documentation here.
This can manipulate the execution time, which is 30 seconds default. If you set it to 0, eg set_time_limit(0), there will be no execution time limit.
May be looping is the reason for time out.
Because when your performing the insert/update operations in side the loop, the connection to the database will be in open state until the loop is terminated which may cause the time out problem.
Try doing the insert/update operation outside of the loop.
Unfortunately I can't show you the code but I can give you an idea of what it looks like, what it does and what the problem I have...
<?php
include(db.php);
include(tools.php);
$c = new GetDB(); // Connection to DB
$t = new Tools(); // Classes to clean, prevent XSS and others
if(isset($_POST['var'])){
$nv = json_decode($_POST['var'])
foreach($nv as $k) {
$id = $t->clean($k->id);
// ... goes on for about 10 keys
// this might seems redundant or insufficient
$id = $c->real_escape_string($id);
// ... goes on for the rest of keys...
$q = $c->query("SELECT * FROM table WHERE id = '$id'");
$r = $q->fetch_row();
if ($r[1] > 0) {
// Item exist in DB then just UPDATE
$q1 = $c->query(UPDATE TABLE1);
$q4 = $c->query(UPDATE TABLE2);
if ($x == 1) {
$q2 = $c->query(SELECT);
$rq = $q2->fetch_row();
if ($rq[0] > 0) {
// Item already in table just update
$q3 = $c->query(UPDATE TABLE3);
} else {
// Item not in table then INSERT
$q3 = $c->query(INSERT TABLE3);
}
}
} else {
// Item not in DB then Insert
$q1 = $c->query(INSERT TABLE1);
$q4 = $c->query(INSERT TABLE2);
$q3 = $c->query(INSERT TABLE4);
if($x == 1) {
$q5 = $c->query(INSERT TABLE3);
}
}
}
}
As you can see is a very basic INSERT, UPDATE tables script, so before we release to full production we did some test to see that the script is working as it should, and the "result" where excellent...
So, we ran this code against 100 requests, everything when just fine... less than 1.7seconds for the 100 requests... but then we saw the amount of data that needed to be send/post it was a jaw drop for me... over 20K items it takes about 3 to 5min to send the post but the script always crash the "data" is an array in json
array (
[0] => array (
[id] => 1,
[val2] => 1,
[val3] => 1,
[val4] => 1,
[val5] => 1,
[val6] => 1,
[val7] => 1,
[val8] => 1,
[val8] => 1,
[val9] => 1,
[val10] => 1
),
[1] => array (
[id] => 2,
[val2] => 2,
[val3] => 2,
[val4] => 2,
[val5] => 2,
[val6] => 2,
[val7] => 2,
[val8] => 2,
[val8] => 2,
[val9] => 2,
[val10] => 2
),
//... about 10 to 20K depend on the day and time
)
but in json... any way, sending this information is not a problem, like I said it can take about 3 to 5mins the problem is the code that does the job receiving the data and do the queries... in a normal shared hosting we get a 503 error which by doing a debug it turn out to be a time out, so for our VPS we can increment the max_execution_time to whatever we need to, to process 10K+ it takes about 1hr in our VPS, but in a shared hosting we can't use max_execution_time... So I ask the other developer the one that is sending the information that instead of sending 10K+ in one blow to send a batch of 1K and let it rest for a second then send another batch..and so on ... so far I haven't got any answer... so I was thinking to do the "pause" on my end, say, after process 1K of items wait for a sec then continue but I don't see it as efficient as receiving the data in batches... how would you solve this?
Sorry, I don't have enough reputation to comment everywhere, yet, so I have to write this in an answer. I would recommend zedfoxus' method of batch processing above. In addition, I would highly recommend figuring out a way of processing those queries faster. Keep in mind that every single PHP function call, etc. gets multiplied by every row of data. Here are just a couple of the ways you might be able to get better performance:
Use prepared statements. This will allow MySQL to cache the memory operation for each successive query. This is really important.
If you use prepared statements, then you can drop the $c->real_escape_string() calls. I would also scratch my head to see what you can safely leave out of the $t->clean() method.
Next I would evaluate the performance of evaluating every single row individually. I'd have to benchmark it to be sure, but I think running a few PHP statements beforehand will be faster than making umpteen unnecessary MySQL SELECT and UPDATE calls. MySQL is much faster when inserting multiple rows at a time. If you expect multiple rows of your input to be changing the same row in the database, then you might want to consider the following:
a. Think about creating a temporary, precompiled array (depending on memory usage involved) that stores the unique rows of data. I would also consider doing the same for the secondary TABLE3. This would eliminate needless "update" queries, and make part b possible.
b. Consider a single query that selects every id from the database that's in the array. This will be the list of items to use an UPDATE query for. Update each of these rows, removing them from the temporary array as you go. Then, you can create a single, multi-row insert statement (prepared, of course), that does all of the inserts at a single time.
Take a look at optimizing your MySQL server parameters to better handle the load.
I don't know if this would speed up a prepared INSERT statement at all, but it might be worth a try. You can wrap the INSERT statement within a transaction as detailed in an answer here: MySQL multiple insert performance
I hope that helps, and if anyone else has some suggestions, just post them in the comments and I'll try to include them.
Here's a look at the original code with just a few suggestions for changes:
<?php
/* You can make sure that the connection type is persistent and
* I personally prefer using the PDO driver.
*/
include(db.php);
/* Definitely think twice about each tool that is included.
* Only include what you need to evaluate the submitted data.
*/
include(tools.php);
$c = new GetDB(); // Connection to DB
/* Take a look at optimizing the code in the Tools class.
* Avoid any and all kinds of loops–this code is going to be used in
* a loop and could easily turn into O(n^2) performance drain.
* Minimize the amount of string manipulation requests.
* Optimize regular expressions.
*/
$t = new Tools(); // Classes to clean, prevent XSS and others
if(isset($_POST['var'])){ // !empty() catches more cases than isset()
$nv = json_decode($_POST['var'])
/* LOOP LOGIC
* Definitely test my hypothesis yourself, but this is similar
* to what I would try first.
*/
//Row in database query
$inTableSQL = "SELECT id FROM TABLE1 WHERE id IN("; //keep adding to it
foreach ($nv as $k) {
/* I would personally use specific methods per data type.
* Here, I might use a type cast, plus valid int range check.
*/
$id = $t->cleanId($k->id); //I would include a type cast: (int)
// Similarly for other values
//etc.
// Then save validated data to the array(s)
$data[$id] = array($values...);
/* Now would also be a good time to add the id to the SELECT
* statement
*/
$inTableSQL .= "$id,";
}
$inTableSQL .= ");";
// Execute query here
// Then step through the query ids returned, perform UPDATEs,
// remove the array element once UPDATE is done (use prepared statements)
foreach (.....
/* Then, insert the remaining rows all at once...
* You'll have to step through the remaining array elements to
* prepare the statement.
*/
foreach(.....
} //end initial POST data if
/* Everything below here becomes irrelevant */
foreach($nv as $k) {
$id = $t->clean($k->id);
// ... goes on for about 10 keys
// this might seems redundant or insufficient
$id = $c->real_escape_string($id);
// ... goes on for the rest of keys...
$q = $c->query("SELECT * FROM table WHERE id = '$id'");
$r = $q->fetch_row();
if ($r[1] > 0) {
// Item exist in DB then just UPDATE
$q1 = $c->query(UPDATE TABLE1);
$q4 = $c->query(UPDATE TABLE2);
if ($x == 1) {
$q2 = $c->query(SELECT);
$rq = $q2->fetch_row();
if ($rq[0] > 0) {
// Item already in table just update
$q3 = $c->query(UPDATE TABLE3);
} else {
// Item not in table then INSERT
$q3 = $c->query(INSERT TABLE3);
}
}
} else {
// Item not in DB then Insert
$q1 = $c->query(INSERT TABLE1);
$q4 = $c->query(INSERT TABLE2);
$q3 = $c->query(INSERT TABLE4);
if($x == 1) {
$q5 = $c->query(INSERT TABLE3);
}
}
}
}
The key is to minimize queries. Often, where you are looping over data doing one or more queries per iteration, you can replace it with a constant number of queries. In your case, you'll want to rewrite it into something like this:
include(db.php);
include(tools.php);
$c = new GetDB(); // Connection to DB
$t = new Tools(); // Classes to clean, prevent XSS and others
if(isset($_POST['var'])){
$nv = json_decode($_POST['var'])
$table1_data = array();
$table2_data = array();
$table3_data = array();
$table4_data = array();
foreach($nv as $k) {
$id = $t->clean($k->id);
// ... goes on for about 10 keys
// this might seems redundant or insufficient
$id = $c->real_escape_string($id);
// ... goes on for the rest of keys...
$table1_data[] = array( ... );
$table2_data[] = array( ... );
$table4_data[] = array( ... );
if ($x == 1) {
$table3_data[] = array( ... );
}
}
$values = array_to_sql($table1_data);
$c->query("INSERT INTO TABLE1 (...) VALUES $values ON DUPLICATE KEY UPDATE ...");
$values = array_to_sql($table2_data);
$c->query("INSERT INTO TABLE2 (...) VALUES $values ON DUPLICATE KEY UPDATE ...");
$values = array_to_sql($table3_data);
$c->query("INSERT INTO TABLE3 (...) VALUES $values ON DUPLICATE KEY UPDATE ...");
$values = array_to_sql($table4_data);
$c->query("INSERT IGNORE INTO TABLE4 (...) VALUES $values");
}
While your original code executed between 3 and 5 queries per row of your input data, the above code only executes 4 queries in total.
I leave the implementation of array_to_sql to the reader, but hopefully this should explain the idea. TABLE4 is an INSERT IGNORE here since you didn't have an UPDATE in the "found" clause of your original loop.
I am trying to loop over some data coming to me from a SOAP request and insert the records into a custom table in my Drupal install.
At first I created a custom module and used standard mysqli_connect() syntax to connect to the database and loop through the records and insert them. This was working great and fetched and inserted my remote data in about 2 seconds without a hitch.
I then remembered that Drupal has a database API (I am fairly new to Drupal) so I decided to do it right and use the API instead. I converted my code to how I think I should be doing it per the API docs, but now the process takes more like 5 or 6 seconds and sometimes even randomly hangs and doesn't complete at all and I get weird Session errors. The records end up inserting fine, but it just takes forever.
I'm wondering if I am doing it wrong. I would also like to wrap the inserts into a transaction, because I will first be deleting ALL of the records in the destination table first and then inserting the new data and since I am deleting first, I want to be able to roll back if the inserts fail for whatever reason.
I did not add transaction code to my original PHP only code, but did try to attempt it with the Drupal API, although completely removing the transaction/try/catch code doesn't seem to affect the speed or issues at all.
Anyway here is my original code:
$data = simplexml_load_string($jobsXml);
$connection = mysqli_connect("localhost","user","pass","database");
if (mysqli_connect_errno($connection))
{
echo "Failed to connect to MySQL: " . mysqli_connect_error();
exit();
}
// delete * current jobs
mysqli_query($connection,'TRUNCATE TABLE jobs;');
$recordsInserted = 0;
foreach ($data->NewDataSet->Table as $item) {
//escape and cleanup some fields
$image = str_replace('http://www.example.com/public/images/job_headers/', '', $item->job_image_file);
$specialty_description = mysqli_real_escape_string($connection, $item->specialty_description);
$job_board_title = mysqli_real_escape_string($connection, $item->job_board_title);
$job_board_subtitle = mysqli_real_escape_string($connection, $item->job_board_subtitle);
$job_state_code = ($item->job_country_code == 'NZ') ? 'NZ' : $item->job_state_code;
$sql = "
INSERT INTO jobs (
job_number,
specialty,
specialty_description,
division_code,
job_type,
job_type_description,
job_state_code,
job_country_code,
job_location_display,
job_board_type,
job_image_file,
job_board_title,
job_board_subtitle
) VALUES (
$item->job_number,
'$item->specialty',
'$specialty_description',
'$item->division_code',
'$item->job_type',
'$item->job_type_description',
'$job_state_code',
'$item->job_country_code',
'$item->job_location_display',
'$item->job_board_type',
'$image',
'$job_board_title',
'$job_board_subtitle'
)
";
if (!mysqli_query($connection,$sql))
{
die('Error: ' . mysqli_error($connection) . $sql);
}
$recordsInserted++;
}
mysqli_close($connection);
echo $recordsInserted . ' records inserted';
and this is my Drupal code. Can anyone tell me if maybe I am doing this wrong or not the most efficient way?
$data = simplexml_load_string($jobsXml);
// The transaction opens here.
$txn = db_transaction();
// delete all current jobs
$records_deleted = db_delete('jobs')
->execute();
$records_inserted = 0;
try {
$records = array();
foreach ($data->NewDataSet->Table as $item) {
$records[] = array(
'job_number' => $item->job_number,
'specialty' => $item->specialty,
'specialty_description' => $item->specialty_description,
'division_code' => $item->division_code,
'job_type' => $item->job_type,
'job_type_description' => $item->job_type_description,
'job_state_code' => ($item->job_country_code == 'NZ') ? 'NZ' : $item->job_state_code,
'job_country_code' => $item->job_country_code,
'job_location_display' => $item->job_location_display,
'job_board_type' => $item->job_board_type,
'job_image_file' => str_replace('http://www.example.com/public/images/job_headers/', '', $item->job_image_file),
'job_board_title' => $item->$job_board_title,
'job_board_subtitle' => $item->job_board_subtitle,
);
$records_inserted++;
}
$fields = array(
'job_number',
'specialty',
'specialty_description',
'division_code',
'job_type',
'job_type_description',
'job_state_code',
'job_country_code',
'job_location_display',
'job_board_type',
'job_image_file',
'job_board_title',
'job_board_subtitle'
);
$query = db_insert('jobs')
->fields($fields);
foreach ($records as $record) {
$query->values($record);
}
$query->execute();
} catch (Exception $e) {
// Something went wrong somewhere, so roll back now.
$txn->rollback();
// Log the exception to watchdog.
watchdog_exception('Job Import', $e);
echo $e;
}
echo $records_deleted . ' records deleted<br>';
echo $records_inserted . ' records inserted';
How big is the dataset you are trying to insert? If the dataset is very large then perhaps you might right into query size issues. Try looping over records and inserting each record one by one like you did with PHP.
I've asked this question before, but have changed my code since. I'm having trouble with this script which inserts form data into a table. The first insert creates a booking which stores the customer's contact details. The second insert takes the booking ref created in the first and creates a 'JOB' for the customer. The final insert is supposed to create a second 'JOB', the customer's return journey.
The first two inserts are running fine,
but it ignored the final one, the second JOB insert.
I have checked the table structures, and the data been passed to the script everything is okay, so the problem must be in the script (shown below) any help is greatly appreciated.
Is it correct to use one script to insert into the same table twice?
<?php
$customer_title = $_POST['customer_title'];
$customer_first_name = $_POST['customer_first_name'];
$customer_last_name = $_POST['customer_last_name'];
$billing_address = $_POST['billing_address'];
$customer_tel = $_POST['customer_tel'];
$customer_mobile = $_POST['customer_mobile'];
$customer_email = $_POST['customer_email'];
$passengers = $_POST['passengers'];
$cases = $_POST['cases'];
$return_flight_number = $_POST['return_flight_number'];
$price = $_POST['price'];
$pickup_date = $_POST['pickup_date'];
$pickup_time = $_POST['pickup_time'];
$pickup_address = $_POST['pickup_address'];
$destination_address = $_POST['pickup_destination'];
$return_date = $_POST['return_date'];
$return_time = $_POST['return_time'];
$return_pickup = $_POST['return_pickup'];
$return_destination = $_POST['return_destination'];
$booking_notes = $_POST['booking_notes'];
$booking_status = "Confirmed";
$authorised = "N";
$booking_agent = "ROOT_TEST";
$booking_date = date("Y/m/d");
if (isset($_POST['customer_title'])) {
include('../assets/db_connection.php');
$create_booking = $db->prepare("INSERT INTO bookings(customer_name, billing_address, contact_tel, contact_mob, contact_email, party_pax, party_cases, booking_notes, price, booking_agent, booking_date, booking_status, authorised)
VALUES(:customer_name, :billing_address, :contact_tel, :contact_mob, :contact_email, :party_pax, :party_cases, :booking_notes, :price, :booking_agent, :booking_date, :booking_status, :authorised );");
$create_booking->execute(array(
":customer_name" => $customer_title . ' ' . $customer_first_name . ' ' . $customer_last_name,
":billing_address" => $billing_address,
":contact_tel" => $customer_tel,
":contact_mob" => $customer_mobile,
":contact_email" => $customer_email,
":party_pax" => $passengers,
":party_cases" => $cases,
":booking_notes" => $booking_notes,
":price" => $price,
":booking_agent" => $booking_agent,
":booking_date" => $booking_date,
":booking_status" => $booking_status,
":authorised" => $authorised
));
$booking_ref = $db->lastInsertId('booking_ref'); // Takes Booking Ref generated in $create_booking
$scheduled = "N";
$create_job = $db->prepare("INSERT INTO jobs(booking_ref, pickup_date, pickup_time, pickup_address, destination_address, scheduled)
VALUES(:booking_ref, :pickup_date, :pickup_time, :pickup_address, :destination_address, :scheduled)");
$create_job->execute(array(
":booking_ref" => $booking_ref,
":pickup_date" => $pickup_date,
":pickup_time" => $pickup_time,
":pickup_address" => $pickup_address,
":destination_address" => $destination_address,
":scheduled" => $scheduled
));
$return = "Y";
$create_return = $db->prepare("INSERT INTO jobs(booking_ref, pickup_date, pickup_time, pickup_address, destination_address, scheduled, return)
VALUES(:booking_ref, :pickup_date, :pickup_time, :pickup_address, :destination_address, :scheduled, :return)");
$create_return->execute(array(
":booking_ref" => $booking_ref,
":pickup_date" => $return_date,
":pickup_time" => $return_time,
":pickup_address" => $return_pickup,
":destination_address" => $return_destination,
":scheduled" => $scheduled,
":return" => $return
));
}
?>
It is incorrect for sure, as inserting the same data twice violates one of most important database architecture laws - Database Normalization principle
However, there is no technical issues with it. There is some mistake which you have to catch using the error message from mysql. To have it, add this line after connecting to PDO.
$db->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
Please note that catching the actual error is the only way to debug SQL queries. Just watching the code makes no sense nor help.
return must be a mysql keyword. write it as
`return`
By the way, I can't stand such enormously huge code.
If I were you, I'd make it in 10 lines, not 50:
$allowed = array('customer_name', 'billing_address', 'contact_tel', 'contact_mob',
'contact_email', 'party_pax', 'party_cases', 'booking_notes', 'price');
$insert = $db->filterArray($_POST,$allowed);
$insert['booking_status'] = "Confirmed";
$insert['authorised'] = "N";
$insert['booking_agent'] = "ROOT_TEST";
$insert['booking_date'] = date("Y-m-d");
$db->query("INSERT INTO bookings SET ?u", $insert);
It looks like booking_ref is the primary key in the jobs table, your trying to insert the same key twice which is why the final query fails.
You should have a seperate field that is the primary key on jobs which is just an auto-incrementing number, then create an index on booking_ref.
There's no law against it. What you need to do is check the return value for the last INSERT query. My best guess is there's a unique index on the jobs table that you're violating with the double-insert.
It's not obvious if you're using mySQLi or PDO here, but both's execute functions return false on failure, so you should catch that and then call the respective object's error functions to get what went wrong.
I have following PHP loop + SQL Update query.
for ($i=0;$i<count($_POST['id']);$i++) {
if(!isset($_POST['live'][$i])){
$_POST['live'][$i] = "0";
} else { $_POST['live'][$i] = "1"; }
$id = ($_POST['id'][$i]);
$live = ($_POST['live'][$i]);
$usr2 = $_SESSION['usr'];
$updated = date("F j, Y, g:i a",time()+60*60);
$sql = "UPDATE news SET live = '$live', usr2 = '$usr2', updated = '$updated' WHERE id = $id";
$result = mysql_query($sql);
//echo $sql."<br />";
}
if($result) {
header("location: notes.php");
exit();
}else {
die("Query failed");
}
How does it work:
I'm submitting big form will ALL OF THE table rows.
receiving this in different file as an array
if $_POST['live'] is 'not set' - set it to '0', if 'set' set it to 1
update array data within for loop
How to UPDATE only the rows which have been actually been changed?
Those which value from $_POST['live'] is actually different from this saved in DB, as the condition would be change of our $live row.
I guess you're concerned about the updated field and that this value only changes when something has been altered. (If that's not the case forget about this answer.)
You can define an ON UPDATE CURRENT_TIMESTAMP clause for a timestamp field. Each time a record is updated without explicitly setting a value for this field mysql uses the current time as its new value...
...but only if the record is altered; if you "update" the fields with the same value as are already in that record nothing happens.
demo script:
<?php
$pdo = new PDO('mysql:host=localhost;dbname=test;charset=utf8', 'localonly', 'localonly');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$pdo->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
setup($pdo);
$stmt = $pdo->prepare('UPDATE soNews SET somevalue=:v WHERE id=:id');
show('original', $pdo);
$stmt->execute( array(':id'=>1, ':v'=>'z') );
show('update with new=old & id=1', $pdo);
$stmt->execute( array(':id'=>2, ':v'=>'y') ); // new value=old value
show('update with new!=old & id=2', $pdo);
function setup($pdo) {
$pdo->exec('
CREATE TEMPORARY TABLE soNews (
id int auto_increment,
somevalue varchar(32),
updated TIMESTAMP DEFAULT 0 ON UPDATE CURRENT_TIMESTAMP,
primary key(id)
)
');
$pdo->exec("INSERT INTO soNews (id,somevalue,updated) VALUES (1,'x', Now()-interval 47 hour),(2,'y', Now()-interval 47 hour)");
}
function show($label, $pdo) {
echo "------ $label --------\n";
foreach( $pdo->query('SELECT * FROM soNews', PDO::FETCH_ASSOC) as $row ) {
echo join(', ', $row), "\n";
}
}
prints
------ original --------
1, x, 2011-08-16 14:09:53
2, y, 2011-08-16 14:09:53
------ update with new=old & id=1 --------
1, z, 2011-08-18 13:09:53
2, y, 2011-08-16 14:09:53
------ update with new!=old & id=2 --------
1, z, 2011-08-18 13:09:53
2, y, 2011-08-16 14:09:53
As you can see as a result of the first update query the timestamp field has been updated while the second query setting new=old didn't affect the updated field.
Bobby tables will destroy your database. All your bits are belong to him (strictly speaking, this is an exaggeration, but you need to wrap all of your db inputs with mysql_real_escape_string or better yet, move to PDO's or MySQLi).
Long and the short? No, there is no reliable way to determine whether user input is the same as what is in the database without actually querying the database first or somehow storing the original output from the DB locally ($_SESSION or whatnot).
There are legitimate use cases for that, but it looks like you're better off just calling the updates. You can prevent them slightly by adding AND LIVE != '$live' AND UR2 != '$ur2', but you'll still need to run that many queries.
BTW -- I generally advise people not to use traditional for loops in PHP pretty much ever. PHP's foreach is better in almost every way. Instead of for ($i=0;$i<count();$i++), use foreach( $_POST['id'] as $i => $id ). You'll already have $id declared.
Actually, I think that a good way of doing is to:
1) Perform a query to get the old record from the db, then store the row contents in an associative array, with column names as keys.
2) Create a new array by checking the content of each "column" to be updated. If the content received is different from the value stored on the db, update the record data, else ignore and go ahead. Finally send back the updated data to the db with an UPDATE
function updateRecord(array $input_data) {
// Get the data associated to the record id we want to update.
$record_data = $yourDBWrapperClass
->where("id",$_GET['record_id'])
->get(TABLE_NAME);
// Process column by column and append data to the final array.
$final_data = [];
$ignored_columns = ['id', 'last_update'];
foreach ($record_data as $col_name => $value) {
// Update the record data, only when input data shows a difference
if(array_key_exists($col_name, $input_data)
&&
$record_data[$col_name] != $input_data[$col_name])
{
$final_data[$col_name] = $inputData[$col_name];
}
else if (array_key_exist($ignored_columns, $col_name) == FALSE && $record_data[$col_name] == $input_data[$col_name]
{
$final_data[$col_name] == $value;
}
}
// Finally, perform the db update.
$update_result = $yourDBWrapperClass
->where("id", $_GET['record_id'])
->update(TABLE_NAME, $final_data);
return $update_result ? "Record update success" : "Record update failed";
}
note: You don't need to send back the id, or last_update columns: their value is calculated automatically by the server. Sending a wrong value, will cause an error, or provide a wrong information. Think about the last_update column: it's better to leave to MySQL, which will call use the column default to get the value: NOW(); or CURDATE();
expected aspect of variables/arrays
$_GET['record_id'] = 123;
$record_data = [
"id" => 123,
"name" => "John",
"surname" => "Dahlback",
"age" => 31,
"last_update" => "2019-01-01 10:00:00"
];
$input_data = [
"age" => 32
];
$final_data = [
// "id" => 123,
"name" => "John",
"surname" => "Dahlback",
"age" => 32,
// "last_update" => CURDATE();
];