Running queries in For loop - php

I am trying to remove running MySQL queries in for loops inside some code.
But I am not sure how best to achieve that.
using PHP PDO, with named parameters, how do I store the queries and then run them as a batch after the loop? so the only thing that happens in the for loop is the queries get built, but not executed until after the loop has finished?
here is example code:
for($i=$rowcount; $i>0; $i--){
$id = $array[$i];
$query = "DELETE FROM table WHERE ID=:id AND Order=:order";
$stmt = $conn->prepare($query);
$stmt->execute(array('id' => $id, 'order' => $i));
}

My initial reaction would be: Why do this? What is your motivation? Simply avoiding SQL in loops, or do you have some sort of operational problems you want to avoid?
Your exact query would not be that easy to convert into a single query because you do have tuples of ID and Order values that have to match in order to be deleted. Also notice that prepared statements usually do not accept an arbitrary number of parameters, so even if you'd be able to transform your query into the form DELETE FROM table WHERE (ID=1 AND Order=1000) OR (ID=4 AND Order=1234)..., you'd have to somehow work out how to fill the first, second, third ... placeholder. Additionally, you'd be forced to generate that prepared statement dynamically, which is probably the opposite of how prepared statements should be done when it comes to security.
If you have performance problems because deleting 1000 entries one after the other has a big impact, there are alternatives: If you wrap the deletion inside one single transaction, then it probably doesn't matter that much how many entries you delete - ALL of them will be deleted once the transaction is committed. Also note that using prepared statements is one way to speed up database operations - but only if you prepare them only once before you loop, and inside the loop you'd only pass new parameters again and again.
So to wrap it up: Undoing SQL in loops is not the best thing if the programming problem you want to solve is better solved using a loop, and there is no other problem related to it. If however there is such a problem, it has to be analyzed and mentioned - removing the loops isn't an automatic success story.

I suppose you need to ensure that all queries are executed, in other words you need a transaction:
$conn->beginTransaction();
try{
$query = "DELETE FROM table WHERE ID=:id AND Order=:order";
$stmt = $conn->prepare($query);
for($i=$rowcount;$i>0;$i--){
$id = $array[$i];
$stmt->execute(['id' => $id,'order'=>$i]);
}
$conn->commit();
}
catch(Exception $ex){
$conn->rollBack();
}

Related

How to execute two sql queries in php?

I am trying to execute this code in a php page to store some data on my database.
The thing is that I want to Insert data, but due to a foreign key constraint it is impossible. So, in my php code I want to execute two sql queries. The first one to disable foreign key checks and the second one to insert the data.
When I try it in phpmyadmin it works. But manually. I would like to put it on php code.
This is my code. The parameter $conexion is the one that executes my sql queries.
Any ideas?
$sql = "SET foreign_key_checks=0";
$sql. = "INSERT INTO routes (title, distance, subtitle) VALUES ('".$_POST['title']."','".$_POST['distance']."', '".$_POST['subtitle']."');";
$conexion->multi_query($sql);
Try to avoid using multi_query. Sending a small query to the MySQL server doesn't really affect performance and does prevent kind of limit the effect of something like SQL injection.
In your case there's no need for multi_query. If you send two queries in a script, both go over on the same connection. The SET query affect the current connection.
// Protect against SQL injection
$title = $conexion->escape_string($_POST['title']);
$distance = $conexion->escape_string($_POST['distance']);
$subtitle = $conexion->escape_string($_POST['subtitle']);
// Execute queries
$conexion->query("SET forgeign_key_checks=0");
$conexion->query("INSERT INTO routes (title, distance, subtitle) VALUES ('$tittle', '$distance', '$subtitle')");
Apart from the comment above, you need a semi-colon between your sql statements
multi_query - Executes one or multiple queries which are concatenated by a semicolon.

INSERT in main foreach() or in separate foreach()?

I have a cache table which is rebuilt occasionally:
$Sql = 'INSERT INTO someTable (...fields...) VALUES (...values...)';
$stmt = $pdo->prepare($sql, array(PDO::ATTR_CURSOR => PDO::CURSOR_FWDONLY));
$items_to_insert[] = array();
foreach ($item as $i) {
$details = array();
// Lots of things here, such as:
$details[':username'] = $['username'];
$items_to_insert[] = $details
}
foreach ($items_to_insert as $i) {
$stmt->execute($i);
}
What should I take into consideration to decide if it would be better off to run $stmt->execute() in the first foreach and to eliminate the second foreach and the $items_to_insert array? Is there any way to have the execute() run concurrently with the next foreach loop? This is for an application that will likely run on a variety of hardware so I am not interested in the specific case of benchmarking on my workstation, but rather I am interested in learning the intricacies of situation to make better use of PDO best practices.
The best practice would be use none of foreach loops. Using sql queries in loop is a very bad idea, and kind of query (insert, select, update or delete) doesn't matter).
What you should to do is to make one query using a loop and execute it only once. Unfortunately, PDO doesn't provide any automatic way to do this, so you have to write it manually.
Look at the accepted answer of this question, it does exactly what you need: PDO Prepared inserts multiple rows single query
This question has actually nothing to do with PDO but just with common sense. Running useless extra loop would be obviously worse than not running it at all.
However, it matters rather for the overall code quality, while performance-wise you hardly ever notice any difference.
It sounds like you want an all or nothing situation where either they all insert or they all fail.
Consider using Database Transactions for this. Begin your Transaction, move the insert up to the other loop and then do a commit at the end.

Most efficent way of determing if a value is in a table

I often run into the situation where I want to determine if a value is in a table. Queries often happen often in a short time period and with similar values being searched therefore I want to do this the most efficient way. What I have now is
if($statment = mysqli_prepare($link, 'SELECT name FROM inventory WHERE name = ? LIMIT 1'))//name and inventory are arbitrarily chosen for this example
{
mysqli_stmt_bind_param($statement, 's', $_POST['check']);
mysqli_stmt_execute($statement);
mysqli_stmt_bind_result($statement, $result);
mysqli_stmt_store_result($statement);//needed for mysqli_stmt_num_rows
mysqli_stmt_fetch($statement);
}
if(mysqli_stmt_num_rows($statement) == 0)
//value in table
else
//value not in table
Is it necessary to call all the mysqli_stmt_* functions? As discussed in this question for mysqli_stmt_num_rows() to work the entire result set must be downloaded from the database server. I'm worried this is a waste and takes too long as I know there is 1 or 0 rows. Would it be more efficient to use the SQL count() function and not bother with the mysqli_stmt_store_result()? Any other ideas?
I noticed the prepared statement manual says "A prepared statement or a parametrized statement is used to execute the same statement repeatedly with high efficiency". What is highly efficient about it and what does it mean same statement? For example if two separate prepared statements evaluated to be the same would it still be more efficient?
By the way I'm using MySQL but didn't want to add the tag as a solution may be non-MySQL specific.
if($statment = mysqli_prepare($link, 'SELECT name FROM inventory WHERE name = ? LIMIT 1'))//name and inventory are arbitrarily chosen for this example
{
mysqli_stmt_bind_param($statement, 's', $_POST['check']);
mysqli_stmt_execute($statement);
mysqli_stmt_store_result($statement);
}
if(mysqli_stmt_num_rows($statement) == 0)
//value not in table
else
//value in table
I believe this would be sufficient. Note that I switched //value not in table
and //value in table.
It really depends of field type you are searching for. Make sure you have an index on that field and that index fits in memory. If it does, SELECT COUNT(*) FROM <your_table> WHERE <cond_which_use_index> LIMIT 1. The important part is LIMIT 1 which prevent for unnecessary lookup. You can run EXPLAIN SELECT ... to see which indexes used and probably make a hint or ban some of them, it's up to you. COUNT(*) works damn fast, it is optimized by design return result very quickly (MyISAM only, for InnoDB the whole stuff is a bit different due to ACID). The main difference between COUNT(*) and SELECT <some_field(s)> is that count doesn't perform any data reading and with (*) it doesn't care about whether some field is a NULL or not, just count rows by most suitable index (chosen internally). Actually I can suggest that even for InnoDB it's a fastest technique.
Also use case matters. If you want insert unique value make constrain on that field and use INSERT IGNORE, if you want to delete value which may not be in table run DELETE IGNORE and same for UPDATE IGNORE.
Query analyzer define by itself whether two queries are the same on or not and manage queries cache, you don't have to worry about it.
The different between prepared and regular query is that the first one contains rule and data separately, so analyzer can define which data is dynamic and better handle that, optimize and so. It can do the same for regular query but for prepared we say that we will reuse it later and give a hint which data is variable and which is fixed. I'm not very good in MySQL internal so you can ask such questions on more specific sites to understand details in a nutshell.
P.S.: Prepared statements in MySQL are session global, so after session they are defined in ends they are deallocated. Exact behavior and possible internal MySQL caching is a subject of additional investigation.
This is the kind of things in-memory caches are really good at. Something like this should work better than most microoptimization attempts (pseudocode!):
function check_if_value_is_in_table($value) {
if ($cache->contains_key($value)) {
return $cache->get($value);
}
// run the SQL query here, put result in $result
// note: I'd benchmark if using mysqli_prepare actually helps
// performance-wise
$cache->put($value, $result);
return $result;
}
Have a look at memcache or the various alternatives.

When to close Prepared Statement

When to close prepared statements in PHP?
Example:
$query = "insert into web_reviews (title,added_date,reviewer_home_url,read_more_link,summary) values(?,?,?,?,?)";
$stmt = $this->db->prepare($query);
$stmt->bind_params($this->title,$this->added_date,$this->reviewer_home_url,$this->read_more,$this->summary);
$stmt->execute() or die("Cannot add the date to the database, please try again.");
$stmt->close();
$stmt = $this->db->prepare("select id from web_reviews where title = ? and read_more = ?");
$stmt->bind_params($this->title,$this->read_more);
$stmt->execute();
$stmt->bind_results($web_review_id);
$stmt->close();
Should I use $stmt->close(); here?
Edit:
What is written on the PHP Manual and also one comment from the manual says:
Closes a prepared statement.
mysqli_stmt_close() also deallocates
the statement handle. If the current
statement has pending or unread
results, this function cancels them so
that the next query can be executed.
Comment:
if you are repeating an statement in
an loop using bind_param and so on
inside it for a larger operation. i
thougt id would be good to clean it
with stmt->close. but it broke always
with an error after aprox. 250
operations . As i tried it with
stmt->reset it worked for me.
That is a good use of close, especially since you are planning on making another query. With both PDO statements and MySQLi statements, I find that erring on the side of cleanliness is almost always for the best -- it removes potential bugs down the line.
As to the gentlemen with 250 operations... I don't see what the real use case is. Why does he need to query the database 250 different times? Why can't he query the database once with 250 records? Or, more likely, why can't he query the database 25 times with 10 records?
I am unable to comment currently, so I am just providing an answer. When you run a prepared statement that queries the database for a result, it will not execute another query unless you remove the current result it is storing. $result = $stmt->get_result().
Secondly, If you will need the result from the first query to be saved so that you use it later, then I recommend using two result sets. The first stores the result from the first execution of $stmt and the second for the second execution. This might not answer the question directly, but it may help someone.

PHP/PDO: style of write many queries on one page?

An example of my scenario is a large setup page for an application, the method I use is for example:
//query 1
$stmt = $dbh->prepare("...");
$stmt->execute();
//query 2
$stmt = $dbh->prepare("...");
$stmt->execute();
Would this be an accepted method to write more queries? I have no clue how it's supposed to be done (or who does what, rather), I assume writing the second $stmt is the most acceptable way, as there is no need to create other variables, am I right?
I really wish to know how people do this sort of thing.. I don't want to release 'ugly' code if I have to.
Yes, that is perfectly acceptable way to execute queries. No need to create new $stmt objects.
Also, if you ever get the error Lost connection to MySQL server during query when performing multiple queries within a single page, always issue this with the query: This will tell the MySQL driver to use the buffered versions of the MySQL API.
PDO::setAttribute("PDO::MYSQL_ATTR_USE_BUFFERED_QUERY", true);
So that your query looks like:
$db->prepare('select * from tablename', array(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => true));
$db->execute();

Categories