This is a sample pdo execute and commit example from php.net
<?php
/* Begin a transaction, turning off autocommit */
$dbh->beginTransaction();
/* Insert multiple records on an all-or-nothing basis */
$sql = 'INSERT INTO fruit
(name, colour, calories)
VALUES (?, ?, ?)';
$sth = $dbh->prepare($sql);
foreach ($fruits as $fruit) {
$sth->execute(array(
$fruit->name,
$fruit->colour,
$fruit->calories,
));
}
/* Commit the changes */
$dbh->commit();
/* Database connection is now back in autocommit mode */
?>
From what I understand, this enables me to faster insert thousand of rows in a table. Now, if I have a php file that is called
http://localhost/insertFruit.php?name=x&color=y&calories=z
which basically collects those values and insert them into database.
Now if I want to take advantage of that PDO loop and execute, I can't. If there is 100 of submission of those insertFruit.php, how can I collect all those values, combine into an array and then commit ? Using a separate class, calling object, using global values etc?
I am sure there is better way to do this.
how can i collect all those values, combine into an array and then commit?
Although you can, you shouldn't.
Neither transactions nor prepared statements are suitable for the separate requests.
Not to mention that your initial question is groundless. You don't need a faster insert. Your inserts are already fast.
There is no "better" way. Premature optimization is the root of all evil. By trying to devise a "better" way you'll ruin whole system.
All you can do is cluster some data into single request. If you have several fruits in one request, you can use prepared statements to process them in a loop.
Related
I want to run mysql queries to insert data from a custom html form.
Here I have to insert multiple set of data, some data to one table and some to other. Currently I am using the simple approach to send data to php page using jquery ajax and run multiple mysqli_query() functions one after another. But I guess when there will be large number of users, there will be problems related to speed. So can anyone suggest me a better way to do the same.
There are 5 tables in the database and each table has 7 to 10 columns that need to get different set of data, every time.
I want to run each query only if the previous insert query is successfully completed.
That's why I am checking for the result every time and then running the next query, which makes me feel the issue of speed on large user base.
The problem is, I need to insert data to first table and if its successfully inserted then only run query for the second table.
This means you need a transaction.
A transaction is a set of queries that either all execute ok or if one fails - they all fail. This is to ensure you don't end up with crap data in your tables.
Do not
Do not use multiquery.
Do not use mysql_* function(s).
Do not use bulk inserts.
People telling you to do that just have absolutely no clue what they're doing, ignore them.
Do
Use PDO
Use prepared statements
Prepare the statement(s) ONCE, use them MULTIPLE times
Sample code - do NOT copy paste
$dsn = 'mysql:dbname=testdb;host=127.0.0.1;charset=utf8mb4';
$user = 'dbuser';
$password = 'dbpass';
$pdo = new PDO($dsn, $user, $password);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$insert['first'] = $pdo->prepare("INSERT INTO table SET col1 = :val, col2 = :val2");
$insert['second'] = $pdo->prepare("INSERT INTO another_table SET col1 = :val, col2 = :val2");
$pdo->beginTransaction();
$insert['first']->bindValue(':val', 'your value');
$insert['first']->bindValue(':val2', 'anothervalue');
$insert['first']->execute();
$insert['second']->bindValue(':val', 'your value');
$insert['second']->bindValue(':val2', 'anothervalue');
$insert['second']->execute();
$pdo->commit();
The code above will save the data in two tables ONLY if both inserts are successful.
To paraphrase the accepted answer but with accent on mysqli.
The key is to configure mysqli to throw exceptions and to use a transaction.
A transaction will ensure that all operations either complete in their entirety or have no effect whatsoever. Another important advantage of using a transaction is that it makes multiple inserts faster, eliminating all possible delays that could be caused by separate query execution.
To use transactions with mysqli you need to do as follows:
First of all, make sure you have a proper mysqli connection, which, among other things, tells mysqli to throw an exception in case of error. Then just prepare your queries, start a transaction, execute the queries and commit the transaction - just like it is shown in the accepted answer, but with mysqli:
include 'mysqli.php';
$stmt1 = $mysqli->prepare("INSERT INTO table SET col1 = ?, col2 = ?");
$stmt2 = $mysqli->prepare("INSERT INTO another_table SET col1 = ?, col2 = ?");
$mysqli->begin_transaction();
$stmt1->bind_param("ss", $col1, $col2);
$stmt1->execute();
$stmt2->bind_param("ss", $col1, $col2);
$stmt2->execute();
$mysqli->commit();
Thanks to exceptions and transactions there is no need to verify the result of each query manually.
I want to run mysql queries to insert data from a custom html form.
Here I have to insert multiple set of data, some data to one table and some to other. Currently I am using the simple approach to send data to php page using jquery ajax and run multiple mysqli_query() functions one after another. But I guess when there will be large number of users, there will be problems related to speed. So can anyone suggest me a better way to do the same.
There are 5 tables in the database and each table has 7 to 10 columns that need to get different set of data, every time.
I want to run each query only if the previous insert query is successfully completed.
That's why I am checking for the result every time and then running the next query, which makes me feel the issue of speed on large user base.
The problem is, I need to insert data to first table and if its successfully inserted then only run query for the second table.
This means you need a transaction.
A transaction is a set of queries that either all execute ok or if one fails - they all fail. This is to ensure you don't end up with crap data in your tables.
Do not
Do not use multiquery.
Do not use mysql_* function(s).
Do not use bulk inserts.
People telling you to do that just have absolutely no clue what they're doing, ignore them.
Do
Use PDO
Use prepared statements
Prepare the statement(s) ONCE, use them MULTIPLE times
Sample code - do NOT copy paste
$dsn = 'mysql:dbname=testdb;host=127.0.0.1;charset=utf8mb4';
$user = 'dbuser';
$password = 'dbpass';
$pdo = new PDO($dsn, $user, $password);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$insert['first'] = $pdo->prepare("INSERT INTO table SET col1 = :val, col2 = :val2");
$insert['second'] = $pdo->prepare("INSERT INTO another_table SET col1 = :val, col2 = :val2");
$pdo->beginTransaction();
$insert['first']->bindValue(':val', 'your value');
$insert['first']->bindValue(':val2', 'anothervalue');
$insert['first']->execute();
$insert['second']->bindValue(':val', 'your value');
$insert['second']->bindValue(':val2', 'anothervalue');
$insert['second']->execute();
$pdo->commit();
The code above will save the data in two tables ONLY if both inserts are successful.
To paraphrase the accepted answer but with accent on mysqli.
The key is to configure mysqli to throw exceptions and to use a transaction.
A transaction will ensure that all operations either complete in their entirety or have no effect whatsoever. Another important advantage of using a transaction is that it makes multiple inserts faster, eliminating all possible delays that could be caused by separate query execution.
To use transactions with mysqli you need to do as follows:
First of all, make sure you have a proper mysqli connection, which, among other things, tells mysqli to throw an exception in case of error. Then just prepare your queries, start a transaction, execute the queries and commit the transaction - just like it is shown in the accepted answer, but with mysqli:
include 'mysqli.php';
$stmt1 = $mysqli->prepare("INSERT INTO table SET col1 = ?, col2 = ?");
$stmt2 = $mysqli->prepare("INSERT INTO another_table SET col1 = ?, col2 = ?");
$mysqli->begin_transaction();
$stmt1->bind_param("ss", $col1, $col2);
$stmt1->execute();
$stmt2->bind_param("ss", $col1, $col2);
$stmt2->execute();
$mysqli->commit();
Thanks to exceptions and transactions there is no need to verify the result of each query manually.
I am trying to remove running MySQL queries in for loops inside some code.
But I am not sure how best to achieve that.
using PHP PDO, with named parameters, how do I store the queries and then run them as a batch after the loop? so the only thing that happens in the for loop is the queries get built, but not executed until after the loop has finished?
here is example code:
for($i=$rowcount; $i>0; $i--){
$id = $array[$i];
$query = "DELETE FROM table WHERE ID=:id AND Order=:order";
$stmt = $conn->prepare($query);
$stmt->execute(array('id' => $id, 'order' => $i));
}
My initial reaction would be: Why do this? What is your motivation? Simply avoiding SQL in loops, or do you have some sort of operational problems you want to avoid?
Your exact query would not be that easy to convert into a single query because you do have tuples of ID and Order values that have to match in order to be deleted. Also notice that prepared statements usually do not accept an arbitrary number of parameters, so even if you'd be able to transform your query into the form DELETE FROM table WHERE (ID=1 AND Order=1000) OR (ID=4 AND Order=1234)..., you'd have to somehow work out how to fill the first, second, third ... placeholder. Additionally, you'd be forced to generate that prepared statement dynamically, which is probably the opposite of how prepared statements should be done when it comes to security.
If you have performance problems because deleting 1000 entries one after the other has a big impact, there are alternatives: If you wrap the deletion inside one single transaction, then it probably doesn't matter that much how many entries you delete - ALL of them will be deleted once the transaction is committed. Also note that using prepared statements is one way to speed up database operations - but only if you prepare them only once before you loop, and inside the loop you'd only pass new parameters again and again.
So to wrap it up: Undoing SQL in loops is not the best thing if the programming problem you want to solve is better solved using a loop, and there is no other problem related to it. If however there is such a problem, it has to be analyzed and mentioned - removing the loops isn't an automatic success story.
I suppose you need to ensure that all queries are executed, in other words you need a transaction:
$conn->beginTransaction();
try{
$query = "DELETE FROM table WHERE ID=:id AND Order=:order";
$stmt = $conn->prepare($query);
for($i=$rowcount;$i>0;$i--){
$id = $array[$i];
$stmt->execute(['id' => $id,'order'=>$i]);
}
$conn->commit();
}
catch(Exception $ex){
$conn->rollBack();
}
I have an array $authors with some numbers I want to insert into a table.
I can have a prepared statement to execute multiple times for each element from the array:
$stmt = $pdo->prepare('INSERT INTO authors (article_id, user_id) VALUES(?, ?)');
$stmt->bindParam(1, $article_id);
$stmt->bindParam($author);
foreach($authors as $author) {
$stmt->execute();
}
However, I can do a trick using implode() and execute the statement only once:
// here probably $authors = array_map('intval', $authors);
$stmt = $pdo->prepare(
'INSERT INTO authors (article_id, user_id)
VALUES ('.implode(', :article_id), (', $authors).', :article_id)');
$stmt->execute([':article_id' => $article_id]);
The first solution is more conventional and looks more securely.
The second (I think) is faster because there is only one query to the database. (and is shorter - there are no loops except in implode)
I don't see any security issues here but it looks safer (to me) when there are no string concatenations in queries.
Which is the proper way in this situation?
Edit: echo of the second query gives this:
INSERT INTO authors (article_id, student_id)
VALUES (121, :article_id), (50, :article_id)
And executes with no errors.
According to the PDO's doc "You cannot use a named parameter marker of the same name more than once in a prepared statement, unless emulation mode is on.". So that alone makes your "implode" solution bad.
That being said, I'll answer on the theory. The point of prepared statements is to compile the query only once, so repeated executions are faster. So prepared statements are meant to be used as in your first example : one simple "template" query, repeated many times.
In your second example, you make a custom query, that will hardly ever be repeated (since it's based on the content of your $authors array). Therefore, prepared statement in this case is completely useless, you have the overhead of the PREPARE without the benefits of repeated executions. It's not the way it's supposed to be used.
Extended insert is a perfectly valid solution, and a good one with that, but use it with normal query (i.e. exec()), and be sure to use quote() to protect against SQL-injection!
Our application in PHP / MySQL uses PDO prepared statements to insert data from user without any sanitizing. In the later stage user can copy created entries, thousand at the time.
For copying part we have tested:
$stmt = $pdo->prepare("INSERT INTO files (`a`,`b`,`c`) VALUES (?,?,?)");
$pdo->beginTransaction();
foreach ($data as $row) {
$stmt->execute($row);
}
$pdo->commit();
vs
$pdo->exec("INSERT INTO files (`a`,`b`,`c`) VALUES . implode(', ', $loop_query));
$row represent rows from database.
First one is 3 times slower than second. We would like to implement second approach.
How safe is using data from database without prepared statements?
It is not safe. As you mention, data on DB is raw.
If you retrieve it to the programming language (php in this case) and use again in a sql sentence it must be protected again against sql injecton.
can't you do a insert (fields) select values instead?
It is not safe at all. Entirely.