I want to run mysql queries to insert data from a custom html form.
Here I have to insert multiple set of data, some data to one table and some to other. Currently I am using the simple approach to send data to php page using jquery ajax and run multiple mysqli_query() functions one after another. But I guess when there will be large number of users, there will be problems related to speed. So can anyone suggest me a better way to do the same.
There are 5 tables in the database and each table has 7 to 10 columns that need to get different set of data, every time.
I want to run each query only if the previous insert query is successfully completed.
That's why I am checking for the result every time and then running the next query, which makes me feel the issue of speed on large user base.
The problem is, I need to insert data to first table and if its successfully inserted then only run query for the second table.
This means you need a transaction.
A transaction is a set of queries that either all execute ok or if one fails - they all fail. This is to ensure you don't end up with crap data in your tables.
Do not
Do not use multiquery.
Do not use mysql_* function(s).
Do not use bulk inserts.
People telling you to do that just have absolutely no clue what they're doing, ignore them.
Do
Use PDO
Use prepared statements
Prepare the statement(s) ONCE, use them MULTIPLE times
Sample code - do NOT copy paste
$dsn = 'mysql:dbname=testdb;host=127.0.0.1;charset=utf8mb4';
$user = 'dbuser';
$password = 'dbpass';
$pdo = new PDO($dsn, $user, $password);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$insert['first'] = $pdo->prepare("INSERT INTO table SET col1 = :val, col2 = :val2");
$insert['second'] = $pdo->prepare("INSERT INTO another_table SET col1 = :val, col2 = :val2");
$pdo->beginTransaction();
$insert['first']->bindValue(':val', 'your value');
$insert['first']->bindValue(':val2', 'anothervalue');
$insert['first']->execute();
$insert['second']->bindValue(':val', 'your value');
$insert['second']->bindValue(':val2', 'anothervalue');
$insert['second']->execute();
$pdo->commit();
The code above will save the data in two tables ONLY if both inserts are successful.
To paraphrase the accepted answer but with accent on mysqli.
The key is to configure mysqli to throw exceptions and to use a transaction.
A transaction will ensure that all operations either complete in their entirety or have no effect whatsoever. Another important advantage of using a transaction is that it makes multiple inserts faster, eliminating all possible delays that could be caused by separate query execution.
To use transactions with mysqli you need to do as follows:
First of all, make sure you have a proper mysqli connection, which, among other things, tells mysqli to throw an exception in case of error. Then just prepare your queries, start a transaction, execute the queries and commit the transaction - just like it is shown in the accepted answer, but with mysqli:
include 'mysqli.php';
$stmt1 = $mysqli->prepare("INSERT INTO table SET col1 = ?, col2 = ?");
$stmt2 = $mysqli->prepare("INSERT INTO another_table SET col1 = ?, col2 = ?");
$mysqli->begin_transaction();
$stmt1->bind_param("ss", $col1, $col2);
$stmt1->execute();
$stmt2->bind_param("ss", $col1, $col2);
$stmt2->execute();
$mysqli->commit();
Thanks to exceptions and transactions there is no need to verify the result of each query manually.
Related
I want to run mysql queries to insert data from a custom html form.
Here I have to insert multiple set of data, some data to one table and some to other. Currently I am using the simple approach to send data to php page using jquery ajax and run multiple mysqli_query() functions one after another. But I guess when there will be large number of users, there will be problems related to speed. So can anyone suggest me a better way to do the same.
There are 5 tables in the database and each table has 7 to 10 columns that need to get different set of data, every time.
I want to run each query only if the previous insert query is successfully completed.
That's why I am checking for the result every time and then running the next query, which makes me feel the issue of speed on large user base.
The problem is, I need to insert data to first table and if its successfully inserted then only run query for the second table.
This means you need a transaction.
A transaction is a set of queries that either all execute ok or if one fails - they all fail. This is to ensure you don't end up with crap data in your tables.
Do not
Do not use multiquery.
Do not use mysql_* function(s).
Do not use bulk inserts.
People telling you to do that just have absolutely no clue what they're doing, ignore them.
Do
Use PDO
Use prepared statements
Prepare the statement(s) ONCE, use them MULTIPLE times
Sample code - do NOT copy paste
$dsn = 'mysql:dbname=testdb;host=127.0.0.1;charset=utf8mb4';
$user = 'dbuser';
$password = 'dbpass';
$pdo = new PDO($dsn, $user, $password);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$insert['first'] = $pdo->prepare("INSERT INTO table SET col1 = :val, col2 = :val2");
$insert['second'] = $pdo->prepare("INSERT INTO another_table SET col1 = :val, col2 = :val2");
$pdo->beginTransaction();
$insert['first']->bindValue(':val', 'your value');
$insert['first']->bindValue(':val2', 'anothervalue');
$insert['first']->execute();
$insert['second']->bindValue(':val', 'your value');
$insert['second']->bindValue(':val2', 'anothervalue');
$insert['second']->execute();
$pdo->commit();
The code above will save the data in two tables ONLY if both inserts are successful.
To paraphrase the accepted answer but with accent on mysqli.
The key is to configure mysqli to throw exceptions and to use a transaction.
A transaction will ensure that all operations either complete in their entirety or have no effect whatsoever. Another important advantage of using a transaction is that it makes multiple inserts faster, eliminating all possible delays that could be caused by separate query execution.
To use transactions with mysqli you need to do as follows:
First of all, make sure you have a proper mysqli connection, which, among other things, tells mysqli to throw an exception in case of error. Then just prepare your queries, start a transaction, execute the queries and commit the transaction - just like it is shown in the accepted answer, but with mysqli:
include 'mysqli.php';
$stmt1 = $mysqli->prepare("INSERT INTO table SET col1 = ?, col2 = ?");
$stmt2 = $mysqli->prepare("INSERT INTO another_table SET col1 = ?, col2 = ?");
$mysqli->begin_transaction();
$stmt1->bind_param("ss", $col1, $col2);
$stmt1->execute();
$stmt2->bind_param("ss", $col1, $col2);
$stmt2->execute();
$mysqli->commit();
Thanks to exceptions and transactions there is no need to verify the result of each query manually.
I want to run mysql queries to insert data from a custom html form.
Here I have to insert multiple set of data, some data to one table and some to other. Currently I am using the simple approach to send data to php page using jquery ajax and run multiple mysqli_query() functions one after another. But I guess when there will be large number of users, there will be problems related to speed. So can anyone suggest me a better way to do the same.
There are 5 tables in the database and each table has 7 to 10 columns that need to get different set of data, every time.
I want to run each query only if the previous insert query is successfully completed.
That's why I am checking for the result every time and then running the next query, which makes me feel the issue of speed on large user base.
The problem is, I need to insert data to first table and if its successfully inserted then only run query for the second table.
This means you need a transaction.
A transaction is a set of queries that either all execute ok or if one fails - they all fail. This is to ensure you don't end up with crap data in your tables.
Do not
Do not use multiquery.
Do not use mysql_* function(s).
Do not use bulk inserts.
People telling you to do that just have absolutely no clue what they're doing, ignore them.
Do
Use PDO
Use prepared statements
Prepare the statement(s) ONCE, use them MULTIPLE times
Sample code - do NOT copy paste
$dsn = 'mysql:dbname=testdb;host=127.0.0.1;charset=utf8mb4';
$user = 'dbuser';
$password = 'dbpass';
$pdo = new PDO($dsn, $user, $password);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$insert['first'] = $pdo->prepare("INSERT INTO table SET col1 = :val, col2 = :val2");
$insert['second'] = $pdo->prepare("INSERT INTO another_table SET col1 = :val, col2 = :val2");
$pdo->beginTransaction();
$insert['first']->bindValue(':val', 'your value');
$insert['first']->bindValue(':val2', 'anothervalue');
$insert['first']->execute();
$insert['second']->bindValue(':val', 'your value');
$insert['second']->bindValue(':val2', 'anothervalue');
$insert['second']->execute();
$pdo->commit();
The code above will save the data in two tables ONLY if both inserts are successful.
To paraphrase the accepted answer but with accent on mysqli.
The key is to configure mysqli to throw exceptions and to use a transaction.
A transaction will ensure that all operations either complete in their entirety or have no effect whatsoever. Another important advantage of using a transaction is that it makes multiple inserts faster, eliminating all possible delays that could be caused by separate query execution.
To use transactions with mysqli you need to do as follows:
First of all, make sure you have a proper mysqli connection, which, among other things, tells mysqli to throw an exception in case of error. Then just prepare your queries, start a transaction, execute the queries and commit the transaction - just like it is shown in the accepted answer, but with mysqli:
include 'mysqli.php';
$stmt1 = $mysqli->prepare("INSERT INTO table SET col1 = ?, col2 = ?");
$stmt2 = $mysqli->prepare("INSERT INTO another_table SET col1 = ?, col2 = ?");
$mysqli->begin_transaction();
$stmt1->bind_param("ss", $col1, $col2);
$stmt1->execute();
$stmt2->bind_param("ss", $col1, $col2);
$stmt2->execute();
$mysqli->commit();
Thanks to exceptions and transactions there is no need to verify the result of each query manually.
I'm using ORACLE version 11g.
I would like to execute three queries "at the same time" and take care that if one or more of theses queries fails must return both tables to the previous state. These queries are one select to know if the selected row still being possible the make the action, and one update and one insert to do the action.
In my case I need to make an update on the same locked row (obviously no one else should be able to do the action to the same row) and later and insert on another table, only if the result of the select query confirm that the selected row still having the option to execute the action, so the queries will be like these approximately:
//this is the row I want to execute the action
$selectedIdFromTable1 = "1";
$query="SELECT attr1 FROM table1 WHERE attr1 = 'oldValueAttr1' AND id = selectedIdFromTable1";
$stmt = $this->oracleDB->prepare($query);
$stmt->bindValue(1, $attr1, "string");
$stmt->execute();
$result = $stmt->fetchColumn();
if($result->num_rows == 1){ //I'm still being able to do the action to the row because the row still having the oldValue
//So here the row must be locked to execute the update and the insert only once. Only one user should execute the update and the insert.
$query="UPDATE table1 SET attr1 = ? WHERE id == $selectedIdFromTable1";
$stmt = $this->oracleDB->prepare($query);
$stmt->bindValue(1, 'newValueAttr1', "string");
$stmt->execute();
$query="INSERT INTO table2 (attr2) VALUES (?)";
$stmt = $this->oracleDB->prepare($query);
$stmt->bindValue(1, 'newValueAttr2', "string");
$stmt->execute();
}
//here the lock can release the row for future actions (but not this one, because if any one tries the previous select should not find anymore the selected row)
Also I'm using the binding system to send the variables more safety. Not sure If can affect the answer.
I'm quite sure that a transaction with locking row is the answer and if it's the answer, I will really appreciate to receive your help with an example of a transaction with Oracle with an example of this situation.
All of that, will be in a Symfony 3.3 project. Probably is not necessary this last information, but the transaction code must be in the symfony project and not in the oracle database for different reasons.
Thank you very much.
If you will use symfony you will most likely use the DBAL connection.
Transactions are handled as described in its documentation
(To me it seems more a transaction feature than a locking one)
Transactions:
$conn->beginTransaction();
try{
// do stuff
$conn->commit();
} catch (\Exception $e) {
$conn->rollBack();
throw $e;
}
Locking is not handled by DBAL
I wrote some code that built up a single query of multiple insert and update statements which was executed at the end of a page load. It used to work okay. I am writing similar, optimised code on my dev system (Ubuntu 14.04, PHP 5.5.3-Ubuntu), but I am no longer able to run multiple statements in one PDO query:
What I do
During a page render, I build up an SQL statement that would look a bit like:
insert into <table> (col1,col2,col3) VALUES (?,?,?);
update <table> set col1 = ?, col4 = ? where id = ?;
insert into <table> (col1,col2,col3) VALUES (?,?,?);
...
When the page has been rendered and I'm sure there are no problems, I execute the query using a wrapper for PDO. The important bits of the wrapper function are
$database = new PDO("mysql:host=<host>;dbname=<dbname>", <user>, <pwd>,
array(PDO::MYSQL_ATTR_INIT_COMMAND => "set names 'utf8'"));
$database->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$stmt = $database->prepare($sql);
$stmt->execute($params);
For some reason, I am no longer able to execute this statement in one hit, instead, PDO only performs the first query, despite $stmt->queryString still holding the whole query. Can anyone help me with this problem.
Found the problem:
PDO fails silently if one of the queries throws an exception. In my case, the first query was okay, but the second was throwing an integrity constraint failure error, so it looked like only the first query was being run.
A wise man just told me: Don't shoot the messenger, break the queries up
My PHP/Mysql script works in stages:
Insert Customer info
Get new customer's ID via mysqli_insert_id(blah)
Insert into ADDRESS using new customer's ID
Insert into PHONE using new customer's ID
Insert into CONTACTS using new customer's ID
My problem is: If something goes wrong with one of the latter insert statements, MYSQL stops and throws an error of course, but the initial Customer INSERT already fired and was successful. Now if we go back and correct the error in the inputs and try to do this all again, we will have multiple partial entries for this Customer.
So Question: Is there a way to test ALL the MySQL Inserts for errors and upon finding none, THEN go ahead and submit them all?
For Example:
Insert Customer info (check for would-be errors; don't actually insert)
Get new customer's ID...
Insert into ADDRESS...(check for would-be errors; don't actually insert)
Insert into PHONE...(check for would-be errors; don't actually insert)
Insert into CONTACTS...(check for would-be errors; don't actually insert)
OK, now that there were no errors thrown and nothing was actually inserted, do all the pre-screened inserts!
Does this exist?
Use MySQL transaction.
A transaction is a sequential group of database manipulation operations, which is performed as if it were one single work unit. In other words, a transaction will never be complete unless each individual operation within the group is successful. If any operation within the transaction fails, the entire transaction will fail.
Practically, you will club many SQL queries into a group and you will execute all of them together as part of a transaction.
Read this tutorial about MySQL transaction
Yes, a transaction allows you to perform a series of queries, and commit all those changes at the end, or roll them back in case one of the queries failed.
If you're using PDO, the basic setup is this:
//optional
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
try
{
$pdo->beginTransaction();
$stmt = $pdo->prepare();
$pdo->query();
//in short, all your queries here
$pdo->commit();//commit changes to DB
}
catch (PDOException $e)
{
$pdo->rollBack();//Nothing is changed in DB
}
Of course, transactions work best on transaction-safe storage engines (InnoDB, BerkeleyDB, ndbcluster). Since MySQL 5, transactions do work on MyISAM, but in order for them to be atomic, and thus safe to use, you'll have to lock and un-lock the table yourself.
See the mysql docs for more details
a transaction would be best for a roll back state. ifthats not possible prefix the statements with insert and it should let you know about any errors
Here's a sample using mysqli to support my comment:
if (isset($_POST) && isset($_POST['whatever']))
{
$con = new mysqli("localhost", "user", "pw", "db");
// begin transaction by setting autocommit to false;
$con->autocommit(FALSE);
$stmt = $con->prepare("INSERT INTO `something` (`data`) VALUES (?)");
$stmt->bind_param("s", $_POST['whatever']);
$stmt->execute();
if (!$stmt)
{
$con->rollback();
}
else
{
$con->commit();
}
$stmt->close();
}
You should know transactions require innodb tables. It helps to check the specs:
http://dev.mysql.com/doc/refman/5.0/en/commit.html
See barranka's comments for more details