I am currently using this type of SQL on MySQL to insert multiple rows of values in one single query:
INSERT INTO `tbl` (`key1`,`key2`) VALUES ('r1v1','r1v2'),('r2v1','r2v2'),...
On the readings on PDO, the use prepared statements should give me a better security than static queries.
I would therefore like to know whether it is possible to generate "inserting multiple rows of values by the use of one query" using prepared statements.
If yes, may I know how can I implement it?
Multiple Values Insert with PDO Prepared Statements
Inserting multiple values in one execute statement. Why because according to this page it is faster than regular inserts.
$datafields = array('fielda', 'fieldb', ... );
$data[] = array('fielda' => 'value', 'fieldb' => 'value' ....);
$data[] = array('fielda' => 'value', 'fieldb' => 'value' ....);
more data values or you probably have a loop that populates data.
With prepared inserts you need to know the fields you're inserting to, and the number of fields to create the ? placeholders to bind your parameters.
insert into table (fielda, fieldb, ... ) values (?,?...), (?,?...)....
That is basically how we want the insert statement to look like.
Now, the code:
function placeholders($text, $count=0, $separator=","){
$result = array();
if($count > 0){
for($x=0; $x<$count; $x++){
$result[] = $text;
}
}
return implode($separator, $result);
}
$pdo->beginTransaction(); // also helps speed up your inserts.
$insert_values = array();
foreach($data as $d){
$question_marks[] = '(' . placeholders('?', sizeof($d)) . ')';
$insert_values = array_merge($insert_values, array_values($d));
}
$sql = "INSERT INTO table (" . implode(",", $datafields ) . ") VALUES " .
implode(',', $question_marks);
$stmt = $pdo->prepare ($sql);
$stmt->execute($insert_values);
$pdo->commit();
Although in my test, there was only a 1 sec difference when using multiple inserts and regular prepared inserts with single value.
Same answer as Mr. Balagtas, slightly clearer...
Recent versions MySQL and PHP PDO do support multi-row INSERT statements.
SQL Overview
The SQL will look something like this, assuming a 3-column table you'd like to INSERT to.
INSERT INTO tbl_name
(colA, colB, colC)
VALUES (?, ?, ?), (?, ?, ?), (?, ?, ?) [,...]
ON DUPLICATE KEY UPDATE works as expected even with a multi-row INSERT; append this:
ON DUPLICATE KEY UPDATE colA = VALUES(colA), colB = VALUES(colB), colC = VALUES(colC)
PHP Overview
Your PHP code will follow the usual $pdo->prepare($qry) and $stmt->execute($params) PDO calls.
$params will be a 1-dimensional array of all the values to pass to the INSERT.
In the above example, it should contain 9 elements; PDO will use every set of 3 as a single row of values. (Inserting 3 rows of 3 columns each = 9 element array.)
Implementation
Below code is written for clarity, not efficiency. Work with the PHP array_*() functions for better ways to map or walk through your data if you'd like. Whether you can use transactions obviously depends on your MySQL table type.
Assuming:
$tblName - the string name of the table to INSERT to
$colNames - 1-dimensional array of the column names of the table
These column names must be valid MySQL column identifiers; escape them with backticks (``) if they are not
$dataVals - mutli-dimensional array, where each element is a 1-d array of a row of values to INSERT
Sample Code
// setup data values for PDO
// memory warning: this is creating a copy all of $dataVals
$dataToInsert = array();
foreach ($dataVals as $row => $data) {
foreach($data as $val) {
$dataToInsert[] = $val;
}
}
// (optional) setup the ON DUPLICATE column names
$updateCols = array();
foreach ($colNames as $curCol) {
$updateCols[] = $curCol . " = VALUES($curCol)";
}
$onDup = implode(', ', $updateCols);
// setup the placeholders - a fancy way to make the long "(?, ?, ?)..." string
$rowPlaces = '(' . implode(', ', array_fill(0, count($colNames), '?')) . ')';
$allPlaces = implode(', ', array_fill(0, count($dataVals), $rowPlaces));
$sql = "INSERT INTO $tblName (" . implode(', ', $colNames) .
") VALUES " . $allPlaces . " ON DUPLICATE KEY UPDATE $onDup";
// and then the PHP PDO boilerplate
$stmt = $pdo->prepare ($sql);
$stmt->execute($dataToInsert);
$pdo->commit();
For what it is worth, I have seen a lot of users recommend iterating through INSERT statements instead of building out as a single string query as the selected answer did. I decided to run a simple test with just two fields and a very basic insert statement:
<?php
require('conn.php');
$fname = 'J';
$lname = 'M';
$time_start = microtime(true);
$stmt = $db->prepare('INSERT INTO table (FirstName, LastName) VALUES (:fname, :lname)');
for($i = 1; $i <= 10; $i++ ) {
$stmt->bindParam(':fname', $fname);
$stmt->bindParam(':lname', $lname);
$stmt->execute();
$fname .= 'O';
$lname .= 'A';
}
$time_end = microtime(true);
$time = $time_end - $time_start;
echo "Completed in ". $time ." seconds <hr>";
$fname2 = 'J';
$lname2 = 'M';
$time_start2 = microtime(true);
$qry = 'INSERT INTO table (FirstName, LastName) VALUES ';
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?)";
$stmt2 = $db->prepare($qry);
$values = array();
for($j = 1; $j<=10; $j++) {
$values2 = array($fname2, $lname2);
$values = array_merge($values,$values2);
$fname2 .= 'O';
$lname2 .= 'A';
}
$stmt2->execute($values);
$time_end2 = microtime(true);
$time2 = $time_end2 - $time_start2;
echo "Completed in ". $time2 ." seconds <hr>";
?>
While the overall query itself took milliseconds or less, the latter (single string) query was consistently 8 times faster or more. If this was built out to say reflect an import of thousands of rows on many more columns, the difference could be enormous.
The Accepted Answer by Herbert Balagtas works well when the $data array is small. With larger $data arrays the array_merge function becomes prohibitively slow. My test file to create the $data array has 28 cols and is about 80,000 lines. The final script took 41s to complete.
Using array_push() to create $insert_values instead of array_merge() resulted in a 100X speed up with execution time of 0.41s.
The problematic array_merge():
$insert_values = array();
foreach($data as $d){
$question_marks[] = '(' . placeholders('?', sizeof($d)) . ')';
$insert_values = array_merge($insert_values, array_values($d));
}
To eliminate the need for array_merge(), you can build the following two arrays instead:
//Note that these fields are empty, but the field count should match the fields in $datafields.
$data[] = array('','','','',... n );
//getting rid of array_merge()
array_push($insert_values, $value1, $value2, $value3 ... n );
These arrays can then be used as follows:
function placeholders($text, $count=0, $separator=","){
$result = array();
if($count > 0){
for($x=0; $x<$count; $x++){
$result[] = $text;
}
}
return implode($separator, $result);
}
$pdo->beginTransaction();
foreach($data as $d){
$question_marks[] = '(' . placeholders('?', sizeof($d)) . ')';
}
$sql = "INSERT INTO table (" . implode(",", array_keys($datafield) ) . ") VALUES " . implode(',', $question_marks);
$stmt = $pdo->prepare($sql);
$stmt->execute($insert_values);
$pdo->commit();
Two possible approaches:
$stmt = $pdo->prepare('INSERT INTO foo VALUES(:v1_1, :v1_2, :v1_3),
(:v2_1, :v2_2, :v2_3),
(:v2_1, :v2_2, :v2_3)');
$stmt->bindValue(':v1_1', $data[0][0]);
$stmt->bindValue(':v1_2', $data[0][1]);
$stmt->bindValue(':v1_3', $data[0][2]);
// etc...
$stmt->execute();
Or:
$stmt = $pdo->prepare('INSERT INTO foo VALUES(:a, :b, :c)');
foreach($data as $item)
{
$stmt->bindValue(':a', $item[0]);
$stmt->bindValue(':b', $item[1]);
$stmt->bindValue(':c', $item[2]);
$stmt->execute();
}
If the data for all the rows are in a single array, I would use the second solution.
That's simply not the way you use prepared statements.
It is perfectly okay to insert one row per query because you can execute one prepared statement multiple times with different parameters. In fact that is one of the greatest advantages as it allows you to insert you a great number of rows in an efficient, secure and comfortable manner.
So it maybe possible to implement the scheme you proposing, at least for a fixed number of rows, but it is almost guaranteed that this is not really what you want.
A shorter answer: flatten the array of data ordered by columns then
//$array = array( '1','2','3','4','5', '1','2','3','4','5');
$arCount = count($array);
$rCount = ($arCount ? $arCount - 1 : 0);
$criteria = sprintf("(?,?,?,?,?)%s", str_repeat(",(?,?,?,?,?)", $rCount));
$sql = "INSERT INTO table(c1,c2,c3,c4,c5) VALUES$criteria";
When inserting a 1,000 or so records you don't want to have to loop through every record to insert them when all you need is a count of the values.
Here is my simple approach.
$values = array();
foreach($workouts_id as $value){
$_value = "(".$value.",".$plan_id.")";
array_push($values,$_value);
}
$values_ = implode(",",$values);
$sql = "INSERT INTO plan_days(id,name) VALUES" . $values_."";
$stmt = $this->conn->prepare($sql);
$stmt->execute();
Here's a class I wrote do multiple inserts with purge option:
<?php
/**
* $pdo->beginTransaction();
* $pmi = new PDOMultiLineInserter($pdo, "foo", array("a","b","c","e"), 10);
* $pmi->insertRow($data);
* ....
* $pmi->insertRow($data);
* $pmi->purgeRemainingInserts();
* $pdo->commit();
*
*/
class PDOMultiLineInserter {
private $_purgeAtCount;
private $_bigInsertQuery, $_singleInsertQuery;
private $_currentlyInsertingRows = array();
private $_currentlyInsertingCount = 0;
private $_numberOfFields;
private $_error;
private $_insertCount = 0;
function __construct(\PDO $pdo, $tableName, $fieldsAsArray, $bigInsertCount = 100) {
$this->_numberOfFields = count($fieldsAsArray);
$insertIntoPortion = "INSERT INTO `$tableName` (`".implode("`,`", $fieldsAsArray)."`) VALUES";
$questionMarks = " (?".str_repeat(",?", $this->_numberOfFields - 1).")";
$this->_purgeAtCount = $bigInsertCount;
$this->_bigInsertQuery = $pdo->prepare($insertIntoPortion.$questionMarks.str_repeat(", ".$questionMarks, $bigInsertCount - 1));
$this->_singleInsertQuery = $pdo->prepare($insertIntoPortion.$questionMarks);
}
function insertRow($rowData) {
// #todo Compare speed
// $this->_currentlyInsertingRows = array_merge($this->_currentlyInsertingRows, $rowData);
foreach($rowData as $v) array_push($this->_currentlyInsertingRows, $v);
//
if (++$this->_currentlyInsertingCount == $this->_purgeAtCount) {
if ($this->_bigInsertQuery->execute($this->_currentlyInsertingRows) === FALSE) {
$this->_error = "Failed to perform a multi-insert (after {$this->_insertCount} inserts), the following errors occurred:".implode('<br/>', $this->_bigInsertQuery->errorInfo());
return false;
}
$this->_insertCount++;
$this->_currentlyInsertingCount = 0;
$this->_currentlyInsertingRows = array();
}
return true;
}
function purgeRemainingInserts() {
while ($this->_currentlyInsertingCount > 0) {
$singleInsertData = array();
// #todo Compare speed - http://www.evardsson.com/blog/2010/02/05/comparing-php-array_shift-to-array_pop/
// for ($i = 0; $i < $this->_numberOfFields; $i++) $singleInsertData[] = array_pop($this->_currentlyInsertingRows); array_reverse($singleInsertData);
for ($i = 0; $i < $this->_numberOfFields; $i++) array_unshift($singleInsertData, array_pop($this->_currentlyInsertingRows));
if ($this->_singleInsertQuery->execute($singleInsertData) === FALSE) {
$this->_error = "Failed to perform a small-insert (whilst purging the remaining rows; the following errors occurred:".implode('<br/>', $this->_singleInsertQuery->errorInfo());
return false;
}
$this->_currentlyInsertingCount--;
}
}
public function getError() {
return $this->_error;
}
}
Based on my experiments I found out that mysql insert statement with multiple value rows in single transaction is the fastest one.
However, if the data is too much then mysql's max_allowed_packet setting might restrict the single transaction insert with multiple value rows. Hence, following functions will fail when there is data greater than mysql's max_allowed_packet size:
singleTransactionInsertWithRollback
singleTransactionInsertWithPlaceholders
singleTransactionInsert
The most successful one in insert huge data scenario is transactionSpeed method, but it consumes time more the above mentioned methods. So, to handle this problem you can either split your data into smaller chunks and call single transaction insert multiple times or give up speed of execution by using transactionSpeed method.
Here's my research
<?php
class SpeedTestClass
{
private $data;
private $pdo;
public function __construct()
{
$this->data = [];
$this->pdo = new \PDO('mysql:dbname=test_data', 'admin', 'admin');
if (!$this->pdo) {
die('Failed to connect to database');
}
}
public function createData()
{
$prefix = 'test';
$postfix = 'unicourt.com';
$salutations = ['Mr.', 'Ms.', 'Dr.', 'Mrs.'];
$csv[] = ['Salutation', 'First Name', 'Last Name', 'Email Address'];
for ($i = 0; $i < 100000; ++$i) {
$csv[] = [
$salutations[$i % \count($salutations)],
$prefix.$i,
$prefix.$i,
$prefix.$i.'#'.$postfix,
];
}
$this->data = $csv;
}
public function truncateTable()
{
$this->pdo->query('TRUNCATE TABLE `name`');
}
public function transactionSpeed()
{
$timer1 = microtime(true);
$this->pdo->beginTransaction();
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES (:first_name, :last_name)';
$sth = $this->pdo->prepare($sql);
foreach (\array_slice($this->data, 1) as $values) {
$sth->execute([
':first_name' => $values[1],
':last_name' => $values[2],
]);
}
// $timer2 = microtime(true);
// echo 'Prepare Time: '.($timer2 - $timer1).PHP_EOL;
// $timer3 = microtime(true);
if (!$this->pdo->commit()) {
echo "Commit failed\n";
}
$timer4 = microtime(true);
// echo 'Commit Time: '.($timer4 - $timer3).PHP_EOL;
return $timer4 - $timer1;
}
public function autoCommitSpeed()
{
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES (:first_name, :last_name)';
$sth = $this->pdo->prepare($sql);
foreach (\array_slice($this->data, 1) as $values) {
$sth->execute([
':first_name' => $values[1],
':last_name' => $values[2],
]);
}
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function noBindAutoCommitSpeed()
{
$timer1 = microtime(true);
foreach (\array_slice($this->data, 1) as $values) {
$sth = $this->pdo->prepare("INSERT INTO `name` (`first_name`, `last_name`) VALUES ('{$values[1]}', '{$values[2]}')");
$sth->execute();
}
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsert()
{
$timer1 = microtime(true);
foreach (\array_slice($this->data, 1) as $values) {
$arr[] = "('{$values[1]}', '{$values[2]}')";
}
$sth = $this->pdo->prepare('INSERT INTO `name` (`first_name`, `last_name`) VALUES '.implode(', ', $arr));
$sth->execute();
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsertWithPlaceholders()
{
$placeholders = [];
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES ';
foreach (\array_slice($this->data, 1) as $values) {
$placeholders[] = '(?, ?)';
$arr[] = $values[1];
$arr[] = $values[2];
}
$sql .= implode(', ', $placeholders);
$sth = $this->pdo->prepare($sql);
$sth->execute($arr);
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsertWithRollback()
{
$placeholders = [];
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES ';
foreach (\array_slice($this->data, 1) as $values) {
$placeholders[] = '(?, ?)';
$arr[] = $values[1];
$arr[] = $values[2];
}
$sql .= implode(', ', $placeholders);
$this->pdo->beginTransaction();
$sth = $this->pdo->prepare($sql);
$sth->execute($arr);
$this->pdo->commit();
$timer2 = microtime(true);
return $timer2 - $timer1;
}
}
$s = new SpeedTestClass();
$s->createData();
$s->truncateTable();
echo "Time Spent for singleTransactionInsertWithRollback: {$s->singleTransactionInsertWithRollback()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for single Transaction Insert: {$s->singleTransactionInsert()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for single Transaction Insert With Placeholders: {$s->singleTransactionInsertWithPlaceholders()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for transaction: {$s->transactionSpeed()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for AutoCommit: {$s->noBindAutoCommitSpeed()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for autocommit with bind: {$s->autoCommitSpeed()}".PHP_EOL;
$s->truncateTable();
The results for 100,000 entries for a table containing only two columns is as below
$ php data.php
Time Spent for singleTransactionInsertWithRollback: 0.75147604942322
Time Spent for single Transaction Insert: 0.67445182800293
Time Spent for single Transaction Insert With Placeholders: 0.71131205558777
Time Spent for transaction: 8.0056409835815
Time Spent for AutoCommit: 35.4979159832
Time Spent for autocommit with bind: 33.303519010544
Here is another (slim) solution for this issue:
At first you need to count the data of the source array (here: $aData) with count(). Then you use array_fill() and generate a new array wich as many entries as the source array has, each with the value "(?,?)" (the number of placeholders depends on the fields you use; here: 2). Then the generated array needs to be imploded and as glue a comma is used.
Within the foreach loop, you need to generate another index regarding on the number of placeholders you use (number of placeholders * current array index + 1). You need to add 1 to the generated index after each binded value.
$do = $db->prepare("INSERT INTO table (id, name) VALUES ".implode(',', array_fill(0, count($aData), '(?,?)')));
foreach($aData as $iIndex => $aValues){
$iRealIndex = 2 * $iIndex + 1;
$do->bindValue($iRealIndex, $aValues['id'], PDO::PARAM_INT);
$iRealIndex = $iRealIndex + 1;
$do->bindValue($iRealIndex, $aValues['name'], PDO::PARAM_STR);
}
$do->execute();
This is how I did it:
First define the column names you'll use, or leave it blank and pdo will assume you want to use all the columns on the table - in which case you'll need to inform the row values in the exact order they appear on the table.
$cols = 'name', 'middleName', 'eMail';
$table = 'people';
Now, suppose you have a two dimensional array already prepared. Iterate it, and construct a string with your row values, as such:
foreach ( $people as $person ) {
if(! $rowVals ) {
$rows = '(' . "'$name'" . ',' . "'$middleName'" . ',' . "'$eMail'" . ')';
} else { $rowVals = '(' . "'$name'" . ',' . "'$middleName'" . ',' . "'$eMail'" . ')';
}
Now, what you just did was check if $rows was already defined, and if not, create it and store row values and the necessary SQL syntax so it will be a valid statement. Note that strings should go inside double quotes and single quotes, so they will be promptly recognized as such.
All it's left to do is prepare the statement and execute, as such:
$stmt = $db->prepare ( "INSERT INTO $table $cols VALUES $rowVals" );
$stmt->execute ();
Tested with up to 2000 rows so far, and the execution time is dismal. Will run some more tests and will get back here in case I have something further to contribute.
Regards.
Since it has not been suggested yet, I'm pretty sure LOAD DATA INFILE is still the fastest way to load data as it disables indexing, inserts all data, and then re-enables the indexes - all in a single request.
Saving the data as a csv should be fairly trivial keeping in mind fputcsv. MyISAM is fastest, but you still get big performance in InnoDB. There are other disadvantages, though so I would go this route if you are inserting a lot of data, and not bother with under 100 rows.
Although an old question all the contributions helped me a lot so here's my solution, which works within my own DbContext class. The $rows parameter is simply an array of associative arrays representing rows or models: field name => insert value.
If you use a pattern that uses models this fits in nicely when passed model data as an array, say from a ToRowArray method within the model class.
Note: It should go without saying but never allow the arguments passed
to this method to be exposed to the user or reliant on any user input, other than the insert values, which have been validated and sanitised. The $tableName argument and the column names should be defined by the calling logic; for instance a User model could be mapped to the user table, which has its column list mapped to the model's member fields.
public function InsertRange($tableName, $rows)
{
// Get column list
$columnList = array_keys($rows[0]);
$numColumns = count($columnList);
$columnListString = implode(",", $columnList);
// Generate pdo param placeholders
$placeHolders = array();
foreach($rows as $row)
{
$temp = array();
for($i = 0; $i < count($row); $i++)
$temp[] = "?";
$placeHolders[] = "(" . implode(",", $temp) . ")";
}
$placeHolders = implode(",", $placeHolders);
// Construct the query
$sql = "insert into $tableName ($columnListString) values $placeHolders";
$stmt = $this->pdo->prepare($sql);
$j = 1;
foreach($rows as $row)
{
for($i = 0; $i < $numColumns; $i++)
{
$stmt->bindParam($j, $row[$columnList[$i]]);
$j++;
}
}
$stmt->execute();
}
You can insert multiple rows in a single query with this function:
function insertMultiple($query,$rows) {
if (count($rows)>0) {
$args = array_fill(0, count($rows[0]), '?');
$params = array();
foreach($rows as $row)
{
$values[] = "(".implode(',', $args).")";
foreach($row as $value)
{
$params[] = $value;
}
}
$query = $query." VALUES ".implode(',', $values);
$stmt = $PDO->prepare($query);
$stmt->execute($params);
}
}
$row is an array of arrays of values.
In your case you would call the function with
insertMultiple("INSERT INTO tbl (`key1`,`key2`)",array(array('r1v1','r1v2'),array('r2v1','r2v2')));
This has the benefit that you use prepared statements, while inserting multiple rows with a single query. Security!
Here is my solution: https://github.com/sasha-ch/Aura.Sql
based on auraphp/Aura.Sql library.
Usage example:
$q = "insert into t2(id,name) values (?,?), ... on duplicate key update name=name";
$bind_values = [ [[1,'str1'],[2,'str2']] ];
$pdo->perform($q, $bind_values);
Bugreports are welcome.
My real world example to insert all german postcodes into an empty table (to add town names later):
// obtain column template
$stmt = $db->prepare('SHOW COLUMNS FROM towns');
$stmt->execute();
$columns = array_fill_keys(array_values($stmt->fetchAll(PDO::FETCH_COLUMN)), null);
// multiple INSERT
$postcode = '01000';// smallest german postcode
while ($postcode <= 99999) {// highest german postcode
$values = array();
while ($postcode <= 99999) {
// reset row
$row = $columns;
// now fill our row with data
$row['postcode'] = sprintf('%05d', $postcode);
// build INSERT array
foreach ($row as $value) {
$values[] = $value;
}
$postcode++;
// avoid memory kill
if (!($postcode % 10000)) {
break;
}
}
// build query
$count_columns = count($columns);
$placeholder = ',(' . substr(str_repeat(',?', $count_columns), 1) . ')';//,(?,?,?)
$placeholder_group = substr(str_repeat($placeholder, count($values) / $count_columns), 1);//(?,?,?),(?,?,?)...
$into_columns = implode(',', array_keys($columns));//col1,col2,col3
// this part is optional:
$on_duplicate = array();
foreach ($columns as $column => $row) {
$on_duplicate[] = $column;
$on_duplicate[] = $column;
}
$on_duplicate = ' ON DUPLICATE KEY UPDATE' . vsprintf(substr(str_repeat(', %s = VALUES(%s)', $count_columns), 1), $on_duplicate);
// execute query
$stmt = $db->prepare('INSERT INTO towns (' . $into_columns . ') VALUES' . $placeholder_group . $on_duplicate);//INSERT INTO towns (col1,col2,col3) VALUES(?,?,?),(?,?,?)... {ON DUPLICATE...}
$stmt->execute($values);
}
As you can see its fully flexible. You don't need to check the amount of columns or check on which position your column is. You only need to set the insert data:
$row['postcode'] = sprintf('%05d', $postcode);
I'm proud of some of the query string constructors as they work without heavy array-functions like array_merge. Especially vsprintf() was a good find.
Finally I needed to add 2x while() to avoid exceeding the memory limit. This depends on your memory limit but at all its a good general solution to avoid problems (and having 10 queries is still much better than 10.000).
This worked for me
$sql = 'INSERT INTO table(pk_pk1,pk_pk2,date,pk_3) VALUES ';
$qPart = array_fill(0, count($array), "(?, ?,UTC_TIMESTAMP(),?)");
$sql .= implode(",", $qPart);
$stmt = DB::prepare('base', $sql);
$i = 1;
foreach ($array as $value) {
$stmt->bindValue($i++, $value);
$stmt->bindValue($i++, $pk_pk1);
$stmt->bindValue($i++, $pk_pk2);
$stmt->bindValue($i++, $pk_pk3);
}
$stmt->execute();
Most of the solutions given here to create the prepared query are more complex that they need to be. Using PHP's built in functions you can easily creare the SQL statement without significant overhead.
Given $records, an array of records where each record is itself an indexed array (in the form of field => value), the following function will insert the records into the given table $table, on a PDO connection $connection, using only a single prepared statement. Note that this is a PHP 5.6+ solution because of the use of argument unpacking in the call to array_push:
private function import(PDO $connection, $table, array $records)
{
$fields = array_keys($records[0]);
$placeHolders = substr(str_repeat(',?', count($fields)), 1);
$values = [];
foreach ($records as $record) {
array_push($values, ...array_values($record));
}
$query = 'INSERT INTO ' . $table . ' (';
$query .= implode(',', $fields);
$query .= ') VALUES (';
$query .= implode('),(', array_fill(0, count($records), $placeHolders));
$query .= ')';
$statement = $connection->prepare($query);
$statement->execute($values);
}
I had the same problem and this is how i accomplish for myself, and i made a function for myself for it ( and you can use it if that helps you).
Example:
INSERT INTO countries (country, city) VALUES (Germany, Berlin), (France, Paris);
$arr1 = Array("Germany", "Berlin");
$arr2 = Array("France", "France");
insertMultipleData("countries", Array($arr1, $arr2));
// Inserting multiple data to the Database.
public function insertMultipleData($table, $multi_params){
try{
$db = $this->connect();
$beforeParams = "";
$paramsStr = "";
$valuesStr = "";
for ($i=0; $i < count($multi_params); $i++) {
foreach ($multi_params[$i] as $j => $value) {
if ($i == 0) {
$beforeParams .= " " . $j . ",";
}
$paramsStr .= " :" . $j . "_" . $i .",";
}
$paramsStr = substr_replace($paramsStr, "", -1);
$valuesStr .= "(" . $paramsStr . "),";
$paramsStr = "";
}
$beforeParams = substr_replace($beforeParams, "", -1);
$valuesStr = substr_replace($valuesStr, "", -1);
$sql = "INSERT INTO " . $table . " (" . $beforeParams . ") VALUES " . $valuesStr . ";";
$stmt = $db->prepare($sql);
for ($i=0; $i < count($multi_params); $i++) {
foreach ($multi_params[$i] as $j => &$value) {
$stmt->bindParam(":" . $j . "_" . $i, $value);
}
}
$this->close($db);
$stmt->execute();
return true;
}catch(PDOException $e){
return false;
}
return false;
}
// Making connection to the Database
public function connect(){
$host = Constants::DB_HOST;
$dbname = Constants::DB_NAME;
$user = Constants::DB_USER;
$pass = Constants::DB_PASS;
$mysql_connect_str = 'mysql:host='. $host . ';dbname=' .$dbname;
$dbConnection = new PDO($mysql_connect_str, $user, $pass);
$dbConnection->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
return $dbConnection;
}
// Closing the connection
public function close($db){
$db = null;
}
If insertMultipleData($table, $multi_params) returns TRUE, your data has been inserted to your database.
what about something like this:
if(count($types_of_values)>0){
$uid = 1;
$x = 0;
$sql = "";
$values = array();
foreach($types_of_values as $k=>$v){
$sql .= "(:id_$k,:kind_of_val_$k), ";
$values[":id_$k"] = $uid;
$values[":kind_of_val_$k"] = $v;
}
$sql = substr($sql,0,-2);
$query = "INSERT INTO table (id,value_type) VALUES $sql";
$res = $this->db->prepare($query);
$res->execute($values);
}
The idea behind this is to cycle through your array values, adding "id numbers" to each loop for your prepared statement placeholders while at the same time, you add the values to your array for the binding parameters. If you don't like using the "key" index from the array, you could add $i=0, and $i++ inside the loop. Either works in this example, even if you have associative arrays with named keys, it would still work providing the keys were unique. With a little work it would be fine for nested arrays too..
**Note that substr strips the $sql variables last space and comma, if you don't have a space you'd need to change this to -1 rather than -2.
Related
I am currently using this type of SQL on MySQL to insert multiple rows of values in one single query:
INSERT INTO `tbl` (`key1`,`key2`) VALUES ('r1v1','r1v2'),('r2v1','r2v2'),...
On the readings on PDO, the use prepared statements should give me a better security than static queries.
I would therefore like to know whether it is possible to generate "inserting multiple rows of values by the use of one query" using prepared statements.
If yes, may I know how can I implement it?
Multiple Values Insert with PDO Prepared Statements
Inserting multiple values in one execute statement. Why because according to this page it is faster than regular inserts.
$datafields = array('fielda', 'fieldb', ... );
$data[] = array('fielda' => 'value', 'fieldb' => 'value' ....);
$data[] = array('fielda' => 'value', 'fieldb' => 'value' ....);
more data values or you probably have a loop that populates data.
With prepared inserts you need to know the fields you're inserting to, and the number of fields to create the ? placeholders to bind your parameters.
insert into table (fielda, fieldb, ... ) values (?,?...), (?,?...)....
That is basically how we want the insert statement to look like.
Now, the code:
function placeholders($text, $count=0, $separator=","){
$result = array();
if($count > 0){
for($x=0; $x<$count; $x++){
$result[] = $text;
}
}
return implode($separator, $result);
}
$pdo->beginTransaction(); // also helps speed up your inserts.
$insert_values = array();
foreach($data as $d){
$question_marks[] = '(' . placeholders('?', sizeof($d)) . ')';
$insert_values = array_merge($insert_values, array_values($d));
}
$sql = "INSERT INTO table (" . implode(",", $datafields ) . ") VALUES " .
implode(',', $question_marks);
$stmt = $pdo->prepare ($sql);
$stmt->execute($insert_values);
$pdo->commit();
Although in my test, there was only a 1 sec difference when using multiple inserts and regular prepared inserts with single value.
Same answer as Mr. Balagtas, slightly clearer...
Recent versions MySQL and PHP PDO do support multi-row INSERT statements.
SQL Overview
The SQL will look something like this, assuming a 3-column table you'd like to INSERT to.
INSERT INTO tbl_name
(colA, colB, colC)
VALUES (?, ?, ?), (?, ?, ?), (?, ?, ?) [,...]
ON DUPLICATE KEY UPDATE works as expected even with a multi-row INSERT; append this:
ON DUPLICATE KEY UPDATE colA = VALUES(colA), colB = VALUES(colB), colC = VALUES(colC)
PHP Overview
Your PHP code will follow the usual $pdo->prepare($qry) and $stmt->execute($params) PDO calls.
$params will be a 1-dimensional array of all the values to pass to the INSERT.
In the above example, it should contain 9 elements; PDO will use every set of 3 as a single row of values. (Inserting 3 rows of 3 columns each = 9 element array.)
Implementation
Below code is written for clarity, not efficiency. Work with the PHP array_*() functions for better ways to map or walk through your data if you'd like. Whether you can use transactions obviously depends on your MySQL table type.
Assuming:
$tblName - the string name of the table to INSERT to
$colNames - 1-dimensional array of the column names of the table
These column names must be valid MySQL column identifiers; escape them with backticks (``) if they are not
$dataVals - mutli-dimensional array, where each element is a 1-d array of a row of values to INSERT
Sample Code
// setup data values for PDO
// memory warning: this is creating a copy all of $dataVals
$dataToInsert = array();
foreach ($dataVals as $row => $data) {
foreach($data as $val) {
$dataToInsert[] = $val;
}
}
// (optional) setup the ON DUPLICATE column names
$updateCols = array();
foreach ($colNames as $curCol) {
$updateCols[] = $curCol . " = VALUES($curCol)";
}
$onDup = implode(', ', $updateCols);
// setup the placeholders - a fancy way to make the long "(?, ?, ?)..." string
$rowPlaces = '(' . implode(', ', array_fill(0, count($colNames), '?')) . ')';
$allPlaces = implode(', ', array_fill(0, count($dataVals), $rowPlaces));
$sql = "INSERT INTO $tblName (" . implode(', ', $colNames) .
") VALUES " . $allPlaces . " ON DUPLICATE KEY UPDATE $onDup";
// and then the PHP PDO boilerplate
$stmt = $pdo->prepare ($sql);
$stmt->execute($dataToInsert);
$pdo->commit();
For what it is worth, I have seen a lot of users recommend iterating through INSERT statements instead of building out as a single string query as the selected answer did. I decided to run a simple test with just two fields and a very basic insert statement:
<?php
require('conn.php');
$fname = 'J';
$lname = 'M';
$time_start = microtime(true);
$stmt = $db->prepare('INSERT INTO table (FirstName, LastName) VALUES (:fname, :lname)');
for($i = 1; $i <= 10; $i++ ) {
$stmt->bindParam(':fname', $fname);
$stmt->bindParam(':lname', $lname);
$stmt->execute();
$fname .= 'O';
$lname .= 'A';
}
$time_end = microtime(true);
$time = $time_end - $time_start;
echo "Completed in ". $time ." seconds <hr>";
$fname2 = 'J';
$lname2 = 'M';
$time_start2 = microtime(true);
$qry = 'INSERT INTO table (FirstName, LastName) VALUES ';
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?)";
$stmt2 = $db->prepare($qry);
$values = array();
for($j = 1; $j<=10; $j++) {
$values2 = array($fname2, $lname2);
$values = array_merge($values,$values2);
$fname2 .= 'O';
$lname2 .= 'A';
}
$stmt2->execute($values);
$time_end2 = microtime(true);
$time2 = $time_end2 - $time_start2;
echo "Completed in ". $time2 ." seconds <hr>";
?>
While the overall query itself took milliseconds or less, the latter (single string) query was consistently 8 times faster or more. If this was built out to say reflect an import of thousands of rows on many more columns, the difference could be enormous.
The Accepted Answer by Herbert Balagtas works well when the $data array is small. With larger $data arrays the array_merge function becomes prohibitively slow. My test file to create the $data array has 28 cols and is about 80,000 lines. The final script took 41s to complete.
Using array_push() to create $insert_values instead of array_merge() resulted in a 100X speed up with execution time of 0.41s.
The problematic array_merge():
$insert_values = array();
foreach($data as $d){
$question_marks[] = '(' . placeholders('?', sizeof($d)) . ')';
$insert_values = array_merge($insert_values, array_values($d));
}
To eliminate the need for array_merge(), you can build the following two arrays instead:
//Note that these fields are empty, but the field count should match the fields in $datafields.
$data[] = array('','','','',... n );
//getting rid of array_merge()
array_push($insert_values, $value1, $value2, $value3 ... n );
These arrays can then be used as follows:
function placeholders($text, $count=0, $separator=","){
$result = array();
if($count > 0){
for($x=0; $x<$count; $x++){
$result[] = $text;
}
}
return implode($separator, $result);
}
$pdo->beginTransaction();
foreach($data as $d){
$question_marks[] = '(' . placeholders('?', sizeof($d)) . ')';
}
$sql = "INSERT INTO table (" . implode(",", array_keys($datafield) ) . ") VALUES " . implode(',', $question_marks);
$stmt = $pdo->prepare($sql);
$stmt->execute($insert_values);
$pdo->commit();
Two possible approaches:
$stmt = $pdo->prepare('INSERT INTO foo VALUES(:v1_1, :v1_2, :v1_3),
(:v2_1, :v2_2, :v2_3),
(:v2_1, :v2_2, :v2_3)');
$stmt->bindValue(':v1_1', $data[0][0]);
$stmt->bindValue(':v1_2', $data[0][1]);
$stmt->bindValue(':v1_3', $data[0][2]);
// etc...
$stmt->execute();
Or:
$stmt = $pdo->prepare('INSERT INTO foo VALUES(:a, :b, :c)');
foreach($data as $item)
{
$stmt->bindValue(':a', $item[0]);
$stmt->bindValue(':b', $item[1]);
$stmt->bindValue(':c', $item[2]);
$stmt->execute();
}
If the data for all the rows are in a single array, I would use the second solution.
That's simply not the way you use prepared statements.
It is perfectly okay to insert one row per query because you can execute one prepared statement multiple times with different parameters. In fact that is one of the greatest advantages as it allows you to insert you a great number of rows in an efficient, secure and comfortable manner.
So it maybe possible to implement the scheme you proposing, at least for a fixed number of rows, but it is almost guaranteed that this is not really what you want.
A shorter answer: flatten the array of data ordered by columns then
//$array = array( '1','2','3','4','5', '1','2','3','4','5');
$arCount = count($array);
$rCount = ($arCount ? $arCount - 1 : 0);
$criteria = sprintf("(?,?,?,?,?)%s", str_repeat(",(?,?,?,?,?)", $rCount));
$sql = "INSERT INTO table(c1,c2,c3,c4,c5) VALUES$criteria";
When inserting a 1,000 or so records you don't want to have to loop through every record to insert them when all you need is a count of the values.
Here is my simple approach.
$values = array();
foreach($workouts_id as $value){
$_value = "(".$value.",".$plan_id.")";
array_push($values,$_value);
}
$values_ = implode(",",$values);
$sql = "INSERT INTO plan_days(id,name) VALUES" . $values_."";
$stmt = $this->conn->prepare($sql);
$stmt->execute();
Here's a class I wrote do multiple inserts with purge option:
<?php
/**
* $pdo->beginTransaction();
* $pmi = new PDOMultiLineInserter($pdo, "foo", array("a","b","c","e"), 10);
* $pmi->insertRow($data);
* ....
* $pmi->insertRow($data);
* $pmi->purgeRemainingInserts();
* $pdo->commit();
*
*/
class PDOMultiLineInserter {
private $_purgeAtCount;
private $_bigInsertQuery, $_singleInsertQuery;
private $_currentlyInsertingRows = array();
private $_currentlyInsertingCount = 0;
private $_numberOfFields;
private $_error;
private $_insertCount = 0;
function __construct(\PDO $pdo, $tableName, $fieldsAsArray, $bigInsertCount = 100) {
$this->_numberOfFields = count($fieldsAsArray);
$insertIntoPortion = "INSERT INTO `$tableName` (`".implode("`,`", $fieldsAsArray)."`) VALUES";
$questionMarks = " (?".str_repeat(",?", $this->_numberOfFields - 1).")";
$this->_purgeAtCount = $bigInsertCount;
$this->_bigInsertQuery = $pdo->prepare($insertIntoPortion.$questionMarks.str_repeat(", ".$questionMarks, $bigInsertCount - 1));
$this->_singleInsertQuery = $pdo->prepare($insertIntoPortion.$questionMarks);
}
function insertRow($rowData) {
// #todo Compare speed
// $this->_currentlyInsertingRows = array_merge($this->_currentlyInsertingRows, $rowData);
foreach($rowData as $v) array_push($this->_currentlyInsertingRows, $v);
//
if (++$this->_currentlyInsertingCount == $this->_purgeAtCount) {
if ($this->_bigInsertQuery->execute($this->_currentlyInsertingRows) === FALSE) {
$this->_error = "Failed to perform a multi-insert (after {$this->_insertCount} inserts), the following errors occurred:".implode('<br/>', $this->_bigInsertQuery->errorInfo());
return false;
}
$this->_insertCount++;
$this->_currentlyInsertingCount = 0;
$this->_currentlyInsertingRows = array();
}
return true;
}
function purgeRemainingInserts() {
while ($this->_currentlyInsertingCount > 0) {
$singleInsertData = array();
// #todo Compare speed - http://www.evardsson.com/blog/2010/02/05/comparing-php-array_shift-to-array_pop/
// for ($i = 0; $i < $this->_numberOfFields; $i++) $singleInsertData[] = array_pop($this->_currentlyInsertingRows); array_reverse($singleInsertData);
for ($i = 0; $i < $this->_numberOfFields; $i++) array_unshift($singleInsertData, array_pop($this->_currentlyInsertingRows));
if ($this->_singleInsertQuery->execute($singleInsertData) === FALSE) {
$this->_error = "Failed to perform a small-insert (whilst purging the remaining rows; the following errors occurred:".implode('<br/>', $this->_singleInsertQuery->errorInfo());
return false;
}
$this->_currentlyInsertingCount--;
}
}
public function getError() {
return $this->_error;
}
}
Based on my experiments I found out that mysql insert statement with multiple value rows in single transaction is the fastest one.
However, if the data is too much then mysql's max_allowed_packet setting might restrict the single transaction insert with multiple value rows. Hence, following functions will fail when there is data greater than mysql's max_allowed_packet size:
singleTransactionInsertWithRollback
singleTransactionInsertWithPlaceholders
singleTransactionInsert
The most successful one in insert huge data scenario is transactionSpeed method, but it consumes time more the above mentioned methods. So, to handle this problem you can either split your data into smaller chunks and call single transaction insert multiple times or give up speed of execution by using transactionSpeed method.
Here's my research
<?php
class SpeedTestClass
{
private $data;
private $pdo;
public function __construct()
{
$this->data = [];
$this->pdo = new \PDO('mysql:dbname=test_data', 'admin', 'admin');
if (!$this->pdo) {
die('Failed to connect to database');
}
}
public function createData()
{
$prefix = 'test';
$postfix = 'unicourt.com';
$salutations = ['Mr.', 'Ms.', 'Dr.', 'Mrs.'];
$csv[] = ['Salutation', 'First Name', 'Last Name', 'Email Address'];
for ($i = 0; $i < 100000; ++$i) {
$csv[] = [
$salutations[$i % \count($salutations)],
$prefix.$i,
$prefix.$i,
$prefix.$i.'#'.$postfix,
];
}
$this->data = $csv;
}
public function truncateTable()
{
$this->pdo->query('TRUNCATE TABLE `name`');
}
public function transactionSpeed()
{
$timer1 = microtime(true);
$this->pdo->beginTransaction();
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES (:first_name, :last_name)';
$sth = $this->pdo->prepare($sql);
foreach (\array_slice($this->data, 1) as $values) {
$sth->execute([
':first_name' => $values[1],
':last_name' => $values[2],
]);
}
// $timer2 = microtime(true);
// echo 'Prepare Time: '.($timer2 - $timer1).PHP_EOL;
// $timer3 = microtime(true);
if (!$this->pdo->commit()) {
echo "Commit failed\n";
}
$timer4 = microtime(true);
// echo 'Commit Time: '.($timer4 - $timer3).PHP_EOL;
return $timer4 - $timer1;
}
public function autoCommitSpeed()
{
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES (:first_name, :last_name)';
$sth = $this->pdo->prepare($sql);
foreach (\array_slice($this->data, 1) as $values) {
$sth->execute([
':first_name' => $values[1],
':last_name' => $values[2],
]);
}
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function noBindAutoCommitSpeed()
{
$timer1 = microtime(true);
foreach (\array_slice($this->data, 1) as $values) {
$sth = $this->pdo->prepare("INSERT INTO `name` (`first_name`, `last_name`) VALUES ('{$values[1]}', '{$values[2]}')");
$sth->execute();
}
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsert()
{
$timer1 = microtime(true);
foreach (\array_slice($this->data, 1) as $values) {
$arr[] = "('{$values[1]}', '{$values[2]}')";
}
$sth = $this->pdo->prepare('INSERT INTO `name` (`first_name`, `last_name`) VALUES '.implode(', ', $arr));
$sth->execute();
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsertWithPlaceholders()
{
$placeholders = [];
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES ';
foreach (\array_slice($this->data, 1) as $values) {
$placeholders[] = '(?, ?)';
$arr[] = $values[1];
$arr[] = $values[2];
}
$sql .= implode(', ', $placeholders);
$sth = $this->pdo->prepare($sql);
$sth->execute($arr);
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsertWithRollback()
{
$placeholders = [];
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES ';
foreach (\array_slice($this->data, 1) as $values) {
$placeholders[] = '(?, ?)';
$arr[] = $values[1];
$arr[] = $values[2];
}
$sql .= implode(', ', $placeholders);
$this->pdo->beginTransaction();
$sth = $this->pdo->prepare($sql);
$sth->execute($arr);
$this->pdo->commit();
$timer2 = microtime(true);
return $timer2 - $timer1;
}
}
$s = new SpeedTestClass();
$s->createData();
$s->truncateTable();
echo "Time Spent for singleTransactionInsertWithRollback: {$s->singleTransactionInsertWithRollback()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for single Transaction Insert: {$s->singleTransactionInsert()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for single Transaction Insert With Placeholders: {$s->singleTransactionInsertWithPlaceholders()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for transaction: {$s->transactionSpeed()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for AutoCommit: {$s->noBindAutoCommitSpeed()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for autocommit with bind: {$s->autoCommitSpeed()}".PHP_EOL;
$s->truncateTable();
The results for 100,000 entries for a table containing only two columns is as below
$ php data.php
Time Spent for singleTransactionInsertWithRollback: 0.75147604942322
Time Spent for single Transaction Insert: 0.67445182800293
Time Spent for single Transaction Insert With Placeholders: 0.71131205558777
Time Spent for transaction: 8.0056409835815
Time Spent for AutoCommit: 35.4979159832
Time Spent for autocommit with bind: 33.303519010544
Here is another (slim) solution for this issue:
At first you need to count the data of the source array (here: $aData) with count(). Then you use array_fill() and generate a new array wich as many entries as the source array has, each with the value "(?,?)" (the number of placeholders depends on the fields you use; here: 2). Then the generated array needs to be imploded and as glue a comma is used.
Within the foreach loop, you need to generate another index regarding on the number of placeholders you use (number of placeholders * current array index + 1). You need to add 1 to the generated index after each binded value.
$do = $db->prepare("INSERT INTO table (id, name) VALUES ".implode(',', array_fill(0, count($aData), '(?,?)')));
foreach($aData as $iIndex => $aValues){
$iRealIndex = 2 * $iIndex + 1;
$do->bindValue($iRealIndex, $aValues['id'], PDO::PARAM_INT);
$iRealIndex = $iRealIndex + 1;
$do->bindValue($iRealIndex, $aValues['name'], PDO::PARAM_STR);
}
$do->execute();
This is how I did it:
First define the column names you'll use, or leave it blank and pdo will assume you want to use all the columns on the table - in which case you'll need to inform the row values in the exact order they appear on the table.
$cols = 'name', 'middleName', 'eMail';
$table = 'people';
Now, suppose you have a two dimensional array already prepared. Iterate it, and construct a string with your row values, as such:
foreach ( $people as $person ) {
if(! $rowVals ) {
$rows = '(' . "'$name'" . ',' . "'$middleName'" . ',' . "'$eMail'" . ')';
} else { $rowVals = '(' . "'$name'" . ',' . "'$middleName'" . ',' . "'$eMail'" . ')';
}
Now, what you just did was check if $rows was already defined, and if not, create it and store row values and the necessary SQL syntax so it will be a valid statement. Note that strings should go inside double quotes and single quotes, so they will be promptly recognized as such.
All it's left to do is prepare the statement and execute, as such:
$stmt = $db->prepare ( "INSERT INTO $table $cols VALUES $rowVals" );
$stmt->execute ();
Tested with up to 2000 rows so far, and the execution time is dismal. Will run some more tests and will get back here in case I have something further to contribute.
Regards.
Since it has not been suggested yet, I'm pretty sure LOAD DATA INFILE is still the fastest way to load data as it disables indexing, inserts all data, and then re-enables the indexes - all in a single request.
Saving the data as a csv should be fairly trivial keeping in mind fputcsv. MyISAM is fastest, but you still get big performance in InnoDB. There are other disadvantages, though so I would go this route if you are inserting a lot of data, and not bother with under 100 rows.
Although an old question all the contributions helped me a lot so here's my solution, which works within my own DbContext class. The $rows parameter is simply an array of associative arrays representing rows or models: field name => insert value.
If you use a pattern that uses models this fits in nicely when passed model data as an array, say from a ToRowArray method within the model class.
Note: It should go without saying but never allow the arguments passed
to this method to be exposed to the user or reliant on any user input, other than the insert values, which have been validated and sanitised. The $tableName argument and the column names should be defined by the calling logic; for instance a User model could be mapped to the user table, which has its column list mapped to the model's member fields.
public function InsertRange($tableName, $rows)
{
// Get column list
$columnList = array_keys($rows[0]);
$numColumns = count($columnList);
$columnListString = implode(",", $columnList);
// Generate pdo param placeholders
$placeHolders = array();
foreach($rows as $row)
{
$temp = array();
for($i = 0; $i < count($row); $i++)
$temp[] = "?";
$placeHolders[] = "(" . implode(",", $temp) . ")";
}
$placeHolders = implode(",", $placeHolders);
// Construct the query
$sql = "insert into $tableName ($columnListString) values $placeHolders";
$stmt = $this->pdo->prepare($sql);
$j = 1;
foreach($rows as $row)
{
for($i = 0; $i < $numColumns; $i++)
{
$stmt->bindParam($j, $row[$columnList[$i]]);
$j++;
}
}
$stmt->execute();
}
You can insert multiple rows in a single query with this function:
function insertMultiple($query,$rows) {
if (count($rows)>0) {
$args = array_fill(0, count($rows[0]), '?');
$params = array();
foreach($rows as $row)
{
$values[] = "(".implode(',', $args).")";
foreach($row as $value)
{
$params[] = $value;
}
}
$query = $query." VALUES ".implode(',', $values);
$stmt = $PDO->prepare($query);
$stmt->execute($params);
}
}
$row is an array of arrays of values.
In your case you would call the function with
insertMultiple("INSERT INTO tbl (`key1`,`key2`)",array(array('r1v1','r1v2'),array('r2v1','r2v2')));
This has the benefit that you use prepared statements, while inserting multiple rows with a single query. Security!
Here is my solution: https://github.com/sasha-ch/Aura.Sql
based on auraphp/Aura.Sql library.
Usage example:
$q = "insert into t2(id,name) values (?,?), ... on duplicate key update name=name";
$bind_values = [ [[1,'str1'],[2,'str2']] ];
$pdo->perform($q, $bind_values);
Bugreports are welcome.
My real world example to insert all german postcodes into an empty table (to add town names later):
// obtain column template
$stmt = $db->prepare('SHOW COLUMNS FROM towns');
$stmt->execute();
$columns = array_fill_keys(array_values($stmt->fetchAll(PDO::FETCH_COLUMN)), null);
// multiple INSERT
$postcode = '01000';// smallest german postcode
while ($postcode <= 99999) {// highest german postcode
$values = array();
while ($postcode <= 99999) {
// reset row
$row = $columns;
// now fill our row with data
$row['postcode'] = sprintf('%05d', $postcode);
// build INSERT array
foreach ($row as $value) {
$values[] = $value;
}
$postcode++;
// avoid memory kill
if (!($postcode % 10000)) {
break;
}
}
// build query
$count_columns = count($columns);
$placeholder = ',(' . substr(str_repeat(',?', $count_columns), 1) . ')';//,(?,?,?)
$placeholder_group = substr(str_repeat($placeholder, count($values) / $count_columns), 1);//(?,?,?),(?,?,?)...
$into_columns = implode(',', array_keys($columns));//col1,col2,col3
// this part is optional:
$on_duplicate = array();
foreach ($columns as $column => $row) {
$on_duplicate[] = $column;
$on_duplicate[] = $column;
}
$on_duplicate = ' ON DUPLICATE KEY UPDATE' . vsprintf(substr(str_repeat(', %s = VALUES(%s)', $count_columns), 1), $on_duplicate);
// execute query
$stmt = $db->prepare('INSERT INTO towns (' . $into_columns . ') VALUES' . $placeholder_group . $on_duplicate);//INSERT INTO towns (col1,col2,col3) VALUES(?,?,?),(?,?,?)... {ON DUPLICATE...}
$stmt->execute($values);
}
As you can see its fully flexible. You don't need to check the amount of columns or check on which position your column is. You only need to set the insert data:
$row['postcode'] = sprintf('%05d', $postcode);
I'm proud of some of the query string constructors as they work without heavy array-functions like array_merge. Especially vsprintf() was a good find.
Finally I needed to add 2x while() to avoid exceeding the memory limit. This depends on your memory limit but at all its a good general solution to avoid problems (and having 10 queries is still much better than 10.000).
This worked for me
$sql = 'INSERT INTO table(pk_pk1,pk_pk2,date,pk_3) VALUES ';
$qPart = array_fill(0, count($array), "(?, ?,UTC_TIMESTAMP(),?)");
$sql .= implode(",", $qPart);
$stmt = DB::prepare('base', $sql);
$i = 1;
foreach ($array as $value) {
$stmt->bindValue($i++, $value);
$stmt->bindValue($i++, $pk_pk1);
$stmt->bindValue($i++, $pk_pk2);
$stmt->bindValue($i++, $pk_pk3);
}
$stmt->execute();
Most of the solutions given here to create the prepared query are more complex that they need to be. Using PHP's built in functions you can easily creare the SQL statement without significant overhead.
Given $records, an array of records where each record is itself an indexed array (in the form of field => value), the following function will insert the records into the given table $table, on a PDO connection $connection, using only a single prepared statement. Note that this is a PHP 5.6+ solution because of the use of argument unpacking in the call to array_push:
private function import(PDO $connection, $table, array $records)
{
$fields = array_keys($records[0]);
$placeHolders = substr(str_repeat(',?', count($fields)), 1);
$values = [];
foreach ($records as $record) {
array_push($values, ...array_values($record));
}
$query = 'INSERT INTO ' . $table . ' (';
$query .= implode(',', $fields);
$query .= ') VALUES (';
$query .= implode('),(', array_fill(0, count($records), $placeHolders));
$query .= ')';
$statement = $connection->prepare($query);
$statement->execute($values);
}
I had the same problem and this is how i accomplish for myself, and i made a function for myself for it ( and you can use it if that helps you).
Example:
INSERT INTO countries (country, city) VALUES (Germany, Berlin), (France, Paris);
$arr1 = Array("Germany", "Berlin");
$arr2 = Array("France", "France");
insertMultipleData("countries", Array($arr1, $arr2));
// Inserting multiple data to the Database.
public function insertMultipleData($table, $multi_params){
try{
$db = $this->connect();
$beforeParams = "";
$paramsStr = "";
$valuesStr = "";
for ($i=0; $i < count($multi_params); $i++) {
foreach ($multi_params[$i] as $j => $value) {
if ($i == 0) {
$beforeParams .= " " . $j . ",";
}
$paramsStr .= " :" . $j . "_" . $i .",";
}
$paramsStr = substr_replace($paramsStr, "", -1);
$valuesStr .= "(" . $paramsStr . "),";
$paramsStr = "";
}
$beforeParams = substr_replace($beforeParams, "", -1);
$valuesStr = substr_replace($valuesStr, "", -1);
$sql = "INSERT INTO " . $table . " (" . $beforeParams . ") VALUES " . $valuesStr . ";";
$stmt = $db->prepare($sql);
for ($i=0; $i < count($multi_params); $i++) {
foreach ($multi_params[$i] as $j => &$value) {
$stmt->bindParam(":" . $j . "_" . $i, $value);
}
}
$this->close($db);
$stmt->execute();
return true;
}catch(PDOException $e){
return false;
}
return false;
}
// Making connection to the Database
public function connect(){
$host = Constants::DB_HOST;
$dbname = Constants::DB_NAME;
$user = Constants::DB_USER;
$pass = Constants::DB_PASS;
$mysql_connect_str = 'mysql:host='. $host . ';dbname=' .$dbname;
$dbConnection = new PDO($mysql_connect_str, $user, $pass);
$dbConnection->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
return $dbConnection;
}
// Closing the connection
public function close($db){
$db = null;
}
If insertMultipleData($table, $multi_params) returns TRUE, your data has been inserted to your database.
what about something like this:
if(count($types_of_values)>0){
$uid = 1;
$x = 0;
$sql = "";
$values = array();
foreach($types_of_values as $k=>$v){
$sql .= "(:id_$k,:kind_of_val_$k), ";
$values[":id_$k"] = $uid;
$values[":kind_of_val_$k"] = $v;
}
$sql = substr($sql,0,-2);
$query = "INSERT INTO table (id,value_type) VALUES $sql";
$res = $this->db->prepare($query);
$res->execute($values);
}
The idea behind this is to cycle through your array values, adding "id numbers" to each loop for your prepared statement placeholders while at the same time, you add the values to your array for the binding parameters. If you don't like using the "key" index from the array, you could add $i=0, and $i++ inside the loop. Either works in this example, even if you have associative arrays with named keys, it would still work providing the keys were unique. With a little work it would be fine for nested arrays too..
**Note that substr strips the $sql variables last space and comma, if you don't have a space you'd need to change this to -1 rather than -2.
I am attempting to bind params to a sql statement using call_user_func_array as describe on Dynamically Bind Params in Prepared Statements with MySQLi; however, my mysqli_prepare keeps returning false.
Here is my data function that is called to store data:
function storeData($form_data, $table_name, $cxn){
if(!is_array($form_data)){
return false;
exit();
}
$types = str_repeat("s", count($form_data));
$params = array();
$params[] = &$types;
$keys = array_keys($form_data);
$values = array_values($form_data);
for ($i = 0; $i < count($values); $i++) {
$params[] = &$values[$i];
}
$sql = "INSERT INTO $table_name (" . implode(',', $keys) . ") VALUES (" .
implode(',', array_fill(0, count($values), '?')) . ")
ON DUPLICATE KEY UPDATE ";
$updates = implode(',', array_map(function($col) {
return "$col = VALUES($col)";
}, $keys));
$sql .= $updates;
if($stmt = mysqli_prepare($cxn, $sql)){
call_user_func_array(array($stmt, 'bind_param'), $params);
return mysqli_stmt_execute($stmt);
}
Here is my $sql string at time of prepare:
$sql"INSERT INTO interest (Baseball,Basketball,Camping,Canoeing,Cycling,Football,Gaming,Golf,Hiking,Parks,Photography,Runway,Skydiving,Soccer,Username) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?) ON DUPLICATE KEY UPDATE Baseball = VALUES(Baseball),Basketball = VALUES(Basketball),Camping = VALUES(Camping),Canoeing = VALUES(Canoeing),Cycling = VALUES(Cycling),Football = VALUES(Football),Gaming = VALUES(Gaming),Golf = VALUES(Golf),Hiking = VALUES(Hiking),Parks = VALUES(Parks),Photography = VALUES(Photography),Runway = VALUES(Runway),Skydiving = VALUES(Skydiving),Soccer = VALUES(Soccer),Username = VALUES(Username)"
Here is my $params and $key outputs:
$keysarray[15]
$keys[0]"Baseball"
$keys[1]"Basketball"
$keys[2]"Camping"
$keys[3]"Canoeing"
$keys[4]"Cycling"
$keys[5]"Football"
$keys[6]"Gaming"
$keys[7]"Golf"
$keys[8]"Hiking"
$keys[9]"Parks"
$keys[10]"Photography"
$keys[11]"Runway"
$keys[12]"Skydiving"
$keys[13]"Soccer"
$keys[14]"Username"
$paramsarray[16]
$params[0]"sssssssssssssss"
$params[1]"0"
$params[2]"0"
$params[3]"0"
$params[4]"0"
$params[5]"0"
$params[6]"0"
$params[7]"0"
$params[8]"0"
$params[9]"0"
$params[10]"0"
$params[11]"0"
$params[12]"0"
$params[13]"0"
$params[14]"0"
$params[15]"test0613"
$valuesarray[15]
$values[0]"0"
$values[1]"0"
$values[2]"0"
$values[3]"0"
$values[4]"0"
$values[5]"0"
$values[6]"0"
$values[7]"0"
$values[8]"0"
$values[9]"0"
$values[10]"0"
$values[11]"0"
$values[12]"0"
$values[13]"0"
$values[14]"test0613"
There error existed in a column i was attempting to map which did not exist. The error procedure was found here, which allowed me to produce fatal errors that noted a column did not exist in the table I was referencing.
the problem is my function insert inserts my record in two rows.
this is my code to connect to database in a file named :
connect.php
<?php
try{
$db = new PDO("mysql:host=localhost;dbname=NPD" , "root" , "");
echo "connected";
}
catch(Exception $e){
echo $e->getMessage();
}
this is my database class in a file
database.php
<?php
require 'connect.php';
class DB {
public function insertInto($tableName , $info){
global $db;
foreach ($info as $coloumnName => $coloumnValue) {
$stmt = $db->prepare("INSERT INTO $tableName ($coloumnName) VALUES ('$coloumnValue') ");
$stmt->execute();
}
}
}
$da = new DB;
$da->insertInto('tableOne',array('name' => 'lolo' , 'deg' => '100'));
the result in the database is :
tableOne
how can to make the insert function inserts my record in one row.
note : i want to insert any number of columns and values.
try to do something like this:
$arr = array('name' => 'lolo' , 'deg' => '100');
$columns=array_keys($arr);
$values=array_values($arr);
$str="INSERT INTO $tableName (".implode(',',$columns).") VALUES ('" . implode("', '", $values) . "' )";
echo $str;//your sql
// $stmt = $db->prepare($str);
// $stmt->execute();//uncomment to execute
Like this but there are some concerns ( also I haven't tested this )
class DB {
protected $_conn;
public function __construct( $user, $pass, $database='NPD', $host='localhost' ){
try{
$this->_conn = new PDO("mysql:host={$host};dbname={$database}" , $user , $pass);
echo "connected";
}catch(Exception $e){
echo $e->getMessage();
}
}
public function insertInto($tableName , $info){
$sql = 'INSERT INTO $tableName (';
$sql .= '`'implode('`,`', array_keys($info[0])).'`';
$sql .= ')VALUES';
foreach ($info as $index => $row) {
$sql .= '(';
foreach( $row as $column => $value){
$sql .= ':'.$column.$index.',';
$params[':'.$column.$index] = $value;
}
$sql = rtrim($sql, ',');
$sql .= '),';
}
$sql = rtrim($sql, ',');
$stmt = $this->_conn->prepare($sql);
$stmt->execute($params);
}
}
}
$da = new DB('root', '');
$da->insertInto('tableOne',array( array('name' => 'lolo' , 'deg' => '100') ) );
First of all you loose any sql injection protection on the column names. If you can manage the placeholders on the values, then that is ok, but without using them there you loose protection on that as well. This can be solved by using the db schema itself, via Show columns but that gets a wee bit complex.
https://dev.mysql.com/doc/refman/5.7/en/show-columns.html
Second, your input array structure is all wrong, it needs to be array(0=>array(...), 1=>array(...)) instead of just array(...)
Third I would make this class a "Singleton" but that's just me
http://coderoncode.com/design-patterns/programming/php/development/2014/01/27/design-patterns-php-singletons.html
Forth, if you just want to do a single row at a time you can change this method
public function insertInto($tableName , $info){
$sql = 'INSERT INTO $tableName (';
$sql .= '`'implode('`,`', array_keys($info)).'`';
$sql .= ')VALUES(';
$params = array();
foreach( $info as $column => $value){
$sql .= ':'.$column.$index.',';
$params[':'.$column.$index] = $value;
}
$sql = rtrim($sql, ',');
$sql .= ')';
$stmt = $this->_conn->prepare($sql);
$stmt->execute($params);
}
And use the current input array structure you have.
This Is how i coded my own insert function
public function insertRecord($table,$records){
//A variable to store all the placeholders for my PDO INSERT values.
$placeholder = '';
for ($i = 0; $i < sizeof($records); $i++){
$placeholder[$i] = '?';
}
//A FOR-LOOP to loop through the records in the $record array
$placeholder = implode(',', $placeholder);
//Imploding ',' in between the placeholders
$sql = "INSERT INTO ".$table." VALUES ("{$placeholder}")";
$query = $this->dbh->prepare($sql);
$query->execute($records);
}
It Might not be the best..worked for me though.
As some other answers/comments have stated, there are quite a few critiques one could make about this overall process. However, in the interests of simply answering the question, you may want to just build the statement by looping through the columns, then looping through the values, then executing the finished statement (code below is just an example and hasn't been tested):
require 'connect.php';
class DB {
public function insertInto($tableName , $info){
global $db;
$query = "INSERT INTO $tableName (";
$columns = array_keys($info);
// build the columns in the statement
$length = count($columns);
foreach($columns as $index => $column) {
$query .= "$column";
if ($index+1 < $length) {
$query .= ','
}
}
$query .= ") VALUES ("
// build the values in the statement
$i = 1;
$length = count($info);
foreach($info as $value) {
$query .= "'$value'"
if ($i < $length) {
$query .= ","
}
$i++;
}
$query .= ")"
$stmt = $db->prepare($query);
$stmt->execute();
}
}
$da = new DB;
$da->insertInto('tableOne',array('name' => 'lolo' , 'deg' => '100'));
I am currently using this type of SQL on MySQL to insert multiple rows of values in one single query:
INSERT INTO `tbl` (`key1`,`key2`) VALUES ('r1v1','r1v2'),('r2v1','r2v2'),...
On the readings on PDO, the use prepared statements should give me a better security than static queries.
I would therefore like to know whether it is possible to generate "inserting multiple rows of values by the use of one query" using prepared statements.
If yes, may I know how can I implement it?
Multiple Values Insert with PDO Prepared Statements
Inserting multiple values in one execute statement. Why because according to this page it is faster than regular inserts.
$datafields = array('fielda', 'fieldb', ... );
$data[] = array('fielda' => 'value', 'fieldb' => 'value' ....);
$data[] = array('fielda' => 'value', 'fieldb' => 'value' ....);
more data values or you probably have a loop that populates data.
With prepared inserts you need to know the fields you're inserting to, and the number of fields to create the ? placeholders to bind your parameters.
insert into table (fielda, fieldb, ... ) values (?,?...), (?,?...)....
That is basically how we want the insert statement to look like.
Now, the code:
function placeholders($text, $count=0, $separator=","){
$result = array();
if($count > 0){
for($x=0; $x<$count; $x++){
$result[] = $text;
}
}
return implode($separator, $result);
}
$pdo->beginTransaction(); // also helps speed up your inserts.
$insert_values = array();
foreach($data as $d){
$question_marks[] = '(' . placeholders('?', sizeof($d)) . ')';
$insert_values = array_merge($insert_values, array_values($d));
}
$sql = "INSERT INTO table (" . implode(",", $datafields ) . ") VALUES " .
implode(',', $question_marks);
$stmt = $pdo->prepare ($sql);
$stmt->execute($insert_values);
$pdo->commit();
Although in my test, there was only a 1 sec difference when using multiple inserts and regular prepared inserts with single value.
Same answer as Mr. Balagtas, slightly clearer...
Recent versions MySQL and PHP PDO do support multi-row INSERT statements.
SQL Overview
The SQL will look something like this, assuming a 3-column table you'd like to INSERT to.
INSERT INTO tbl_name
(colA, colB, colC)
VALUES (?, ?, ?), (?, ?, ?), (?, ?, ?) [,...]
ON DUPLICATE KEY UPDATE works as expected even with a multi-row INSERT; append this:
ON DUPLICATE KEY UPDATE colA = VALUES(colA), colB = VALUES(colB), colC = VALUES(colC)
PHP Overview
Your PHP code will follow the usual $pdo->prepare($qry) and $stmt->execute($params) PDO calls.
$params will be a 1-dimensional array of all the values to pass to the INSERT.
In the above example, it should contain 9 elements; PDO will use every set of 3 as a single row of values. (Inserting 3 rows of 3 columns each = 9 element array.)
Implementation
Below code is written for clarity, not efficiency. Work with the PHP array_*() functions for better ways to map or walk through your data if you'd like. Whether you can use transactions obviously depends on your MySQL table type.
Assuming:
$tblName - the string name of the table to INSERT to
$colNames - 1-dimensional array of the column names of the table
These column names must be valid MySQL column identifiers; escape them with backticks (``) if they are not
$dataVals - mutli-dimensional array, where each element is a 1-d array of a row of values to INSERT
Sample Code
// setup data values for PDO
// memory warning: this is creating a copy all of $dataVals
$dataToInsert = array();
foreach ($dataVals as $row => $data) {
foreach($data as $val) {
$dataToInsert[] = $val;
}
}
// (optional) setup the ON DUPLICATE column names
$updateCols = array();
foreach ($colNames as $curCol) {
$updateCols[] = $curCol . " = VALUES($curCol)";
}
$onDup = implode(', ', $updateCols);
// setup the placeholders - a fancy way to make the long "(?, ?, ?)..." string
$rowPlaces = '(' . implode(', ', array_fill(0, count($colNames), '?')) . ')';
$allPlaces = implode(', ', array_fill(0, count($dataVals), $rowPlaces));
$sql = "INSERT INTO $tblName (" . implode(', ', $colNames) .
") VALUES " . $allPlaces . " ON DUPLICATE KEY UPDATE $onDup";
// and then the PHP PDO boilerplate
$stmt = $pdo->prepare ($sql);
$stmt->execute($dataToInsert);
$pdo->commit();
For what it is worth, I have seen a lot of users recommend iterating through INSERT statements instead of building out as a single string query as the selected answer did. I decided to run a simple test with just two fields and a very basic insert statement:
<?php
require('conn.php');
$fname = 'J';
$lname = 'M';
$time_start = microtime(true);
$stmt = $db->prepare('INSERT INTO table (FirstName, LastName) VALUES (:fname, :lname)');
for($i = 1; $i <= 10; $i++ ) {
$stmt->bindParam(':fname', $fname);
$stmt->bindParam(':lname', $lname);
$stmt->execute();
$fname .= 'O';
$lname .= 'A';
}
$time_end = microtime(true);
$time = $time_end - $time_start;
echo "Completed in ". $time ." seconds <hr>";
$fname2 = 'J';
$lname2 = 'M';
$time_start2 = microtime(true);
$qry = 'INSERT INTO table (FirstName, LastName) VALUES ';
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?)";
$stmt2 = $db->prepare($qry);
$values = array();
for($j = 1; $j<=10; $j++) {
$values2 = array($fname2, $lname2);
$values = array_merge($values,$values2);
$fname2 .= 'O';
$lname2 .= 'A';
}
$stmt2->execute($values);
$time_end2 = microtime(true);
$time2 = $time_end2 - $time_start2;
echo "Completed in ". $time2 ." seconds <hr>";
?>
While the overall query itself took milliseconds or less, the latter (single string) query was consistently 8 times faster or more. If this was built out to say reflect an import of thousands of rows on many more columns, the difference could be enormous.
The Accepted Answer by Herbert Balagtas works well when the $data array is small. With larger $data arrays the array_merge function becomes prohibitively slow. My test file to create the $data array has 28 cols and is about 80,000 lines. The final script took 41s to complete.
Using array_push() to create $insert_values instead of array_merge() resulted in a 100X speed up with execution time of 0.41s.
The problematic array_merge():
$insert_values = array();
foreach($data as $d){
$question_marks[] = '(' . placeholders('?', sizeof($d)) . ')';
$insert_values = array_merge($insert_values, array_values($d));
}
To eliminate the need for array_merge(), you can build the following two arrays instead:
//Note that these fields are empty, but the field count should match the fields in $datafields.
$data[] = array('','','','',... n );
//getting rid of array_merge()
array_push($insert_values, $value1, $value2, $value3 ... n );
These arrays can then be used as follows:
function placeholders($text, $count=0, $separator=","){
$result = array();
if($count > 0){
for($x=0; $x<$count; $x++){
$result[] = $text;
}
}
return implode($separator, $result);
}
$pdo->beginTransaction();
foreach($data as $d){
$question_marks[] = '(' . placeholders('?', sizeof($d)) . ')';
}
$sql = "INSERT INTO table (" . implode(",", array_keys($datafield) ) . ") VALUES " . implode(',', $question_marks);
$stmt = $pdo->prepare($sql);
$stmt->execute($insert_values);
$pdo->commit();
Two possible approaches:
$stmt = $pdo->prepare('INSERT INTO foo VALUES(:v1_1, :v1_2, :v1_3),
(:v2_1, :v2_2, :v2_3),
(:v2_1, :v2_2, :v2_3)');
$stmt->bindValue(':v1_1', $data[0][0]);
$stmt->bindValue(':v1_2', $data[0][1]);
$stmt->bindValue(':v1_3', $data[0][2]);
// etc...
$stmt->execute();
Or:
$stmt = $pdo->prepare('INSERT INTO foo VALUES(:a, :b, :c)');
foreach($data as $item)
{
$stmt->bindValue(':a', $item[0]);
$stmt->bindValue(':b', $item[1]);
$stmt->bindValue(':c', $item[2]);
$stmt->execute();
}
If the data for all the rows are in a single array, I would use the second solution.
That's simply not the way you use prepared statements.
It is perfectly okay to insert one row per query because you can execute one prepared statement multiple times with different parameters. In fact that is one of the greatest advantages as it allows you to insert you a great number of rows in an efficient, secure and comfortable manner.
So it maybe possible to implement the scheme you proposing, at least for a fixed number of rows, but it is almost guaranteed that this is not really what you want.
A shorter answer: flatten the array of data ordered by columns then
//$array = array( '1','2','3','4','5', '1','2','3','4','5');
$arCount = count($array);
$rCount = ($arCount ? $arCount - 1 : 0);
$criteria = sprintf("(?,?,?,?,?)%s", str_repeat(",(?,?,?,?,?)", $rCount));
$sql = "INSERT INTO table(c1,c2,c3,c4,c5) VALUES$criteria";
When inserting a 1,000 or so records you don't want to have to loop through every record to insert them when all you need is a count of the values.
Here is my simple approach.
$values = array();
foreach($workouts_id as $value){
$_value = "(".$value.",".$plan_id.")";
array_push($values,$_value);
}
$values_ = implode(",",$values);
$sql = "INSERT INTO plan_days(id,name) VALUES" . $values_."";
$stmt = $this->conn->prepare($sql);
$stmt->execute();
Here's a class I wrote do multiple inserts with purge option:
<?php
/**
* $pdo->beginTransaction();
* $pmi = new PDOMultiLineInserter($pdo, "foo", array("a","b","c","e"), 10);
* $pmi->insertRow($data);
* ....
* $pmi->insertRow($data);
* $pmi->purgeRemainingInserts();
* $pdo->commit();
*
*/
class PDOMultiLineInserter {
private $_purgeAtCount;
private $_bigInsertQuery, $_singleInsertQuery;
private $_currentlyInsertingRows = array();
private $_currentlyInsertingCount = 0;
private $_numberOfFields;
private $_error;
private $_insertCount = 0;
function __construct(\PDO $pdo, $tableName, $fieldsAsArray, $bigInsertCount = 100) {
$this->_numberOfFields = count($fieldsAsArray);
$insertIntoPortion = "INSERT INTO `$tableName` (`".implode("`,`", $fieldsAsArray)."`) VALUES";
$questionMarks = " (?".str_repeat(",?", $this->_numberOfFields - 1).")";
$this->_purgeAtCount = $bigInsertCount;
$this->_bigInsertQuery = $pdo->prepare($insertIntoPortion.$questionMarks.str_repeat(", ".$questionMarks, $bigInsertCount - 1));
$this->_singleInsertQuery = $pdo->prepare($insertIntoPortion.$questionMarks);
}
function insertRow($rowData) {
// #todo Compare speed
// $this->_currentlyInsertingRows = array_merge($this->_currentlyInsertingRows, $rowData);
foreach($rowData as $v) array_push($this->_currentlyInsertingRows, $v);
//
if (++$this->_currentlyInsertingCount == $this->_purgeAtCount) {
if ($this->_bigInsertQuery->execute($this->_currentlyInsertingRows) === FALSE) {
$this->_error = "Failed to perform a multi-insert (after {$this->_insertCount} inserts), the following errors occurred:".implode('<br/>', $this->_bigInsertQuery->errorInfo());
return false;
}
$this->_insertCount++;
$this->_currentlyInsertingCount = 0;
$this->_currentlyInsertingRows = array();
}
return true;
}
function purgeRemainingInserts() {
while ($this->_currentlyInsertingCount > 0) {
$singleInsertData = array();
// #todo Compare speed - http://www.evardsson.com/blog/2010/02/05/comparing-php-array_shift-to-array_pop/
// for ($i = 0; $i < $this->_numberOfFields; $i++) $singleInsertData[] = array_pop($this->_currentlyInsertingRows); array_reverse($singleInsertData);
for ($i = 0; $i < $this->_numberOfFields; $i++) array_unshift($singleInsertData, array_pop($this->_currentlyInsertingRows));
if ($this->_singleInsertQuery->execute($singleInsertData) === FALSE) {
$this->_error = "Failed to perform a small-insert (whilst purging the remaining rows; the following errors occurred:".implode('<br/>', $this->_singleInsertQuery->errorInfo());
return false;
}
$this->_currentlyInsertingCount--;
}
}
public function getError() {
return $this->_error;
}
}
Based on my experiments I found out that mysql insert statement with multiple value rows in single transaction is the fastest one.
However, if the data is too much then mysql's max_allowed_packet setting might restrict the single transaction insert with multiple value rows. Hence, following functions will fail when there is data greater than mysql's max_allowed_packet size:
singleTransactionInsertWithRollback
singleTransactionInsertWithPlaceholders
singleTransactionInsert
The most successful one in insert huge data scenario is transactionSpeed method, but it consumes time more the above mentioned methods. So, to handle this problem you can either split your data into smaller chunks and call single transaction insert multiple times or give up speed of execution by using transactionSpeed method.
Here's my research
<?php
class SpeedTestClass
{
private $data;
private $pdo;
public function __construct()
{
$this->data = [];
$this->pdo = new \PDO('mysql:dbname=test_data', 'admin', 'admin');
if (!$this->pdo) {
die('Failed to connect to database');
}
}
public function createData()
{
$prefix = 'test';
$postfix = 'unicourt.com';
$salutations = ['Mr.', 'Ms.', 'Dr.', 'Mrs.'];
$csv[] = ['Salutation', 'First Name', 'Last Name', 'Email Address'];
for ($i = 0; $i < 100000; ++$i) {
$csv[] = [
$salutations[$i % \count($salutations)],
$prefix.$i,
$prefix.$i,
$prefix.$i.'#'.$postfix,
];
}
$this->data = $csv;
}
public function truncateTable()
{
$this->pdo->query('TRUNCATE TABLE `name`');
}
public function transactionSpeed()
{
$timer1 = microtime(true);
$this->pdo->beginTransaction();
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES (:first_name, :last_name)';
$sth = $this->pdo->prepare($sql);
foreach (\array_slice($this->data, 1) as $values) {
$sth->execute([
':first_name' => $values[1],
':last_name' => $values[2],
]);
}
// $timer2 = microtime(true);
// echo 'Prepare Time: '.($timer2 - $timer1).PHP_EOL;
// $timer3 = microtime(true);
if (!$this->pdo->commit()) {
echo "Commit failed\n";
}
$timer4 = microtime(true);
// echo 'Commit Time: '.($timer4 - $timer3).PHP_EOL;
return $timer4 - $timer1;
}
public function autoCommitSpeed()
{
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES (:first_name, :last_name)';
$sth = $this->pdo->prepare($sql);
foreach (\array_slice($this->data, 1) as $values) {
$sth->execute([
':first_name' => $values[1],
':last_name' => $values[2],
]);
}
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function noBindAutoCommitSpeed()
{
$timer1 = microtime(true);
foreach (\array_slice($this->data, 1) as $values) {
$sth = $this->pdo->prepare("INSERT INTO `name` (`first_name`, `last_name`) VALUES ('{$values[1]}', '{$values[2]}')");
$sth->execute();
}
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsert()
{
$timer1 = microtime(true);
foreach (\array_slice($this->data, 1) as $values) {
$arr[] = "('{$values[1]}', '{$values[2]}')";
}
$sth = $this->pdo->prepare('INSERT INTO `name` (`first_name`, `last_name`) VALUES '.implode(', ', $arr));
$sth->execute();
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsertWithPlaceholders()
{
$placeholders = [];
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES ';
foreach (\array_slice($this->data, 1) as $values) {
$placeholders[] = '(?, ?)';
$arr[] = $values[1];
$arr[] = $values[2];
}
$sql .= implode(', ', $placeholders);
$sth = $this->pdo->prepare($sql);
$sth->execute($arr);
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsertWithRollback()
{
$placeholders = [];
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES ';
foreach (\array_slice($this->data, 1) as $values) {
$placeholders[] = '(?, ?)';
$arr[] = $values[1];
$arr[] = $values[2];
}
$sql .= implode(', ', $placeholders);
$this->pdo->beginTransaction();
$sth = $this->pdo->prepare($sql);
$sth->execute($arr);
$this->pdo->commit();
$timer2 = microtime(true);
return $timer2 - $timer1;
}
}
$s = new SpeedTestClass();
$s->createData();
$s->truncateTable();
echo "Time Spent for singleTransactionInsertWithRollback: {$s->singleTransactionInsertWithRollback()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for single Transaction Insert: {$s->singleTransactionInsert()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for single Transaction Insert With Placeholders: {$s->singleTransactionInsertWithPlaceholders()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for transaction: {$s->transactionSpeed()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for AutoCommit: {$s->noBindAutoCommitSpeed()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for autocommit with bind: {$s->autoCommitSpeed()}".PHP_EOL;
$s->truncateTable();
The results for 100,000 entries for a table containing only two columns is as below
$ php data.php
Time Spent for singleTransactionInsertWithRollback: 0.75147604942322
Time Spent for single Transaction Insert: 0.67445182800293
Time Spent for single Transaction Insert With Placeholders: 0.71131205558777
Time Spent for transaction: 8.0056409835815
Time Spent for AutoCommit: 35.4979159832
Time Spent for autocommit with bind: 33.303519010544
Here is another (slim) solution for this issue:
At first you need to count the data of the source array (here: $aData) with count(). Then you use array_fill() and generate a new array wich as many entries as the source array has, each with the value "(?,?)" (the number of placeholders depends on the fields you use; here: 2). Then the generated array needs to be imploded and as glue a comma is used.
Within the foreach loop, you need to generate another index regarding on the number of placeholders you use (number of placeholders * current array index + 1). You need to add 1 to the generated index after each binded value.
$do = $db->prepare("INSERT INTO table (id, name) VALUES ".implode(',', array_fill(0, count($aData), '(?,?)')));
foreach($aData as $iIndex => $aValues){
$iRealIndex = 2 * $iIndex + 1;
$do->bindValue($iRealIndex, $aValues['id'], PDO::PARAM_INT);
$iRealIndex = $iRealIndex + 1;
$do->bindValue($iRealIndex, $aValues['name'], PDO::PARAM_STR);
}
$do->execute();
This is how I did it:
First define the column names you'll use, or leave it blank and pdo will assume you want to use all the columns on the table - in which case you'll need to inform the row values in the exact order they appear on the table.
$cols = 'name', 'middleName', 'eMail';
$table = 'people';
Now, suppose you have a two dimensional array already prepared. Iterate it, and construct a string with your row values, as such:
foreach ( $people as $person ) {
if(! $rowVals ) {
$rows = '(' . "'$name'" . ',' . "'$middleName'" . ',' . "'$eMail'" . ')';
} else { $rowVals = '(' . "'$name'" . ',' . "'$middleName'" . ',' . "'$eMail'" . ')';
}
Now, what you just did was check if $rows was already defined, and if not, create it and store row values and the necessary SQL syntax so it will be a valid statement. Note that strings should go inside double quotes and single quotes, so they will be promptly recognized as such.
All it's left to do is prepare the statement and execute, as such:
$stmt = $db->prepare ( "INSERT INTO $table $cols VALUES $rowVals" );
$stmt->execute ();
Tested with up to 2000 rows so far, and the execution time is dismal. Will run some more tests and will get back here in case I have something further to contribute.
Regards.
Since it has not been suggested yet, I'm pretty sure LOAD DATA INFILE is still the fastest way to load data as it disables indexing, inserts all data, and then re-enables the indexes - all in a single request.
Saving the data as a csv should be fairly trivial keeping in mind fputcsv. MyISAM is fastest, but you still get big performance in InnoDB. There are other disadvantages, though so I would go this route if you are inserting a lot of data, and not bother with under 100 rows.
Although an old question all the contributions helped me a lot so here's my solution, which works within my own DbContext class. The $rows parameter is simply an array of associative arrays representing rows or models: field name => insert value.
If you use a pattern that uses models this fits in nicely when passed model data as an array, say from a ToRowArray method within the model class.
Note: It should go without saying but never allow the arguments passed
to this method to be exposed to the user or reliant on any user input, other than the insert values, which have been validated and sanitised. The $tableName argument and the column names should be defined by the calling logic; for instance a User model could be mapped to the user table, which has its column list mapped to the model's member fields.
public function InsertRange($tableName, $rows)
{
// Get column list
$columnList = array_keys($rows[0]);
$numColumns = count($columnList);
$columnListString = implode(",", $columnList);
// Generate pdo param placeholders
$placeHolders = array();
foreach($rows as $row)
{
$temp = array();
for($i = 0; $i < count($row); $i++)
$temp[] = "?";
$placeHolders[] = "(" . implode(",", $temp) . ")";
}
$placeHolders = implode(",", $placeHolders);
// Construct the query
$sql = "insert into $tableName ($columnListString) values $placeHolders";
$stmt = $this->pdo->prepare($sql);
$j = 1;
foreach($rows as $row)
{
for($i = 0; $i < $numColumns; $i++)
{
$stmt->bindParam($j, $row[$columnList[$i]]);
$j++;
}
}
$stmt->execute();
}
You can insert multiple rows in a single query with this function:
function insertMultiple($query,$rows) {
if (count($rows)>0) {
$args = array_fill(0, count($rows[0]), '?');
$params = array();
foreach($rows as $row)
{
$values[] = "(".implode(',', $args).")";
foreach($row as $value)
{
$params[] = $value;
}
}
$query = $query." VALUES ".implode(',', $values);
$stmt = $PDO->prepare($query);
$stmt->execute($params);
}
}
$row is an array of arrays of values.
In your case you would call the function with
insertMultiple("INSERT INTO tbl (`key1`,`key2`)",array(array('r1v1','r1v2'),array('r2v1','r2v2')));
This has the benefit that you use prepared statements, while inserting multiple rows with a single query. Security!
Here is my solution: https://github.com/sasha-ch/Aura.Sql
based on auraphp/Aura.Sql library.
Usage example:
$q = "insert into t2(id,name) values (?,?), ... on duplicate key update name=name";
$bind_values = [ [[1,'str1'],[2,'str2']] ];
$pdo->perform($q, $bind_values);
Bugreports are welcome.
My real world example to insert all german postcodes into an empty table (to add town names later):
// obtain column template
$stmt = $db->prepare('SHOW COLUMNS FROM towns');
$stmt->execute();
$columns = array_fill_keys(array_values($stmt->fetchAll(PDO::FETCH_COLUMN)), null);
// multiple INSERT
$postcode = '01000';// smallest german postcode
while ($postcode <= 99999) {// highest german postcode
$values = array();
while ($postcode <= 99999) {
// reset row
$row = $columns;
// now fill our row with data
$row['postcode'] = sprintf('%05d', $postcode);
// build INSERT array
foreach ($row as $value) {
$values[] = $value;
}
$postcode++;
// avoid memory kill
if (!($postcode % 10000)) {
break;
}
}
// build query
$count_columns = count($columns);
$placeholder = ',(' . substr(str_repeat(',?', $count_columns), 1) . ')';//,(?,?,?)
$placeholder_group = substr(str_repeat($placeholder, count($values) / $count_columns), 1);//(?,?,?),(?,?,?)...
$into_columns = implode(',', array_keys($columns));//col1,col2,col3
// this part is optional:
$on_duplicate = array();
foreach ($columns as $column => $row) {
$on_duplicate[] = $column;
$on_duplicate[] = $column;
}
$on_duplicate = ' ON DUPLICATE KEY UPDATE' . vsprintf(substr(str_repeat(', %s = VALUES(%s)', $count_columns), 1), $on_duplicate);
// execute query
$stmt = $db->prepare('INSERT INTO towns (' . $into_columns . ') VALUES' . $placeholder_group . $on_duplicate);//INSERT INTO towns (col1,col2,col3) VALUES(?,?,?),(?,?,?)... {ON DUPLICATE...}
$stmt->execute($values);
}
As you can see its fully flexible. You don't need to check the amount of columns or check on which position your column is. You only need to set the insert data:
$row['postcode'] = sprintf('%05d', $postcode);
I'm proud of some of the query string constructors as they work without heavy array-functions like array_merge. Especially vsprintf() was a good find.
Finally I needed to add 2x while() to avoid exceeding the memory limit. This depends on your memory limit but at all its a good general solution to avoid problems (and having 10 queries is still much better than 10.000).
This worked for me
$sql = 'INSERT INTO table(pk_pk1,pk_pk2,date,pk_3) VALUES ';
$qPart = array_fill(0, count($array), "(?, ?,UTC_TIMESTAMP(),?)");
$sql .= implode(",", $qPart);
$stmt = DB::prepare('base', $sql);
$i = 1;
foreach ($array as $value) {
$stmt->bindValue($i++, $value);
$stmt->bindValue($i++, $pk_pk1);
$stmt->bindValue($i++, $pk_pk2);
$stmt->bindValue($i++, $pk_pk3);
}
$stmt->execute();
Most of the solutions given here to create the prepared query are more complex that they need to be. Using PHP's built in functions you can easily creare the SQL statement without significant overhead.
Given $records, an array of records where each record is itself an indexed array (in the form of field => value), the following function will insert the records into the given table $table, on a PDO connection $connection, using only a single prepared statement. Note that this is a PHP 5.6+ solution because of the use of argument unpacking in the call to array_push:
private function import(PDO $connection, $table, array $records)
{
$fields = array_keys($records[0]);
$placeHolders = substr(str_repeat(',?', count($fields)), 1);
$values = [];
foreach ($records as $record) {
array_push($values, ...array_values($record));
}
$query = 'INSERT INTO ' . $table . ' (';
$query .= implode(',', $fields);
$query .= ') VALUES (';
$query .= implode('),(', array_fill(0, count($records), $placeHolders));
$query .= ')';
$statement = $connection->prepare($query);
$statement->execute($values);
}
I had the same problem and this is how i accomplish for myself, and i made a function for myself for it ( and you can use it if that helps you).
Example:
INSERT INTO countries (country, city) VALUES (Germany, Berlin), (France, Paris);
$arr1 = Array("Germany", "Berlin");
$arr2 = Array("France", "France");
insertMultipleData("countries", Array($arr1, $arr2));
// Inserting multiple data to the Database.
public function insertMultipleData($table, $multi_params){
try{
$db = $this->connect();
$beforeParams = "";
$paramsStr = "";
$valuesStr = "";
for ($i=0; $i < count($multi_params); $i++) {
foreach ($multi_params[$i] as $j => $value) {
if ($i == 0) {
$beforeParams .= " " . $j . ",";
}
$paramsStr .= " :" . $j . "_" . $i .",";
}
$paramsStr = substr_replace($paramsStr, "", -1);
$valuesStr .= "(" . $paramsStr . "),";
$paramsStr = "";
}
$beforeParams = substr_replace($beforeParams, "", -1);
$valuesStr = substr_replace($valuesStr, "", -1);
$sql = "INSERT INTO " . $table . " (" . $beforeParams . ") VALUES " . $valuesStr . ";";
$stmt = $db->prepare($sql);
for ($i=0; $i < count($multi_params); $i++) {
foreach ($multi_params[$i] as $j => &$value) {
$stmt->bindParam(":" . $j . "_" . $i, $value);
}
}
$this->close($db);
$stmt->execute();
return true;
}catch(PDOException $e){
return false;
}
return false;
}
// Making connection to the Database
public function connect(){
$host = Constants::DB_HOST;
$dbname = Constants::DB_NAME;
$user = Constants::DB_USER;
$pass = Constants::DB_PASS;
$mysql_connect_str = 'mysql:host='. $host . ';dbname=' .$dbname;
$dbConnection = new PDO($mysql_connect_str, $user, $pass);
$dbConnection->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
return $dbConnection;
}
// Closing the connection
public function close($db){
$db = null;
}
If insertMultipleData($table, $multi_params) returns TRUE, your data has been inserted to your database.
what about something like this:
if(count($types_of_values)>0){
$uid = 1;
$x = 0;
$sql = "";
$values = array();
foreach($types_of_values as $k=>$v){
$sql .= "(:id_$k,:kind_of_val_$k), ";
$values[":id_$k"] = $uid;
$values[":kind_of_val_$k"] = $v;
}
$sql = substr($sql,0,-2);
$query = "INSERT INTO table (id,value_type) VALUES $sql";
$res = $this->db->prepare($query);
$res->execute($values);
}
The idea behind this is to cycle through your array values, adding "id numbers" to each loop for your prepared statement placeholders while at the same time, you add the values to your array for the binding parameters. If you don't like using the "key" index from the array, you could add $i=0, and $i++ inside the loop. Either works in this example, even if you have associative arrays with named keys, it would still work providing the keys were unique. With a little work it would be fine for nested arrays too..
**Note that substr strips the $sql variables last space and comma, if you don't have a space you'd need to change this to -1 rather than -2.
I'm writing some PHP to accept an array of numbers and names in POST and insert them into a MySQL table (named Contacts_table) Here's the version that works fine without any error:
<?php
// Includes
require_once 'Admin/Connector.php';
// Test if payload exists
if($_POST){
// Read payload into arrays
$ar = 0;
foreach($_POST as $entry){
$namenum = explode(',', $entry);
$names[$ar] = $namenum[1];
$numbers[$ar] = $namenum[0];
$ar += 1;
}
$namenum = NULL;
// Build SQL query
$sql = 'INSERT INTO Contact_table (NAME, PHONE) VALUES ';
$insertQuery = array();
$insertData = array();
$n = 0;
foreach ($numbers as $num) {
$insertQuery[] = '(?, ?)';
$insertData[] = $names[$n];
$insertData[] = $num;
$n++;
}
$sql .= implode(', ', $insertQuery);
$sql .= ' ON DUPLICATE KEY UPDATE name = COALESCE(VALUES(name), name);';
$n = NULL;
$num = NULL;
// Connect to MySQL database
$connect = dbconn(PROJHOST,PROJDB,PROJDBUSER,PROJDBPWD);
// Execute SQL query
$query = $connect->prepare($sql);
$query->execute($insertData);
$insertQuery = NULL;
$insertData = NULL;
$sql = NULL;
$query = NULL;
// Close connection to MySQL database
$connect = NULL;
}
?>
However, as you can see, I'm not using the bindParam() function here and just feeding the values directly to the execute() function. Many have recommended that I use bindParam() instead for server performance gains. Is it true or I am better off with this program as it stands? I did try writing and running a version of the above code using bindParam:
<?php
// Includes
require_once 'Admin/Connector.php';
// Test if payload exists
if($_POST){
// Read payload into arrays
$ar = 0;
foreach($_POST as $entry){
$namenum = explode(',', $entry);
$names[$ar] = $namenum[1];
$numbers[$ar] = $namenum[0];
$ar += 1;
}
$namenum = NULL;
// Build SQL query
$sql = 'INSERT INTO Contact_table (NAME, PHONE) VALUES ';
$insertQuery = array();
$insertData = array();
$n = 0;
foreach ($numbers as $num) {
$insertQuery[] = '(?, ?)';
$insertData[] = $names[$n];
$insertData[] = $num;
$n++;
}
$sql .= implode(', ', $insertQuery);
$sql .= ' ON DUPLICATE KEY UPDATE name = COALESCE(VALUES(name), name);';
$n = NULL;
$num = NULL;
// Connect to MySQL database
$connect = dbconn(PROJHOST,PROJDB,PROJDBUSER,PROJDBPWD);
// Prepare SQL query
$query = $connect->prepare($sql);
// Bind variables
foreach($insertData as $key => &$ins) {
$connect->bindParam($key+1,$ins);
}
// Execute SQL query
$query->execute();
$insertQuery = NULL;
$insertData = NULL;
$sql = NULL;
$query = NULL;
$key = NULL;
$ins = NULL;
// Close connection to MySQL database
$connect = NULL;
}
?>
But this code refuses to run and returns a fatal error – Call to undefined method PDO::bindParam(). What am I doing wrong here? I understand it's possible to write a much simpler code if I include execute() within the loop but that would spawn multiple queries which I want to avoid at all costs. My goal is a single query no matter what.
You can't $connect->bindParam($key+1,$ins);. Because PDO object doesn't have such method. Only PDOStatement has. That is why you've got error message.
You should :
$query->bindValue($key+1,$ins);
And you should use bindValue because if not, all your inserted values will get same value (the last one of $ins before you call execute).