I am currently trying to use the multi-valued INSERT queries with SQLite3 and PDO.
I did some research and found that before SQLite version: 3.7.11 the multi-valued INSERT syntax was not supported. This has since (2012) changed.
My phpinfo() is informing me that:
PDO Driver for SQLite 3.x enabled
SQLite Library 3.7.7.1
Regardless of that, PDO doesn't seem to support using these kinds of INSERT queries using SQLite3.
My question is if there is any workaround this issue. I am working on an application that is compatible with both SQLite3 and MySQL. Seeing that both of them support multi-value inserts, I would hate to use two kinds of query and INSERT logic only because PDO is not up-to-date.
Some edits - adding code specifics:
Opening the DB connection:
public function useSQLite3($file)
{
$dsn = "sqlite:$file";
$this->dbService = new PDO ($dsn);
$this->dbService->query('PRAGMA journal_mode=WAL;');
$this->dbService->setAttribute(PDO::ATTR_DEFAULT_FETCH_MODE, PDO::FETCH_ASSOC);
}
Method that handles the bulk insert to the DB:
public function bulkInsertLink(array $links)
{
$insertRows = array();
$placeholders = array();
$j = 0;
$i=0;
foreach($links as $linkData) {
$placeholders[$j] = '(';
foreach($linkData as $columnData) {
$placeholders[$j] .= '?,';
$insertRows[$i] = $columnData;
$i++;
}
$placeholders[$j] = rtrim($placeholders[$j], ',');
$placeholders[$j] .= ')';
$j++;
}
$query = 'INSERT INTO links (status, date, lang, group_ID, group_link_ID, link, sitemap_link) VALUES ';
$query .= implode(',', $placeholders);
$preparedQuery = $this->dbService->prepare($query);
$preparedQuery->execute($insertRows);
}
$links is an array, where each element represents the information for one row to be inserted.
Using PDO you can do multi values inserts like this:
$statement = $pdo->prepare('INSERT INTO t VALUES (?, ?), (?, ?)');
$pdo->execute([1, 2, 3, 4]);
But I'd personally prepared a single insert statement and executed it multiple times with different parameters.
Related
I'm currently struggling to solve a small problem with MySQLi's prepared statements. I'm trying to write a custom PHP function that takes a few variables and uses them to write some data into a table in my database, via prepared statement.
The issue is that I need the function to accept any number of parameters.
An example of the function in action:
db_insert_secure("Goals", "(Name, Description, Type, ProjectID)", $data);
This is supposed to write all of the info stored in the $data array into the (4) specified rows in the Goals table.
However, because the amount of parameters can change, I can't think of a way to bind them in an efficient manner.
The one idea I had was to use a switch statement that would handle binding different numbers of parameters, but that isn't the most eloquent or efficient method, I'm sure.
The script in its entirety:
function db_insert_secure($table, $columns, $data)
{
$link = db_connect();
$inputData = explode(",", $columns);
$query = "INSERT INTO ".$table." ".$columns." VALUES (";
for ($i = 0; $i < sizeof($inputData); $i ++)
{
$query .= "?";
if ($i != sizeof($inputData) - 1)
{
$query .= ", ";
}
}
$query .= ")";
echo $query;
$check_statement = mysqli_prepare($link, $query);
//mysqli_stmt_bind_param($check_statement, 's', $data[0]);
echo $check_statement;
db_disconnect($link);
}
NOTE: the db_connect and db_disconnect scripts are custom scripts for opening and closing a connection to the database. db_connect simply returns the connection object.
Can anyone think of a solution to this problem that does not involve using eval?
Funnily enough, after struggling with this for the past 12 hours or so, I actually managed to find a solution to the problem within a few minutes of starting this thread.
You can use an array as data for a prepared statement by placing "..." in front of it.
For example, in my case:
mysqli_stmt_bind_param($preparedStatement, 'ssss', ...$data);
basically I am trying to implement a function in one of my PHP classes that makes an entry into a junction table for a many to many relationship.
This is the method here:
public function setTags($value, $id){
global $db;
$tags = $value;
$query .= "DELETE FROM directorycolumntags
WHERE directorycolumn_id = $id; ";
foreach($tags as $tag){
$query .= "INSERT INTO directorycolumntags (directorycolumn_id, tag_id)
VALUES (".$id.",".$tag.");";
}
mysql_query($query);
}
The SQL is produces works fine, as I've echoed it and manually executed it via phpMyAdmin. However, If I leave it as above, the data is never inserted. Does anyone know why this might be happening?
This is the sql it is generating which works fine when I type it manually in:
DELETE FROM directorycolumntags WHERE directorycolumn_id = 178;
INSERT INTO directorycolumntags (directorycolumn_id, tag_id) VALUES (178,29);
INSERT INTO directorycolumntags (directorycolumn_id, tag_id) VALUES (178,30);
INSERT INTO directorycolumntags (directorycolumn_id, tag_id) VALUES (178,32);
The old, unsafe, deprecated mysql_* extension never supported multiple queries. You could, conceivably do this using the mysql replacement extension: mysqli_*, which has the mysqli_multi_query function.
Personally, I'd not use this approach, though. I'd do what most devs would do: use a prepared statement in a transaction to execute each query safely, and commit the results on success, or rollback on failure:
$db = new PDO(
'mysql:host=127.0.0.1;dbname=db;charset=utf8',
'user',
'pass',
array(
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION
)
);
try
{
$db->beginTransaction();
$stmt = $db->prepare('DELETE FROM tbl WHERE field = :id');
$stmt->execute(array(':id' => $id));
$stmt = $db->prepare('INSERT INTO tbl (field1, field2) VALUES (:field1, :field2)');
foreach ($tags as $tag)
{
$stmt->execute(
array(
':field1' => $id,
':field2' => $tag
)
);
$stmt->closeCursor();//<-- optional for MySQL
}
$db->commit();
}
catch (PDOException $e)
{
$db->rollBack();
echo 'Something went wrong: ', $e->getMessage();
}
Going slightly off-topic: You really ought to consider using type-hints. From your code, it's clear that $values is expected to be an array. A type-hint can ensure that the value being passed is in fact an array. You should also get rid of that ugly global $db;, and instead pass the connection as an argument, too. That's why I'd strongly suggest you change your function's signature from:
public function setTags($value, $id){
To:
public function setTags(PDO $db, array $value, $id)
{
}
That way, debugging gets a lot easier:
$instance->setTags(123, 123);//in your current code will not fail immediately
$instance->setTags($db, [123], 123);//in my suggestion works but...
$instance->setTags([123], null, '');// fails with a message saying argument 1 instance of PDO expected
http://docs.php.net/mysql_query says:
mysql_query() sends a unique query (multiple queries are not supported) to the currently active database on the server that's associated with the specified link_identifier
If you can use mysqli perhaps this interest you: mysqli.multi-query
Executes one or multiple queries which are concatenated by a semicolon.
you can not run multiple quires using mysql_query, try to modify your function like this. It would be better if you use mysqli or pdo instead of mysql because soon it will deprecated and it would not work on newer version of php
public function setTags($value, $id){
global $db;
$tags = $value;
mysql_query("DELETE FROM directorycolumntags WHERE directorycolumn_id = $id");
foreach($tags as $tag){
mysql_query("INSERT into directorycolumntags (directorycolumn_id, tag_id) VALUES (".$id.",".$tag.")");
}
}
First of all, I apologize if this is answered somewhere else, but I couldn't find anything.
I have problems with the following code:
function register_user ($register_data) {
global $db;
array_walk ($register_data, 'array_sanitize');
$register_data ['password'] = md5 ($register_data ['password']);
$fields = '`' . implode ('`, `', array_keys ($register_data)) . '`';
$data = '\'' . implode ('\', \'', $register_data) . '\'';
$query = $db -> prepare ("INSERT INTO `users` (:fields) VALUES (:data)");
$query -> bindParam (':fields', $fields);
$query -> bindParam (':data', $data);
$query -> execute ();
}
The problem is that this is executed correctly but the query is not ran and the row is not inserted in the database.
Now, if I just do this:
$query = $db -> prepare ("INSERT INTO `users` ($fields) VALUES ($data)");
//$query -> bindParam (':fields', $fields);
//$query -> bindParam (':data', $data);
$query -> execute ();
everything works like a charm, so I am guessing the problem is with how I am passing data to the placeholders.
Can someone please explain to me why this is not working? I'd like to understand it properly in the first place.
Thanks in advance for any help.
There are two different use cases that could be described as Passing an imploded array to a query placeholder. One is using prepared statements with IN() clause in SQL. this case is already fully covered in this answer.
Another use case is an insert helper function, like one featured in your question. I've got an article that explains how to create an SQL injection proof insert helper function for PDO_MYSQL.
Given such a function is not only adding data values to the query but also table and column names, a prepared statement won't be enough to protect from SQL injection. Hence, such a function will need a helper function of its own, to protect table and field named. Here is one for MySQL:
function escape_mysql_identifier($field){
return "`".str_replace("`", "``", $field)."`";
}
And now we can finally have a function that accepts a table name and an array with data and runs a prepared INSERT query against a database:
function prepared_insert($pdo, $table, $data) {
$keys = array_keys($data);
$keys = array_map('escape_mysql_identifier', $keys);
$fields = implode(",", $keys);
$table = escape_mysql_identifier($table);
$placeholders = str_repeat('?,', count($keys) - 1) . '?';
$sql = "INSERT INTO $table ($fields) VALUES ($placeholders)";
$pdo->prepare($sql)->execute(array_values($data));
}
that can be used like this:
prepared_insert($pdo, 'users', ['name' => $name, 'password' => $hashed_password]);
the full explanation can be found in the article linked above, but in brief, we are creating a list of column names from the input array keys and a list of comma separated placeholders for the SQL VALUES() clause. And finally we are sending the input array values into PDO's execute(). Safe, convenient and concise.
I am looking to do multiple inserts using PHP PDO.
The closest answer I have found is this one
how-to-insert-an-array-into-a-single-mysql-prepared-statement
However the example thats been given uses ?? instead of real placeholders.
I have looked at the examples on the PHP doc site for place holders
php.net pdo.prepared-statements
$stmt = $dbh->prepare("INSERT INTO REGISTRY (name, value) VALUES (:name, :value)");
$stmt->bindParam(':name', $name);
$stmt->bindParam(':value', $value);
Now lets say I wanted to achieve the above but with an array
$valuesToInsert = array(
0 => array('name' => 'Robert', 'value' => 'some value'),
1 => array('name' -> 'Louise', 'value' => 'another value')
);
How would I go about it with PDO and multiple inserts per transaction?
I imagine it would start of with a loop?
$stmt = $dbh->prepare("INSERT INTO REGISTRY (name, value) VALUES (:name, :value)");
foreach($valuesToInsert as $insertRow){
// now loop through each inner array to match binded values
foreach($insertRow as $column => value){
$stmt->bindParam(":{$column}", value);
}
}
$stmt->execute();
However the above does not work but hopefully will demonstrate what im trying to achieve
First of all, ? symbols are real place-holders (most drivers allow to use both syntaxes, positional and named place-holders). Secondly, prepared statements are nothing but a tool to inject raw input into SQL statements—the syntax of the SQL statement itself is unaffected. You already have all the elements you need:
How to insert multiple rows with a single query
How to generate SQL dynamically
How to use prepared statements with named place-holders.
It's fairly trivial to combine them all:
$sql = 'INSERT INTO table (memberID, programID) VALUES ';
$insertQuery = [];
$insertData = [];
$n = 0;
foreach ($data as $row) {
$insertQuery[] = '(:memberID' . $n . ', :programID' . $n . ')';
$insertData['memberID' . $n] = $memberid;
$insertData['programID' . $n] = $row;
$n++;
}
if (!empty($insertQuery)) {
$sql .= implode(', ', $insertQuery);
$stmt = $db->prepare($sql);
$stmt->execute($insertData);
}
I'm assuming you are using InnoDB so this answer is only valid for that engine (or any other transaction-capable engine, meaning MyISAM isn't included).
By default InnoDB runs in auto-commit mode. That means each query is treated as its own contained transaction.
To translate that to something us mortals can understand, it means that every INSERT query you issue will force hard-disk to commit it by confirming it wrote down the query information.
Considering how mechanical hard-disks are super slow since their input-output operation per second is low (if I'm not mistaken, the average is 300ish IO's), it means your 50 000 queries will be - well, super slow.
So what do you do? You commit all of your 50k queries in a single transaction. It might not be the best solution for various purposes but it'll be fast.
You do it like this:
$dbh->beginTransaction();
$stmt = $dbh->prepare("INSERT INTO REGISTRY (name, value) VALUES (:name, :value)");
foreach($valuesToInsert as $insertRow)
{
// now loop through each inner array to match bound values
foreach($insertRow as $column => value)
{
$stmt->bindParam(":$column", value);
$stmt->execute();
}
}
$dbh->commit();
A little modifications in solution provided by N.B
$stmt->execute() should be outside of inner loop because you may have one or more columns that need to bind before calling $stmt->execute() else you 'll get exception "Invalid parameter number: number of bound variables does not match number of token".
2nd "value" variable were missing dollar signs.
function batchinsert($sql,$params){
try {
db->beginTransaction();
$stmt = db->prepare($sql);
foreach($params as $row)
{
// now loop through each inner array to match bound values
foreach($row as $column => $value)
{
$stmt->bindParam(":$column", $value);
}
$stmt->execute();
}
db->commit();
} catch(PDOExecption $e) {
$db->rollback();
}
}
Test:
$sql = "INSERT INTO `test`(`name`, `value`) VALUES (:name, :value)" ;
$data = array();
array_push($data, array('name'=>'Name1','value'=>'Value1'));
array_push($data, array('name'=>'Name2','value'=>'Value2'));
array_push($data, array('name'=>'Name3','value'=>'Value3'));
array_push($data, array('name'=>'Name4','value'=>'Value4'));
array_push($data, array('name'=>'Name5','value'=>'Value5'));
batchinsert($sql,$data);
Your code was actually ok, but had a problem in $stmt->bindParam(":$column", value); It should be $stmt->bindValue(":{$column}", $value); and it will work perfectly. This will assist others in future.
Full code:
foreach($params as $row)
{
// now loop through each inner array to match bound values
foreach($row as $column => $value)
{
$stmt->bindValue(":{$column}", $value); //EDIT
}
// Execute statement to add to transaction
$stmt->execute();
}
Move execute inside of the loop.
$stmt = $dbh->prepare("INSERT INTO REGISTRY (name, value) VALUES (:name, :value)");
foreach($valuesToInsert as $insertRow)
{
$stmt->execute($insertRow);
}
If you experience any problems with this such recommended way, you have to ask a question, describing these certain problems.
How can I implement recursive MySQL Queries. I am trying to look for it but resources are not very helpful.
Trying to implement similar logic.
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Parsing Large CSV file to get data and initiate insertion into schema.
$query = "";
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
$query = $query + "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (" + $data[0] + ", " + $data[1] + ", " + $data[2] + ", " + $data[3] + ")";
}
$stmt = $this->prepare($query);
// Execute the statement
$stmt->execute();
$this->checkForErrors($stmt);
}
#Author: Numenor
Error Message: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '0' at line 1
This Approach inspired to look for an MySQL recursive query approach.
Here is the Approach I was using Earlier:
Current Code:
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Parsing Large CSV file to get data and initiate insertion into schema.
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
$query = "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (:id, :code, :connectid, :connectcode)";
$stmt = $this->prepare($query);
// Then, for each line : bind the parameters
$stmt->bindValue(':id', $data[0], PDO::PARAM_INT);
$stmt->bindValue(':code', $data[1], PDO::PARAM_INT);
$stmt->bindValue(':connectid', $data[2], PDO::PARAM_INT);
$stmt->bindValue(':connectcode', $data[3], PDO::PARAM_INT);
// Execute the statement
$stmt->execute();
$this->checkForErrors($stmt);
}
}
Updated Code
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Prepare insertion query to insert data into schema.
$query = "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (:id, :code, :connectid, :connectcode)";
$stmt = $this->prepare($query);
// Then, for each line : bind the parameters
$stmt->bindValue(':id', $data[0], PDO::PARAM_INT);
$stmt->bindValue(':code', $data[1], PDO::PARAM_INT);
$stmt->bindValue(':connectid', $data[2], PDO::PARAM_INT);
$stmt->bindValue(':connectcode', $data[3], PDO::PARAM_INT);
//Loop through CSV file and execute inserts prepared, but this is not working
//and there are not data being populated into database.
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
// Execute the statement
list($id, $code, $connid, $conncode)=$data;
$stmt->execute();
$this->checkForErrors($stmt);
}
}
This was my Main Question for which I am looking for suggestions !!!
There's nothing recursive in that code snippet.
The wrong operator is used to concatenate the strings, it's . (dot) not +
You'd have to use something like mysqli::multi_query() to execute more than one statement with a single function call and the statements would have to be separated by a delimiter character (by default a semicolon)
Since you're already using prepare() and execute() why not simply make it a parametrized prepared statement and then assign the values in each iteration of the loop and execute the statement? (Exactly what is $this and what type of object does $this->prepare() return?)
edit and btw: $this->prepare() indicates that your class extends a database class. And it also holds a file descriptor $this->fin. This has a certain code smell. My guess is that your class uses/has a database/datasink object and a file/datasource, but not is a database+readfile class. Only extend a class if your derived class is something.
edit: a simple example
class Foo {
protected $pdo;
public function __construct(PDO $pdo) {
$this->pdo = $pdo;
}
public function initiateInserts($file)
{
$query = '
INSERT INTO
dt_table_tmp
(id, code, connectid, connectcode)
VALUES
(:id, :code, :connid, :conncode)
';
$stmt = $this->pdo->prepare($query);
$stmt->bindParam(':id', $id);
$stmt->bindParam(':code', $code);
$stmt->bindParam(':connid', $connid);
$stmt->bindParam(':conncode', $conncode);
$fin = fopen($file, 'r') or die('Cannot open file');
while ( false!==($data=fgetcsv($fin,5000,";")) ) {
list($id, $code, $connid, $conncode)=$data;
$stmt->execute();
}
}
}
$pdo = new PDO("mysql:host=localhost;dbname=test", 'localonly', 'localonly');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// set up a demo table and some test data
$pdo->exec('CREATE TEMPORARY TABLE dt_table_tmp (id int, code int, connectid int, connectcode int)');
$sourcepath = 'sample.data.tmp';
$fh = fopen($sourcepath, 'wb') or die('!fopen(w)');
for($i=0; $i<10000; $i++) {
fputcsv($fh, array($i, $i%4, $i%100, $i%3), ';');
}
fclose($fh); unset($fh);
// test script
$foo = new Foo($pdo);
$foo->initiateInserts($sourcepath);
a few tips about speeding up mysql data import
check if your data really requires to be parsed, sometimes load data works just fine for csv
if possible, create an sql file first via php and then execute it with mysql command line client
use multivalue inserts
disable keys before inserting
multivalue insert statement is something like
INSERT INTO users(name, age) VALUES
("Sam", 13),
("Joe", 14),
("Bill", 33);
this is much faster than three distinct insert statements.
Disabling keys is important to prevent indexing each time you're executing an INSERT:
ALTER TABLE whatever DISABLE KEYS;
INSERT INTO whatever .....
INSERT INTO whatever .....
INSERT INTO whatever .....
ALTER TABLE whatever ENABLE KEYS;
further reading http://dev.mysql.com/doc/refman/5.1/en/insert-speed.html
Inspired by this question I would say you should do something similar. If you really have so many data, then a bulk import is the most appropriate approach for this. And you already have the data in a file.
Have a look at the LOAD DATA INFILE command.
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed. The file name must be given as a literal string.
If you are interested in the speed differences then read Speed of INSERT Statements.
E.g. you can do this:
$query = "LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
"
This will also ignore the first line assuming that it only indicates the columns.