Looping through a variable array and inserting using bind_param - php

Im trying to loop through several arrays to insert data into a mysql database. And Im trying to bind the data so that I can loop through it. There can be a various number of columns to which data is bound.
It appears that the data Im binding is not being processed as expected and the insert ultimately fails.
I have a columns array that stores the column names and data types. I also have a values array that stores the values that are to be inserted. Sample data:
$colArr = array (
array('i', 'ID'),
array('s', 'Date')
);
$valArr = array(
array(1, 'now()'),
array(2, 'now()'),
);
//I create my type and query strings as well as the array referencing the columns for binding.
$valStrForQry = rtrim(str_repeat('?, ', count($v['colArr'])), ', '); //result: '?, ?'
$params = array();
$colsForQry = '';
$typeStr = '';
$cntr = 0;
foreach ($colArr as $cols) {
$colsForQry .= $cols[1] . ', ';
$typeStr .= $cols[0];
$params[] = &$valArr[$cntr][1];
$cntr++;
}
$colsForQry = rtrim($colsForQry, ', '); //result: 'ID, Date'
$qry = 'INSERT INTO table (' . $colsForQry . ') VALUES (' . $valStrForQry . ')';
$stmt = $mysqli->prepare($qry);
//Bind the parameters.
call_user_func_array(array($stmt, 'bind_param'), array_merge(array($typeStr), $params));
//Loop through the values array, assign them using eval, and execute the statement. Im open to suggestions if theres a better way to do this.
foreach ($valArr as $vals) {
$cntr = 0;
foreach ($colArr as $c) {
eval('$' . $c[1] . ' = ' . $vals[$cntr] . ';');
$cntr++;
}
if ($stmt->execute() === FALSE) {
//show $stmt->error for this iteration
} else {
//show success for this iteration
}
}
The first iteration results in a successful insertion of incorrect data. That is, the inserted ID is 0, not 1, and no other info is inserted. The second iteration (and all consecutive ones) results in the following error message: Duplicate entry '0' for key 'PRIMARY'
What am I doing wrong here, is it the eval or something else? Im not sure how to figure this one out.

Instead of continuing to try to get the existing code working, I'm going to suggest a KISS starting point, without the prepare(), the eval(), or the bind_param().
$cols = ['ID', 'Date'];
$vals = [
[1, '\'now()\''],
[2, '\'now()\''],
];
foreach ($vals as $val)
{
$sql = 'INSERT INTO table (' . implode($cols, ', ') . ') VALUES (' . implode($val, ', ') . ')';
// exec here
}
To make this a bit safer, you'll probably want to escape all the values before the implode, or before/as they are put into the array you're working with. The existing code is, IMHO, trying to be too "clever" to do something so simple.
Alternately, you may want to consider switching to using the PDO library instead of mysqli. PDO supports binding of named parameters on a per-parameter basis, which could be done in a loop without the eval().
Someone else may get the provided "clever" solution working instead of course.

Related

Laravel 5.2 update/insert batch similar to Codeigniter's update_batch/insert_batch()

Does laravel have an update batch functionality similar to Codeigniter?
Codeigniter uses $this->db->update_batch('mytable', $data, 'title'); to do a batch update. More info could be found here.
But as for laravel's update, it seems that it could only do a single transaction. I feel that this is kind of bad when you have multiple rows to update wherein it will be inside a for loop. Something similar to this:
foreach ($rows => $row) {
DB::table('users')->where('id', $row['row_id'])->update(['votes' => 1]);
}
For atleast you get the picture, right?
If you'll look into this code, your database could get knock out pretty much as it keeps on connecting unlike the update_batch(), only a single transaction is being throw.
TL;DR - it is not clear that the CodeIgniter (CI) method is more efficient than a series of UPDATE queries. In addition, the CI method is less clear than a looped series of UPDATEs.
It is not correct to say that Laravel "keeps on connecting" - it will open a connection at the start of the request and keep using that connection until the request is finished. I think what you mean to say that you are sending a whole lot of queries to the server.
It is true that doing this:
INSERT INTO table VALUES (1, ...);
INSERT INTO table VALUES (2, ...);
INSERT INTO table VALUES (3, ...);
...
INSERT INTO table VALUES (n, ...);
is going to be less efficient than doing this:
INSERT INTO table VALUES (1, ...),
(2, ...),
(3, ...),
...
(n, ...);
But this is not a simple INSERT. Look at the code generated by the CI library in the link you posted. What we're looking at is the difference in efficiency between:
UPDATE table SET value=1 WHERE id=1;
UPDATE table SET value=2 WHERE id=2;
UPDATE table SET value=3 WHERE id=3;
...
UPDATE table SET value=n WHERE id=n;
and
UPDATE table
SET value = CASE
WHEN id = 1 THEN 1
WHEN id = 2 THEN 2
WHEN id = 3 THEN 3
...
WHEN id = n THEN n
ELSE value END
WHERE id IN (1, 2, 3, ..., n);
I'm no SQL expert, but I'm not convinced that the second is more efficient than the first. That long chain of WHEN clauses is going to be processed by the SQL server - a bunch of short UPDATE statements is (in my opinion) going to be more efficient, particularly when you're using an indexed column in the WHERE clause.
Finally, looking at the CI documentation, that update_batch method makes some assumptions about the structure of the data you pass to it - for example, that the first item in the array is the key for the update statements. That (to me at least) is not going to be clear when you look at your code six months from now.
maybe it will help
trait InsertOrUpdate {
static function InsertOrUpdate(array $rows) {
$table = DB::getTablePrefix().with(new self)->getTable();
if (empty($table)) {
return false;
}
$maxRowData = DB::table($table)->where('id', DB::raw('(select max(`id`) from ' . $table . ')'))->first();
if (! empty($maxRowData)) {
$maxId = $maxRowData->id;
$result = DB::statement('ALTER TABLE ' . $table . ' AUTO_INCREMENT = ' . $maxId . ';');
}
$tableColumns = DB::getSchemaBuilder()->getColumnListing($table);
$datetime = Carbon::now()->toDateTimeString();
if (in_array('created_at', $tableColumns)) {
foreach ($rows as $key => $row) {
$rows[$key]['created_at'] = $datetime;
$rows[$key]['updated_at'] = $datetime;
}
}
$first = reset($rows);
$columns = implode(',',
array_map(function($value) {
return "$value";
},
array_keys($first))
);
$values = implode(',', array_map(function($row) {
return '('.implode( ',',
array_map(function($value) { return '"' . str_replace('"', '""', $value) . '"'; }, $row)
).')';
} , $rows)
);
$updates = '';
if (in_array('updated_at', $tableColumns)) {
unset($first['created_at']);
unset($first['updated_at']);
$first['deleted_at'] = NULL;
$updateString = '(CASE WHEN ';
$lastClolumn = count($first);
$columnNum = 1;
foreach (array_keys($first) as $column) {
$updateString .= $column . ' <> VALUES(' . $column . ')';
if ($columnNum != $lastClolumn) {
$updateString .= ' OR ';
}
$columnNum++;
}
$updateString .= ' THEN \'' . $datetime . '\' ELSE `updated_at` END), ';
$updates .= 'updated_at = ' . $updateString;
}
$updates .= implode(',',
array_map(function($value) {return "$value = VALUES($value)"; } , array_keys($first) )
);
$sql = "INSERT INTO {$table}({$columns}) VALUES {$values} ON DUPLICATE KEY UPDATE {$updates};";
return DB::statement($sql);
}
Look into database transactions...
https://laravel.com/docs/5.2/database#database-transactions

Is there a shortcut for binding named PDO params for MySQL inserts?

PDO seems to require a lot of repetition if you want to use named parameters. I was looking for a way to make it simpler, using a single instance of column/data pairs -- without having to re-type column names or even variable names multiple times.
I'm answering this question myself because I wrote a function that I think does this pretty elegantly, and basically, I wanted to show it off (and help people looking to do the same).
I'm not at all sure if I'm the first one to think of this, or if there are any issues I didn't foresee. Feel free to let me know, or supply your own solution, if you have something better.
Starting from #equazcion's answer, but using slightly different code method:
function bindFields($fields) {
return implode(",", array_map(function ($f) { return "`$f`=:$f"; },
array_keys($fields)));
}
Or if you want traditional INSERT syntax instead of the MySQL-specific INSERT...SET syntax:
function bindFields($fields) {
return "(" . implode(",", array_map(function ($f) { return "`$f`"; },
array_keys($fields))) . ")"
. " VALUES (" . implode(",", array_map(function ($f) { return ":$f"; },
array_keys($fields))) . ")";
}
function bindFields($fields){
end($fields);
$lastField = key($fields);
$bindString = ' ';
foreach($fields as $field => $data){
$bindString .= $field . '=:' . $field;
$bindString .= ($field === $lastField ? ' ' : ',');
}
return $bindString;
}
Supply the data to be inserted using a single associative array. Then, use bindFields() on that array, to generate a string of column = :column pairs for the MySQL query:
$data = array(
'first_column' => 'column data string',
'second_column' => 'another column data string',
'another_column' => 678,
'one_more_field' => 'something'
);
$query = "INSERT INTO tablename SET" . bindFields($data);
$link = new PDO("mysql:host='your-hostname.com';dbname='your_dbname'", 'db_username', 'db_pass');
$prepared = $link->prepare($query);
$prepared->execute($data);
bindFields($data) output:
first_column=:first_column,second_column=:second_column,another_column=:another_column,one_more_field=:one_more_field

PDO INSERT not working with $_POST Variables

I'm pulling my hair out over this- hopefully it's an easy oversight.
I'm planning on sending a bunch of variables to this PHP files with a jQuery AJAX function. I wrote this section to assign all $_POST variables to php variables:
foreach($_POST as $key => $value){
$$key = $value;
}
and it seems to be working, because I can manipulate the variables like so:
echo 'name: ' . $name . '<br>';
echo 'main_pic: ' . $main_pic . '<br>';
echo 'product_pic: ' . $product_pic . '<br>';
echo 'more_pic: ' . $more_pic . '<br>';
echo 'paypal_code: ' . $paypal_code . '<br>';
echo 'category_id: ' . $category_id . '<br>';
echo 'price: ' . $price . '<br>';
echo 'description: ' . $description . '<br>';
echo 'product_color: ' . $product_color . '<br>';
echo 'design_color: ' . $design_color;
So now that I have those, I want to insert them into my table-
$qry = $pdo->prepare("INSERT INTO inventory (name, main_pic, product_pic, more_pic, paypal_code, category_id, price, description, product_color, design_color)
VALUES (:name, :main_pic, :product_pic, :more_pic, :paypal_code, :category_id, :price, :description, :product_color, :design_color)");
$qry-> bindParam(':name', $name);
$qry-> bindParam(':main_pic', $main_pic);
$qry-> bindParam(':product_pic', $product_pic);
$qry-> bindParam(':more_pic', $more_pic);
$qry-> bindParam(':paypal_code', $paypal_code);
$qry-> bindParam(':category_id', $category_id);
$qry-> bindParam(':price', $price);
$qry-> bindParam(':description', $description);
$qry-> bindParam(':product_color', $product_color);
$qry-> bindParam(':design_color', $design_color);
$qry-> execute();
This doesn't run- and I'm not sure the best way to log an error to see why. If I manually assign the variables and comment out my earlier $_POST shenanigans, everything seems to work and the INSERT runs fine.
Any clues? I thought it might be because the database is expecting certain variable type, but I think I've fully explored that.
Anyone know of any reasons that manually assigning the variables would work but grabbing them from $_POST wouldn't?
EDIT: following those suggestions, I've got an error message
SQLSTATE[23000]: Integrity constraint violation: 1062 Duplicate entry '0' for key 1
Alright, so I'm not assigning a unique Primary key when I'm trying this insert- I was under the assumption that PDO would handle that. What is the best way to handle assigning a unique Primary key? I'd like to avoid having the user manually assign it.
This is not an answer, but it will look better than a comment. Why are you doing the following:
foreach($_POST as $key => $value){
$$key = $value;
}
This is register globals all over again (sort of). With above code you can easily overwrite local variables with unexpected results or even worse introduce security vulnerabilities.
What if I post $_POST['is_admin'] = 1 or something like that? Either way I think you get the idea. What you just did is bad and can be dangerous.
Like #PeeHaa, this is not an answer, but to expand on my comment on his; drop the pseudo register globals functionality: opt for a white-list of fields, and coordinate the names with those posted.
$fields = ['name', 'age', 'sex'];
$query = $pdo->prepare(sprintf('INSERT INTO `table` (%s) VALUES (%s)',
implode(',', $fields),
implode(',', array_map(function($field) {
return ":{$field}";
}, $fields))));
foreach($fields as $field) {
$query->bind(":{$field}", $_POST[$field]);
}
$query->execute();
Obviously, this needs more validation, empty() checking, etc., but you get the idea. Furthermore, you can add more validation/sanitization with a callback lookup:
$sanitizers = [
'sex' => function($value) {
$value = strtolower($value);
return in_array($value, ['male', 'female', 'unknown'])
? $value : 'unknown';
},
];
foreach($sanitizers as $field => $sanitizer) {
if (isset($_POST[$field])) {
$_POST[$field] = $sanitizer($_POST[$field]);
}
}
If post contains "genderless" for "sex", you'll get "unknown" instead.
More complete example:
// whitelist keys and sanitizer values
$fields = [
// limit to 255
'name' => function($value) {
return substr($value, 0, 255);
},
// you can't be that old
'age' => function($value) {
return min(max((int) $value, 0), 100);
},
// starfish need not apply
'sex' => function($value) {
$value = strtolower($value);
return in_array($value, ['male', 'female', 'unknown'])
? $value : 'unknown';
},
];
// build ye' old query
$query = $pdo->prepare(sprintf('INSERT INTO `table` (%s) VALUES (%s)',
implode(',', array_keys($fields)),
implode(',', array_map(function($fields){
return ":${$field}";
}, array_keys($fields)))));
// loop dee doop to sanitize and bind
foreach ($fields as $key => $sanitizer) {
if (is_callable($sanitizer)) {
$query->bind(":{$field}", $sanitizer($_POST[$field]));
continue;
}
$query->bind(":{$field}", $_POST[$field]);
}
// fire the cannons!
$query->execute();
Empty your database table and see if it works... Likely you have a primary key in your db table that you are trying to overwrite.
Check your db table as well, using EXPLAIN inventory
If you want, you can have an autoincremented primary key. Supposing you use mysql:
http://dev.mysql.com/doc/refman/5.0/en/example-auto-increment.html
The OP said:
Alright, so I'm not assigning a unique Primary key when I'm trying
this insert- I was under the assumption that PDO would handle that.
What is the best way to handle assigning a unique Primary key? I'd
like to avoid having the user manually assign it.
Just addressing this part or your question - you just use an unquoted 0
insert into table (id, name) values (0, 'Bob');
But as you say, you should not have to if it is an id which is auto-incrementing correctly.
In old versions of Mysql (< 5) you could use an empty string, which blew up when 5.0 came out, just in case anyone reading this fell foul of that previously undocumented feature.

PHP MySQL INSERT 1-3,000 rows as quickly and efficently as possible

I am looking for the fastest way to INSERT 1-3,000 rows into a MySQL database using PHP. My current solution is taking around 42 seconds to insert the rows which I think that could be much faster.
I am using a self-written DB class, the insert() method takes two params (string) $table and (array) $vars. The $items array is an associative array where the key is the column name in the table and the value is the value to insert. This works really well for because I sometimes have 30 columns in a table and already have the data there in an array. The insert() method is below:
function insert($table,$vars) {
if(empty($this->sql_link)){
$this->connection();
}
$cols = array();
$vals = array();
foreach($vars as $key => $value) {
$cols[] = "`" . $key . "`";
$vals[] = "'" . $this->esc($value) . "'";
}
//join the columns and values to insert into sql
$fields = join(', ', $cols);
$values = join(', ', $vals);
$insert = mysql_query("INSERT INTO `$table` ($fields) VALUES ($values);", $this->sql_link);
return $insert;
}
It should be self-explanatory but basically I take the keys and values from $vars and create an INSERT statement. It works, I think the problem I am having is sending the queries one at a time.
Should I build a long query string?
INSERT INTO table (field, field2, etc) VALUES (1, 2, ect);INSERT INTO table (field, field2, etc) VALUES (1, 2, ect);INSERT INTO table (field, field2, etc) VALUES (1, 2, ect);INSERT INTO table (field, field2, etc) VALUES (1, 2, ect);INSERT INTO table (field, field2, etc) VALUES (1, 2, ect); and send it all at one time? If so can this handle 3,000 insert statements in one call?
Is there another way I am not looking at? Any info is appreciated.
Thanks
The most performant way is to use the multiple-row insert syntax:
INSERT INTO table (field, field2, etc) VALUES (1, 2, etc),(1, 2, etc),(1, 2, etc);
Manual:
INSERT statements that use VALUES syntax can insert multiple rows. To do this, include multiple lists of column values, each enclosed within parentheses and separated by commas. Example:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
The values list for each row must be enclosed within parentheses.
Two ways of improve insertion speeds:
At the start, before any INSERT, do a mysql_query("START TRANSACTION"); or a simpler mysql_query("BEGIN");. At the end, do a mysql_query("COMMIT");. These two lines, speeds up the bulk insertion a 5-10x performance.
If the table backend is MyISAM (NOT InnoDB), do the INSERTs followed with the word DELAYED. For example, instead of INSERT INTO table use INSERT DELAYED INTO table for an aditional 10-15x speed-up.
If you combine the 2 methods, is posible to achieve a speed-up of 100 times.
Mysql can import data directly from a file which can significantly speed up importing data. See:
LOAD DATA INFILE Syntax
<?php
$data = "data/fullz.txt";
$db = new PDO("sqlite:db/ssninfo.db");
$db->beginTransaction();
$stmt = $db->prepare('INSERT INTO ssninfo (fname,lname,ssn,address,city,state,zip,phone,birth,email) VALUES (?,?,?,?,?,?,?,?,?,?)');
if($file=fopen($data, "r")){
while(!feof($file)){
$line = fgets($file);
$part = explode('|', $line);
$stmt->execute($part);
}
}
$db->commit();
As usual, it depends; you don't even mention which engine you're using, which is a big determinant. But I've found the MySQL manual guidance pretty reliable.
http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html
Auto discovering the maximum ammount of inserts.
to insert that kind of ammounts (3000) there should not be any problem of doing something like (assuming you use pdo):
$stmt = $dbh->prepare("INSERT INTO yourtable(name, id) VALUES " . str_repeat('(?,?),', $amountOfRows - 1) . '(?,?)');
You can improve that to make create generic way to create big statements like the one above for tables with different ammount of fields:
$fields = array("name", "id");
$fieldList = implode(", ", $fields);
$params = '(' . str_repeat('?,', count($fields) - 1) . '?)';
$values = str_repeat($params . ',', $ammountOfRows - 1) . $params;
$stmt = $dbh->prepare("INSERT INTO $table($fieldList) VALUES " . $values);
but the problem with the above solution is that wont work with any combination of rows and ammount of fields.
Seems to be that mysql is not only limited by the ammount of rows but also the ammount of parameters is taken into account.
But you dont want to be changing your code whenever a new mysql release changes the limit of the parameters, rows or even the size of the sql sentence.
So, a much better approach to create a generic way to generate big statements would be trying to feat the underlaying database engine:
/**
* Creates an insert sql with the maximum allowed of parameters
* #param string $table
* #param string $attributeList
* #param int &$ammountInserts returns the ammount of inserts
* #return \PDOStatement
*/
public static function getBiggestInsertStatement($table, $attributeList, $max, &$ammountInserts)
{
$previousSize = null;
$size = 10;
$sql = 'INSERT INTO ' . $table . '(' . implode(',', $attributeList) . ') values ';
$return = null;
$params = '(' . str_repeat('?,', count($attributeList) - 1) . '?)';
do {
try {
$previousSize = $size;
$values = str_repeat($params . ',', $size - 1) . $params;
$return = Db::getInstance()->prepare($sql . $values);
if ($size > $max) {
$values = str_repeat($params . ',', $max - 1) . $params;
$return = Db::getInstance()->prepare($sql . $values);
$ammountInserts = $max;
break;
}
$ammountInserts = $size;
$size *= 2;
} catch(\Exception $e) {
}
} while($previousSize != $size);
return $return;
}
One thing that you must have in mind is that since you dont know that limits the query could be able to push a lower ammount of items that all that you need to insert.
So you would have to create a strategy like the one below to succesfuly achieve insert them all in any possible scenario:
$insert = Db::getBiggestInsertStatement($table, array('field1','field2'), $numrows, $maximumInserts);
$i = 0;
$values = array();
for ($j = 0; $j < $numrows; $j++) {
if ($i === $maximumInserts) {
$insert->execute($values);
$i = 0;
$values = array();
}
$values[] = "value1" . $j;
$values[] = "value2" . $j;
$i++;
});
if ($i > 0) {
$insertRemaining = Db::getBiggestInsertStatement($table, array('field1', 'field2'), $i, $maximumInserts);
$insertRemaining->execute($values);
}
I have tried to insert in a table with a single column 1000000 rows, and it's done within seconds, agains minutes that would take to insert them one by one.
The standard technique for speeding up bulk inserts in to use a prepared SQL statement inside a loop inside a transaction. That will make it pretty well optimal. After that you could try tweaking it in various ways, but you are probably wasting your time.

php $_POST to get values - not the best way

EDIT:
Thank you so much for your answers, you really amaze me with so much wisdom :)
I am trying to relay on TuteC's code a bit changed, but can't figure how to make it work properly:
$valor = $_POST['valor'];
$post_vars = array('iphone3g1', 'iphone3g2', 'nome', 'iphone41', 'postal', 'apelido');
foreach($post_vars as $var) {
   $$var = "'" . mysql_real_escape_string($_POST[$var]). "', ";
}
$sql = "INSERT INTO clientes (iphone3g1, iphone3g2, nome, iphone41, postal, apelido, valor) VALUES ($$var '$valor')";
$query= mysql_query($sql);
I know there's a bit of cheating on the code, i would need to use substring so the $$var wouldn't output a "," at the end where i need the values, instead i tried to insert a variable that is a value ($valor = $_POST['valor'];)
What is going wrong?
And for the others who tried to help me, thank you very much, i am learning so much with you here at stackoverflow.
I have a form with several field values, when trying to write a php file that reads those values it came out a mostruosity:
$codigounico= md5(uniqid(rand()));
$modelo=$_POST['selectName'];
$serial=$_POST['serial'];
$nif=$_POST['nif'];
$iphone3g1=$_POST['iphone3g1'];
$iphone3g2=$_POST['iphone3g2'];
$iphone3g3=$_POST['iphone3g3'];
$iphone3g4=$_POST['iphone3g4'];
$iphone3gs1=$_POST['iphone3gs1'];
$iphone3gs2=$_POST['iphone3gs2'];
$iphone3gs3=$_POST['iphone3gs3'];
$iphone3gs4=$_POST['iphone3gs4'];
$iphone41=$_POST['iphone41'];
$iphone42=$_POST['iphone42'];
$iphone43=$_POST['iphone43'];
$iphone44=$_POST['iphone44'];
$total=$_POST['total'];
$valor=$_POST['valor'];
$nome=$_POST['nome'];
$apelido=$_POST['apelido'];
$postal=$_POST['postal'];
$morada=$_POST['morada'];
$notas=$_POST['notas'];
$sql="INSERT INTO clientes (postal, morada, nome, apelido, name, serial, iphone3g1, iphone3g2, iphone3g3, iphone3g4, total, valor, iphone3gs1, iphone3gs2, iphone3gs3, iphone3gs4, iphone41, iphone42, iphone43, iphone44, nif, codigounico, Notas)VALUES('$postal', '$morada', '$nome', '$apelido', '$modelo', '$serial', '$iphone3g1', '$iphone3g2', '$iphone3g3', '$iphone3g4', '$total', '$valor', '$iphone3gs1', '$iphone3gs2', '$iphone3gs3', '$iphone3gs4', '$iphone41', '$iphone42', '$iphone43', '$iphone44', '$nif', '$codigounico', '$notas')";
$result=mysql_query($sql);
This is a very dificult code to maintain,
can I make my life easier?
To restrict which POST variables you "import", you can do something like:
$post_vars = array('iphone3g1', 'iphone3g2', '...');
foreach($post_vars as $var) {
$$var = mysql_real_escape_string($_POST[$var]);
}
EDIT: Changed addslashes by mysql_real_escape_string (thanks #Czechnology).
The issue I see is repetition of the same names four times over. This is how I would reduce it to two occurrences (you could drop it to one with more finagling).
$sql = 'INSERT INTO clientes (postal, morada, nome, apelido, name, serial, iphone3g1, iphone3g2, iphone3g3, iphone3g4, total, valor, iphone3gs1, iphone3gs2, iphone3gs3, iphone3gs4, iphone41, iphone42, iphone43, iphone44, nif, codigounico, Notas) VALUES(:postal, :morada, :nome, :apelido, :modelo, :serial, :iphone3g1, :iphone3g2, :iphone3g3, :iphone3g4, :total, :valor, :iphone3gs1, :iphone3gs2, :iphone3gs3, :iphone3gs4, :iphone41, :iphone42, :iphone43, :iphone44, :nif, :codigounico, :notas)';
preg_match_all('/:(\w+)/', $sql, $inputKeys);
$tokens = $inputKeys[0];
$values = array_map($inputKeys[1], function($k){
return mysql_real_escape_string($_POST[$k]);
});
$sql = str_replace($tokens, $values, $sql);
$result = mysql_query($sql);
Depending on how you want to separate your logic, a reversed approach might be more useful, where you would specify the array of key names and iterate over that to generate the SQL string.
<?php
$inputKeys = array('postal', 'morada', 'nome', 'apelido', 'name', 'serial', 'iphone3g1', 'iphone3g2', 'iphone3g3', 'iphone3g4', 'total', 'valor', 'iphone3gs1', 'iphone3gs2', 'iphone3gs3', 'iphone3gs4', 'iphone41', 'iphone42', 'iphone43', 'iphone44', 'nif', 'codigounico', 'Notas');
$keyList = '(' . implode(',', $inputKeys) . ')';
$valueList = 'VALUES (';
foreach ($inputKeys as $k) {
$valueList .= mysql_real_escape_string($_POST[$k]);
$valueList .= ',';
}
$valueList = rtrim($valueList, ',');
$valueList .= ')';
$sql = 'INSERT INTO clientes '.$keyList.' '.$valueList;
$result = mysql_query($sql);
This approach drops the occurrences of the keys to one and will probably more naturally with your application.
TuteC had a good aim but failed in details.
It makes me wonder, why noone has a ready made solution, but had to devise it on-the-fly. Nobody faced the same problem before?
And why most people trying to solve only part of the problem, getting variables only.
The goal is not to get variables.
The goal is to get a query. So, get yourself a query.
//quite handy way to define an array, saves you from typing zillion quotes
$fields = explode(" ","postal morada nome apelido name serial iphone3g1 iphone3g2 iphone3g3 iphone3g4 total valor iphone3gs1 iphone3gs2 iphone3gs3 iphone3gs4 iphone41 iphone42 iphone43 iphone44 nif codigounico Notas");
$sql = "INSERT INTO clientes SET ";
foreach ($fields as $field) {
if (isset($_POST[$field])) {
$sql.= "`$field`='".mysql_real_escape_string($_POST[$field])."', ";
}
}
$sql = substr($set, 0, -2);
This code will create you a query without boring repeating the same field name many times.
But that's still not all improvements you can make.
A really neat thing is called a function.
function dbSet($fields) {
$set = '';
foreach ($fields as $field) {
if (isset($_POST[$field])) {
$set.="`$field`='".mysql_real_escape_string($_POST[$field])."', ";
}
}
return substr($set, 0, -2);
}
put this function into your code library being included into all your scripts (you have one, don't you?)
and then use it for both insert and update queries:
$_POST['codigounico'] = md5(uniqid(rand()));//a little hack to add custom field(s)
if ($action=="update") {
$id = intval($_POST['id']);
$sql = "UPDATE $table SET ".dbSet($fields)." WHERE id = $id";
}
if ($action=="insert") {
$sql = "INSERT $table SET ".dbSet($fields);
}
So, your code become extremely short and reliable and even reusable.
The only thing you have to change to handle another table is $fields array.
It seems your database is not well planned as it contains seemingly repetitive fields (iphone*). You have to normalize your database.
The same approach to use with prepared statements can be found in this my question: Insert/update helper function using PDO
You could use a rather ugly part of PHP called variable variables, but it is generally considered a poor coding practice. You could include your database escaping at the same time. The code would look something like:
foreach($_POST as $key => $value){
$$key = mysql_real_escape_string($value);
}
The variable variables manual section says they do not work with superglobals like $_PATH, but I think it may work in this case. I am not somewhere where I can test right now.
PHP: extract
Be careful though and make sure you clean the data before using it.
$set = array();
$keys = array('forename', 'surname', 'email');
foreach($keys as $val) {
$safe_value = mysqli_escape_string($db, $_POST[$val]);
array_push($set, "$val='$safe_value'");
}
$set_query = implode(',', $set);
Then make your MySQL query something like UPDATE table SET $set_query WHERE... or INSERT INTO table SET $set_query.
If you need to validate, trim, etc, do it before the above code like this:
$_POST["surname"] = trim($_POST["surname"];
Actually, you could make your life easier by making your code a bit more complicated - escape the input before inserting into the database!
$sql =
"INSERT INTO clientes SET
"postal = '" . mysql_real_escape_string($_POST['postal']) . "', ".
"morada = '" . mysql_real_escape_string($_POST['morada']) . "', ".
...
First, I recommend you to create a key-value array like this:
$newClient = array(
'codigounico' => md5(uniqid(rand())),
'postal' => $_POST['postal'],
'modelo' => $_POST['selectName'],
...
);
In this array key is the column name your MySQL table.
In the code you've provided not every field is copied right from POST array (some are calculated, and some keys of the POST aren't equal with the tables column names), so you should use a flexible method.
You should still specify all columns and values but only once so code is still maintainable and you won't have any security errors if someone sends you a broken POST. As for me it looks more like configuration than coding.
Then I recommend you to write a function similar to this:
function buildInsertQuery($tableName, $keyValue) {
$result = '';
if (!empty($keyValue)) {
$delimiter = ', ';
$columns = '';
$values = '';
foreach ($keyValue as $key => $value) {
$columns .= $key . $delimiter;
$values .= mysql_real_escape_string($value) . $delimiter;
}
$columns = substr($columns, 0, -length($delimiter));
$values = substr($values, 0, -length($delimiter));
$result = 'INSERT INTO `' . $tableName . '` (' . $columns . ') VALUES (' . $values . ')';
}
return $result;
}
And then you can simply build your query with just one function call:
$query = buildInsertQuery('clientes', $newClient);

Categories