Best multiInsert way to add database in php? - php

How is best way add to database multiInsert row?
E.g I have array and i would like add all array to database. I can create loop foreach and add all arrays.
$array=['apple','orange'];
foreach($array as $v)
{
$stmt = $db->exec("Insert into test(fruit) VALUES ('$v')");
}
And it's work, but maybe i should use transaction? or it do other way?

Use a prepared statement.
$sql = "INSERT INTO test (fruit) VALUES ";
$sql .= implode(', ', array_fill(0, count($array), '(?)'));
$stmt = $db->prepare($sql);
$stmt->execute($array);
The SQL will look like:
INSERT INTO test (fuit) VALUES (?), (?), (?), ...
where there are as many (?) as the number of elements in $array.
Doing a single query with many VALUES is much more efficient that performing separate queries in a loop.
If you have an associative array with input values for a single row, you can use a prepared query like this:
$columns = implode(',', array_keys($array);
$placeholders = implode(', ', array_fill(0, count($array), '?'));
$sql = "INSERT INTO test($columns) VALUES ($placeholders)";
$stmt = $db->prepare($sql);
$stmt->execute(array_values($array));

The way you've done it is, in many ways, the worst option. So the good news is that any other way of doing it will probably be better. As it stands, the code may fail depending on what's in the data; consider:
$v="single ' quote";
$stmt = $db->exec("Insert into test(fruit) VALUES ('$v')");
But without knowing what your criteria are for "best", its rather hard to advise. Using a parametrised query with data binding, or as they are often described "prepared statements" is one solution to the problem described above. Escaping the values appropriately before interpolating the string is another (and is how most PHP implementations of data binding work behind the scenes) is another common solution.
Leaving aside the question of how you get the parameters into the SQL statement, then there is the question of performance. Each round trip to the database has a cost associated with it. And doing a single insert at a time also has a performance impact - for each query, the DBMS must parse the query, apply the appropriate concurrency controls, execute the query, apply the writes to the journal, then to the data tables and indexes then tidy up before it can return the thread of execution back to PHP to construct the next query.
Wrapping multiple queries in a transaction (you are using transactions already - but they are implicit and applied to each statement) can reduce some of the overhead I have described here, but can introduce other problems, the nature of which depends on which concurrency model your DBMS uses.
To get the data into the database as quickly as possible, and minimising index fragmentation, the "best" solution is to batch up multiple inserts:
$q="INSERT INTO test (fruit) VALUES ";
while (count($array)) {
$s=$q;
$j='';
for ($x=0; $x<count($array) && $x<CHUNKSIZE; $x++) {
$s.=$j." ('" . mysqli_real_escape_string($db,
array_shift($array)) . "')";
$j=',';
}
mysqli_query($db,$s);
}
This is similar to Barmar's method, but I think easier to understand when you are working with more complex record structures, and it won't break with very large input sets.

Related

Mysql IN clause parameterization in PHP, Python

I'm used to doing something like the following for my queries:
$array = array(123,456,10,57,1024,768); //random data
$sql = "select * from table where field in(".implode(',',$array).")";
mysql_query($sql);
This works just fine, and is fast even when the array may have 1,000 or more items to search for(effectively a join against a nonexistent table).
For me, this is quite useful because I am joining data from multiple sources together.
However, I know it is somewhat insecure -- SQL injection is possible if I don't escape my incoming data correctly.
I'd like to try to figure out a parametrized equivalent of this; some way to do that sort of "array IN" using more "hardened" code.
It may not be possible to do this directly, it may require a helper function that does the replacements. If so, I'm looking for a good one I can drop into my code, hopefully something that will allow me to just pass an array as a parameter.
While my example is for PHP, I also need to figure this out for Python, as I'm doing the same thing there.
I've read some similar questions here, but haven't seen any (good) answers.
Using PDO prepared statements:
$placeholders = str_repeat('?, ', count($array)-1) . '?';
$stmt = $pdo->prepare("SELECT * FROM table WHERE field IN ($placeholders)");
$stmt->execute($array);
$placeholders will contain a sequence of ?, ?, ? placeholders, with the same number of ? as the size of the array. Then when you execute the statement, the array values are bound to the placeholders.
I have personally done:
$in = substr(str_repeat('?,', count($array)), 0, -1);
$sql = "SELECT * FROM table WHERE field IN ($in)";
This will provide you with ?, for each array element and remove the trailing comma.

Dynamically create contents of IN operator SQL php [duplicate]

This question already has answers here:
Can I bind an array to an IN() condition in a PDO query?
(23 answers)
Closed 7 years ago.
Is it possible to use prepared statements with either MySQLi or PDO and still be able to dynamically add items to the IN part of the query, for example...
$somearray = ['tagvalue1', 'tagvalue2', 'tagvalue3'];
$sql = "SELECT foo FROM bar
WHERE tag IN(?)";
I ask this because I have a situation whereby the number of elements in the IN part is not known until runtime.
You asked:
Is it possible to use prepared statements with either MySQLi or PDO
and still be able to dynamically add items to the IN part of the
query, for example...
No, unfortunately it is not. It happens that ColdFusion does this, but not php.
While you can't do exactly what you want with a prepared query, you can dynamically generate the $sql string for the query to accomplish the same thing.
Given some array $array = (n, n1, n2, ... nN)
$sql = "SELECT foo FROM bar WHERE tag IN (";
foreach($array as $value) {
$sql .= "'" . $value . "', ";
}
// Strip off the last comma and space from the IN clause
$sql = substr($sql, 0, strlen($sql) - 2);
$sql .= ")";
It's certainly not the most elegant solution and you'll have to do some more data validation or escaping of dangerous characters that a prepared query would handle better, but it will do the job.
As a side note, there are ORM (Object Relational Mapper) libraries that support things like accepting an array of values to generate the IN clause in a database statement. Propel is the one I have most experience with, but I'm sure others like Doctrine would have a similar method.
A propel-ish example would be like
$results = BarQuery::create()
->select('foo')
->filterByTag(array($value1, $value2, ..., $vauleN)
->find();
Lots of added functionality and support, but does increase your initial set up time for a project.

Running queries in PDO without binding

Can you run queries in PDO without preparing them? I am aware of the SQL-Injection issues that can arise with this but I am in a test environment.
I want to be able to write pure MySQL queries and just execute them, not have to prepare the query, bind the placeholders etc...
I would like to be able to execute a query like the following instantly.
INSERT INTO table (table_id, car, bike, date) VALUES (1, 'bmw', 'suzuki', 2004)
I seem to be getting errors running execute() directly on this query.
Thanks in advance.
The idea of prepared statements is not primarily that you can bind parameters, but that you can reuse the compiled statement multiple times, which should increase efficiency.
The prepare-execute workflow isn't too inconvenient for one-off use cases, but PDO offers other methods as well:
exec executes a statement and returns the number of affected rows. It is useful for initialization stuff, but not for SELECTs.
query is useful for static queries that don't involve untrusted input. It is similar to prepare-execute, but does not allow parameters, and does not allow the reuse of the compiled query.
Due to these limitations, they should generally only be used on static queries (i.e. the query is a plain string and not constructed from concatenations with variables).
You can safely escape user input with the quote method, so you could do something like
// untrusted data:
$car = 'bmw';
$bike = 'suzuki';
$year = 2004;
...
$dbh->exec('INSERT INTO table (table_id, car, bike, date) VALUES (1, '. $dbh->quote($car) .', '. $dbh->quote($bike) .', '. $dbh->quote($year) .')');
But this is so inconvenient that you'll end up using
$dbh->prepare('INSERT INTO table (table_id, car, bike, date) VALUES (1, :car, :bike, :year)')
->execute(array(':car' => $car, ':bike' => $bike, ':year' => $year));
instead.
Don't use a PDOStatement just use the PDO connection object directly.
$conn = new PDO("....");
$result = $conn->exec("INSERT INTO table (table_id, car, bike, date) VALUES (1, bmw, suzuki, 2004)");
or
$result = $conn->query("SELECT * FROM table");
http://www.php.net/manual/en/pdo.query.php

PHP/MySQL: Using array elements in WHERE clause using prepared statements

I want to make a "dynamic" WHERE clause in my query based on a array of strings. And I want to run the created query using Mysqi's prepared statements.
My code so far, PHP:
$searchArray = explode(' ', $search);
$searchNumber = count($searchArray);
$searchStr = "tags.tag LIKE ? ";
for($i=1; $i<=$searchNumber-1 ;$i++){
$searchStr .= "OR tags.tag LIKE ? ";
}
My query:
SELECT tag FROM tags WHERE $searchStr;
More PHP:
$stmt -> bind_param(str_repeat('s', count($searchArray)));
Now this obviously gives me an error since the bind_param part only contains half the details it need.
How should I proceed?
Are there any other (better) way of doing this?
Is it secure?
Regarding the security part of the question, prepared statements with placeholders are as secure as the validation mechanism involved in filling these placeholders with values up. In the case of mysqli prepared statements, the documentation says:
The markers are legal only in certain places in SQL statements. For example, they are allowed in the VALUES() list of an INSERT statement (to specify column values for a row), or in a comparison with a column in a WHERE clause to specify a comparison value.
However, they are not allowed for identifiers (such as table or column names), in the select list that names the columns to be returned by a SELECT statement, or to specify both operands of a binary operator such as the = equal sign. The latter restriction is necessary because it would be impossible to determine the parameter type. It's not allowed to compare marker with NULL by ? IS NULL too. In general, parameters are legal only in Data Manipulation Language (DML) statements, and not in Data Definition Language (DDL) statements.
This clearly excludes any possibility of modifying the general semantic of the query, which makes it much harder (but not impossible) to divert it from its original intent.
Regarding the dynamic part of your query, you could use str_repeat in the query condition building part, instead of doing a loop:
$searchStr = 'WHERE tags.tag LIKE ?' .
str_repeat($searchNumber - 1, ' OR tags.tag LIKE ?');
For the bind_param call, you should use call_user_func_array like so:
$bindArray[0] = str_repeat('s', $searchNumber);
array_walk($searchArray,function($k,&$v) use (&$bindArray) {$bindArray[] = &$v;});
call_user_func_array(array($stmt,'bind_param'), $bindArray);
Hopefully the above snippet should bind every value of the $bindArray with its corresponding placeholder in the query.
Addenum:
However, you should be wary of two things:
call_user_func_array expects an integer indexed array for its second parameter. I am not sure how it would behave with a dictionary.
mysqli_stmt_bind_param requires its parameters to be passed by reference.
For the first point, you only need to make sure that $bindArray uses integer indices, which is the case in the code above (or alternatively check that call_user_func_array doesn't choke on the array you're providing it).
For the second point, it will only be a problem if you intend to modify the data within $bindArray after calling bind_param (ie. through the call_user_func_array function), and before executing the query.
If you wish to do so - for instance by running the same query several times with different parameters' values in the same script, then you will have to use the same array ( $bindArray) for the following query execution, and update the array entries using the same keys. Copying another array over won't work, unless done by hand:
foreach($bindArray as $k => $v)
$bindArray[$k] = some_new_value();
or
foreach($bindArray as &$v)
$v = some_new_value();
The above would work because it would not break the references on the array entries that bind_param bound with the statement when it was called earlier. Likewise, the following should work because it does not change the references which have been set earlier up.
array_walk($bindArray, function($k,&$v){$v = some_new_value();});
A prepared statement needs to have a well-defined number of arguments; it can't have any element of dynamic functionality. That means you'll have to generate the specific statement that you need and prepare just that.
What you can do – in case your code actually gets called multiple times during the existence of the database connection - is make cache of those prepared statements, and index them by the number of arguments that you're taking. This would mean that the second time you call the function with three arguments, you already have the statement done. But as prepared statements don't survive the disconnect anyway, this really only makes sense if you do multiple queries in the same script run. (I'm deliberately leaving out persistent connections, because that opens up an entirely different can of worms.)
By the way, I'm not an MySQL expert, but would it not make a difference to not have the where conditions joined,but rather writing WHERE tags in (tag1, tag2, tag3, tag4)?
Solved it by the help of an answer found here.
$query = "SELECT * FROM tags WHERE tags.tag LIKE CONCAT('%',?,'%')" . str_repeat(" OR tags.tag LIKE CONCAT('%',?,'%')", $searchNumber - 1)
$stmt = $mysqli -> prepare($query);
$bind_names[] = str_repeat('s', $searchNumber);
for ($i = 0; $i < count($searchArray); $i++){
$bind_name = 'bind'.$i; //generate a name for variable bind1, bind2, bind3...
$$bind_name = $searchArray[$i]; //create a variable with this name and put value in it
$bind_names[] = & $$bind_name; //put a link to this variable in array
}
call_user_func_array(array($stmt, 'bind_param'), &$bind_names);
$stmt -> execute();

PHP: Multiple SQL queries vs one monster

I have some really funky code. As you can see from the code below I have a series of filters that I add to query. Now would it be easier to just have multiple queries, with it's own set of filters, then store the results in an array, or have this mess?
Does anyone have a better solution to this mess? I need to be able to filter by keyword and item number, and it needs to be able to filter using multiple values, not know which is which.
//Prepare filters and values
$values = array();
$filters = array();
foreach($item_list as $item){
$filters[] = "ItemNmbr = ?";
$filters[] = "ItemDesc LIKE ?";
$filters[] = "NoteText LIKE ?";
$values[] = $item;
$values[] = '%' . $item . '%';
$values[] = '%' . $item . '%';
}
//Prepare the query
$sql = sprintf(
"SELECT ItemNmbr, ItemDesc, NoteText, Iden, BaseUOM FROM ItemMaster WHERE %s LIMIT 21",
implode(" OR ", $filters)
);
//Set up the types
$types = str_repeat("s", count($filters));
array_unshift($values, $types);
//Execute it
$state = $mysqli->stmt_init();
$state->prepare($sql) or die ("Could not prepare statement:" . $mysqli->error);
call_user_func_array(array($state, "bind_param"), $values);
$state->bind_result($ItemNmbr, $ItemDesc, $NoteText, $Iden, $BaseUOM);
$state->execute() or die ("Could not execute statement");
$state->store_result();
I don't see anything particularly monstrous about your query.
The only thing I would do different is separate the search terms.
Ie
$item_list could be split in numeric items and text items.
then you could make the search something like:
...WHERE ItemNmbr IN ( number1, number2, number3) OR LIKE .... $text_items go here....
IN is a lot more efficient and if your $item_list doesn't contain any text part... then you are just searching a bunch of numbers which is really fast.
Now the next part if you are using a lot of LIKEs in your query maybe you should consider using MySQL Full-text Searching.
You're answer depends on what exactly you need. The advantage of using only 1 query is that of resource use. One query takes only 1 connection and 1 communication with the sql server. And depending what what exactly you are attempting to do, it might take less SQL power to do it in 1 statement than multiple.
However, it might be more practical from a programmers point of view to use a few less complex sql statements that require less to create than 1 large one. Remember, ultimitaly you are programming this and you need to make it work. It might not really make a difference, script processing vs. sql processing. Only you can make ultimate call, which is more important? I would generally recommend SQL processing above script process when dealing with large databases.
A single table query can be cached by the SQL engine. MySQL and its ilk do not cache joined tables. The general rule for performance is to use joins only when necessary. This encourages the DB engine to cache table indexes aggressively and also makes your code easier to adapt to (faster) object databases--like Amazon/Google cloud services force you to.

Categories