Do I need to sanitize results from database query? - php

Our application in PHP / MySQL uses PDO prepared statements to insert data from user without any sanitizing. In the later stage user can copy created entries, thousand at the time.
For copying part we have tested:
$stmt = $pdo->prepare("INSERT INTO files (`a`,`b`,`c`) VALUES (?,?,?)");
$pdo->beginTransaction();
foreach ($data as $row) {
$stmt->execute($row);
}
$pdo->commit();
vs
$pdo->exec("INSERT INTO files (`a`,`b`,`c`) VALUES . implode(', ', $loop_query));
$row represent rows from database.
First one is 3 times slower than second. We would like to implement second approach.
How safe is using data from database without prepared statements?

It is not safe. As you mention, data on DB is raw.
If you retrieve it to the programming language (php in this case) and use again in a sql sentence it must be protected again against sql injecton.
can't you do a insert (fields) select values instead?

It is not safe at all. Entirely.

Related

Speeding the process of MySql insert

I have a DB table which has approximately 40 columns and the main motive is to insert the records in the database as quickly as possible. I am using PHP for this.
The problems is, to create the insert statement, I have to loop through a for each. I am not sure if I doning this correctly. Please suggest me the best atlernative.. here is the example..
/// to loop through the available data ///
$sqc = "";
for ($i=1; $i<=100; $i++){
if ($sqc == ""){
$sqc = "('".$array_value["col1"]."'.. till .. '".$array_value["col40"]."',)";
} else {
$sqc .= ",('".$array_value["col1"]."'.. till .. '".$array_value["col40"]."',)";
}
}
/// finally the sql query ///
$sql_quyery = "INSERT INTO table_name (`col1`,.. till.. ,`col40`) values ".$sqc;
This concatenation of $sqc is taking a lot of time. and also the insertion in the DB, is there an alternate way of doing this.. i need to find a way to speed this up like 100X.. :(
Thank you
As suggested on MySQL Optimizing INSERT Statements page, there are some ways for this-
If you are inserting many rows from the same client at the same time, use INSERT statements with multiple VALUES lists to insert several rows at a time. This is considerably faster (many times faster in some cases) than using separate single-row INSERT statements. If you are adding data to a nonempty table, you can tune the bulk_insert_buffer_size variable to make data insertion even faster.
When loading a table from a text file, use LOAD DATA INFILE. This is usually 20 times faster than using INSERT statements.
Find the link below-
[MySQL Guide]
[1] https://dev.mysql.com/doc/refman/5.7/en/insert-optimization.html
Is just a small contribute but you can avoid the concat using a binding
$stmt = mysqli_prepare($yuor_conn,
"INSERT INTO table_name (`col1`,.. till.. ,`col40`) VALUES (?, ... till.. ?)");
mysqli_stmt_bind_param($stmt, 'ss.......s',
$array_value["col1"], $array_value["col2"], .. till..,
$array_value["col40"]);

Using pdo execute and commit for mutiple http requests

This is a sample pdo execute and commit example from php.net
<?php
/* Begin a transaction, turning off autocommit */
$dbh->beginTransaction();
/* Insert multiple records on an all-or-nothing basis */
$sql = 'INSERT INTO fruit
(name, colour, calories)
VALUES (?, ?, ?)';
$sth = $dbh->prepare($sql);
foreach ($fruits as $fruit) {
$sth->execute(array(
$fruit->name,
$fruit->colour,
$fruit->calories,
));
}
/* Commit the changes */
$dbh->commit();
/* Database connection is now back in autocommit mode */
?>
From what I understand, this enables me to faster insert thousand of rows in a table. Now, if I have a php file that is called
http://localhost/insertFruit.php?name=x&color=y&calories=z
which basically collects those values and insert them into database.
Now if I want to take advantage of that PDO loop and execute, I can't. If there is 100 of submission of those insertFruit.php, how can I collect all those values, combine into an array and then commit ? Using a separate class, calling object, using global values etc?
I am sure there is better way to do this.
how can i collect all those values, combine into an array and then commit?
Although you can, you shouldn't.
Neither transactions nor prepared statements are suitable for the separate requests.
Not to mention that your initial question is groundless. You don't need a faster insert. Your inserts are already fast.
There is no "better" way. Premature optimization is the root of all evil. By trying to devise a "better" way you'll ruin whole system.
All you can do is cluster some data into single request. If you have several fruits in one request, you can use prepared statements to process them in a loop.

insert using implode or prepared statements?

I have an array $authors with some numbers I want to insert into a table.
I can have a prepared statement to execute multiple times for each element from the array:
$stmt = $pdo->prepare('INSERT INTO authors (article_id, user_id) VALUES(?, ?)');
$stmt->bindParam(1, $article_id);
$stmt->bindParam($author);
foreach($authors as $author) {
$stmt->execute();
}
However, I can do a trick using implode() and execute the statement only once:
// here probably $authors = array_map('intval', $authors);
$stmt = $pdo->prepare(
'INSERT INTO authors (article_id, user_id)
VALUES ('.implode(', :article_id), (', $authors).', :article_id)');
$stmt->execute([':article_id' => $article_id]);
The first solution is more conventional and looks more securely.
The second (I think) is faster because there is only one query to the database. (and is shorter - there are no loops except in implode)
I don't see any security issues here but it looks safer (to me) when there are no string concatenations in queries.
Which is the proper way in this situation?
Edit: echo of the second query gives this:
INSERT INTO authors (article_id, student_id)
VALUES (121, :article_id), (50, :article_id)
And executes with no errors.
According to the PDO's doc "You cannot use a named parameter marker of the same name more than once in a prepared statement, unless emulation mode is on.". So that alone makes your "implode" solution bad.
That being said, I'll answer on the theory. The point of prepared statements is to compile the query only once, so repeated executions are faster. So prepared statements are meant to be used as in your first example : one simple "template" query, repeated many times.
In your second example, you make a custom query, that will hardly ever be repeated (since it's based on the content of your $authors array). Therefore, prepared statement in this case is completely useless, you have the overhead of the PREPARE without the benefits of repeated executions. It's not the way it's supposed to be used.
Extended insert is a perfectly valid solution, and a good one with that, but use it with normal query (i.e. exec()), and be sure to use quote() to protect against SQL-injection!

inserting multiple record using single mysql_query possible or not?

I'm working on a site in which I have to insert values in different table. so keeping this need in view, is it possible for me that can I use multiple query in single mysql_query in php or not.
for example:
mysql_query("insert into tableA (e-mail, name) values ('xxx', 'xxx'); insert into tableB (xxx, xxx, xxx) values ('value1','value2','value3')")
I want to run multiple queries in single statement. Please suggest some solution.
No, it is not possible. The obsolete mysql_* API only allows for one query to be executed at a time. To do this you need to use the mysqli API and mysqli_multi_query().
A single MySQL "INSERT" statement can support multiple VALUE tuples if they're for the same table.
mysql_query("insert into tableA (e-mail, name) values ('xxx', 'xxx'), ('yyy','yyy')")
However, what you're trying to do is not possible with the mysql_* functions.
Although the mysqli_* API allows you to run multiple queries at once, I recommend you AGAINST doing that for at least 2 reasons:
It's always a good (actually, great) idea to use prepared statements, for security reasons. Prepared statements can be used with the MySQLi API as well as with PDO.
As you can see from the docs for mysqli_multi_query(), getting errors from that function can be cumbersome. The function, indeed, returns only "false" if the first query fails; to get results for other queries you need to call another function.
In general, why would you need to combine multiple queries together? Eventually, the time you'd save would be minimal.
Instead, if your goal is having more than one query executed together, and having the whole set of queries fail if one fails, you can use transactions (which also can speed up inserts in some cases). Both MySQLi and PDO support transactions: see examples here for PDO http://php.net/manual/en/pdo.transactions.php
PS: in general, it's a good idea to avoid using mysql_* functions entirely, as those APIs are deprecated.
Welcome to PDO:
With PDO am able to do something like:
$sql = "
insert into tableA (e_mail, name) values (:e_mail, :name);
insert into tableB (xxx1, xxx2, xxx3) values (:xxx1, :xxx2, :xxx3)
";
Just have the Query Prepared first then VOILA!!
OR Using the Transaction method:
$con->beginTransaction();
$sql1 = "insert into tableA (e_mail, name) values (:e_mail, :name)";
$sql2 = "insert into tableB (xxx, fff) values (:xxx, :fff)";
$sql3 = "insert into tableC (qqq, bbb) values (:qqq, :bbb)";
$con->commit();

Use prepared statements everywhere in PHP? (PDO)

I'm going to be switching my database class that I use in several sites/projects, from using a custom mysql_query method*, to using PDO and prepared statements. However I have a question first - do I want to use prepared statements everywhere? Even in places where the query will only be ran once? What about situations where I need to do something like:
INSERT INTO `table` (`column`, `column`) VALUES ('value','value'), ('value','value'),('value','value'), etc.
Should I use a single prepared statement (And a single VALUE), but execute it with different variables each time, or should I use the style above? If I do use a prepared statement here, how bad of a performance hit are we talking? Do I need to use transactions in this situation?
*My mysql_query method is similar to a prepared statement in that the user can call $mysql->Query("SELECT * FROM%sWHERE '%s'='%s'", $var, $var, $var), and the method auto-escapes everything with mysql_real_escape_string.
Prepared statements provide a good degree of protection against SQL injection, and they also provide a performance benefit for some types of query. Personally, I would use them everywhere.
If you discover that a particular query is causing performance problems, you can do some profiling to track down the cause of the problem, and then optimise the code or query as required. But don't try to micro-optimise before you have a problem.
As for transactions, just use them when you need them. For example, when you need to perform a sequence of all-or-nothing updates, where if one fails, the whole lot must fail. These can be useful for things like many-to-many relationships, where three tables must be updated, and you don't want partial relationships remaining if a failure occurs.
Use only PDO parameters to pass variables into the query.
You can use prepared statement for multiple insert as well:
$insertQuery = 'INSERT INTO table (col1, col2) VALUES ';
$insertQueryData = array();
$insertData = array();
foreach ($data as $record) {
$insertQueryData[] = '(?, ?)';
$insertData[] = $record['col1'];
$insertData[] = $record['col2'];
}
$insertQuery .= implode(', ', $insertQueryData);
$statement = $db->prepare($insertQuery);
$statement->execute($insertData);
You should do the prepared statement every time. But, you may want to write a small helper that: prepares, binds, and runs, the query in one shot without multiple lines of code to do it.

Categories