I have a array with a variable amount of values.
Is there a more efficient or better way to INSERT them into my DB besides a loop with a query inside it?
At this site, there is a nice example of MySQL with a multi-insert query. It is valid SQL to
INSERT INTO [table]
VALUES
(row1),
(row2),
...
On request: a php snippet:
$query="INSERT INTO mytable\nVALUES\n (".$values[0].")";
array_shift( $values );
foreach( $values as $value ) {
$query .= ",(".$value.")";
}
In my experience multi-row inserts are processed MUCH faster than an equivalent number of single row inserts, if you're inserting a large amount of data at a time, that's a good way to go. I've watched a process of entering thousands of rows of data be condensed from 5-10 minutes down to literally seconds using this method.
As far as the code part, I've been a fan of using implode() to join arrays of fields & values together. No reason you can't do the same for rows of data, you just need to be able to identify which fields need to be quoted, escaped, etc.
For the sake of argument assume $rows is an array of properly formatted SQL values...
$sql = "INSERT INTO `table` VALUES (" . implode("), (", $rows) . ")";
You could apply something similar to assemble the individual fields should you desire.
If the DB you are using allows multiple value insert, then you could create an multi-insert statement and send that to the DB - one connect with one command to do multiple inserts.
If you cannot do multiple inserts - (as MSSQL does not allow) - then I think you are stuck.
Related
I have a DB table which has approximately 40 columns and the main motive is to insert the records in the database as quickly as possible. I am using PHP for this.
The problems is, to create the insert statement, I have to loop through a for each. I am not sure if I doning this correctly. Please suggest me the best atlernative.. here is the example..
/// to loop through the available data ///
$sqc = "";
for ($i=1; $i<=100; $i++){
if ($sqc == ""){
$sqc = "('".$array_value["col1"]."'.. till .. '".$array_value["col40"]."',)";
} else {
$sqc .= ",('".$array_value["col1"]."'.. till .. '".$array_value["col40"]."',)";
}
}
/// finally the sql query ///
$sql_quyery = "INSERT INTO table_name (`col1`,.. till.. ,`col40`) values ".$sqc;
This concatenation of $sqc is taking a lot of time. and also the insertion in the DB, is there an alternate way of doing this.. i need to find a way to speed this up like 100X.. :(
Thank you
As suggested on MySQL Optimizing INSERT Statements page, there are some ways for this-
If you are inserting many rows from the same client at the same time, use INSERT statements with multiple VALUES lists to insert several rows at a time. This is considerably faster (many times faster in some cases) than using separate single-row INSERT statements. If you are adding data to a nonempty table, you can tune the bulk_insert_buffer_size variable to make data insertion even faster.
When loading a table from a text file, use LOAD DATA INFILE. This is usually 20 times faster than using INSERT statements.
Find the link below-
[MySQL Guide]
[1] https://dev.mysql.com/doc/refman/5.7/en/insert-optimization.html
Is just a small contribute but you can avoid the concat using a binding
$stmt = mysqli_prepare($yuor_conn,
"INSERT INTO table_name (`col1`,.. till.. ,`col40`) VALUES (?, ... till.. ?)");
mysqli_stmt_bind_param($stmt, 'ss.......s',
$array_value["col1"], $array_value["col2"], .. till..,
$array_value["col40"]);
I am trying to make an inventory / invoice web application. The user enters information such as order ID, date, order total, and then each of the products bought along with their respective quantity. I'm using PDO for the sql queries.
I do not know in advance how many unique products are going to be in an invoice so I have an associative array that stores the products and their quantities (product name is used as the key) when the form is submitted.
On submit a prepared statement is built/executed.
Right now I have the order_id, date, and order_total query done.
$stmt = $connection->prepare("INSERT INTO table_1 (order_id, order_date, order_total) VALUES ('$orderid', '$date', '$total_cost')");
$stmt->execute();
That part is simple enough. The aim of the other query is the following.
$testStmt = $connection->prepare("INSERT INTO table_2 (keys from the assoc array are listed here) VALUES (values from the assoc arrays are listed here)");
$testStm->execute();
My array would end up looking like this once the user inputs some products:
$array
(
"product1" => quantity1
"product2" => quantity2
)
The idea I have had so far is to make a string for columns that need to be included in the sql query and then a string for the values for the sql query. Then iterate through the array and append the keys and values to the respective strings in such a way that I could use them in the sql query. I haven't gotten it to work and am worried that it could open myself up to sql injection (I am still quite unfamiliar with sql injection so I have been trying to read up on it).
$columns;
$values_input;
foreach($assoc_array as $product => $quant)
{
$columns .= "' " . $product . "', ";
$values_input .= "' " . $quant . "', ";
}
The idea being that $columns and $values_input string would end up containing all the appropriate column names and the quantities to be entered into those columns. Then I figured I could be able to use those strings as part of the SQL query. Something like this.
INSERT INTO $columns VALUES $values_input
I'd appreciate any help or insight. If I'm way off here or doing something in a retarded way feel free to shout about it, I'd rather fix a screw up than continue on with it if that's the case.
You are already using PDO, which is a good thing if you want to protect yourself from SQL injection. You are even trying to prepare your statement, but since you are not binding any parameters, one could argue if that is really what you are doing. Example 5 on the PHP docs page is in fact pretty close to what you want to do. Allow me to adapt it to your use case:
// create a placeholders string looking like "?, ?, ..., ?"
$placeholders = implode(',', array_fill(0, count($params), '?'));
// prepare the statement
$qry = $connection->prepare("INSERT INTO table_2 ($params) VALUES ($params)");
// bind the parameters to the statement. (We first need all columns, then all values)
$qry->execute(array_merge(array_keys($params), array_values($params)));
This should result in a query that looks exactly like your first example, but with a dynamic number of columns, or parameters. And since you are preparing your statement and binding the parameters on execution, PDO should handle all quoting and escaping to prevent SQL injection.
As a side note, your table structure seems a bit of to me. I don't think you normalized your data correctly, though it is a bit hard to tell with the table names you are using. I believe your structure should look something like this, and I fear it doesn't:
TABLE orders (id, date, total, client_id)
TABLE products (id, name, price, ...)
TABLE order_lines (id, order_id, product_id, quantity)
TABLE clients (...)
The exact structure obviously depends on your use case, but I believe this is about the simplest structure you can get away with if you want to build an order system that you can easily query and that can serve as a base for possible expansion in the future.
Since you are trying to make an inventory/invoice application, do you happen to have a product database? If you do, you may want to use the product id instead of the product names as key. As product names sound like there could be duplicates or can change. If product name changes, you will have problems querying.
Do you accept products not in db to be entered into the invoice? If so, it adds some complications.
On SQL injections, you should sanitize input before using it for queries. Read: What's the best method for sanitizing user input with PHP?
Most modern frameworks have many built in protections against SQL injections if you do not query manually. So consider using them.
Many of them use active record pattern see: http://www.phpactiverecord.org/projects/main/wiki/Basic_CRUD (So you don't have to deal with writing queries manually like you do.)
An example of active record in a framework: https://www.codeigniter.com/user_guide/database/query_builder.html
I am pulling data from a IDX/RETS server and trying to insert the data into my own MySQL data base using php.
The data comes in as multiple associative arrays and i loop through them entering each array into its own row.
The table itself has 215 columns, in my first test I only pull 10 records/array's and after running my script I only get 1 row entered.
I have been able to get all 10 rows to insert but only by reducing the number of columns to about 6. For what ever reason when trying to enter all 215 columns of the 10 records it keeps timing out.
I have tried:
ini_set('max_execution_time', 500);
&
set_time_limit(0);
I have also tried entering these values into my php.ini file but what ever i do the script only seems to run for about 15-30 seconds.
Is there something else I am missing or should be doing when entering so many columns??
My code is just a simple while loop that is looping through the arrays. And my insert is like this:
$sql = "INSERT INTO rets_property_residentialproperty";
$sql .= " (`".implode("`, `", array_keys($my_array))."`)";
$sql .= " VALUES ('".implode("', '", $my_array)."') ";
$result = mysqli_query($dbcon, $sql) or die(mysql_error());
For one thing, don't mix mysql and mysqli interface functions. The call to mysql_error should be replaced with a call to mysqli_error.
Not at all clear what's "timing out".
Most likely, the INSERT statement is throwing an error, the code is going into the die, and the problem is with the mysql_error function. Get that function replaced with mysqli_error, and I venture that you'll get a MySQL error.
... or die(mysqli_error($dbcon));
^ ^^^^^^
Likely, the underlying issue is a SQL syntax error. For example, one of the values in the array may contain a character that needs to be escaped, such as a single quote.
(But, I'm just guessing here.)
For debugging, you might consider echoing out the SQL text
echo $sql;
before the call to mysqli_query. (Then you can compare the error to the actual SQL text that you intended to send to the database.
To handle single quotes and other "dangerous" characters in the values, (to close a gaping SQL Injection vulnerability) you'd either need to call mysqli_real_escape_string on each of the array values, before the value is included in the SQL text. Or, use a prepared statement with bind placeholders, and then supply the array values as the values for the bind placeholders.
I need some suggestions and ideas.
Here's the scenario. The server receives a bunch of IDs from client via Ajax. Some of these IDs may already exist in database some may not. I need to save those that are not.
One way would be to set sql queries to select * whose ID is what I have. But this requires to a select statement for each id. Each time I receive something about 300 IDs which means 300 sql queries. This I think would slow the server. So what do you think is a better way to do this? Is there a way to extract the non-existing IDs with one SQL query?
P.S. The server is running on CakePHP.
I think what you need is SQL's IN keyword:
SELECT id FROM table WHERE id IN (?)
Where you would insert your IDs separated by comma, e.g.
$id_str = implode(',', $ids);
Make sure that $ids is an array of integers to prevent SQL injection
The outcome is a MySQL result containing all ids that exist. Build them into an array and use PHP's array_diff to get all IDs that do not exist. Full code:
$result = $connection->query('SELECT id FROM table WHERE id IN ('.
implode(',', $ids) . ')');
while($row = $result->fetch_row()) {
$existent[] = $row[0];
}
$not_existent = array_diff($ids, $existent);
If I understand you correctly, an insert ignore could do the trick
INSERT IGNORE INTO `table` (`id`,`col`,`col2`) VALUES ('id','val1','val2');
then any duplicate id's will be silently dropped, so long as id is unique or primary.
Also the keyword IN can be useful for finding rows with a value in a set. Eg
SELECT * FROM `table` WHERE `id` IN (2,4,6,7)
I have an html page where I collect an array of values from checkboxes to insert in a database. The html page posts to a PHP page that collects the data and then stores in the database.
For each value, there are a few other fields I would like to include that are the same for all the values such as time entered. I can easily convert the captured array into a comma delimited list using implode. I use such a comma delimited list of ids to update and delete records. However, when I want to insert them, MYSQL does not seem to allow you to use a comma delimited list. My question is, what is the easiest way to insert records, one for each value in the c comma delimited list, without using a loop.
html page
<input type="checkbox" name="var[]" value=1>
<input type="checkbox" name="var{}" value=2>
PHP page
$vars = $_POST['var'];
This gives me an array that I can convert to a comma delimited list using implode.
To delete, I can go
$sql = "DELETE * from table WHERE id in '$vars'";
To update I can go
$sql = "UPDATE table, WHERE id in '$vars'";
But there does not seem to be an equivalent for Insert.
Following would work:
$sql = "INSERT into table (var, timeentered) values (1,now()) (2,now())";
However, that's not how I have my data. what I would like to do is something like
$sql = "INSERT into table (var,timeentered) values($vars), now()" but of course that doesn't work.
Do I have to convert my nice comma delimited list that works so well for update and delete into something that looks like (1,now) (2, now()) for inserting or is there an alternative?
Thanks for any suggestions.
Unfortunately you have to build whole query by yourself:
$sql ="insert into table (var, timeentered) values ";
$sql .= "(".implode(", now()), (", $vars).")";
You need to loop through your data set and create the multi-line insert query manually. There is no other alternative if you want to insert multiple rows with a single query. That is outside of using certain DB frameworks which might present a better interface for doing this. Of course at the end of the day, such a DB framework would in essence be building the multi-item insert query manually at some point.
The question might come done to one of how many items are you going to insert. If you are only going to be inserting a few records at a time, then you might want to consider just using prepared statements with individual inserts. However if you are going to be inserting hundreds of records at a time, that would probably not be a good idea.
In your mysql database you can set the default for the column "time_created" to be a TIMESTAMP default CURRENT_TIMESTAMP. This way you don't have to worry about it. Just use the regular insert and it will automatically set the "time_created" column.
For your other issue of multi-line inserts you can create an $update array and use a foreach loop to issue a sql insert command on every row of data.
Two options I can think of.
Build a dynamic insert query like you suggest. However do not call now() each time but just insert a single date ie
$time = gmdate();
$sql = "INSERT into table (var, timeentered) values (1,$time) (2,$time)";
Or use a prepared statement of the single insert below, turn off autocommit, start a transaction and execute the prepared statement in a for loop for the number of inserts needed, then commit the transaction.
$sql = "INSERT into table (var, timeentered) values (?,?)"
Mostly you will have to build your query using some type of looping structure. Convention and best practice aside if you just want to know how to make your array acceptable for a multiple insert statement then why not just do this:
$values = '('.implode('),(', $array).')';
or if already CSV then:
$values = '('.implode('),(', explode(',' $csv)).')';
then you can just use $values in your query using double quotes.