I have the following code which is working fine,
But I need a way to batch process it and
Insert it all in one go.
for($i=0; $i<sizeof($my_array); $i++)
{
$sql = "INSERT INTO exclude_resource (id,resource_id,config_id)
VALUES(DEFAULT, '$my_array[$i]' ,'$insert_id')";
$command = $connection->createCommand($sql);
$result = $command->execute();
}
Your query should resemble something like (see mysql docs):
INSERT INTO table (field1, field2, field3)
VALUES
(value1a, value2a, value3a),
(value1b, value2b, value3b),
(value1c, value2b, value3c),
...
So put the values in an array, join them with commas, and execute the resulting query. Incorporated in PHP:
$values = array();
for($i=0; $i<sizeof($my_array); $i++) {
$values[] = "(DEFAULT, '$my_array[$i]' ,'$insert_id')";
}
$sql =
"INSERT INTO exclude_resource (id,resource_id,config_id) VALUES '.
join(',', $values);
$command = $connection->createCommand($sql);
$result = $command->execute();
$insert_id is needs to have a value, but thats the same with your code snippet.
If you have more than 5k or 10k rows to insert, you should run an INSERT in between and reset the array.
Related
Assuming I have an array as follows:
$array = array('first_value',
'second_value',
'thrid_value', 'and so on');
And a Column in which I'd want to insert those values, but each value in a separate row.
Would it it be possible to do that?
Obviously there are some answers to this one would be just loop thru the array elements and for every loop execute an insert statement, but that just seems unwise.
Or given that I'd have an ID column, that would help a lot(but I don't).
The amount of data to be introduced is not terribly large so the loop is perfectly viable, I just wanna make sure there isn't some easier way to do this that I may not be aware of.
You could use prepared statements; the first query will send the SQL statement and the subsequent calls will only send the data, thereby reducing the load:
$stmt = $db->prepare('INSERT INTO mytable (colname) VALUES (?)');
foreach ($array as $value) {
$stmt->execute(array($value));
}
If you're using PDO, such as the above example, make sure to disable prepared statement emulation.
// connect to database and store the resource in $connection
$array = array('first_value',
'second_value',
'thrid_value', 'and so on');
foreach($array as $value)
{
$value=mysqli_real_escape_string($connection,$value);
mysqli_query($connection,"INSERT INTO yourTABLE(columnName) VALUES('$value')");
}
You can put them all into a single INSERT statement with multiple VALUES lists.
$values = implode(',', array_map(function($v) use ($mysqli) {
return "'" . $mysqli->real_escape_string($v) . "'"; },
$array));
$query = "INSERT INTO yourTable (Column) VALUES $values";
$mysqli->execute($query) or die ($mysqli->error);
From mysql manual for insert,you may try this:
INSERT INTO yourtable (column_name) VALUES (value_a), (value_b), (value_c);
$array = array('first_value','second_value','third_value');
$SQL = "INSERT INTO `table` (column) VALUES('".implode("'),('",$array)."')";
OR
$values = '';
foreach($array as $val){
$values .= !empty($values)? ",('{$val}')" : "('{$val}')";
}
$SQL = "INSERT INTO `table` (column) VALUES{$values}";
I have a text file to read which has around 10000 points separated by,
x1,y1
x2,y2
x3,y3
.
.
10000 times
I read them using a loop in PHP and then store in an array and then I run a loop and insert one line at a time in my database. It takes really long time. Is there any way I can insert the whole array
for ($i=0; $i<10000; $i++)
{
$sql = '
INSERT INTO `firefly`.`FreeFormPoly`
(`markedObjectID`, `order`, `x`, `y`)
VALUES
('.$markedObjectsID.', '.$order.', '.$valuesx[i].','.$valuesy[i].')';
$db->query($sql, $markedObjectsID, $order, $values[1], $values[0]);
}
Try using multiple insert statement. Generate one insert and submit the entire statement
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
SO:
$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES';
for($i=0;$i<10000;$i++) {
if($i != 0) $sql .= ',';
$sql .= '('.$markedObjectsID.', '.$order.', '.$valuesx[i].','.$valuesy[i].')';
}
$db->query($sql);
I would do something like this:
$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES';
for($i=0;$i<length;$i++) {
$sql .= '('.$markedObjectsID.', '.$order.', .$valuesx[i].','.$valuesy[i].'),';
}
$sql = substr($sql,0,-1);
$db->query($sql);
Explanation:
The syntax to enter multiple records is
INSERT INTO TABLE_NAME VALUES(VAL1, VAL2, ....), (...), (...);
In the SQL you are concatenating (val1,val2,val3), every time you execute the loop hece you will get an extra , in the last position and substr() trims it off.
More preferably I would do
$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES ';
for($i=0;$i<length;$i++) {
$sql .= '('.$markedObjectsID.', '.$order.', .$valuesx[i].','.$valuesy[i].'),';
}
$sql = substr($sql,0,-1);
$result = mysqli_query($db,$sql)
or die('Error in querying the database');
You should be able to speed it up considerably by sending BEGIN TRANSACTION before the loop and COMMIT after the loop. One time I had to insert 14,000 data points on SQLite, and it took 20 minutes, but when I put the data in as a transaction it completed in 0.3 seconds.
First off, you can use prepared statements to reduce the network overhead. Assuming PDO:
$stmt = $db->prepare('INSERT INTO mytable (foo, bar, baz) VALUES (:foo, :bar, :baz)');
for ($i = 0; $i < $length; ++$i) {
$stmt->execute(array(
':foo' => $data[$i]['foo'],
':bar' => $data[$i]['bar'],
':baz' => $data[$i]['baz'],
));
}
Second, you could wrap the whole code inside $db->beginTransaction() and $db->commit().
I'm trying to switch some hard-coded queries to use parameterized inputs, but I've run into a problem: How do you format the input for parameterized bulk inserts?
Currently, the code looks like this:
$data_insert = "INSERT INTO my_table (field1, field2, field3) ";
$multiple_inserts = false;
while ($my_condition)
{
if ($multiple_inserts)
{
$data_insert .= " UNION ALL ";
}
$data_insert .= " SELECT myvalue1, myvalue2, myvalue3 ";
}
$recordset = sqlsrv_query($my_connection, $data_insert);
A potential solution (modified from How to insert an array into a single MySQL Prepared statement w/ PHP and PDO) appears to be:
$sql = 'INSERT INTO my_table (field1, field2, field3) VALUES ';
$parameters = array();
$data = array();
while ($my_condition)
{
$parameters[] = '(?, ?, ?)';
$data[] = value1;
$data[] = value2;
$data[] = value3;
}
if (!empty($parameters))
{
$sql .= implode(', ', $parameters);
$stmt = sqlsrv_prepare($my_connection, $sql, $data);
sqlsrv_execute($stmt);
}
Is there a better way to accomplish a bulk insert with parameterized queries?
Well, you have three options.
Build once - execute multiple. Basically, you prepare the insert once for one row, then loop over the rows executing it. Since the SQLSERVER extension doesn't support re-binding of a query after it's been prepared (you need to do dirty hacks with references) that may not be the best option.
Build once - execute once. Basically, you build one giant insert as you said in your example, bind it once, and execute it. This is a little bit dirty and misses some of the benefits that prepared queries gives. However, due to the requirement of references from Option 1, I'd do this one. I think it's cleaner to build a giant query rather than depend on variable references.
Build multiple - execute multiple. Basically, take the method you're doing, and tweak it to re-prepare the query every so many records. This prevents overly big queries and "batches" the queries. So something like this:
$sql = 'INSERT INTO my_table (field1, field2, field3) VALUES ';
$parameters = array();
$data = array();
$execute = function($params, $data) use ($my_connection, $sql) {
$query = $sql . implode(', ', $parameters);
$stmt = sqlsrv_prepare($my_connection, $query, $data);
sqlsrv_execute($stmt);
}
while ($my_condition) {
$parameters[] = '(?, ?, ?)';
$data[] = value1;
$data[] = value2;
$data[] = value3;
if (count($parameters) % 25 == 0) {
//Flush every 25 records
$execute($parameters, $data);
$parameters = array();
$data = array();
}
}
if (!empty($parameters)) {
$execute($sql, $parameters, $data);
}
Either method will suffice. Do what you think fits your requirements best...
Why not just use "prepare once, execute multiple" method. I know you want it to either all fail or all work, but it's not exactly hard to handle that with transactions:
http://www.php.net/manual/en/pdo.begintransaction.php
http://www.php.net/manual/en/pdo.commit.php
http://www.php.net/manual/en/pdo.rollback.php
here my code-
$things = mysql_real_escape_string(implode(',', $_POST['things']),$link);
$q = "INSERT INTO tblslider(src) VALUES ('".$things."')";
print_r($q);
$result = $mysqli->query($q) or die(mysqli_error($mysqli));
but my query is getting generated
INSERT INTO tblslider(src) VALUES ('4368122.jpg,5440051.jpg,1047428.jpg') but it should be
INSERT INTO tblslider(src) VALUES ('4368122.jpg'),('5440051.jpg'),('1047428.jpg') thats why it is taking it as one record not three.
You could do:
$things = array_map('mysql_real_escape_string', $_POST['things']);
$q = "INSERT INTO tblslider(src) VALUES ('". implode("'),('", $things)."')";
It generates (with my test data):
INSERT INTO tblslider(src) VALUES ('a.jpg'),('b.jpg'),('c.jpg')
I forgot: Only use functions like mysql_real_escape_string on the real data, not the SQL string. In your example you apply the function on the already concatenated data.
You have imploded things which is now an array, so you need to iterate over this with a foreach loop such as...
foreach ($things as $item) {
$q = "INSERT INTO tblslider(src) VALUES ('".$item."')";
echo '<br />'.$q;
$result = $mysqli->query($q) or die(mysqli_error($mysqli));
}
You could echo $q to make sure you're getting the queries right for each item also.
try this:
$formatVals = function($x){$rx = mysql_real_escape_string($x); return "('$rx')";};
$valString = implode(',', array_map($formatVals, $_POST['things']);
$sql = "INSERT INTO tblslider (src) VALUES $valString";
I'm trying to load data from a few hundred text files into a database.
I believe MYSQL is exiting out of the loop without inserting all the rows.
Can anyone suggest how to insert blocks of 1000 rows to the end of data, with PHP code?
$filenames_array = array();
foreach($filenames_array as $filename)
{
$file_array = file($filename);
$file_value = $file_array[0];
$new_array = explode(",", $file_value);
$length = count($new_array);
for($i = 0; $i < $length; $i++)
{
$sql = "INSERT INTO `names`
(`id`, `name`)
VALUES
('',
'" . $new_array[$i] . "'
)";
$result = mysql_query($sql) or die(mysql_error());
echo $i . 'Row Inserted<br />';
}
}
you're probably trying to run too many INSERT statements in a single query.
look into PDO and prepared statements or use SQL syntax like this:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
Is it possible that one of the entries you're trying to insert contains a single quote '? In this case, an error would occur and the the loop wouldn't finish.
You should always escape the values you insert into the database with mysql_real_escape_string to prevent problems like that, and to make sure you're not vulnerable to sql injection.
$sql = "INSERT INTO `names`
(`id`, `name`)
VALUES
('',
'" . mysql_real_escape_string($new_array[$i]) . "'
)";
Why not combine every txt file into one big text file, and read it line by line? See the examples here http://php.net/manual/en/function.fgets.php
Mainly:
<?php
$handle = #fopen("/tmp/inputfile.txt", "r");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
echo $buffer;
}
fclose($handle);
}
?>