So, I got a series of data that I need to insert into a table. Right now I am using a for loop to iterate through each entry and save the model one by one. But that doesn't seem like a good way to do it, moreover using transaction would be an issue. What's a better way to do it to improve performance and also so I can use transaction.Here's the code I am currently using.
foreach ($sheetData as $data)
{
$newRecord = new Main;
$newRecord->id = $data['A'];
$newRecord->name = $data['B'];
$newRecord->unit = $data['C'];
$newRecord->save();
}
If you can skip validation, you can generate a simple sql insert and execute it once. like:
$count = 0;
$sql = '';
foreach ($sheetData as $data)
{
if(!$count)
$sql .= 'INSERT INTO tbl_main (id ,name ,unit) Values ('.$data['A'].','$data['B']','$data['C']') ';
else
$sql .= ' , ('.$data['A'].','$data['B']','$data['C']')';
$count++;
}
Yii::app()->db->createCommand($sql)->execute();
Related
Does laravel have an update batch functionality similar to Codeigniter?
Codeigniter uses $this->db->update_batch('mytable', $data, 'title'); to do a batch update. More info could be found here.
But as for laravel's update, it seems that it could only do a single transaction. I feel that this is kind of bad when you have multiple rows to update wherein it will be inside a for loop. Something similar to this:
foreach ($rows => $row) {
DB::table('users')->where('id', $row['row_id'])->update(['votes' => 1]);
}
For atleast you get the picture, right?
If you'll look into this code, your database could get knock out pretty much as it keeps on connecting unlike the update_batch(), only a single transaction is being throw.
TL;DR - it is not clear that the CodeIgniter (CI) method is more efficient than a series of UPDATE queries. In addition, the CI method is less clear than a looped series of UPDATEs.
It is not correct to say that Laravel "keeps on connecting" - it will open a connection at the start of the request and keep using that connection until the request is finished. I think what you mean to say that you are sending a whole lot of queries to the server.
It is true that doing this:
INSERT INTO table VALUES (1, ...);
INSERT INTO table VALUES (2, ...);
INSERT INTO table VALUES (3, ...);
...
INSERT INTO table VALUES (n, ...);
is going to be less efficient than doing this:
INSERT INTO table VALUES (1, ...),
(2, ...),
(3, ...),
...
(n, ...);
But this is not a simple INSERT. Look at the code generated by the CI library in the link you posted. What we're looking at is the difference in efficiency between:
UPDATE table SET value=1 WHERE id=1;
UPDATE table SET value=2 WHERE id=2;
UPDATE table SET value=3 WHERE id=3;
...
UPDATE table SET value=n WHERE id=n;
and
UPDATE table
SET value = CASE
WHEN id = 1 THEN 1
WHEN id = 2 THEN 2
WHEN id = 3 THEN 3
...
WHEN id = n THEN n
ELSE value END
WHERE id IN (1, 2, 3, ..., n);
I'm no SQL expert, but I'm not convinced that the second is more efficient than the first. That long chain of WHEN clauses is going to be processed by the SQL server - a bunch of short UPDATE statements is (in my opinion) going to be more efficient, particularly when you're using an indexed column in the WHERE clause.
Finally, looking at the CI documentation, that update_batch method makes some assumptions about the structure of the data you pass to it - for example, that the first item in the array is the key for the update statements. That (to me at least) is not going to be clear when you look at your code six months from now.
maybe it will help
trait InsertOrUpdate {
static function InsertOrUpdate(array $rows) {
$table = DB::getTablePrefix().with(new self)->getTable();
if (empty($table)) {
return false;
}
$maxRowData = DB::table($table)->where('id', DB::raw('(select max(`id`) from ' . $table . ')'))->first();
if (! empty($maxRowData)) {
$maxId = $maxRowData->id;
$result = DB::statement('ALTER TABLE ' . $table . ' AUTO_INCREMENT = ' . $maxId . ';');
}
$tableColumns = DB::getSchemaBuilder()->getColumnListing($table);
$datetime = Carbon::now()->toDateTimeString();
if (in_array('created_at', $tableColumns)) {
foreach ($rows as $key => $row) {
$rows[$key]['created_at'] = $datetime;
$rows[$key]['updated_at'] = $datetime;
}
}
$first = reset($rows);
$columns = implode(',',
array_map(function($value) {
return "$value";
},
array_keys($first))
);
$values = implode(',', array_map(function($row) {
return '('.implode( ',',
array_map(function($value) { return '"' . str_replace('"', '""', $value) . '"'; }, $row)
).')';
} , $rows)
);
$updates = '';
if (in_array('updated_at', $tableColumns)) {
unset($first['created_at']);
unset($first['updated_at']);
$first['deleted_at'] = NULL;
$updateString = '(CASE WHEN ';
$lastClolumn = count($first);
$columnNum = 1;
foreach (array_keys($first) as $column) {
$updateString .= $column . ' <> VALUES(' . $column . ')';
if ($columnNum != $lastClolumn) {
$updateString .= ' OR ';
}
$columnNum++;
}
$updateString .= ' THEN \'' . $datetime . '\' ELSE `updated_at` END), ';
$updates .= 'updated_at = ' . $updateString;
}
$updates .= implode(',',
array_map(function($value) {return "$value = VALUES($value)"; } , array_keys($first) )
);
$sql = "INSERT INTO {$table}({$columns}) VALUES {$values} ON DUPLICATE KEY UPDATE {$updates};";
return DB::statement($sql);
}
Look into database transactions...
https://laravel.com/docs/5.2/database#database-transactions
How could I make this code faster. It's fine when I insert 100 records into my database but it takes really long time when I insert let's say 500K records.
I've tried to use implode in my code but it's not working.
Code seems to have two foreach loops, one inside the other, but I can't find a way to make it work, does anyone has an idea?
My framework is codeigniter.
Here's what the code looks like:
<?php
function Add_multiple_users($values)
{
$err = '';
foreach($values as $rows)
{
$clientQuery = 'INSERT INTO
client
(
admin_id,
create_time
)
VALUES
(
"'.$this -> session -> userdata('user_id').'",
"'.date('Y-m-d H:i:s').'"
)';
$clientResult = #$this -> db -> query($clientQuery);
if($clientResult)
{
$client_id = $this -> db -> insert_id();
foreach($rows as $row)
{
$attrQuery = 'INSERT INTO
client_attribute_value
(
attribute_id,
client_id,
value
)
VALUES
(
"'.$row['attribute_id'].'",
"'.$client_id.'",
"'.addslashes(trim($row['value'])).'"
)';
$attrResult = #$this -> db -> query($attrQuery);
if(!$attrResult)
{
$err .= '<p class="box error">Could not add attribute for<br>
Attribute ID: '.$row['attribute_id'].'<br>
Client ID: '.$client_id.'<br>
Attribute Value: '.trim($row['value']).'</p>';
}
}
}
}
return $err;
}
?>
Here's what I've tried:
$attrQuery = "INSERT INTO client_attribute_value (attribute_id, client_id, value) VALUES ";
$vls = array();
foreach($rows as $row) {
$myattribute_id = $row['attribute_id'];
$myclient_id = $row[$client_id];
$myvalue = addslashes(trim($row['value']));
$vls[] = " ( '$myattribute_id ', '$myclient_id ', '$myvalue ')";
$attrQuery .= implode(', ', $vls);
$attrResult = #$this -> db -> query($attrQuery);
Client Table sample:
client_attribute_value Table sample:
Answering my own Question, hope this helps someone else in the future.
In the model .php file of the codeigniter/framework add this:
$this->db->trans_start();
MY CODE
$this->db->trans_complete();
Problem Solved. :)
It speed up my INSERTS records to database approx. 15K records in 30secs.
My guess is that you are making multiple connections with the database, try to open one and preserve it's resource opened for all changes, then you close it only at the end of the process.
I have some strange problem with inserting rows in loop into mySQL table.
Let me show you php code first that I use then I describe some statistic.
I tend to think that it is some mySQL issue, but absolutely no idea what kind of. Max inserts in table per minute? (Can't be max row reached - planty of spase on disk)
echo " For character=" . $row[1];
$xml = simplexml_load_file($api_url);
$i=0;
foreach ($xml->result->rowset->row as $value) {
$newQuery = 'INSERT INTO '.$tableName.' (transactionDateTime, quantity, typeName, price, clientName, station, transactionType, seller) VALUES ("'.$value['transactionDateTime'].'",'.$value['quantity'].',"'.$value['typeName'].'","'.$value['price'].'","'.$value['clientName'].'","'.$value['stationName'].'","'.$value['transactionType'].'","'.$row[1].'")';
$i++;
if (!mysqli_query($conn, $newQuery)) {
die('Error while adding transaction record: ' . mysqli_error($conn));
} // if END
} // foreach END
echo " added records=" . $i;
I have same data in XML that doesn't change. (XML has something like 1400+ rows that i would insert)
It always inserts different amount of rows. Max amount it inserted was around 800+
If I insert like 10sec delay into foreach loop at $i==400 it will add even less rows. And more delays - less rows.
It never comes to that part of code where mysqli_error($conn)
It never reaches echo " added records=" . $i; part of the code.
Since it alwasy stops on different recors I have to assume nothing wrong with INSERT query.
Since it never reaches line after foreach loop echo " added records=" . $i; I also assume XML data wasn't processed by the end of it.
If I use another sources of data (another character) where are less records in XML then this code works just fine.
What could possibly be my problem?
Could be that your firing multiple queries at your SQL server. Better to build a single SQL query via your foreach then fire it once.
Something like this, basically:
$db = new mysqli($hostname, $username, $password, $database);
if($db->connect_errno > 0)
{
$error[] = "Couldn't establish connection to the database.";
}
$commaIncrement = 1;
$commaCount = count($result);
$SQL[] = "INSERT INTO $table $columns VALUES";
foreach ($result as $value)
{
$comma = $commaCount == $commaIncrement ? "" : ",";
$SQL[] = "(";
$SQL[] = "'$value[0]'"."'$value[1]'"."'$value[2]'"."'$value[3]'";
$SQL[] = ")$comma";
$commaIncrement++;
}
$SQL[] = ";";
$completedSQL = implode(' ',$SQL);
$query = $db->prepare($completedSQL);
if($query)
{
$db->query($completedSQL)
}
$db->close();
Scrowler is right, your php is timing out. As a test, you can add
set_time_limit(0);
to the start of your php script.
WARNING - Don't use this in production or anywhere else. Always set a reasonable time limit for the script.
I thought I would edit my question as by the comment it seems this is a very insecure way of doing what I am trying to acheive.
What I want to do is allow the user to import a .csv file but I want them to be able to set the fields they import.
Is there a way of doing this apart from the way I tried to demonstrate in my original question?
Thank you
Daniel
This problem I am having has been driving me mad for weeks now, everything I try that to me should work fails.
Basically I have a database with a bunch of fields in.
In one of my pages I have the following code
$result = mysql_query("SHOW FIELDS FROM my_database.products");
while ($row = mysql_fetch_array($result)) {
$field = $row['Field'];
if ($field == 'product_id' || $field == 'product_name' || $field == 'product_description' || $field == 'product_slug' || $field == 'product_layout') {
} else {
echo '<label class="label_small">'.$field.'</label>
<input type="text" name="'.$field.'" id="input_text_small" />';
}
}
This then echos a list of fields that have the label of the database fields and also includes the database field in the name of the text box.
I then post the results with the following code
$result = mysql_query("SHOW FIELDS FROM affilifeed_1000.products");
$i = 0;
while ($row = mysql_fetch_array($result)) {
$field = $row['Field'];
if ($field == 'product_name' || $field == 'product_description' || $field == 'product_slug' || $field == 'product_layout') {
} else {
$input_field = $field;
$output_field = mysql_real_escape_string($_POST[''.$field.'']);
}
if ($errorcount == 0) {
$insert = "INSERT INTO my_database.products ($input_field)
VALUES ('$output_field')";
$result_insert = mysql_query($insert) or die ("<br>Error in database<b> ".mysql_error()."</b><br>$result_insert");
}
}
if ($result_insert) {
echo '<div class="notification_success">Well done you have sucessfully created your product, you can view it by clicking here</div>';
} else {
echo '<div class="notification_fail">There was a problem creating your product, please try again later...</div>';
}
It posts sucessfully but the problem is that it creates a new "row" for every insert.
For example in row 1 it will post the first value and then the rest will be empty, in row 2 it will post the second value but the rest will be empty, row 3 the third value and so on...
I have tried many many many things to get this working and have researched the foreach loop which I haven't been familiar with before, binding the variable, imploding, exploding but none of them seem to do the trick.
I can kind of understand why it is doing it as it is wrapped in the while loop but if I put it outside of this it only inserts the last value.
Can anyone shed any light as to why this is happening?
If you need any more info please let me know.
Thank you
Daniel
You're treating each field you're displaying as its own record to be inserted. Since you're trying to create a SINGLE record with MULTIPLE fields, you need to build the query dynamically, e.g.
foreach ($_POST as $key => $value);
$fields[] = mysql_real_escape_string($key);
$values[] = "'" . msyql_real_escape_string($value) . "'";
} // build arrays of the form's field/value pairs
$field_str = implode(',', $fields); // turn those arrays into comma-separated strings
$values_str = implode(',', $values);
$sql = "INSERT INTO yourtable ($field_str) VALUES ($value_str);"
// insert those strings into the query
$result = mysql_query($sql) or die(mysql_error());
which will give you
INSERT INTO youtable (field1, field2, ...) VALUES ('value1', 'value2', ...)
Note that I'm using the mysql library here, but you should avoid it. It's deprecated and obsolete. Consider switching to PDO or mysqli before you build any more code that could be totally useless in short order.
On a security basis, you should not be passing the field values directly through the database. Consider the case where you might be doing a user permissions management system. You probably wouldn't want to expose a "is_superuser" field, but your form would allow anyone to give themselves superuser privileges by hacking up their html form and putting a new field saying is_superuser=yes.
This kind of code is downright dangerous, and you should not be using it in a production system, no matter how much sql injection protect you build into it.
Alright....I can't say that I know exactly whats going on but lets try this...
First off....
$result = mysql_query("SHOW FIELDS FROM my_database.products");
$hideArray = array("product_id","product_name","product_description", "product_slug","product_layout");
while ($row = mysql_fetch_array($result)) {
if (!in_array($row['Field'], $hideArray)){
echo '<label class="label_small">'.$field.'</label>
<input type="text" name="'.$field.'" id="input_text_small" />';
}
}
Now, why you would want to post this data makes not sense to me but I am going to ignore that.....whats really strange is you aren't even using the post data...maybe I'm not getting something....I would recommend using a db wrapper class...that way you can just through the post var into....ie. $db->insert($_POST) ....but if you ware doing it long way...
$fields = "";
$values = "";
$query = "INSERT INTO table ";
foreach ($_POST as $key => $data){
$values .= $data.",";
$fields .= $fields.",";
}
substr($values, 0, -1);
substr($fields, 0, -1);
$query .= "(".$fields.") VALUES (".$values.");";
This is untested....you can also look into http://php.net/manual/en/function.implode.php so you don't have to do the loop.
Basically you don't seem to understand what is going on in your script...if you echo the sql statements and you can a better idea of whats going....learn what is happening with your code and then try to understand what the correct approach is. Don't just copy and paste my code.
Anoter question: What is the best way to copy a whole table from one Database without using something like this:
CREATE TABLE DB2.USER SELECT * FROM DB1.USER;
This does not work because I can't use data from two Databases at the same time. So I decided to make this with php. There I had to cache all Data and then I'd create a table in the other DB.
But now - what would be the fastest way to cache the data? I guess there are closely everytime less than 1000 records per table.
Thanks for your input
Export it to a .sql file and import to the new database.
In phpMyAdmin click the database you want to export, click export, check save as file, then go.
Then upload it to the new database that you want to duplicate the data in.
In a strict PHP sense, the only way I could think to do it would to be use of SHOW TABLES and describe {$table} to return the all the field names and structures of the records, parse that out, create your create table statements, then loop through each table and create insert statements.
I'm afraid the best I can do for you is a sort of prototype code that I would imagine would be incredibly server intensive, which is why I recommend you go an alternate route.
Something like:
<?php
// connect to first DB
$tables = mysql_query("SHOW TABLES") or die(mysql_error());
while ($row = mysql_fetch_assoc($tables)) {
foreach($row as $value) {
$aTables[] = $value;
}
}
$i = 0;
foreach ($aTables as $table) {
$desc = mysql_query("describe " . $table);
while ($row = mysql_fetch_assoc($desc)) {
$aFields[$i][] = array($row["Field"],$row["Type"],$row["Null"],$row["Key"],$row["Default"],$row["Extra"]);
}
$i++;
}
// connect to second DB
for ($i = 0; $i < count($aTables); $i++) {
// Loop through tables, fields, and rows for create table/insert statements
$query = 'CREATE TABLE IF NOT EXISTS {$aTables[$i]} (
//loop through fields {
{$aFields[$i][$j][0]} {$aFields[$i][$j][1]} {$aFields[$i][$j][2]} {$aFields[$i][$j][3]} {$aFields[$i][$j][4]} {$aFields[$i][$j][5]},
{$aFields[$i][$j][0]} {$aFields[$i][$j][1]} {$aFields[$i][$j][2]} {$aFields[$i][$j][3]} {$aFields[$i][$j][4]} {$aFields[$i][$j][5]},
{$aFields[$i][$j][0]} {$aFields[$i][$j][1]} {$aFields[$i][$j][2]} {$aFields[$i][$j][3]} {$aFields[$i][$j][4]} {$aFields[$i][$j][5]},
{$aFields[$i][$j][0]} {$aFields[$i][$j][1]} {$aFields[$i][$j][2]} {$aFields[$i][$j][3]} {$aFields[$i][$j][4]} {$aFields[$i][$j][5]},
etc...
}
)';
//loop through data
$query .= 'INSERT INTO {$aTables[$i]} VALUES';
$result = mysql_query("SELECT * FROM " . $aTables[$i]);
while ($row = mysql_fetch_assoc($result)) {
$query .= '(';
foreach ($aFields[$i][0] as $field) {
$query .= '"{$row[$field]}",';
}
$query .= '),';
}
mysql_query($query);
}
?>
This is based off of this script which may come in handy for reference.
Hopefully that's something to get you started, but I would suggest you look for a non PHP alternative.
This is the way I do it, not original with me:
http://homepage.mac.com/kelleherk/iblog/C711669388/E2080464668/index.html
$servername = "localhost";
$username = "root";
$password = "*******";
$dbname = "dbname";
$sqlfile = "/path/backupsqlfile.sql";
$command='mysql -h' .$servername .' -u' .$username .' -p' .$password .' ' .$dbname .' < ' .$sqlfile;
exec($command);