Effeciant way of inserting array of data in database using codeigniter - php

I have following table fields in the database :
id, user_id, permission_id, value, created.
I have one result of an array which may contain more than 20 different values at a time. That whole contains the permission_id values and rest of the fields are similar. like user_id will be only one which will be inserted same with each permission_id and value will be always 1 and created is same as it would contain current date and time.
Now I am able to insert into database with following code:
$user_perms=$this->input->post('permissions');
foreach($user_perms as $perms) {
$userPerms['permission_id']=$perms;
$userPerms['user_id']=$uid;
$userPerms['value']=1;
$userPerms['created']=date("Y-m-d H:i:s");
$this->admins->insertPerms($userPerms);
}
Now it runs very well. But i want to make it more efficient and fast. As you may have noticed that i run that insert query in the foreach loop . So, when the user will click the submit at the back end the query may run more than 30 times at a time. which is not a good idea.
Therefore, how can i insert data without loop at once ?

You can use $this->db->insert_batch() to insert multiple rows at once:
$this->db->insert_batch(
'table_name',
array(
// first row:
array('permission_id' => 1, 'user_id' => 1, 'value' => 1, 'created' => '...'),
// second row:
array('permission_id' => 2, 'user_id' => 1, 'value' => 1, 'created' => '...')
)
);
(read more here)
However, you obviously don't avoid the foreach loop that way because you'd need to somehow create the array of data that you're passing to it ...
So, another way to optimize it is to run those inserts inside a transaction. That would (as far as SQL is concerned at least) be the equivalent of inserting them all at once in a single query, because it's COMMIT that's the most expensive operation and therefore 1 commit is faster than 20 commits.

Related

Bulk Insert in Laravel 5

I received post data as string as follow:
data01,data02
data11,
data21,data22
...
dataxx,dataxx
The data can be as long as 10.000 rows, column 1 is mandatory and unique, column 2 is optional.
so have sanitize the string (ensure column 1 unique (only within the input data - not through DB), format, etc.) and have convert it to array:
[0:
[col1: data01,
col2: data02],
1:
[col1: data11,
col2: ""],
2:
[col1: val11,
col2: val12],
.....
9999:
[col1: dataxx,
col2: dataxx],
]
so that array now sits at my Controller...
now, on my own opinion, it's best to send that array to mysql through Stored Proc and let SP do the bulk insert job (let me know if you have a better way, and why)
questions:
how to pass that array to Mysql?
and should I use insert select on duplicate key ignore?
I need to get a report to be presented in view, to tell user which data column 1 is duplicated (ignored)
And just a side question,
if i do:
$validator = Validator::make($request->all(), [
'array.*.column1' => 'unique:items'
]);
will it actually query to DB as many as my data quantity? (eg. 10000 times?)
I just affraid the code performance is too expensive
What my solution for this is
First define rules of validation it always run on request not on DB.
--
$rules = array("column 1" => "unique:KEY_NAME|required");
$validator = Validator::make($request->all(), $rules);
Make an array
--
$data = array(
array('k1'=> 'v1', 'k2'=> 'v2'), all data
);
Now use Eloquent::insert(); and pass data into this.
Well don't query on database. Just take array difference on final data and request data. Also add one more column which state whether is duplicate or not. So it's gonna save a good time.

How to insert Array to Database using firstOrCreate in Laravel?

I have an array like this and it has 120 elements in it
`array (size=120)
0 =>
array (size=8)
'name' => That the quick brown fox jumps over the lazy - 7' (length=53)
'url' => string 'google.com/zyx' (length=134)
'category' => string 'search-engine' (length=6)
1 =>
array (size=8)
'name' => string 'Mr. john brandy gave me a wall nut of quite' (length=67)
'url' => string 'yahoo.com/dzxser' (length=166)
'category' => string 'indian' (length=6)`
I want to insert them to my bookmark table which model I have created and I want to make sure duplication doesn't occur. I have found this https://laravel.com/docs/5.4/eloquent#other-creation-methods specially firstOrCreate method.
I assume I have to use foreach but I am not sure how. Can anyone help me with some workaround.
Actually you don't need firstOrCreate, you need updateOrCreate. Checking Laravel Other Creation methods You will find that method.
Say that array is in $alldata:
foreach($alldata as $data) {
MyModel::updateOrCreate($data); //the fields must be fillable in the model
}
This will run update/or create of 120 queries while cycling through the loop. The advantage is that, you cannot have a duplicate, rather if there is a repetition, its only going to perform an update to the table.
However the best way to ensure that there is no duplication in whatever way the data comes is to set it up when making your database table. You can set unique constraints on many fields if thats your case.
If you don't want duplication to occur when inserting array of records then all you have to do it set a constraint making sure fields are unique.
If you're using migrations to create databse schema you can use something like this: $table->string('name')->unique();
Now for example, this will make sure that 'name' column data is

PDO Sqlite General error: 25 bind or column index out of range

Read some relevant questions here, here and here. A simple query still triggers an error
SQLSTATE[HY000]: General error: 25 bind or column index out of range
The $query
INSERT OR IGNORE INTO `menu` (`id`,`name`,`name_clean`,`display`) VALUES (:idInsert,:nameInsert,:name_cleanInsert,:displayInsert);
UPDATE `menu` SET id=:idUpdate,name=:nameUpdate,name_clean=:name_cleanUpdate,display=:displayUpdate WHERE id = 1;
';
The $values
[:idInsert] => 1
[:idUpdate] => 1
[:nameInsert] => 2
[:nameUpdate] => 2
[:name_cleanInsert] => 3
[:name_cleanUpdate] => 3
[:displayInsert] => 1
[:displayUpdate] => 1
The snippet. $this->db->handle is the DB handle. As stated in one of the references above, I have implemented the setAttribute(\PDO::ATTR_EMULATE_PREPARES, true) to be able to execute multiple queries
$statement = $this->db->handle->prepare($query);
$statement->execute($values);
Fighting with this one for hours and feels like I am running circles. What am I missing here?
Update
Table definition as required
DROP TABLE IF EXISTS `menu`;
CREATE TABLE `menu` (`id` INTEGER PRIMARY KEY NOT NULL ,`name` VARCHAR,`name_clean` VARCHAR,`sequence` INTEGER, `display` INTEGER);
I think you're running into this:
The keys from input_parameters must match the ones declared in the SQL. Before PHP 5.2.0 this was silently ignored.
Reference: PHP: PDOStatement::execute
My guess is that PHP tries to match the parameters on each of the two queries.
Your input and update parameters look the same, so I don't think there is a requirement to have the two sets. Try collapsing them into one set
[:id] => 1
[:name] => 2
[:name_clean] => 3
[:display] => 1
and referencing them in both queries.
Another note: are you sure you want
WHERE id = 1
This should probably also be
WHERE id=:id
The issue is that PDO wants to bind 8 values to 4 params in the first query split by the ";".
Seeing you SQL code (and your added database structure) I would change it to something like:
INSERT INTO menu (id, name, name_clean, display) VALUES
(
:idInsert, :nameInsert, :name_cleanInsert, :displayInsert
)
ON DUPLICATE KEY UPDATE
id = VALUES(:idUpdate),
name = VALUES(:nameUpdate),
name_clean = VALUES(:name_cleanUpdate),
display = VALUES(:displayUpdate);
This is written by hand, might contain syntax errors.

What is a proper way to insert, update, or delete records based on an outside source?

I have two externally hosted third-party .txt files that are updated on an irregular basis by someone other than myself. I have written a script that pulls this information in, manipulates it, and creates a merged array of data suitable for use in a database. I'm not looking for exact code but rather a description of a good process that will work efficiently in inserting a new row from this array if it doesn't already exist, updating a row in the table if any values have changed, or deleting a row in the table if it no longer exists in the array of data.
The data is rather simple and has the following structure:
map (string) | route (string) | time (decimal) | player (string) | country (string)
where a map and route combination must be unique.
Is there any way to do all needed actions without having to loop through all of the external data and all of the data from the table in my database? If not, what would be the most efficient method?
Below is what I have written. It takes care of all but the delete part:
require_once('includes/db.php');
require_once('includes/helpers.php');
$data = array_merge(
custom_parse_func('http://example1.com/ex.txt'),
custom_parse_func('http://example2.com/ex.txt')
);
try {
$dsn = "mysql:host=$dbhost;dbname=mydb";
$dbh = new PDO($dsn, $dbuser, $dbpass);
$dbh->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
$dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
foreach ($data as $value) {
$s = $dbh->prepare('INSERT INTO table SET map=:map, route=:route, time=:time, player=:player, country=:country ON DUPLICATE KEY UPDATE map=:map2, route=:route2, time=:time2, player=:player2, country=:country2');
$s->execute(array(
':map' => $value['map'],
':route' => $value['route'],
':time' => $value['time'],
':player' => $value['player'],
':country' => $value['country'],
':map2' => $value['map'],
':route2' => $value['route'],
':time2' => $value['time'],
':player2' => $value['player'],
':country2' => $value['country']
));
}
} catch(PDOException $e) {
echo $e;
}
You mention that you're using MySQL, which has a handy INSERT ... ON DUPLICATE KEY UPDATE ... statement (documentation here). You will have to iterate over your collection of data (but not the existing table). I would handle it a little differently than #Tim B does...
create a temporary table to hold the new data.
loop through your new data and insert it into the new table
run an INSERT ... ON DUPLICATE KEY UPDATE ... statement inserting from the temporary table into the existing table - that takes care of both inserting new records and updated changed records.
run a DELETE FROM [existing table] t1 LEFT JOIN [temporary table] t2 ON [whatever key(s) you have] WHERE t2.id IS NULL - this will delete everything from the existing table that does not appear in the temporary table.
The nice thing about temporary tables is that they are automatically dropped when the connection closes (as well has having some other nice features like being invisible to other connections).
The other nice thing about this method is that you can do some (or all) of your data manipulation in the database after you insert it into a table in step 1. It is often faster and simpler to do this kind of thing through SQL instead of looping through and changing values in your array.
The simplest way would be to truncate the table and then insert all the values. This will handle all of your requirements.
Assuming that is not viable though then you need to remember which rows have been modified, that can be done using a flag, a version number, or a timestamp. For example:
Update the table, set the "updated" flag to 0 on every row
Loop through doing an upsert for every item (http://dev.mysql.com/doc/refman/5.6/en/insert-on-duplicate.html). Set the flag to 1 in each upsert.
Delete every entry from the database with the flag set to 0.

Laravel - multi-insert rows and retrieve ids

I'm using Laravel 4, and I need to insert some rows into a MySQL table, and I need to get their inserted IDs back.
For a single row, I can use ->insertGetId(), however it has no support for multiple rows. If I could at least retrieve the ID of the first row, as plain MySQL does, would be enough to figure out the other ones.
It's mysql behavior of
last-insert-id
Important
If you insert multiple rows using a single INSERT statement, LAST_INSERT_ID() returns the value generated for the first inserted row only. The reason for this is to make it possible to reproduce easily the same INSERT statement against some other server.
u can try use many insert and take it ids or after save, try use $data->id should be the last id inserted.
If you are using INNODB, which supports transaction, then you can easily solve this problem.
There are multiple ways that you can solve this problem.
Let's say that there's a table called Users which have 2 columns id, name and table references to User model.
Solution 1
Your data looks like
$data = [['name' => 'John'], ['name' => 'Sam'], ['name' => 'Robert']]; // this will insert 3 rows
Let's say that the last id on the table was 600. You can insert multiple rows into the table like this
DB::begintransaction();
User::insert($data); // remember: $data is array of associative array. Not just a single assoc array.
$startID = DB::select('select last_insert_id() as id'); // returns an array that has only one item in it
$startID = $startID[0]->id; // This will return 601
$lastID = $startID + count($data) - 1; // this will return 603
DB::commit();
Now, you know the rows are between the range of 601 and 603
Make sure to import the DB facade at the top using this
use Illuminate\Support\Facades\DB;
Solution 2
This solution requires that you've a varchar or some sort of text field
$randomstring = Str::random(8);
$data = [['name' => "John$randomstring"], ['name' => "Sam$randomstring"]];
You get the idea here. You add that random string to a varchar or text field.
Now insert the rows like this
DB::beginTransaction();
User::insert($data);
// this will return the last inserted ids
$lastInsertedIds = User::where('name', 'like', '%' . $randomstring)
->select('id')
->get()
->pluck('id')
->toArray();
// now you can update that row to the original value that you actually wanted
User::whereIn('id', $lastInsertedIds)
->update(['name' => DB::raw("replace(name, '$randomstring', '')")]);
DB::commit();
Now you know what are the rows that were inserted.
As user Xrymz suggested, DB::raw('LAST_INSERT_ID();') returns the first.
According to Schema api insertGetId() accepts array
public int insertGetId(array $values, string $sequence = null)
So you have to be able to do
DB::table('table')->insertGetId($arrayValues);
Thats speaking, if using MySQL, you could retrive the first id by this and calculate the rest. There is also a DB::getPdo()->lastInsertId(); function, that could help.
Or if it returened the last id with some of this methods, you can calculate it back to the first inserted too.
EDIT
According to comments, my suggestions may be wrong.
Regarding the question of 'what if row is inserted by another user inbetween', it depends on the store engine. If engine with table level locking (MyISAM, MEMORY, and MERGE) is used, then the question is irrevelant, since thete cannot be two simultaneous writes to the table.
If row-level locking engine is used (InnoDB), then, another possibility might be to just insert the data, and then retrieve all the rows by some known field with whereIn() method, or figure out the table level locking.
$result = Invoice::create($data);
if ($result) {
$id = $result->id;
it worked for me
Note: Laravel version 9

Categories