get record ids after bulk insert in laravel - php

I'm using laravel 5.4 I want to get record ids after I insert them in the table. here's data that I want to insert, it stored in array
$data = [
[
"count" => "100",
"start_date" => 1515628800
],
[
"count" => "102",
"start_date" => 1515715200
]
];
here I insert the array of items at once
\Auth::user()->schedule()->insert($data);
but this method returns boolean and I want to get ids(or all columns) of these new items after they been inserted, how should I do this? it doesn't matter if it will be done with eloquent or querybuilder. I already tried insertGetId method, but it doesn't seem to work with multidimensional array. if its not possible with laravel, what would be the best way to implement this?

Workaround: Select highest PK value and you should know what values they had. You might need locking to prevent a race with other users though.
edit:
LOCK TABLES schedule WRITE;
// Select last ID
UNLOCK TABLES;
the list of id's is then the last id - insert count

My approach is to generate the id's server side any time I run into this issue but you have to make sure the id's are unique. Also, you can do it for exactly the tables you need (not the entire database). Hope this helps.

Related

Cross-Reference Array elements bei named index - PHP/Laravel/Lumen

TL;DR:
I want to use data from a 1-dimensional array of arbitrary size, created by userinput, and fill its values into the appropriate fields of a 2-dimensional array of arbitrary size, created via query from the Database.
I have a webapplication where the user can access the DBs data both in read-mode and write-mode.
The DB records accessible to him are determined by the departments he belongs to.
The records are organized in a DB structure where a coretable contains data visible to ALL departments, while extensiontables referencing the coretable via FK contain data which are only visible to users who belong to the department associated with this extensiontable.
I'm using Lumen/Laravel and its eloquent model to interact with the DB from my Backend, some examplecode for accessing the DB looks like this:
$join = coretable::with($permittedTables)->find(1);
Here, $permittedTables is an array of tablenames, referencing the extensiontables accessible to the user.
The above code then fetches the following result:
{
"id": 1,
"Internal_key": "TESTKEY_1",
"extensiontable_itc": {
"description": "EXTENSION_iTC_1"
},
"extensiontable_sysops": {
"description": "EXTENSION_SYSOPS_1"
}
}
Now, the user will have a list-like view in the front-end where all this data has been merged into a big table. In this list, the user can click a field and change its value, then send an http-request to persist these changes to the DB.
This http-request will be an array in JSON format, which i will json_decode() in my backend, and use the transmitted id to fetch the model as seen above.
Now, at this point, two sets of data, organized in the structure of associative arrays, will face each other. The input from the http-request will likely be a 1-dimensional array, while the model from the DB will almost certainly be the multidimensional array youve seen above.
Furthermore, there is a huge amount of possible combinations of datasets. It can be the combination seen above, but it also can be a combination of data from other tables not listed here, and it can be both a bigger or smaller set of tables aggregated into the model and userinput.
Therefore, the process setting the incoming input to the DB must be able to determine dynamically which field of the input must be put into which field of the DB.
I'm doing this for the first time and I don't have a lot of ideas how to do this. The only thing that came to my mind was mirroring the DB column names to the indexes of the input's array and then loop through the model and compare its indexes to the index of the currently selected element of the input. If they match, the value from the respective field of the input will be set to the respective field of the DB.
I am aware that in order for this to work, each column of the tables affected by this process MUST have a unique name across the DB. This is doable though.
Still, this is currently the only idea I could come up with.
However, I wanted to ask you two things about this:
1) Are there other, less "hacky" approaches to solve the problem outlined above?
2) Can someone give me a custom function/a set of custom functions capable of iterating over two arrays, of which at least one will be multidimensional, while comparing their indexes, then setting the value from Array A to Array B when the indexes match?
Below I have a little codeexample, creating two arrays which reproduce the described situation, with which you can fiddle around:
First, the inputside:
$inputArray = array(
"id" => 1,
"internal_key" => "TESTKEY_1",
"CPU" => "intelTest1",
"GPU" => "nvidiaTest1",
"Soundcard" => "AsusTest1"
"MAC" => "macTest1",
"IP" => "ipTest1",
"VLAN" => "vlanTest1"
);
Then, the DB-side:
$modelArray = array(
"id" => 1,
"internal_key" => "TESTKEY_2",
"extensiontable_itc" => array (
"CPU" => "intelTest1",
"GPU" => "nvidiaTest2",
"Soundcard" => "AsusTest1"
),
"extensiontable_sysops" => array (
"MAC" => "macTest2",
"IP" => "ipTest1",
"VLAN" => "vlanTest1"
)
);

cakephp - get results from two unrelated tables

I've been googling this for a while but to no avail. I'm finding the possible answers confusing and hoped someone could clear it up for me.
I've two tables (tasks and installs) which contain similar data, but not the same, and there's no relationship between the two tables, other than the fact they both belong to the same branch. So for example:
Tasks Table
id
branch_id
task_name
to_be_billed
created
Installs Table
id
branch_id
install_details
to_be_billed
created
I'm trying to figure out how to get a result set which would show each record from either table, arranged by date created order and only where the 'to_be_billed' column is '1'.
Can anyone give me some pointers please?
Thanks
I'm assuming that you want to get the results using the branch_id and these two tables (Tasks and Install) have some relationship with the BranchTable.
I'm also assuming the Tasks and Installs Table's have multiple records for a Branch.
BranchTable->find()
->contain([
'Tasks' => [
'sort' => ['Tasks.created' => 'ASC']
]
])
->contain([
'Installs' => [
'sort' => ['Installs.created' => 'ASC']
]
])
->matching('Tasks', function ($q){
return $q->andWhere(['Tasks.to_be_billed' => 1]);
})
->matching('Installs', function ($q){
return $q->andWhere(['Installs.to_be_billed' => 1]);
})
->where(['Branch.id' => $foo]);
If your doubt does not use these assumptions let me know.
If you are trying to get the data using one DB query then you would need to use the UNION operator.
In that case you would need these two queries to have the same columns, so for example:
select
id,
branch_id,
task_name,
NULL as install_details,
'task' as type,
to_be_billed,
created
from
tasks_table
UNION
select
id,
branch_id,
NULL as task_name,
install_details,
'install' as type,
to_be_billed,
created
from
install_table
but that's a rather dirty solution.
If I knew what exactly you are trying to achieve, maybe I could suggest a better answer.

Handling concurrency in Laravel

I'm creating an API that interacts with a MySQL inventory database. We have 15 users that can reserve products, updating the database in the following way:
Decreasing the on-hand value and increasing the reserved value of a product.
The inventory table looks like this:
id int
sku varchar
on-hand int
reserved int
The problem is: How to handle the update of the row if 2 users try to update it at the same time?
The first aproach i was thinking about was using Transactions:
<?php
function reserveStock()
{
$db->beginTransaction();
// SELECT on-hand, reserved from inventory
// Update inventory values
$db->commit();
return response()->json([ 'success' => 1, 'data' => $data ])
}
The second one was using pessimist locking:
<?php
function reserveStock()
{
// SELECT on-hand, reserved from inventory with ->sharedLock()
// Update inventory values
return response()->json([ 'success' => 1, 'data' => $data ])
}
The third one was to create a updating field with a value of cero. When selecting the products to update, i'd check the updating field before doing anything with that rows. The problem i see here is that i'd have to loop the ones with updating != 0 until they become available. More selects and updates come fromt his aproach.
Which course of action if the best? There may be more options than the ones i've wrote here.
Do not use neither transactions no pessimist locking.
Do not use update either.
For racing condition situation you better review your database, and get rid of update, use insert instead. Make a pivot table to accomodate connection between users and products. And make only first connecting record (where id is smaller) to be the actual one.
If more explanation needed, here it is an example:
Say 2 users racing to get the product. They both create records in the pivot table, almost simultaneously but someone should be firs, right? They both committing very small transaction to persist their data. And then - they both read the pivot table to verify who is succeeded. The first record will be the same for both of them, there would be no blocking in use (at lease explicit). So one customer will get his record and be happy, the other one is going to be reapply to get another product.
And problem solved.
I know I am late but I can help a person like me who came with a similar issue.
You can solve this by using jobs/queues. That way no 'transaction' related to a user will run 'the same time' as another. Consider checking this article about Concurrency attack and I am sure you will appreciate.

Swapping array values with Eloquent model IDs

I need to make an import method that takes the CSV file and imports everything in the database.
I've done the parsing with one of Laravel's CSV addons and it works perfectly giving me a big array of values set as:
[
'col1_name' => 'col1 value',
'col2_name' => 'col2 value',
'col3_name' => 'col3 value,
'...' => '...'
]
This is also perfect since all the column names fit my model which makes the database inserts a breeze.
However - a lot of column values are strings that i'd like to set as separate tables/relations. For example, one column contains the name of the item manufacturer, and i have the manufacturer table set in my database.
My question is - what's the easy way to go through the imported CSV and swap the strings with the corresponding ID from the relationship table, making it compatible with my database design?
Something that would make the imported line:
[
'manufacturer' => 'Dell',
]
into:
[
'manufacturer' => '32',
]
I know i could just do a foreach loop comparing the needed values with values from the relationship models but I'm sure there's an easier and more clean way of doing it.
I don't think theres any "nice" way to do this - you'll need to look up each value for "manufacturer" - the question is, how many queries will you run to do so?
A consideration you need to make here is how many rows you will be importing from your CSV file.
You have a couple of options.
1) Querying 1 by 1
I'm assuming you're going to be looping through every line of the CSV file anyway, and then making a new model? In which case, you can add an extra database call in here;
$model->manufacturer_id = Manufacturer::whereName($colXValue)->first()->id;
(You'd obviously need to put in your own checks etc. here to make sure manufacturers exist)
This method is fine relatively small datsets, however, if you're importing lots and lots of rows, it might end up sluggish with alot of arguably unnecessary database calls.
2) Mapping ALL your Manufacturers
Another option would be to create a local map of all your Manufacturers before you loop through your CSV lines;
$mappedManufacturers = Manufacturer::all()->pluck('id', 'name');
This will make $mappedManufacturers an array of manufacturers that has name as a key, id as a value. This way, when you're building your model, you can do;
$model->manufacturer_id = $mappedManufacturers[$colXValue];
This method is also fine, unless you have tens of thousands of Manufacturers!
3) Where in - then re-looping
Another option would be to build up a list of manufacturer names when looping through your CSV lines, going to the database with 1 whereIn query and then re-looping through your models to populate the manufacturer ID.
So in your initial loop through your CSV, you can temporarily set a property to store the name of the manufacturer, whilst adding it to another array;
$models = collect();
$model->..... = ....;
$model->manufacturer = $colXValue;
$models->push($colXValue);
Then you'll end up with a collection of models. You then query the database for ONLY manufacturers which have appeared:
$manufacturers = Manufacturer::whereIn('name', $models->lists('manufacturer'))->get()->keyBy('name')->toArray();
This will give you array of manufacturers, keyed by their name.
You then loop through your $models collection again, assigning the correct manufacturer id using the map;
$model->manufacturer_id = $manufacturers[$model->manufacturer];
Hopefully this will give you some ideas of how you can achieve this. I'd say the solution mostly depends on your use case - if this was going to be a heavy duty ask - I'd definitely Queue it and be tempted to use Option 1! :P

Laravel 5.2 - Updating multiple rows with one query

So I do a lot of calculations and at the end I have rates that need to be saved to existing rows in a table.
The array I have will be similar to the following:
[
<model_id> => [
'rate' => <some rate>
]
<model_id_2> => [
'rate' => <some other rate>
]
.....
]
Now obviously I could foreach through this array and do an update for each and every item in the array but I could end up with 100 update calls. Is there a way (through laravel's eloquent OR even a raw sql query) to do all these updates through one call?
You may try with Eloquent update() for multiple records update. Here is some code which I am using for update multiple records into the my table.
\App\Notification::where('to_id', '=', 0)
->update(['is_read' => 1]);
If you are worried about the request spent time you can handle this by firing an event and then queueing your listener/job, who will save your model, so it can be processed asynchronously. For examples, go to Laravel Docs for Queues
As long as I know you cannot update multiple rows on Laravel.

Categories