I would like to ask if anyone of you tried to insert multiple records at once in a related tables?
Here's the scenario. I have a table of Drug and DrugMovement. Now I want to insert records to both tables using from a xls file as source. So this xls file contains thousand of records. By the time the user upload that file. All content of that file will be inserted to the tables. Now I'm thinking of batch upload. But I have no idea what will be the best approach to this.
Below is the schema
======== Drug Table ==========
class Drug extends Model
{
public function drugMovements() {
return $this->hasMany('App\DrugMovement');
}
}
======== Drug Movement ===========
class DrugMovement extends Model
{
public function drug() {
return $this->belongsTo('App\Drug');
}
}
Now I want to save records to both this table with a thousand of records inserting at once. How can I achieve this?
If I do something like this then it will be a waste of resource as I need to loop to all the records and do thousands of insert.
foreach($datas as $data) {
$drug = Drug::create([
"pharmacy_id" => 1,
"name" => $data->drug_name,
"strength" => $data->strength,
]);
$drug_movements = new DrugMovement([
"drug_id" => $drug->id,
"quantity" => $data->quantity,
"pharmacist_id" => 1,
]);
$drug->drugMovements()->save($drug_movements);
$drug->save();
}
As you can see if data has thousands of records it will insert thousand times. How can I optimize this?
The best thing would be to make two arrays and after that insert them to the database.
$latestDrugID = Drug::max('id');
// Of course if your database is empty start from `1`
foreach($datas as $data) {
$drugs[] = [
"pharmacy_id" => 1,
"name" => $data->drug_name,
"strength" => $data->strength,
];
$drug_movements[] = [
"drug_id" => $latestDrugID++,
"quantity" => $data->quantity,
"pharmacist_id" => 1,
];
}
Drug::insert($drugs);
DrugMovement::insert($drug_movements);
PS: You may want to chunk these arrays into little pieces depending on how large your data is.
Related
i'm using maatwebsite laravel excel to import some data from excel to database. But i want to add some custom ID with incremental value for each row of data.
For now, i'm able to import data with some input value form together.
DataImport file
class DataImport implements ToModel, WithStartRow{
public function model(array $row)
{
return new Tempdat([
'employee_id' => ??? (combination of client_code +1)
'name' => $row[1],
'gender' => $row[2],
'bod' => $this->transformDate($row[3]),
'engagement_code' => request('engagement_code'), //from input form
'client_code' => request('client_code'), //from input form
]);
}
public function transformDate($value, $format = 'Y-m-d')
{
try {
return \Carbon\Carbon::instance(\PhpOffice\PhpSpreadsheet\Shared\Date::excelToDateTimeObject($value));
} catch (\ErrorException $e) {
return \Carbon\Carbon::createFromFormat($format, $value);
}
}
public function startRow(): int
{
return 2;
} }
DataController file
public function ImportExcel(Request $request)
{
$this->validate($request,[
'file' => 'required|mimes:xls,xlsx',
'engagement_code' => 'required',
]);
$file = $request->file('file');
$clientCode = request('client_code');
$engagementCode = request('engagement_code');
$todayDate = date('dFY');
$file_name = $engagementCode.'_'.$todayDate.$file->getClientOriginalName();
$file->move('tempdat',$file_name);
Excel::import(new DataImport, public_path('/tempdat/'.$file_name));
return redirect()->route('dashboard.tempdat.index');
}
What i'd like to do is to add "employee code" which is combination of "client_code" + 1 for every row. for example if client_code is ABCD and there is 3 rows of data imported then the employee_code will be :
ABCD0001
ABCD0002
ABCD0003
...
i'm already searching for count rows but nothing found yet.
Counting rows will bring you problems whenever you remove a single record from the table: You won't be able to insert new rows if your primary employee_id field has UNIQUE key, or you will start inserting duplicated IDs.
Generating the ID in PHP isn't my recomendation either since you could face integrity problems if two records are trying to be stored simultaneously.
I would use the default model's model_id field with autoincrement to make sure that I won't have problems with id assignation, and inmediatly after saving the record I would update the id for the employee_id field which can be also keyed as "primary" in order to be indexed and accessed efficiently.
For example:
class MyModel extends Model {
public function save() {
if(parent::save()) {
$this->update['employee_id' => "ABCD".$this->id];
}
}
}
I haven't still realized how the Excel library you're using handles the model, but I supposed it can be specified somewhere along the process.
I am able to generate incremental id in laravel excel using this:
https://github.com/haruncpi/laravel-id-generator
$config = [
'table' => 'table_name',
'length' => 7,
'field' => 'employee_id',
'prefix' => request('client_code'),
'reset_on_prefix_change' => true,
];
$employee_id = IdGenerator::generate($config);
so, everytime import executed, employee_id will generated on its own following with client_code prefix (ex: ABCD001, ... )
the file i am importing has thousands of records which is causing my network to get slow. i want to read only those columns that i need, before inserting it into database. When file is processed , it should first search those columns not the whole file, and fetch the rows of those columns
Excel::load($path, function($reader) {
//Getting headers using this
$headers = $reader->first()->keys()->toArray();
//This is the array of required columns
$headings = array('registrant_name','registrant_address','registrant_phone','registrant_zip','registrant_email','registrant_country','registrant_state','registrant_city');
});
Data insertion after file read.
if($data->count() > 0)
{
foreach($data->toArray() as $value)
{
$insert[] = array(
'registrant_name' => $value['registrant_name'],
'registrant_address' => $value['registrant_address'],
'registrant_phone' => $value['registrant_phone'],
'registrant_zip' => $value['registrant_zip'],
'registrant_email' => $value['registrant_email'],
'registrant_country' => $value['registrant_country'],
'registrant_state' => $value['registrant_state'],
'registrant_city' => $value['registrant_city']
);
}
}
if(!empty($insert))
{
DB::table('customers')->insert($insert);
}
In collections, you can check the method only documentation
i.e.
$headers = $reader->first()->only(['registrant_name','registrant_address','registrant_phone','registrant_zip','registrant_email','registrant_country','registrant_state','registrant_city']);
You might also use chunk, to insert the data in smaller collections than in one big.
Could you update the post with the App\Import class in order to help you more.
I am developing a simple task management web application using laravel. The requirement states that we need to save the general information such as TaskDate, AssignedTo in a taskinfo table. List of tasks for one specific person are saved in another table called tasks. The tasks table has TaskDetailID (PK), TaskID (FK from the above table), TaskDescription, HoursRequired, etc...
The form allows users to add as many rows as they can which means a person could get assigned unlimited amount of tasks.
My problem now is saving the tasks data in the table. I've successfully saved the data for the taskinfo table, and i can even save the data for the table but only when it's one column.
Here is my store function on TaskInfoController
public function store(Request $request)
{
$validator = Validator::make(
$request->all(),
[
'TaskDate.*' => 'required',
'AssignedTo.*' => 'required',
]
,
[
'TaskDate.*.required' => 'Task Date is required.',
'AssignedTo.*.required' => 'Please assign the task to someone.',
]
);
if ($validator->fails())
{
//redirect errors to mars
}
$taskinfo = new TaskInfo();
$taskinfo->TaskDate = Carbon::createFromFormat("m/d/Y", $request->input('TaskDate'));
$taskinfo->TaskAssignedTo = $request->input('TaskAssignedTo');
// Some more columns here
$taskinfo->Save();
// Now for the tasks table
$tasksbulkinsert = array();
foreach ($request->input('TaskDescription') as $taskdescription)
{
$tasksbulkinsert[] = array('TaskID' => Uuid::uuid4(), 'TaskDescription' => $taskdescription);
}
Task::insert($tasksbulkinsert);
return redirect()->action('TaskInfoController#index')->with('flash_message', 'Successfully Saved!');}
The above code actually works perfectly but i don't know how i can insert the HoursRequired, or any additonal value with the corresponding taskdescription on the tasks table.
I tried a few approaches
having an incremental count such as i++ to know which row index (so to speak) of the taskdescription the query is currently procession, and having another foreach loop with it's own counter for the hoursrequired input and getting the value where the taskdescription's counters is equal to the hoursrequired counter. But it didn't work and even if it did, i don't think having multiple foreach loops for every column is good for performance.
Having different arrays with their own foreach loop to get the values from the inputs and then somehow merge the arrays.
Here is my HTML form
<input class="form-control" name="TaskDescription[]" type="text">
<input class="form-control" name="HoursRequired[]" type="text">
Main Question.
How can I save the TaskDescription and the HoursRequired into the database with one query.
Not so important question
The array validation at the top works but is there a way to have an error message that states the row of the error.
For example, Date is required for row number n.
You can simply:
foreach ($request->input('TaskDescription') as $i=>$taskdescription)
{
$tasksbulkinsert[] = array(
'TaskID' => Uuid::uuid4(),
'TaskDescription' => $taskdescription,
'HoursRequired' => $request->input('HoursRequired')[$i]
);
}
For your second question:
$messages = [];
foreach($this->request->get('TaskDescription') as $key => $val)
{
$messages['TaskDescription.'.$key.'.required'] = 'TD is required for row $key';
$messages['some_field.'.$key.'.some_rule'] = 'some_field custom message on row $key';
}
$validator = Validator::make(
$request->all(),
[
'TaskDate.*' => 'required',
'AssignedTo.*' => 'required',
]
,
$messages;
I have a case where user uploads a file containing large number of rows(let's say 1000). Each row contain information about a user.
This data is turned into PHP array with objects
E.g
$userArray = [{first_name: 'Peter', last_name:'Pan', email 'peter#example.org'},
{first_name: 'David', last_name:'Hasslehof', email 'david#example.org'}...]
Now for each row I would have to create
foreach ($usersArray as $user) {
$createdUser = User::create(array('email' => $user['email'], 'pin' => $user['id_code']));
$profile = Userprofile::create(array('first_name' => $user['first_name'], 'last_name' =>$user['last_name']));
$createdUser->profile()->associate($profile);
$createdUser->save();
$usergroup->addMember($createdUser);
}
This would mean that if I had 1000 rows, atleast 4000 queries, which is obviously too much. Is there a eloquent way to do this more elegantly?
I tried using querybuilder and insert first all profiles and then users but this did not work, because I dont know which profiles to retrieve(first_name and last_name are not unique fields) and therefore cannot link profile_id's to users that I would like to create.
I think you can use this this laravel library for import.
However, to import multiple relationships, i think there is no other way than use associate() or sync().
$model_object = new YourModelHere();
//your model id here
$model_object->id = $sheet->id;
//your model id fields
$model_object->field_1 = (int)$sheet->field_1;
$model_object->field_2 = (int)$sheet->field_2;
//related id of models heres
$id_of_related_1 = (int) $sheet->id_of_related_1 ;
$id_of_related_2 = (int) $sheet->id_of_related_2;
//in your Model.php the relation i.e. hasMany or Belongs to must be declared first.
$model_object->relationToModelOne()->associate(RelatedModelOne::find($id_of_related_));
$model_object->relationToModelTwo()->associate(RelatedModelTwo::find($id_of_related_2));
$model_object->save();
I have a PHP application which requires large 2D arrays of hard-coded data. I currently just have this defined in a script which is included at the start of every script execution.
I would like to know if anyone has a better idea for how to do this.
My script looks something like the following. Except that I have many more rows and each row has many more fields.
function getData() {
return array(
1 => array('name'=>'a something', 'price'=>123, 'field1'=>1, 'field2'=>3, 'field3'=>2),
2 => array('name'=>'b something', 'price'=>123, 'field1'=>3, 'field2'=>3, 'field3'=>2),
3 => array('name'=>'c something', 'price'=>234, 'field1'=>2, 'field2'=>3, 'field3'=>2),
4 => array('name'=>'d something', 'price'=>345, 'field1'=>8, 'field2'=>3, 'field3'=>2),
5 => array('name'=>'e something', 'price'=>655, 'field1'=>12, 'field2'=>3, 'field3'=>2),
6 => array('name'=>'f something', 'price'=>124, 'field1'=>11, 'field2'=>3, 'field3'=>2),
);
}
Each row has the same fields. So it is very much like a DB table result set. I suppose I could put it in a DB table, but I find this script easier to edit and I would think it's much faster to run than querying a DB.
The problem with my current solution is that it can be hard to read because there are so many fields. What I would like is a system that is easy to read and edit, but it must also be very fast.
Would a DB table be better? Or perhaps reading in a CSV file from a spreadsheet?
As a general idea, write your data is any way that's convenient, possibly CSV or something similar. Create a function that can process that data into the format you need. Cache the processed data so this doesn't need to be done every time.
function getData() {
$cache = 'data.dat';
if (file_exists($cache)) {
return unserialize(file_get_contents($cache));
}
$data = <<<DATA
a something, 123, 1, 3, 2
b something, 123, 3, 3, 2
...
DATA;
$data = array_map(function ($row) {
$row = array_map('trim', explode(',', $row));
return array_combine(array('name', 'price', 'field1', 'field2', 'field3'), $row);
}, explode("\n", $data));
file_put_contents($cache, serialize($data));
return $data;
}
In the above example I'm just using a HEREDOC string to store the data. You may want to put that into an external CSV file, or an XML file that you can generate automatically from somewhere or whatever fits your needs.
If it's very large, storing it in a database may be a better idea. Something lightweight like SQLite will do.