How to optimize large data inserts using laravel eloquent - php

Good day, basically I want to insert some related data all at once using eloquent. My current code is:
$allStudies = Study::chunk(50, function ($studies) use ($request, $questionData, $answerData) {
foreach ($studies as $study) {
$evaluationInsert = Evaluation::create([
'study_id' => $study->id,
'questionnaire' => $request->questionnaire,
'description' => $request->description
]);
$evaluationQuestions = $evaluationInsert
->question()
->createMany($questionData);
foreach ($evaluationQuestions as $question) {
$question->answer()->createMany($answerData);
}
}
});
The result of $allStudies is a collection of Study model that currently have around 150-ish data. $questionData is just a static array of arrays that consist of 38 elements and $answerData is an array of arrays that have 4 elements which consist the answer options of each questions. However, the code does work but it takes too long time to execute because of big loops and increasing the timeout in php seems not an ideal way to solve this. What is the elegant way to solve this kind of case?

Related

Predicting future IDs used before saving to the DB

I am saving a complex dataset in Laravel 4.2 and I am looking for ways to improve this.
$bits has several $bobs. A single $bob can be one of several different classes. I am trying to duplicate a singular $bit and all its associated $bobs and save all of this to the DB with as few calls as possible.
$newBit = $this->replicate();
$newBit->save();
$bobsPivotData = [];
foreach ($this->bobs as $index => $bob) {
$newBob = $bob->replicate();
$newBobs[] = $newBob->toArray();
$bobsPivotData[] = [
'bit_id' => $newBit->id,
'bob_type' => get_class($newBob),
'bob_id' => $newBob->id,
'order' => $index
];
}
// I now want to save all the $bobs kept in $newBobs[]
DB::table('bobs')->insert($newBobs);
// Saving all the pivot data in one go
DB::table('bobs_pivot')->insert($bobsPivotData);
My problem is here, that I cant access $newBob->id before I have inserted the $newBob after the loop.
I am looking for how best to reduce saves to the DB. My best guess is that if I can predict the ids that are going to be used, I can do all of this in one loop. Is there a way I can predict these ids?
Or is there a better approach?
You could insert the bobs first and then use the generated ids to insert the bits. This isn't a great solution in a multi-user environment as there could be new bobs inserted in the interim which could mess things up, but it could suffice for your application.
$newBit = $this->replicate();
$newBit->save();
$bobsPivotData = [];
foreach ($this->bobs as $bob) {
$newBob = $bob->replicate();
$newBobs[] = $newBob->toArray();
}
$insertId = DB::table('bobs')->insertGetId($newBobs);
$insertedBobs = DB::table('bobs')->where('id', '>=', $insertId);
foreach($insertedBobs as $index => $newBob){
$bobsPivotData[] = [
'bit_id' => $newBit->id,
'bob_type' => get_class($newBob),
'bob_id' => $newBob->id,
'order' => $index
];
}
// Saving all the pivot data in one go
DB::table('bobs_pivot')->insert($bobsPivotData);
I have not tested this, so some pseudo-code to be expected.

Constructing Arrays Programmatically

I'm writing a PHP web app that uses XML documents from an API. I have to construct many products from the returned XML data.
What I'm doing right now is using xpath and a loop to pull all of the objects into an array of sub-arrays. Then I loop through the array of arrays and pull specific arrays into their own named array. Is there a way to do this with a function or class constructor?
My current code looks something like this:
if ( $products_results_list[$key]["sChgDesc"] == "Small Box" ) {
$small_box = array(
"ChargeDescID" => $products_results_list[$key]["ChargeDescID"],
"sChgDesc" => $products_results_list[$key]["sChgDesc"],
"dcPrice" => $products_results_list[$key]["dcPrice"],
"dcTax1Rate" => ".0" . $products_results_list[$key]["dcTax1Rate"],
"dcInStock" => $products_results_list[$key]["dcInStock"],
);
}
After writing the above if statement about 8 times, with many more times needed, I'm thinking there must be a better practice than just writing everything out.
I want to do something like:
product_constructor($argument, $product_name) {
if ( $arguement ) {
$product_name = array(
"ChargeDescID" => $products_results_list[$key]["ChargeDescID"],
"sChgDesc" => $products_results_list[$key]["sChgDesc"],
"dcPrice" => $products_results_list[$key]["dcPrice"],
"dcTax1Rate" => ".0" . $products_results_list[$key]["dcTax1Rate"],
"dcInStock" => $products_results_list[$key]["dcInStock"],
);
}
and then just call the function or constructor as needed like:
product_constructor( '$products_results_list[$key]["sChgDesc"] == "Small Box"', '$small_box');
I actually tried the above code, but it was throwing errors. Despite working with PHP daily, I still don't know much about constructors, and figured this might be the perfect opportunity to learn how to do this correctly.
I'm not sure if classes are the right choice for this since I'm going to need to pull the products themselves into product package classes later.

MongoDB PHP count $within method extremely slow for large datasets

Hi guys I have the method below for counting within polygons in a mongodb:
public function countWithinPolygon($polygon, $tags = array())
{
// var_dump($polygon);
// var_dump($polygon->getPoints());exit();
$query = array(
'point' => array(
'$within' => array(
'$polygon' => $polygon->getPoints()->first()->toArray(true)
)
)
);
if($tags)
{
$query['tags'] = array(
'$all' => $tags
);
}
return parent::count($query);
}
For some queries with small amounts of data it is just okay. On larger datasets containing 4000+ calls the execution time is truely pathetic and can take hours. On average it takes three hours to execute. Any ideas or hints on a better way to write this to save time and optimize this query?
The issue was fixed by ensuring an index like so : db.polygon.ensureIndex({'GeoJSON.geometry':'2dsphere'});

PHP | Updating array items through function while iterating - Big performance issue

i am iterating an array with more than 3000 items in it
array looks something like that:
[
[
'id' => 1,
'type' => 'income'
'amount' => 10
],
[
'id' => 2,
'type' => 'expense',
'amount' => 20
],
.......
]
while iterating i call functions that manipulate another array in the same class
something like that:
$this->data->revenue->each(function($row) use($user)
{
if ($row->isIncome())
{
$this->addRevenueRowToColumn($row, 'income');
$this->addRevenueRowToColumn($row, 'total');
}
if ($row->isExpense())
{
$this->addRevenueRowToColumn($row, 'expense');
$this->subtractRevenueRowToColumn($row, 'total');
}
}
this is what the functions do:
protected function addRevenueRowToColumn(&$row, $columnName)
{
$this->report['byMonth'][$row->getMonthTableKey()]['byDepartment'][$row->department_id][$columnName] += $row->amount;
$this->report['byMonth'][$row->getMonthTableKey()]['total'][$columnName] += $row->amount;
$this->report['totals']['byDepartment'][$row->department_id][$columnName] += $row->amount;
$this->report['totals']['total'][$columnName] += $row->amount;
}
protected function subtractRevenueRowToColumn(&$row, $columnName)
{
$this->report['byMonth'][$row->getMonthTableKey()]['byDepartment'][$row->department_id][$columnName] -= $row->amount;
$this->report['byMonth'][$row->getMonthTableKey()]['total'][$columnName] -= $row->amount;
$this->report['totals']['byDepartment'][$row->department_id][$columnName] -= $row->amount;
$this->report['totals']['total'][$columnName] -= $row->amount;
}
it takes about 11 seconds to process the data and display it
what should i do?
thanks in advance!
speed up php only solution
Apart from the suggested use of a database to organize your stuff, if you still want it the hard way ;) you can avoid iterating (let php do it internaly) over the whole array by using one of the php functions below:
array_map
— Applies the callback to the elements of the given arrays
and
array_walk
— Apply a user function to every member of an array
I see in your code, you have the 'use' clause, so I presume you are PHP > 5.3. In that case, you can do the following:
$yourdata = array_map(
function($row) use ($user)
{
/*$user->doStuff();*/
return $row;
}, $yourdata
);
Furthermore, a lot of overhead is the display rendering part. If you have a lot of things to display, for example using a simple echo, it's faster to do:
$result = "";
$result .= $something;
$result .= $somethingelse;
echo $result;
than
echo $something;
echo $somethingelse;
enhance solution to use database
But besides that, it will surely be beneficial if you use the database. The most obvious thing is to store your data in some db tables and use some sql-ish solution to query it. It will speed your script up for sure.
speed up the db+php solution
Next, you can get a major performance boost by doing most of your calculations (business logic) inside the database engine in form of stored procedures.
For example if you'd go with mysql, you could create a cursor and iterate over your table rows in a loop. On every row you do some stuff, while ofcourse having direct access to all the tables/columns of your scheme (and possibly other stored procedures/functions). If you are doing a lot of math-ish calculations it's a great solution, but ofcourse IMHO it's usually less convenient (more sophisticated) to write your stuff in SQL rather than PHP ;)

How to merge multiple arrays that are depending on a function in PHP?

I am seeking for someone's knowledge out here.
Right now, I would need to merge several arrays into a bigger one, but all of those arrays depend on a function.
This function returns a numeric array containing different quantities of numbers between 1 and 7 :
function Possible($i, $j, $grid)
$possible = Possible($i, $j, $grid)
I'm working with a grid, and the function returns a different array for every case of the grid. What I would like to do is to merge those 7 arrays into another one. Some numbers may be present more than once in this big array, but I want it this way.
I tried using for loops, while loops and some other techniques, but nothing worked. In this case, it is not possible to manually define every array, since they change depending of what is contained in the grid and there are too many. It has to be done automatically, and this is where I get stuck.
for ($jj=0; $j<7; $j++){
$possRow = array_merge( ###what do I add here or what do I change in this code to make everything work###
Thank you if someone can help me out!
Etpi
hope this help:
$biggerOneArray = array();
for($k=0;$k<7;$k++) {
$biggerOneArray[] = Possible($i,$j,$grid);
}
Then you can check your bigger array, may contains all arrays of the iterations of the loop (7 arrays merged).
var_dump($biggerOneArray);
The output should be this:
array(
(int) 0 => array(
'key' => 'value',
'key2' => 'value2'
),
(int) 1 => array(
'key3' => 'value3',
'key4' => 'value4'
)
)
etc...
I'm sorry but your description isn't very clear. But just to get you started you might look at this solution.
function Possible($i, $j, $grid) {
// some code ... e.g. $array[] = "some data";
return $array;
}
By creating a small array for each grid and returning it using return $array you get a few little arrays which you can inturn place into a for loop to merge it into one larger array. However i believe the var $jj must have some meaning in the function it self as well.
for($jj=0;$jj<7;$jj++) {
$merged_array[$jj] = Possible($i,$j,$grid);
}
Maybe if you descripe your problem a little more and post an exmple of the array's your working with i can give you a better answer.

Categories