Following up on this Question: How to chunk results from a custom query in Laravel
I try
DB::connection('mgnt')->select($query)->chunk(200, function($orders) {
foreach ($orders as $order) {
//a bunch of code...
}
});
But I get the following error:
FatalErrorException in MigrationController.php line 98:
Call to a member function chunk() on array
Is chunking possible without having an appropriate Eloquent ORM Model?
I try to chunk because I get a blank page (can't find any errrors in any log) if the query returns too many results.
I think right now it max 50.000 results that I can query at once. Is that maybe due to some restriction or limitation in Laravel?
Well since the query will just return an array of objects you can simply use PHP's array_chunk():
$result = DB::connection('mgnt')->select($query);
foreach(array_chunk($result, 200) as $orders){
foreach($orders as $order){
// a bunch of code...
}
}
Here's what chunk() on an eloquent model does:
$results = $this->forPage($page = 1, $count)->get();
while (count($results) > 0)
{
// On each chunk result set, we will pass them to the callback and then let the
// developer take care of everything within the callback, which allows us to
// keep the memory low for spinning through large result sets for working.
call_user_func($callback, $results);
$page++;
$results = $this->forPage($page, $count)->get();
}
You could try to do something similar (although I think it should be possible to run your query all at once, but I can't help you with that...)
Add a limit to your SQL query LIMIT 200
Increase the offset with every query you run. First 0, second 1 * 200, third 2 * 200
Do that until the result is returned empty (e.g. with a while loop like above)
Related
I'm trying to make a notification system with laravel. My idea was to get data and update instantly the "is_delivered" flag.
This is the code:
Model:
public function scopeGetForView($query)
{
$query->orderBy('created_at','DESC');
$return = $query->get();
if($return->count() > 0) {
$query->update(array("is_delivered" => 1));
}
return $return;
}
Controller:
$notifications = Auth::user()->notifications()->limit(10)->offset(10)->getForView();
Well, this would work fine without the offset because MySQL does only support limit (without offset) when updating.
But how can I update the whole collection without looping through it? Looping with instant updating would lead to many queries. The other way I can think of would be to create an array with IDs and update them with whereIn(). Is this the only way doing it?
You can run an update on the whole collection:
DB::table('table_name')->whereIn('id', $collection->modelKeys())->update(['is_delivered' => 1]);
I have a situtaion where I have data in two columns in a table, which need to be multiplied on a row basis. Then each value from the multiplication has to be added to provide a net result
I tried this but it does not work
public function getCampaignStats($item)
{
$query = CampaignStats::where('item',$item);
foreach($query as $q) {
$q->p_c;
$q->caa;
dd($q->p_c);
}
return $query;
}
I get this exception when i try to do this
Object of class Illuminate\Database\Eloquent\Builder could not be converted to string
Is there a better way to do this foreach loop in laravel
You can use raw queries for such a case and do the math within your query. Have a look at the docs:
https://laravel.com/docs/5.4/queries#raw-expressions
https://laravel.com/docs/5.4/queries#where-exists-clauses
$users = User:all();
//User:all() is based on the laravel eloquent model , it's just similar to 'select * from user' query
$total_spend_amount = 0;
foreach($users as $user)
{
$total_spend_amount += $user->spend_amount;
}
echo "Total spend amount :".$total_spend_amount;
//Note: In this simple example i just want to sum of all the spend_amount
Above are just a simple example , how do i do some algorithm on the resulted query without looping/foreach in php?
User::all()->sum('spend_amount'); // this is the answer below and this is correct
//but how about if i want the 'specific algorithm' to be performed on the resulted query instead of sum?
But how about if i want the 'specific algorithm' to be performed on the resulted query instead of sum?
I am not sure about syntax but try with
User::all()->sum('spend_amount');
or
User::all()->select(DB::raw("SUM(spend_amount)")->get();
Laravel provides sum() by default using eloquent.
$total_spent_amount = DB::table('users')->all()->sum('spent_amount`);
I'm having issues with an array returned from DB::select(). I'm heavily using skip and take on Collections of eloquent models in my API. Unfortunately, DB::select returns an array, which obviously doesn't work with skip and take's. How would one convert arrays to a collection that can utilise these methods?
I've tried
\Illuminate\Support\Collection::make(DB::select(...));
Which doesn't quite work as I expected, as it wraps the entire array in a Collection, not the individual results.
Is it possible to convert the return from a DB::select to a 'proper' Collection that can use skip and take methods?
Update
I've also tried:
$query = \Illuminate\Support\Collection::make(DB::table('survey_responses')->join('people', 'people.id',
'=', 'survey_responses.recipient_id')->select('survey_responses.id', 'survey_responses.response',
'survey_responses.score', 'people.name', 'people.email')->get());
Which still tells me:
FatalErrorException in QueryHelper.php line 36:
Call to a member function skip() on array
Cheers
I would try:
$queryResult = DB::table('...')->get();
$collection = collect($queryResult);
If the query result is an array, the collection is filled up with your results. See the official documentation for the collection. Laravel5 Collections
For anyone else that's having this sort of problem in Laravel, I figured out a work around with the following solution:
$query = DB::table('survey_responses')->join('people', 'people.id', '=', 'survey_responses.recipient_id')
->select('survey_responses.id', 'survey_responses.response', 'survey_responses.score', 'people.name', 'people.email');
if(isset($tags)){
foreach($tags as $tag){
$query->orWhere('survey_responses.response', 'like', '%'.$tag.'%');
}
};
// We apply the pagination headers on the complete result set - before any limiting
$headers = \HeaderHelper::generatePaginationHeader($page, $query, 'response', $limit, $tags);
// Now limit and create 'pages' based on passed params
$query->offset(
(isset($page) ? $page - 1 * (isset($limit) ? $limit : env('RESULTS_PER_PAGE', 30)) : 1)
)
->take(
(isset($limit) ? $limit : env('RESULTS_PER_PAGE', 30))
);
Basically, I wasn't aware that you could run the queries almost incrementally, which enabled me to generate pagination chunks before limiting the data returned.
I know this is weird, but somehow I would like to know if it is possible, I am creating my own pagination, and I have been looking for eager loading with pagination but some say it is still not available.
//this query will get the record count
$qry_counter = LeadsModel::with('emails','contacts','create_by_name','sources');
//this query will get the data
$query = LeadsModel::with('emails','contacts','create_by_name','sources');
if(isset($inputs['page_no']) && isset($inputs['records_per_page']))
{
//setting pagination variables
$paginate_start_record = ($inputs['page_no'] - 1) * $inputs['records_per_page'];
//get the corresponding results based on the page and records per page
$query->skip($paginate_start_record)->take($inputs['records_per_page']);
}
//loop through condition
foreach($conditions as $condition)
{
$query->orWhere($condition[0],$condition[1],$condition[2]);
$qry_counter->orWhere($condition[0],$condition[1],$condition[2]);
}
$results_counter = $qry_counter->get()->count();
$results = $query->get();
Any advise on how to optimize this code? Will it be possible to get the total records count first and then it will also return the records based on the set with skip and take? Thanks in advance