Laravel controller function times out, 50000 users - php

In my controller I have a have a function to check subscription status and update it in my database. The problem is with 50000 users, it takes too long to finish and times out.
public function UpdateStatus(){
$users = User::all();
foreach($users as $user){
$user->createOrGetStripeCustomer();
$stripeSubs = $user->asStripeCustomer()->subscriptions->all();
$dbSubs = DB::table('subscriptions')->select('stripe_id')->where('user_id', $user->id)->get();
foreach($dbSubs as $check){
$canDelete=0;
foreach($stripeSubs as $value){
if($check->stripe_id == $value->id){
$canDelete++;
}
}
if($canDelete==0){
DB::table('subscriptions')->where('user_id', $user->id)->where('stripe_id',$check->stripe_id)->update(['stripe_status'=>'ended']);
}
}
}
return redirect('/dashboard');
}
I'm sure I shouldn't even process that many at a time but I kinda got stuck here and am not sure how exactly to approach this. My goal is to make this work and optimize it.

you use job like that:
php artisan make:job checkSubscrption
in your env file change Queue_CONNECTiON=database,
run php artisan queue:table and php artisan migrate
in your checkSubscription file set that
public function handle(){
$users = User::all();
foreach($users as $user){
$user->createOrGetStripeCustomer();
$stripeSubs = $user->asStripeCustomer()->subscriptions->all();
$dbSubs = DB::table('subscriptions')->select('stripe_id')->where('user_id', $user->id)->get();
foreach($dbSubs as $check){
$canDelete=0;
foreach($stripeSubs as $value){
if($check->stripe_id == $value->id){
$canDelete++;
}
}
if($canDelete==0){
DB::table('subscriptions')->where('user_id', $user->id)->where('stripe_id',$check->stripe_id)->update(['stripe_status'=>'ended']);
}
}
}}
in your controller:
public function UpdateStatus(){
checkSubscrption::dispatch();//you can use chunk method and pass your $users as params
return redirect('/dashboard');}
then run php artisan queue:work

Firstly, you should do this in a Queued Job: https://laravel.com/docs/8.x/queues
That way the script won't time out.
However, loading 50k Eloquent objects into memory isn't optimal. To resolve this you can use Lazy Collections: https://laravel.com/docs/8.x/collections#lazy-collections
$users = User::cursor();
foreach ($users as $user) {
// do stuff
}
This will only execute one query, but will only ever keep one User object in memory.

Related

Laravel get memory_limit error, when create folder from database

I Want to create folders based on user_id inside /laravel/public/file/FolderUserXXX.
If I have 1000 record user in database, i want to create 1000 folder inside the path (I need to create those folders because my server gets down and folders disappear).
I have created a job to execute that and I tried it in local with small data (is working) but in production I always get error Allowed memory size of xxxxxxxxxxxxx bytes exhausted (tried to allocate xxxxxxxx bytes). I add memory_limit until 4GB in php.ini but still get the problem.
Here is my code in app/Console/Commands
scenario:
createfolder job for user with status >= 3 and remark_restore is null. On every successful creation remark_restore will change to 1. I tried to limit 1 data per job and run every minute, but I still get error.
But if I change memory_limit to -1 only to run this job and then change it back to previous value, will be there any impact in the future for my web?
Any suggestion on how to prevent the memory_limit error?
public function handle()
{
try {
ini_set('max_execution_time', 300000);
ini_set('memory_limit','4096M');
$take = 1;
$make_folder = User::where('status', '>=', '3')->wherenull('remark_restore')->limit($take)->get();
foreach ($make_folder as $make_folder) {
$path = public_path().'/file/'.$make_folder->id;
if(!(file_exists($path))) {
$data = User::where('id', $make_folder->id)->update(array('remark_restore' => '1' ));
File::makeDirectory($path, $mode = 0777, true, true);
}
}
$count = User::where('status', '>=', '3')->wherenull('remark_restore')->count();
echo 'Pending : '.$count;
}
catch (\Exception $e) {
\Log::warning($e);
}
}
Okay, you are confusing things...
A correct approach here is to dispatch a job for each folder instead of having a single job to do every creation.
So, you should have a Job like CreateFolderForUser and in it have something like:
class CreateFolderForUser
{
protected $user;
public function __construct(User $user)
{
$this->user = $user;
}
public function handle()
{
try {
$path = public_path("/file/{$this->user->id}");
if(!(file_exists($path))) {
File::makeDirectory($path, 0777, true, true);
$this->user->remark_restore = 1;
$this->user->save();
}
}
catch (\Exception $e) {
\Log::warning($e);
}
}
}
See that I have removed everything about iterating users, you just need to do this job that is "Create a folder for this user", nothing else...
Also, see that I have moved the remark_restore update after creating the directory, because on your previous code, if the creation fails, you have already updated the user's remark_restore to 1 but that is false, you had an error...
Now you should have a Scheduler that runs code or do it in a command so the scheduler calls that command every 5 minutes or whatever you want to. This code will see the database and dispatch this job with each user you want to have a folder created.
public function dispatchUserFolderCreation()
{
$take = 1000;
$users = User::where('status', '>=', '3')
->whereNull('remark_restore')
->limit($take)
->chunk(100, function (Collection $users) {
foreach ($users as $user) {
dispatch(new CreateFolderForUser($user));
}
});
}
You could dynamically pass the $take you want, if you are using a command, so the default is 1000 but if you want less or more (do not send 10000, do it in batches of 2000, not more), you can do it.

What is the best practice to load and update data using Redis and MySQL?

I am using Redis as caching layer and I'd like to know what is the best practice or how to use it properly alongside with DB (in this case MySQL).
Here I have an example of user dashboard function:
public function updateDashboardUser(Request $request) {
$user = app('redis')->hGetAll($request->userID); //Get data from cache
if ($user) { //if there is data use this
$id = $user['id'];
$name = $user['name'];
} else { //otherwise use this
$user = User::select('id', 'name')->where('id', '=', $request->userID)->first();
$id = $user->id;
$name = $user->name;
}
return response()->json(['id' => $id, 'name' => $name], 200);
}
However this else statement is never reached somehow even though $user from cache might be empty. Is there a better way to do this?
Also while updating... Is there a better way to automatically update both (cache and DB) when data in one of them is changed.
public function editDashboard(Request $request) {
$user = Route::find($request->userID);
$user->name = $request->name;
$user->save();
$cacheEdit = app('redis')->hSet($user->id, 'name', $request->name);
return response()->json(['status' => '200'], 200);
}
At the moment I do it like this but sometimes only one of them might be changed and then cache data (or vice-versa, DB data) are not synchronized/updated.
This is my first experience with Redis and caching in general, so any help is appreciated.
Instead of directly using Redis API, you should use Laravel cache API instead, it allows some abstraction and you don't even need to know which is the underlying cache.
By using Eloquent instead of query builder, you may unlock some very powerful features such as model events. For example, in your User model:
protected static function booted()
{
parent::booted();
$cache = app('cache');
static::updated(function ($user) use ($cache) {
$cacheKey = $user->getCacheKey();
if ($cache->has($cacheKey) {
$cache->put($cacheKey, $user, 30); //30 is the life duration of this value in cache, you're free to change it
}
});
static::deleted(function ($user) use ($cache) {
$cache->forget($user->getCacheKey());
});
}
public function getCacheKey()
{
return 'users.' . $this->getKey();
}
These "event hooks" get automatically called by Laravel whenever you update or delete a User by using Eloquent.
It allows you then to fluently do this:
use App\User;
use Illuminate\Http\Request;
use Illuminate\Contracts\Routing\ResponseFactory;
use Illuminate\Contracts\Cache\Repository as Cache;
public function updateDashboardUser(Request $request, Cache $cache, ResponseFactory $responseFactory)
{
$id = $request->userID;
$user = $cache->remember('users.' . $id, 30, function () use ($id) {
return User::findOrFail($id);
});
return $responseFactory->json($user->only(['id', 'name']), 200);
}
Such as mentioned here https://laravel.com/docs/7.x/cache#retrieving-items-from-the-cache , you can use remember (or rememberForever) to retrieve something from the cache and automatically fallback into a closure if not found. The findOrFail will then retrieves it from the database and will eventually throw an Illuminate\Database\Eloquent\ModelNotFoundException because it makes no sense then to send a successful response. I also replaced your helpers such as response by using dependency injection on contracts (Laravel interfaces) which is the cleanest practice.
https://laravel.com/docs/7.x/contracts

Laravel Excel queued export failing

I have been having a lot of trouble getting the Laravel Excel package to export a large amount of data. I need to export about 80-100k rows so I implemented the queued export as mentioned in the docs. It works fine when I export a smaller amount of rows, but when I try to do 60-80k rows, it fails every time. While the jobs are being processed, I watch the temp file that is created, and I can see that the size of the file is increasing. I also watch the jobs in the database (I'm using the database queue driver), and I can see the jobs completing for a while. It seems that the jobs take incremently more time until the job fails. I don't get why the first several jobs are quick, and then they start taking more and more time to complete.
I'm using supervisor to manage the queue, so here's my config for that:
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/html/site/artisan queue:work --sleep=3 --tries=3 --timeout=120 --queue=exports,default
autostart=true
autorestart=true
user=www-data
numprocs=8
redirect_stderr=true
stdout_logfile=/var/log/supervisor/worker.log
loglevel=debug
And then my controller to create the export
(new NewExport($client, $year))->queue('public/exports/' . $name)->allOnQueue('exports')->chain([
new NotifyUserOfCompletedExport($request->user(), $name),
]);
I'm using:
Laravel 5.8,
PHP 7.2,
Postgresql 10.10
I should also mention that I have played around with the chunk size a bit, but in the end I've always run into the same problem. I tried chunk sizes of 500, 2000, 10000 but no luck.
In the failed_jobs table, the exception is MaxAttemptsExceededException, although I have also got exceptions for InvalidArgumentException File '/path/to/temp/file' does not exist. I'm not quite sure what else to do. I guess I could make it so it doesn't timeout, but that seems like it will just cause more problems. Any help would be appreciated.
EDIT
Here is the content of my Export Class:
class NewExport implements FromQuery, WithHeadings, WithMapping, WithStrictNullComparison
{
public function __construct($client, $year)
{
$this->year = $year;
$this->client = $client;
}
public function query()
{
$data = $this->getDataQuery();
return $data ;
}
public function headings(): array
{
$columns = [
//....
];
return $columns;
}
public function map($row): array
{
$mapping = [];
foreach($row as $key => $value) {
if(is_bool($value)) {
if($value) {
$mapping[$key] = "Yes";
} else {
$mapping[$key] = "No";
}
}else{
$mapping[$key] = $value;
}
}
return $mapping;
}
private function getDataQuery()
{
$query = \DB::table('my_table')->orderBy('my_field');
return $query;
}
The NotifyUserOfCompletedExport class is just creating a job to email the logged in user that the export is finished with a link to download it.
class NotifyUserOfCompletedExport implements ShouldQueue
{
use Queueable, SerializesModels;
public $user;
public $filename;
public function __construct(User $user, $filename)
{
$this->user = $user;
$this->filename = $filename;
}
public function handle()
{
// This just sends the email
$this->user->notify(new ExportReady($this->filename, $this->user));
}
}
EDIT 2:
So I read this post, and I verified that eventually my server was just running out of memory. That lead to the MaxAttemptsExceededException error. I added more memory to the server, and I am still getting the InvalidArgumentException File '/path/to/temp/file' does not exist after the jobs have completed. It's even more weird though, because I can see that /path/to/temp/file actually does exist. So I have no idea what is going on here, but it's super frustrating.

Relationship is working in Laravel Tinker but not within controller

I made a manyToMany relationship and want to return that in my php code which is not working, but when I run the same code in tinker it is working for some reason. What am I missing?
// Firma
public function auftraege()
{
return $this->belongsToMany("Auftrag", 'auftraege_firma');
}
// Auftrag
public function firmen()
{
return $this->belongsToMany("Firma", 'auftraege_firma');
}
// works in tinker
$firma = App\Firma::first();
$firma->auftraege
// Does not work in php Controller
$firma = App\Firma::first();
return $firma->auftraege
Getting 500 Error
Looking at your controller code, I can only notice two things. Change your controller code like this:
$firma = \App\Firma::first();
return $firma->auftraege;
You are missing \ before the App namespace and also the semicolon is missing in the return statement.
Please also change the relationshps like this:
public function auftraege()
{
return $this->belongsToMany(Auftrag::class, 'auftraege_firma');
}
public function firmen()
{
return $this->belongsToMany(Firma::class, 'auftraege_firma');
}
The reason it was working from tinker is that by default tinker sets the namespace to App for the current tinker session. That's why even though you didn't specify the App namespace, tinker was able to parse the proper namespace.

Return value from one Artisan command to another

I am attempting to call one Artisan (Laravel) command from another command. However, I need to be able to retrieve an array from the command that is called from the "main" command...
i.e
// Command 1
public function handle() {
$returnedValue = $this->call( 'test:command' );
dump( $returnedValue ); // <-- is 5
}
// Command 2
public function handle() {
return $this->returnValue();
}
private function returnValue() {
$val = 5;
return $val;
}
I have looked through the documentation and can't find a way to do this, so I was wondering if there was a way or if I need to re-think my approach.
Thanks!
Artisan Commands don't behave the same way as, for example, Controller functions. They return an exitCode, which in my testing was always 0 (couldn't get anything to return if an error is thrown).
Your approach won't work if you try to get a return value, but you can access \Artisan::output(); to see what exactly is sent by the first artisan command you call.
// FirstCommand.php
public function handle(){
\Artisan::call("second:command");
if(\Artisan::output() == 1){
$this->info("This Worked");
} else {
$this->error("This Didn't Work");
}
}
Note: I used \Artisan::call(); there's some apparent differences between the two where using $this->call() didn't work as expected, but \Artisan::call() did. $this->call() sent both 0 and 1 back, regardless of the actual code being executed; not sure what's up there. Tested on Laravel 5.0, which is quite behind the current, so maybe that's it.
// SecondCommand.php
public function handle(){
try {
$test = 1 / 1;
} catch (\Exception $ex){
$this->error("0");
}
$this->info("1");
}
Running php artisan first:command in my console returns:
$ php artisan first:command
This Worked
Now, if switch the code in $test to
$test = 1 / 0;
I get this in my console:
$ php artisan first:command
This Didn't Work
So, the rule here I guess is to avoid outputting anything in the second command prior to the result you want to check with \Artisan::output().

Categories