Laravel get memory_limit error, when create folder from database - php

I Want to create folders based on user_id inside /laravel/public/file/FolderUserXXX.
If I have 1000 record user in database, i want to create 1000 folder inside the path (I need to create those folders because my server gets down and folders disappear).
I have created a job to execute that and I tried it in local with small data (is working) but in production I always get error Allowed memory size of xxxxxxxxxxxxx bytes exhausted (tried to allocate xxxxxxxx bytes). I add memory_limit until 4GB in php.ini but still get the problem.
Here is my code in app/Console/Commands
scenario:
createfolder job for user with status >= 3 and remark_restore is null. On every successful creation remark_restore will change to 1. I tried to limit 1 data per job and run every minute, but I still get error.
But if I change memory_limit to -1 only to run this job and then change it back to previous value, will be there any impact in the future for my web?
Any suggestion on how to prevent the memory_limit error?
public function handle()
{
try {
ini_set('max_execution_time', 300000);
ini_set('memory_limit','4096M');
$take = 1;
$make_folder = User::where('status', '>=', '3')->wherenull('remark_restore')->limit($take)->get();
foreach ($make_folder as $make_folder) {
$path = public_path().'/file/'.$make_folder->id;
if(!(file_exists($path))) {
$data = User::where('id', $make_folder->id)->update(array('remark_restore' => '1' ));
File::makeDirectory($path, $mode = 0777, true, true);
}
}
$count = User::where('status', '>=', '3')->wherenull('remark_restore')->count();
echo 'Pending : '.$count;
}
catch (\Exception $e) {
\Log::warning($e);
}
}

Okay, you are confusing things...
A correct approach here is to dispatch a job for each folder instead of having a single job to do every creation.
So, you should have a Job like CreateFolderForUser and in it have something like:
class CreateFolderForUser
{
protected $user;
public function __construct(User $user)
{
$this->user = $user;
}
public function handle()
{
try {
$path = public_path("/file/{$this->user->id}");
if(!(file_exists($path))) {
File::makeDirectory($path, 0777, true, true);
$this->user->remark_restore = 1;
$this->user->save();
}
}
catch (\Exception $e) {
\Log::warning($e);
}
}
}
See that I have removed everything about iterating users, you just need to do this job that is "Create a folder for this user", nothing else...
Also, see that I have moved the remark_restore update after creating the directory, because on your previous code, if the creation fails, you have already updated the user's remark_restore to 1 but that is false, you had an error...
Now you should have a Scheduler that runs code or do it in a command so the scheduler calls that command every 5 minutes or whatever you want to. This code will see the database and dispatch this job with each user you want to have a folder created.
public function dispatchUserFolderCreation()
{
$take = 1000;
$users = User::where('status', '>=', '3')
->whereNull('remark_restore')
->limit($take)
->chunk(100, function (Collection $users) {
foreach ($users as $user) {
dispatch(new CreateFolderForUser($user));
}
});
}
You could dynamically pass the $take you want, if you are using a command, so the default is 1000 but if you want less or more (do not send 10000, do it in batches of 2000, not more), you can do it.

Related

Laravel controller function times out, 50000 users

In my controller I have a have a function to check subscription status and update it in my database. The problem is with 50000 users, it takes too long to finish and times out.
public function UpdateStatus(){
$users = User::all();
foreach($users as $user){
$user->createOrGetStripeCustomer();
$stripeSubs = $user->asStripeCustomer()->subscriptions->all();
$dbSubs = DB::table('subscriptions')->select('stripe_id')->where('user_id', $user->id)->get();
foreach($dbSubs as $check){
$canDelete=0;
foreach($stripeSubs as $value){
if($check->stripe_id == $value->id){
$canDelete++;
}
}
if($canDelete==0){
DB::table('subscriptions')->where('user_id', $user->id)->where('stripe_id',$check->stripe_id)->update(['stripe_status'=>'ended']);
}
}
}
return redirect('/dashboard');
}
I'm sure I shouldn't even process that many at a time but I kinda got stuck here and am not sure how exactly to approach this. My goal is to make this work and optimize it.
you use job like that:
php artisan make:job checkSubscrption
in your env file change Queue_CONNECTiON=database,
run php artisan queue:table and php artisan migrate
in your checkSubscription file set that
public function handle(){
$users = User::all();
foreach($users as $user){
$user->createOrGetStripeCustomer();
$stripeSubs = $user->asStripeCustomer()->subscriptions->all();
$dbSubs = DB::table('subscriptions')->select('stripe_id')->where('user_id', $user->id)->get();
foreach($dbSubs as $check){
$canDelete=0;
foreach($stripeSubs as $value){
if($check->stripe_id == $value->id){
$canDelete++;
}
}
if($canDelete==0){
DB::table('subscriptions')->where('user_id', $user->id)->where('stripe_id',$check->stripe_id)->update(['stripe_status'=>'ended']);
}
}
}}
in your controller:
public function UpdateStatus(){
checkSubscrption::dispatch();//you can use chunk method and pass your $users as params
return redirect('/dashboard');}
then run php artisan queue:work
Firstly, you should do this in a Queued Job: https://laravel.com/docs/8.x/queues
That way the script won't time out.
However, loading 50k Eloquent objects into memory isn't optimal. To resolve this you can use Lazy Collections: https://laravel.com/docs/8.x/collections#lazy-collections
$users = User::cursor();
foreach ($users as $user) {
// do stuff
}
This will only execute one query, but will only ever keep one User object in memory.

Laravel Excel queued export failing

I have been having a lot of trouble getting the Laravel Excel package to export a large amount of data. I need to export about 80-100k rows so I implemented the queued export as mentioned in the docs. It works fine when I export a smaller amount of rows, but when I try to do 60-80k rows, it fails every time. While the jobs are being processed, I watch the temp file that is created, and I can see that the size of the file is increasing. I also watch the jobs in the database (I'm using the database queue driver), and I can see the jobs completing for a while. It seems that the jobs take incremently more time until the job fails. I don't get why the first several jobs are quick, and then they start taking more and more time to complete.
I'm using supervisor to manage the queue, so here's my config for that:
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/html/site/artisan queue:work --sleep=3 --tries=3 --timeout=120 --queue=exports,default
autostart=true
autorestart=true
user=www-data
numprocs=8
redirect_stderr=true
stdout_logfile=/var/log/supervisor/worker.log
loglevel=debug
And then my controller to create the export
(new NewExport($client, $year))->queue('public/exports/' . $name)->allOnQueue('exports')->chain([
new NotifyUserOfCompletedExport($request->user(), $name),
]);
I'm using:
Laravel 5.8,
PHP 7.2,
Postgresql 10.10
I should also mention that I have played around with the chunk size a bit, but in the end I've always run into the same problem. I tried chunk sizes of 500, 2000, 10000 but no luck.
In the failed_jobs table, the exception is MaxAttemptsExceededException, although I have also got exceptions for InvalidArgumentException File '/path/to/temp/file' does not exist. I'm not quite sure what else to do. I guess I could make it so it doesn't timeout, but that seems like it will just cause more problems. Any help would be appreciated.
EDIT
Here is the content of my Export Class:
class NewExport implements FromQuery, WithHeadings, WithMapping, WithStrictNullComparison
{
public function __construct($client, $year)
{
$this->year = $year;
$this->client = $client;
}
public function query()
{
$data = $this->getDataQuery();
return $data ;
}
public function headings(): array
{
$columns = [
//....
];
return $columns;
}
public function map($row): array
{
$mapping = [];
foreach($row as $key => $value) {
if(is_bool($value)) {
if($value) {
$mapping[$key] = "Yes";
} else {
$mapping[$key] = "No";
}
}else{
$mapping[$key] = $value;
}
}
return $mapping;
}
private function getDataQuery()
{
$query = \DB::table('my_table')->orderBy('my_field');
return $query;
}
The NotifyUserOfCompletedExport class is just creating a job to email the logged in user that the export is finished with a link to download it.
class NotifyUserOfCompletedExport implements ShouldQueue
{
use Queueable, SerializesModels;
public $user;
public $filename;
public function __construct(User $user, $filename)
{
$this->user = $user;
$this->filename = $filename;
}
public function handle()
{
// This just sends the email
$this->user->notify(new ExportReady($this->filename, $this->user));
}
}
EDIT 2:
So I read this post, and I verified that eventually my server was just running out of memory. That lead to the MaxAttemptsExceededException error. I added more memory to the server, and I am still getting the InvalidArgumentException File '/path/to/temp/file' does not exist after the jobs have completed. It's even more weird though, because I can see that /path/to/temp/file actually does exist. So I have no idea what is going on here, but it's super frustrating.

How to make File Upload with Laravel excel using chunk method?

public function uploadNotaCorte(Request $request, EstadoRepository $estadoRepository)
{
$error = array();
$path = $request->file('file')->getRealPath();
$notasCorte = Excel::load($path, function($reader) {
})->get();
$chunk = $notasCorte->chunk(100);
foreach ($notasCorte as $key => $notaCorte) {
//RULES
}return $error;
}
**
Hi everyone, I'm new to programming and I'm having a hard time implementing the chunk method, so the dodigo above usually works on small files plus larger error files because of the size.
I need to upload a file with 120,000 records and I am trying to use the chunk for this, I do not know what I can do wrong already looked at the documentation more and very simple and I could not solve the problem can anyone help me ??**
Assuming you're using the maatwebsite/excel package, this link should help: http://www.maatwebsite.nl/laravel-excel/docs/import#chunk
You'll want to change your code to something like this:
public function uploadNotaCorte(Request $request, EstadoRepository $estadoRepository)
{
$error = array();
$path = $request->file('file')->getRealPath();
Excel::filter('chunk')->load($path)->chunk(100, function($results)
{
foreach($results as $row)
{
// RULES
}
});
return $error;
}
This isn't tested and I've never used that package (though good to know it exists) so your mileage may vary.

Why is my controller so slow?

I'm on an app that retrieve datas (a 7k lines CSV formated string) from an external server to update my own entity. Each row is an item in a stock.
Today the job is nicely done but it's very very very slow: more than 60s (prod env) to retrieve datas, push it in a 2D array, update the BDD, and finally load a page that display the bdd content.
When only displaying the page it's about 20s (still prod).
This the profiler's timeline result while only displaying records : Symfony's profiler timeline
Anymore, i'm not able to profile the "updateAction" cause i't don't appear in the last ten request list.
2 days ago I was checking each row of the CSV file to add it only if needed, I was soft-deleting items to restore it later when back in the stock etc. but with that speed I tried many things to have normal performances.
At the begening everything was in the controler, I moved the function that add/remove in a dedicated service, then in the repository to finally get it back in my controler. To have decent results I tried to empty the database and then refill it without checking. First, using LOAD DATA LOCAL INFILE but it is not compatible with my table pattern (or I mis understood something) and now I'm simply emptying the table before filling it with the CSV (without any control). The time score I gave earlier was with this last try (which is the best one).
But enought talk
here is my controler:
public function majMatosClanAction()
{
$resMaj = $this->majClanCavernes();
if ($resMaj === NULL)
{
$this->get('session')->getFlashBag()->add('alert-danger', 'Unidentified');
return $this->redirect($this->generateUrl('loki_gbl'));
} else if ($resMaj === FALSE)
{
$this->get('session')->getFlashBag()->add('alert-warning','password update required');
return $this->redirect($this->generateUrl('loki_gbl_ST'));
} else
{
$this->get('session')->getFlashBag()->add('alert-success','success');
return $this->redirect($this->generateUrl('loki_gbl_voirMatosClan'));
}
}
here is the function that my controller call:
public function majClanCavernes()
{
$user = $this->get('security.token_storage')->getToken()->getUser();
$outils = $this->container->get('loki_gbl.outils');
if ($user !== NULL)
{
$pwd = $user->getGob()->getPwd();
$num = $user->getGob()->getNum();
if($outils->checkPwd($num, $pwd) !== TRUE) return FALSE;
$em = $this->getDoctrine()->getManager();
//This is a temporary solution
//////////////////////////////////////////////
$connection = $em->getConnection();
$platform = $connection->getDatabasePlatform();
$connection->executeUpdate($platform->getTruncateTableSQL('MatosClan', true ));
//////////////////////////////////////////////
$repository = $em->getRepository('LokiGblBundle:MatosClan');
$urlMatosClan = "http://ie.gobland.fr/IE_ClanCavernes.php?id=".$num."&passwd=".$pwd;
//encode and format the string via a service
$infosBrutes = $outils->fileGetInfosBrutes($urlMatosClan);
//$csv is a 2D array containing the datas
$csv = $outils->getDatasFromCsv($infosBrutes);
foreach($csv as $item)
{
$newItem = new MatosClan;
$newItem->setNum($item[0]);
$newItem->setType($item[1]);
[...]
$em->persist($newItem);
}
$em->flush();
return TRUE;
}
else{
return NULL;
}
}
What is wrong? 7k lines is not that big!
Could it be a lack of hardware issue?
Check out doctrine's batch processing documentation here.
You can also disable logging:
$em->getConnection()->getConfiguration()->setSQLLogger(null);

PHP Eval alternative to include a file

I am currently running a queue system with beanstalk + supervisor + PHP.
I would like my workers to automatically die when a new version is available (basically code update).
My current code is as follow
class Job1Controller extends Controller
{
public $currentVersion = 5;
public function actionIndex()
{
while (true) {
// check if a new version of the worker is available
$file = '/config/params.php';
$paramsContent = file_get_contents($file);
$params = eval('?>' . file_get_contents($file));
if ($params['Job1Version'] != $this->currentVersion) {
echo "not the same version, exit worker \n";
sleep(2);
exit();
} else {
echo "same version, continue processing \n";
}
}
}
}
When I will update the code, the params file will change with a new version number which will force the worker to terminate. I cannot use include as the file will be loaded in memory in the while loop. Knowing that the file params.php isn't critical in terms of security I wanted to know if there was another way of doing so?
Edit: the params.php looks as follow:
<?php
return [
'Job1Version' => 5
];
$params = require($file);
Since your file has a return statement, the returned value will be passed along.
After few tests I finally managed to find a solution which doesn't require versionning anymore.
$reflectionClass = new \ReflectionClass($this);
$lastUpdatedTimeOnStart = filemtime($reflectionClass->getFileName());
while (true) {
clearstatcache();
$reflectionClass = new \ReflectionClass($this);
$lastUpdatedTime = filemtime($reflectionClass->getFileName());
if ($lastUpdatedTime != $lastUpdatedTimeOnStart) {
// An update has been made, exit
} else {
// worker hasn't been modified since running
}
}
Whenever the file will be updated, the worker will automatically exit
Thanks to #Rudie who pointed me into the right direction.

Categories