Laravel dispatch job run async - php

I have a function that posts some content and pushes a job onto a queue and returns response to user even before the queue complete the job.
for that I changed the .env QUEUE_DRIVER to database, And records is saved in table jobs, but to execute this jobs I have to call the command php artisan queue:work, and that is my question: how do I call this command in the code or what should I do whenever there is jobs in the table?

The command
php artisan:queue work
Should be runnig always it will check if there is new jobs he will dispatch them
But it should be always running you can't execute it from the code
Also you can run
php artisan queue:work --tries=5
This for example will try 5 times then it will stop
Plus you can install supervisor it will always start the queue:work if it faild

Related

How to process php artisan queue:listen

I am facing a serious problem in Laravel Queue system please help me to fix this issue.
Once I queue my mail by using
$mailer = Mail::to($email_to)->queue(new ContactGeneral($data));
it stores into the database and runs this command from terminal php artisan queue:listen it works fine once I close my terminal it does not listen to my queue.
For that, I set up a scheduled in kernem.php file like that which run in every minute
protected function schedule(Schedule $schedule){
$schedule->command('queue:listen')->everyMinute();
}
set this line in a cronjob and work fine
* * * * * cd /path-to-your-project && php artisan schedule:run >> /dev/null 2>&1
Problem is as runs every minute run every minute it not kill the previous process and run another process in next minute it slowdown my server
Please can you let me know what is the best way to implement this
Thanks in advance
Best way is to use supervisor. Though if you are running the application in a shared hosting environment, you can process the queues once and then exit the process thus freeing up the memory by using the following command:
php artisan queue:work --once
Depending on how many queues you'll have, set the queue to run once every 1, 2 or 3 minutes to make sure the previous process has time to consume the queues and they won't interfere often. I think you can use the following command:
* * * * * cd /path-to-your-project && php artisan queue:work --once
No, you don't need to schedule this process
as long as queue:work process is running he will look at your "jobs" table and run task by task
what you need is something to make sure that the process doesn't end when you close the console, as user8555937 and Webinion said you need supervisor and its configuration file, once you run it it will run in the background and you can forget about it

Cron jobs on shared hosting using laravel scheduler not working

I am having issue of running laravel scheduler to send mails in the queue
The setup is as follows: Laravel 5.7
I have configured the scheduler (App/Console/Kernel.php) like mentioned below
protected function schedule(Schedule $schedule)
{
$schedule->command('queue:work --tries=3')->everyFiveMinutes()->withoutOverlapping();
}
The db is set-up as per laravel docs. As soon as I click the link in my UI, I can see the entry in the db.
The .env QUEUE_CONNECTION=database and the same setting in Config/queue.php
(if i change the database to sync, it works perfectly)
My cron job in the server is as follows ( i just tried to log the cron)
/usr/local/bin/php /home/XXX/YYY/artisan schedule:run 1>> /home/XXX/public_html/junk/cron_log.php 2>&1
I can see the cron logs getting updated every five minues but
"No scheduled commands are ready to run"
Exactly the same code and settings last night worked(before going to bed). I had tested for more than
40 emais send attempts and the db entries were getting deleted. I only tried to save the scheduler with everyFiveMinues() but now it is not working.
I can understand mails reaching slowly but why the db entries were not deleted like last night?
this may be useful to other who are using Laravel 5.7, shared hosting Godaddy.
The above issue of dispatching email jobs was not executing ( I mean the cron jobs are running but database entries are not cleared. The issue seems to be with
->withoutOverlapping();
After I have deleted this method, I am now seeing the cron_log entries correctly and I have also received mails. My cron_log entries are as seen below
Running scheduled command: '/opt/alt/php71/usr/bin/php' 'artisan' queue:work --tries=3 > '/dev/null' 2>&1
I am guessing the method withoutOverlapping() has problem in cron execution. I have not changed anything in the code.

Laravel 5.5 Job with delay fires instantly instead of waiting

In my application I am dispatching a job on work queue with delay time. But its work instantly not waiting for delay time. In my config and eve I am using driver as database.
In my database job table not insert any job till now.
My config:
'default' => env('QUEUE_DRIVER', 'database')
My controller code:
Log::info('Request Status Check with Queues Begins', __METHOD__);
MyGetInfo::dispatch($this->name,$this->password,$this->id,$trr->id)->onQueue('work')->delay(12);
return json_encode($data);
The value of QUEUE_DRIVER must be set to database in .env file.
make sure to run this afterwards:
php artisan config:clear
also run
php artisan queue:listen

How can I learn more about why my Laravel Queued Job failed?

The Situation
I'm using Laravel Queues to process large numbers of media files, an individual job is expected to take minutes (lets just say up to an hour).
I am using Supervisor to run my queue, and I am running 20 processes at a time. My supervisor config file looks like this:
[program:duplitron-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/duplitron/artisan queue:listen database --timeout=0 --memory=500 --tries=1
autostart=true
autorestart=true
user=duplitron
numprocs=20
redirect_stderr=true
stdout_logfile=/var/www/duplitron/storage/logs/duplitron-worker.log
In my duplitron-worker.log I noticed Failed: Illuminate\Queue\CallQueuedHandler#call occurs occasionally and I would like to better understand what exactly is failing. Nothing appears in my laravel.log file (which is where exceptions would normally appear).
The Question
Is there a handy way for me to learn more about what is causing my job to fail?
In the newer Laravel versions there's an exception column in the failed_jobs table that has all the info you need. Thanks cdarken and Toskan for pointing this out!
==== OLD METHOD BELOW
Here's what I always do, but first - make sure you have a failed-jobs table! It's well documented, look it up :)
Run the php artisan queue:failed command to list all the failed jobs, and pick the one you're after. Write the ID down.
Then, make sure to stop your queue with supervisorctl stop all duplitron-worker:
Lastly, make sure your .env setting for APP_DEBUG = true.
Then run php artisan queue:retry {step_job_1_id}
Now manually runphp artisan queue:listen --timeout=XXX
If the error is structural (and most are), you should get the failure with debug stack in your log file.
Good luck with debugging :-)
as #cdarken pointed out, the exception can be found in your database table failed_jobs column name exception. Thanks #cdarken I wish his answer would be an answer and not a comment.
Run these command to create a table failed_jobs in db
php artisan queue:failed-table
php artisan migrate
Run queue worker php artisan queue:work --tries=2
Check the exception reason in your database table failed_jobs you've just created.
If you're using database than goto failed_jobs table and look for the exception there.
Worked for me,
in vendor/laravel/framework/src/Illuminate/Notifications/SendQueuedNotifications.php
Just remove "use Illuminate\Queue\SerializesModels;" Line 6
& modify Line 11 to "use Queueable;"

What would prevent jobs in a queue from processing? [PHP / Laravel 5]

I have a queue I set up in Laravel 5 to delete companies and associated records. Each time this happens, a lot of work happens on the back-end, so queues are my best option.
I set up my config/queue.php file along with my .env file so that the database driver will be used. I am using the Queue::pushOn method to push jobs onto a queue called company_deletions. Ex.
Queue::pushOn('company_deletions', new CompanyDelete($id));
Where CompanyDelete is a command created with php artisan command:make CompanyDelete --queued
I have tried to get my queue to process using the following commands:
php artisan queue:work
php artisan queue:work company_deletions
php artisan queue:listen
php artisan queue:listen company_deletions
php artisan queue:work database
php artisan queue:listen database
Sometimes when looking at the output of the above commands, I get the following error:
[InvalidArgumentException]
No connector for []
Even when I don't get an error I cannot get it to actually process the jobs for some reason. When I look in my jobs table, I can see the job on the queue, however the attempts column shows 0, reserved shows 0 and reserved_at is null. Am I missing some steps? I have looked over the documentation several times and cannot for the life of me figure out what is wrong. I don't see anything in the laravel error logs either. What would prevent these jobs from being processed once they are in the jobs database? Any help is appreciated.
i run into a smiliar issue because i dont add the jobs in the default queue..
$job = (new EmailJob(
$this->a,
$this->b,
$this->c,
$this->d,
$e
))->onQueue('emails');
then i have to listen to the specific queue:
php artisan queue:listen --queue=emails
in your case it would be
php artisan queue:listen --queue=company_deletions

Categories