I am trying to dispatch jobs in Laravel into redis. If I do
Queue::push('LogMessage', array('message' => 'Time: '.time()));
then a job is placed into the queues:default key in redis. (note that I do not yet have the listener running; I'm just trying to show that the queue is being used.) However, if I do
$this->dispatch(new AudienceMetaReportJob(1));
then nothing is added to redis and the job executes immediately.
The Job definition:
<?php
use App\Jobs\Job;
use Illuminate\Queue\InteractsWithQueue;
/**
* Simple Job used to build an Audience Report. Its purpose is to
* allow code to dispatch the command to build the report.
*/
class AudienceMetaReportJob extends Job {
use InteractsWithQueue;
protected $audience_id;
/**
* #param $audience_id
*/
public function __construct($audience_id){
$this->audience_id = $audience_id;
}
/**
* Execute the job.
*
* #return void
*/
public function handle(){
$audience_meta = new AudienceMeta();
$audience_meta->reportable = Audience::findOrFail($this->audience_id);
$audience_meta->clear()
->build()
->save();
}
}
Notes:
The Audience job appears to complete properly, but does so as if the QUEUE_DRIVER env variable was set to sync
Redis has been configured properly, as I can manually use the Queue:push() function and can read from redis
I have set the QUEUE_DRIVER env variable to redis, and the problem exists even if I hardcode redis in the config/queue.php file: 'default' => 'redis',
I do not have php artisan queue:listen running
What other configuration am I missing? Why does the Queue class work with redis, while the dispatch function does not?
Related
I am developing a Laravel application for a client company of mine. That company has a server guy. In the middle of the development process, I asked him to set up a server for the application. But he set up a server without any kind of automated deployments procedure instead he gave me a C-Pannel to deploy the changes manually. The application became so complex and now manual deployments became a very tedious task. I am manually building js and css and uploading them to the server. After a heated argument with him, I finally made him work on automated deployments. Instead of properly putting deployments he has put it this way.
<?php
//app/Console/Commands/GitPull.php
namespace App\Console\Commands;
use Illuminate\Console\Command;
class GitPull extends Command
{
/**
* The name and signature of the console command.
*
* #var string
*/
protected $signature = 'git:pull';
/**
* The console command description.
*
* #var string
*/
protected $description = 'Get updates from git server';
/**
* Create a new command instance.
*
* #return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* #return mixed
*/
public function handle()
{
exec('git pull origin master');
}
}
<?php
app/Console/Kernel.php
namespace App\Console;
use Illuminate\Console\Scheduling\Schedule;
use Illuminate\Foundation\Console\Kernel as ConsoleKernel;
class Kernel extends ConsoleKernel
{
/**
* The Artisan commands provided by your application.
*
* #var array
*/
protected $commands = [
Commands\GitPull::class,
];
/**
* Define the application's command schedule.
*
* #param \Illuminate\Console\Scheduling\Schedule $schedule
* #return void
*/
protected function schedule(Schedule $schedule)
{
$schedule->command('git:pull')
->everyMinute();
}
/**
* Register the commands for the application.
*
* #return void
*/
protected function commands()
{
$this->load(__DIR__.'/Commands');
require base_path('routes/console.php');
}
}
Basically he has put a scheduler to execute git pull every minute. Is this the correct way to put deployments? What are the drawbacks in this way? Do the deployments have to be dependent on the application? Is it possible to add automated deployments without the help of the laravel framework? I am not very familiar with dev-ops.
Also, my source code is in bitbucket. I am ready to answer any questions regarding this. Thanks in advance.
I certainly would not use a Laravel command to run deployments. Automated deployments can be as simple or complex as needed, but here are some things I expect a build agent to do:
Listen to changes to master to do a production deploy
Run tests (and alert and stop deployment if they fail)
Build my static assets (js/css/etc)
Copy code to server (several ways to do this)
Run migrations
Restart queues so they are running with latest code
Since you're on bitbucket you could look into pipelines to do your ci/cd.
I am developing a laravel 5.7 application.
I have created a command that should setup my database:
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\Artisan;
class TestSetupCommand extends Command
{
protected $signature = 'test:data';
protected $description = 'Basic Setup for Test Data';
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* #return mixed
*/
public function handle()
{
Artisan::call('migrate:refresh', ['--seed' => '']);
Artisan::call('basis:cc');
Artisan::call('tick:exchange');
$this->info("DB freshly setup: DONE");
$this->info("Coin: DONE");
$this->info("Exchange Scrapped: DONE");
}
}
My problem is that each command takes several minutes to run through. In total it costs me 25 minutes to fill the whole database with data.
I would like to run the commands only for 1 minutes each and kill them afterwards.
Any suggestions how to accomplish this within my laravel command?
I think the best way to do this is to extract these commands into background job. This artisan command then becomes code to queue up that new job (or jobs).
Why? It's very easy to configure jobs to timeout after x amount of time, by overriding a value like so:
<?php
namespace App\Jobs;
class ProcessPodcast implements ShouldQueue
{
/**
* The number of seconds the job can run before timing out.
*
* #var int
*/
public $timeout = 120;
}
Also, why are you refreshing the database? That seems.... like a crazy idea unless this is purely an analytics platform (no user data at all). It's probably a bad thing if that refresh command times out - you may look into job chaining so that the refresh command is guaranteed to succeed, then the other commands (new jobs now) have set timeouts.
I'm trying to schedule an email to remind users who have to-do tasks due tomorrow. I made a custom command email:reminder. Here is my code in custom command:
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use App\Todo;
use Illuminate\Support\Facades\Mail;
class SendReminderEmail extends Command
{
/**
* The name and signature of the console command.
*
* #var string
*/
protected $signature = 'email:reminder';
/**
* The console command description.
*
* #var string
*/
protected $description = 'Remind users of items due to complete next day';
/**
* Create a new command instance.
*
* #return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* #return mixed
*/
public function handle()
{
//
/*
* Send mail dynamically
*/
/*
* hardcoded email
*/
Mail::queue('emails.reminder', [], function ($mail) {
$mail->to('example#email.com')
->from('todoreminder#gmail.com', 'To-do Reminder')
->subject('Due tomorrow on your To-do list!');
}
);
$this->info('Reminder email sent successfully!');
}
}
I hardcoded the email for now to test it, but when I ran php artisan email:reminder, I got an exception of
[InvalidArgumentException]
Only mailables may be queued.
I then checked Laravel's documentation, but Task Scheduling and Email Queue are 2 separate topic.
How can I achieve sending email queue with task scheduling in Laravel
5.6 please?
Also how can I pass data, i.e. to-do items in database into my email
view please?
Any help is greatly appreciated!
Using the console kernel to schedule queued jobs is easy to do. Laravel offers several wrapper methods that make the cron integration trivial. Here's a basic example:
$schedule->job(new SendTodoReminders())->dailyAt('9:00');
You should create a command which does exactly as you described, but without the scheduling. You can use the crontab for scheduling or some other task scheduler.
Have you followed Laravel's documentation about mailing? https://laravel.com/docs/5.6/mail
Once you get to the Sending Mail section, you should not create a controller but a command instead.
When that command works, add it to the task scheduler (eg. crontab) to run on a daily basis.
Mail::queue('emails.reminder', [], function ($mail) {
$mail->to('example#email.com')
->from('todoreminder#gmail.com', 'To-do Reminder')
->subject('Due tomorrow on your To-do list!');
}
);
is deprecated since Laravel 5.3. Only the Mailables can Queued and it should implement the ShouldQueue interface.
For running jobs you have to configure the queue driver and run php artisan queue:work
I am using the latest version of Homestead.
I also have Laravel Horizon set up.
I am using Redis as the queue driver.
Laravel is version 5.6 and is a fresh install.
What's happening is my jobs are all failing (even though the job exits correctly).
I am running the job through command line by using a custom command:
vagrant#homestead:~/myapp$ artisan crawl:start
vagrant#homestead:~/myapp$ <-- No CLI errors after running
app/Console/Command/crawl.php
<?php
namespace MyApp\Console\Commands;
use Illuminate\Console\Command;
use MyApp\Jobs\Crawl;
class crawl extends Command
{
/**
* The name and signature of the console command.
*
* #var string
*/
protected $signature = 'crawl:start';
/**
* The console command description.
*
* #var string
*/
protected $description = 'Start long running job.';
/**
* Create a new command instance.
*
* #return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* #return mixed
*/
public function handle()
{
Crawl::dispatch();
}
}
app/Jobs/Crawl.php
<?php
namespace MyApp\Jobs;
use Illuminate\Bus\Queueable;
use Illuminate\Queue\SerializesModels;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
class Crawl implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
/**
* The number of seconds the job can run before timing out.
*
* #var int
*/
public $timeout = 3600;
/**
* The number of times the job may be attempted.
*
* #var int
*/
public $tries = 1;
/**
* Create a new job instance.
*
* #return void
*/
public function __construct()
{
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
$crawl = new Crawl();
$crawl->start();
}
}
app/Crawl.php
<?php
namespace MyApp;
class Crawl
{
public function start()
{
ini_set('memory_limit','256M');
set_time_limit(3600);
echo "Started.";
sleep(30);
echo "Exited.";
exit();
}
}
worker.log
[2018-03-21 10:14:27][1] Processing: MyApp\Jobs\Crawl
Started.
Exited.
[2018-03-21 10:15:59][1] Processing: MyApp\Jobs\Crawl
[2018-03-21 10:15:59][1] Failed: MyApp\Jobs\Crawl
From Horizon's failed job detail
Failed At 18-03-21 10:15:59
Error Illuminate\Queue\MaxAttemptsExceededException:
MyApp\Jobs\Crawl has been attempted too many
times or run too long. The job may have previously
timed out. in /home/vagrant/app/vendor/laravel
/framework/src/Illuminate/Queue/Worker.php:396
laravel-worker.conf
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /home/vagrant/myapp/artisan queue:work --sleep=3 --tries=1 --timeout=3600
autostart=true
autorestart=true
user=vagrant
numprocs=1
redirect_stderr=true
stdout_logfile=/home/vagrant/myapp/worker.log
config/queue.php
'redis' => [
'driver' => 'redis',
'connection' => 'default',
'queue' => 'default',
'retry_after' => 90,
'block_for' => null,
],
.env
QUEUE_DRIVER=redis
Synopsis
Looking at my worker.log I can see that the output from my class has worked:
Started.
Exited.
But the job is reported as failed. Why?
Strangely, also in the worker.log, it says Processing twice for one job:
[2018-03-21 10:15:59][1] Processing: MyApp\Jobs\Crawl
[2018-03-21 10:15:59][1] Failed: MyApp\Jobs\Crawl
Any help is greatly appreciated!
UPDATE
Removing the exit() has resolved the issue - this is strange as the PHP manual says that you can use exit() to exit the program "normally":
https://secure.php.net/manual/en/function.exit.php
<?php
//exit program normally
exit;
exit();
exit(0);
Removing the exit() has resolved the issue - this is strange as the PHP manual says that you can use exit() to exit the program "normally"
This is true for regular programs, but a queued job in Laravel doesn't follow the same lifecycle.
When the queue system processes a job, that job executes in an existing queue worker process. Specifically, the queue worker fetches the job data from the backend and then calls the job's handle() method. When that method returns, the queue worker runs some code to finalize the job.
If we exit from a job—by calling exit(), die(), or by triggering a fatal error—PHP stops the worker process running the job as well, so the queue system never finishes the job lifecycle, and the job is never marked "complete."
We don't need to explicitly exit from a job. If we want to finish the job early, we can simply return from the handle() method:
public function handle()
{
// ...some code...
if ($exitEarly) {
return;
}
// ...more code...
}
Laravel also includes a trait, InteractsWithQueue, that provides an API which enables a job to manage itself. In this case, we can call the delete() method from a job that exhibits this trait:
public function handle()
{
if ($exitEarly) {
$this->delete();
}
}
But the job is reported as failed. Why? Strangely, also in the worker.log, it says Processing twice for one job
As described above, the job could not finish successfully because we called exit(), so the queue system dutifully attempted to retry the job.
In my api I am using Redis for caching dispatched jobs from my controllers: This is how my controller looks
class FormSubmissionsController extends Controller
{
/**
* #param StoreRequest $request
* #return \Illuminate\Http\JsonResponse
*/
public function store(StoreRequest $request, FormSubmission $formSubmission)
{
JobStore::dispatch($formSubmission, $request->get('tracking_code'), $request->get('form'));
return response()->json([
'id' => $formSubmission->id
]);
}
}
All is working, and the only change I did to use redis it was some config vars in dot env file. My question:
In another controller I want to use some of Amazon SQSservices for queued jobs, any idea how to config queue and how should I dispatch each job to particular queue handler ?
You can pick a connection that should be used to dispatch a job with onConnection() method:
JobStore::dispatch()->onConnection('sqs');
See https://laravel.com/docs/5.5/queues#dispatching-jobs for more details.