I have added some jobs to a queue in Laravel. However, I forgot to put $job->delete() in the function and there is an error in my function. This means the job is never ending. It keeps going being replaced onto the queue and keeps erroring in my log file. How can I delete it from the command line?
I am using beanstalkd for my queuing.
I am using Redis instead of Beanstalkd but this should be the same in both. Restarting Redis doesn't solve the problem. I looked at RedisQueues in the Laravel 4.2 API Docs and found:
public Job|null pop(string $queue = null)
//Pop the next job off of the queue.
This is the same if you look at BeanstalkedQueue.
I threw it in app/routes.php inside dd*, loaded that page and voila.
Route::get('/', function() {
dd(Queue::pop());
#return View::make('hello');
});
NOTE: Reload the page once per queue.
The queue was pulled off the stack. I would like to see a cleaner solution but this worked for me more than once.
*dd($var) = Laravel's die and dump function = die(var_dump($var))
Edit 1: For Redis
The above obviously isn't the best solution so here is a better way. Be careful!
FLUSHDB - Delete all the keys of the currently selected DB. This command never fails.
For Redis use FLUSHDB. This will flush the Redis database not Laravel's database. In the terminal:
$ redis-cli
127.0.0.1:6379> FLUSHDB
OK
127.0.0.1:6379> exit
Restart Beanstalk. On Ubuntu:
sudo service beanstalkd restart
I made an artisan command which will clear all the jobs in your queue. You can optionally specify the connection and/or the pipe.
https://github.com/morrislaptop/laravel-queue-clear
Important note: This solution works only for beanstalk
There are two solutions:
1- From Your PHP Code
To delete jobs programatically, you can do this:
//Que the job first. (YourJobProcessor is a class that has a method called fire like `fire($job,$data)`
$res = Queue::later(5, YourJobProcessor::class, $data, 'queue_name');
//get the job from the que that you just pushed it to
$job = Queue::getPheanstalk()->useTube("queue_name")->peek($res);
//get the job from the que that you just pushed it to
$res = Queue::getPheanstalk()->useTube("queue_name")->delete($job);
If everything went good, the job will not execute, else the job will execute after 5 seconds
2- From Command Line (Linux and Mac only)
From command line (In linux and mac) you can use beanstool.
For example, if you want to delete 100 ready jobs from the queue_name tube you can do the following:
for i in {1..100}; do beanstool delete -t queue_name --state=ready; done
For Redis users, instead of flushing, using redis-cli I ran this command:
KEYS *queue*
on the Redis instance holding queued jobs,
then deleted whatever keys in the response
DEL queues:default queues:default:reserved
Only way I could do it was to restart my computer. Could not find a way to delete a job.
I've used this php-based web admin console in the past.
Otherwise, I believe you'll find yourself using Terminal + telnet, altho I can't find any documentation for deleting via telnet (Just viewing a list of jobs in queue).
It seems that most articles tell you to use your code+library of choice and loop around queues jobs to delete them in this situation.
Here is Laravel 5.1 compatible command, which allows you to clear Beanstalkd queue. The command takes queue name as argument ('default' by default). Do not forget to register it in app/Console/Kernel.php
Related
I'm looking for a sustainable solution to fetch data every x seconds (let's say 20) and store it in a relational database using PHP. After doing some research I found a few options:
1) Cronjobs (with shell scripts)
See https://askubuntu.com/questions/800/how-to-run-scripts-every-5-seconds for more information. This basically comes down to run a shell script (looping/sleeping)
This doesn't feel right as I could not catch exceptions and/or race conditions might occur. Also, cronjobs itself are not made for this kind of tasks.
2) Web-worker (with queued jobs)
Laravel provides a queue worker that can process new jobs (asynchronously) as they are pushed onto the queue. I could push multiple (say a lot) of jobs to the queue at once which should processed every x seconds consecutively.
This sounds like a more robust solution as I could catch exceptions and make sure the worker is running (using observers). The downside; it's slower and it might be overengineered.
3) Web socket
I could use node.js to run a websocket client like socket.io and implement some kind of timing mechanism to store the data every x seconds.
This solution feels odd as I was taught that sockets are used to push data to clients (realtime), but I have never seen that they were used to insert data.
All help is appreciated.
What you are looking for are artisan commands.
You would start by creating a command:
php artisan make:command FetchData
This creates a FetchData class. In this class you can edit the handle function.
public function handle()
{
//fetch your data here
}
You also need to edit the $signature variable.
protected $signature = 'fetch:data';
The next step is to register the command in the Kernel.php in the Console namespace.
You need to add your newly created FetchData class to the $commands array.
protected $commands = [
Commands\FetchData::class
];
You could now call this command from the console like php artisan fetch:data
After you registered your command in the Kernel.php you can schedule this command.
You start by adding following line to your crontab on your server (type crontab -e)
* * * * * php /path-to-your-project/artisan schedule:run >> /dev/null 2>&1
You can now add following command to the schedule function in the Kernel.php:
$schedule->command('fetch:data')->everyThirtyMinutes();
There is no option for a job to run every twenty minutes so in this example I chose thirty minutes. You can check the available options here.
Hello I've setup queues with Laravel 5.1.
I perform the HTTP request (Post), this is routed to the respective controller.
Controller executes the following:
//try saving model
try{
$lg = new User();
$lg->fill($request);
$lg->save();
}catch(QueryException $e){
Log::error( $e->getCode());
}
//creates Job instance
$job = new ProcessUser($lg);
//dispatching job
$queue_id = $this->dispatch($job);
also if I
dump($queue_id);
instead of having the Id key fo the queue I get back 0.
Everything works.....as expected on local dev env, with Homestead.
But on production where I have CentOS...
I expected the job just to be queued. Instead seems like it 's processed right away. (I can never see the Job inserted in my queue)
On my server (CentOS 6) I installed supervisor.
And it is stopped:
$ service supervisord status
supervisord is stopped
And also... I hardly doubt It could work since I didn't configure it in
/etc/supervisor.conf
What am I missing?
How can I check out how's processed the queue?
I have never issued any artisan command like
$php artisan queue:*
Sorry all,
I realised I didn't configured properly the .env file by setting
QUEUE_DRIVER=database
it was set to
QUEUE_DRIVER=sync
Didn't know that "sync" config would process right away the queue...
I have a table users that I need to upload continuously, so once updated, I would like to relaunch the command directly.
Actually, I'm using a cron that launch itself each minute with Laravel ($schedule->command('update:users')->everyMinute();), but I'm losing some time if the job is quicker than one minute of I will overload my server if it is more than one minute.
I was thinking to maybe use a queue, and once the script terminated, relauch itself, like this:
// Do My stuff
Queue::push(new UpdateUsers($));
But if the script crash, it will not reload itself, and I need to launch it at least once. I know that I could use a pcntl_fork function, but I would like to have a turnkey function with Laravel. How should I do ?
I would suggest running a Command the Cli,
in the command place a
while (true)
loop so it will run forever. After you created this script you can run it with supervisord
this service runs the command you tell him, and when it fails it will relaunch it automaticlly. Just be aware that after X failures it will stop, it depends on how you configured it.
Example for conf file in:
/etc/supervisord/conf.d/my_infinite_script.conf
and contents could be:
[program:laravel_queue]
command=php artisan your_command:here
directory=/path/to/laravel
autostart=true
autorestart=true
stderr_logfile=/var/log/your_command.err.log
stdout_logfile=/var/log/your_command.out.log
I've used the approach suggested by Tzook Bar Noy in some cases, but have also used a slightly uglier method which can avoid issues with having scripts looping forever if this might cause problems. This can be called every minute in a cronjob:
$runForSeconds = 55;
$runMinute = date('i');
do {
....code....
} while (date('i') == $runMinute && intval(date('s')) < $runForSeconds);
But the best solution would be to use a Jobs queue and run that using supervisor and:
command=php artisan queue:listen
I have created a new Job in a laravel 5.1 app, running in Homestead VM. I've set it to be queued and have code in the handle method.
The handle() method previous expected a param to be passed, but is no longer required and I've removed the param form the handle method.
However, when the queue runs the job I get an error saying:
[2015-06-17 14:08:46] local.ERROR: exception 'ErrorException' with message 'Missing argument 1 for Simile\Jobs\SpecialJob::handle()' in /home/vagrant/Code/BitBucket/simile-app/app/Jobs/SpecialJob.php:31
line 31 of that file is:
public function handle()
Its not longer expecting any parameters, unless there's a default one that's not documented.
Now ANY changes I make, including comments out ALL content in the Job file are not seen when I run the queue. I will still get the same error.
Ive tried restarting nginx, php5-fpm, supervisor, beanstalkd, and running: artisan cache:clear, artisan clear-compiled, artisan optimize, composer dumpautoload.
Nothing works.
The only way I get get laravel to see any updated to the Job file is to restart the VM. vagrant halt, then vagrant up.
The job is triggered in a console command like this:
$this->dispatch(new SpecialJob($site->id));
Here is the full code of the SpecialJob.php file:
http://laravel.io/bin/qQQ3M#5
I tried created another new Job and tested, I get the same result.
All other non-job files update instantly, no issue. Its just the Job files. Like an old copy is being cached somewhere I can't find.
When running the queue worker as a daemon, you must tell the worker to restart after a code change.
Since daemon queue workers are long-lived processes, they will not pick up changes in your code without being restarted. So, the simplest way to deploy an application using daemon queue workers is to restart the workers during your deployment script. You may gracefully restart all of the workers by including the following command in your deployment script:
php artisan queue:restart
I have an event that is fired when I receive certain notifications. I want to Queue the event so that they aren't all fired at the same time but are Queued up as I receive them and then fired after the previous event completes. I want to know the best way to do this.
Edit: Just for anyone in the future, setting up the database Queue driver is very straightforward and simple. You run the php artisan queue:table and change the driver to 'database'. My problem was that my app wasn't recognizing my QUEUE_DRIVER setting in my .env file for some reason.
Laravel 5 has it's own way of dealing with queued jobs, but you can still use the options that were available in Laravel 4. I've personally been curious as to how it all works and just threw together a blank project and ran a couple of queued jobs with a little help from the docs so this may not be a full answer but I hope this helps you on your way.
First you will want to set your config to use the database queue driver, this can be done in config/queue.php or for me it was a matter of going to the .env file and doing this: QUEUE_DRIVER=database.
Then you want to set up the database table to hold the queued jobs, you can do this by running an artisan command: php artisan queue:table this will create the migration so then you need to create the table by running php artisan migrate and then you'll have your jobs table in your DB.
Following that, you'll want to set up a queued job which come in the form of Commands. For example I'll set up a job that writes some text to the log file. You can create jobs or commands using an artisan command, here's what I did to create a command: php artisan make:command WriteToLog --queued. And here's what my command class looks like after adding a little code to get it to write to the log file...
app/Commands/WriteToLog.php
use App\Commands\Command;
use Illuminate\Queue\SerializesModels;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Contracts\Bus\SelfHandling;
use Illuminate\Contracts\Queue\ShouldBeQueued;
class WriteToLog extends Command implements SelfHandling, ShouldBeQueued {
use InteractsWithQueue, SerializesModels;
protected $secs;
/**
* Create a new command instance.
*
* #return void
*/
public function __construct($secs)
{
$this->secs = $secs;
}
/**
* Execute the command.
*
* #return void
*/
public function handle()
{
\Log::info('Writing to the log in ' . $this->secs);
}
}
After creating a command, to test it out I wrote a route in my routes file ...
app/Http/routes.php
Route::get('/', function(){
// some time to delay the job
$fiveSecs = \Carbon\Carbon::now()->addSeconds(5);
$tenSecs = \Carbon\Carbon::now()->addSeconds(10);
// adds job to queue
Queue::later($fiveSecs, new App\Commands\WriteToLog('5 secs'));
Queue::later($tenSecs, new App\Commands\WriteToLog('10 secs'));
return 'All done';
});
Before we hit the route we want to listen for any jobs in order to process them, just run php artisan queue:listen then you can go to your browser to the route, after hitting the route in my browser the console shows
$ php artisan queue:listen
Processed: Illuminate\Queue\CallQueuedHandler#call
Processed: Illuminate\Queue\CallQueuedHandler#call
And if I check my log file I see the following:
[2015-05-19 19:25:08] local.INFO: Writing to the log in 5 secs
[2015-05-19 19:25:10] local.INFO: Writing to the log in 10 secs
Not exactly 5 and 10 seconds apart but hopefully you get the idea!
For me this is really just the tip of the iceberg and queued jobs are something very powerful in laravel, I highly recommend checking out the docs here: http://laravel.com/docs/5.0/queues and here: http://laravel.com/docs/5.0/bus
You can also fire events from your queued jobs or queue an event handler, see here for more details: http://laravel.com/docs/5.0/events#queued-event-handlers
Laravel makes queues pretty straightforward, but a bit long to explain fully here. Check out these guides:
If you are using forge, it is really painless:
https://mattstauffer.co/blog/laravel-forge-adding-a-queue-worker-with-beanstalkd
If you aren't using forge, it is still pretty ok: http://fideloper.com/ubuntu-beanstalkd-and-laravel4