I working currently on a simple Reminder Laravel app for Heroku. I've written my API and everything works well. Now I want to run a scheduled task on Heroku.
Basically I'm looking for something to execute the following command on a separate dyno:
php EmailScheduler.php
or
php artisan schedule:emails
This command should be a long running process where e.g. each minute the same task is being executed over and over.
In this task I want to query my database and get all reminders that are due and send them as an email so I have to be able to access the business logic of my application like eloquent models.
I know about the scheduler add-on or about certain "cron" solutions but I would like a PHP only solution so just a long running script or task that might sleep most of the time and wake up each minute or so.
In Ruby (Rails) for example I could require the environment, thereby have access to my business logic and then fire up a timer task using the concurrent-ruby library.
I already tried to require the
bootstrap/autoload.php
bootstrap/app.php
files but it seems like I have to do more than just this, like boot the application.
How can I solve this using the Laravel framework and possibly other third-party libraries ?
Task Scheduling in laravel is done by making a cron job. By defining a cron job, one can schedule a task to be excuted at a defind period.
You follow the official documentation of laravel for scheduling: https://laravel.com/docs/5.3/scheduling
In 2017, you indeed couldn't use cron on Heroku. As of 2021, you can use Cron To Go, an Heroku add-on, to set up a cron job that runs on one-off dynos in Heroku. Simply follow the guide here to run Laravel's scheduler once a minute.
Related
I have a Product model with id,name,price.
The price value is stored in an external API and i need to fetch it every minute in order to update it in the database.
Looking through the Laravel documentation I found two ways to implement:
Create an artisan command (https://laravel.com/docs/8.x/artisan) and add it to task scheduling (https://laravel.com/docs/8.x/scheduling#scheduling-artisan-commands)
Create a job (https://laravel.com/docs/8.x/queues) and add it to task scheduling (https://laravel.com/docs/8.x/scheduling#scheduling-artisan-commands)
First of all, is there any other approach i should take in consideration?
If not, which one of the above would be the best approach and why is it correct for my use case?
As per my comments on one of your previous questions on this topic, whether you use a queue or not depends on your use case.
An Artisan command is a process that executes once and performs a task or tasks and then exits when that task is complete. It is generally run from the command line rather than through a user action. You can then use the task scheduling of your command's host operating system (e.g. a CRON job) to execute that command periodically. It will faithfully execute it when you schedule it to be done.
A Queued job will execute when the Job turns up next in the queue, in priority order. Let's say you send your API call (from your other post) to the queue to be processed. Another system then decides it needs to send out emails urgently (with a higher priority). Suddenly, your Job, which was next, is now waiting for 2000 other Jobs to finish (which might take a half hour). Then, you're no longer receiving new data until your Job executes.
With a scheduled job, you have a time critical system in place. With queues, you have a "when I get to it" approach.
Hope this makes the difference clearer.
With laravel it is a lot easy to use the built in scheduler. You have to add only one entry to the crontab and that is to run the command php artisan schedule:run EVERY MINUTE on your project. After that you dont have to thing about configuring the crontab on the server, you just add commands to the laravel scheduler and they will work as expected.
You should probably use Cron Job Task Scheduling which would be the first approach you mentioned.
Commonly for this type of use-cases commands are the easiest and cleanest approach.
There are a few things to do in order to make it work as expected:
Create a new command that will need to take care of hitting the endpoint and storing the retrieved data to the database
In Kernel.php file register your command and the frequency of running (each minute)
Run php artisan schedule:run
You can read more about how to create it here:
I have a cron job that run every 5 Hours. It calls a PHP script , this script will do a call to an external API to sync some data.
The problem is sometimes I'm getting timeout from the API and the job will fail.
Are there any mechanisms to let cron tab do auto retry or auto recover the jobs that are failed?
I have tried to do an extra job and call it in case of any failures manually.
What is the best approach to do so?
Cron does only run once at specific time or every minutes/hours/days etc. It doesn't check the return code. So it's not that easy peasy lemon squeezy at all...
In my opinion you have a few options how to do it:
Create a some kind of scheduler where you can write your CRON job again if it fails, in this case you will need one more CRON job to read you scheduler and run proper command. Scheduler can be database / file / NoSQL based. In scheduler you can have flag like (bool) executed which will let scheduler know which tasks are already done.
Use queues (f.ex. Rabbit) to call it self again when fail.
Use framework, I'm using Symfony to manage own created commands to execute them (check second link below) based on database, using also enqueue/enqueue-bundle package to manage queues in Symfony.
I think if you are not so advanced with PHP I'd recommend to go for self made scheduler based on database (MySQL / PostgreSQL / NoSQL) with Symfony (check second link below). In this case you just have to SELECT all non executed record (commands) from database and just run them.
Lecture
Laravel - Queues, retrying failed jobs
Symfony - calling another commands in command
Queues package for PHP (incl. Symfony)
enqueue/enqueue-bundle
What you can do is something like this:
https://crontab.guru/#1_0-23_13__
1 0-23 13 * *
Start the job past 1 minute at every hour on the 13th of each month.
βAt minute 1 past every hour from 0 through 23 on day-of-month 13.β
...then in your code you'd have some logic to detect if the process\script already ran correclty... if yes, skip the run attempt; otherwise let it run and then have a flag set to check against on the subsequent attempt run.
Hope you get the idea.
You can use supervisord :
supervisord website
Or handle API timeout in code.
In this moment i have a shared hosting server.
In my app i want to use Laravel's queue system, but i can't maintain the command php artisan queue:work beacuse i can't install a supervisor.
With a little bit of an effort, i can move my app on a VPS, but i don't have much of experience with servers and i'm a little scared that my app will be offline much time.
Considering lack of experience on server side, i have this questions:
Is it ok to use Laravel queues with cron jobs? Can it break in any way?
Only for this problem should i upgrade to an VPS or i should remain on this shared hosting server ( i have ssh access here )?
Quick answer: you should not use Laravel queues without a process monitor such as Supervisor.
It all depends of what you want to achieve, but an alternative to queues would be using the laravel scheduler: you can trigger the scheduler with a cron task (every minute for example), and dispatch jobs easily.
And if you really want to use the queues, a solution could be to add your jobs to the queue, and process them using a cron task every minute running the following command: php artisan queue:work. But I would recommend the previous solution.
I feel a little bit silly for asking this question but I can't seem to find an answer on the internet for this problem. After searching for several hours I figured out that on a linux server you use Supervisor to run "php artisan queue:listen" (either with or without daemon) continuously on your website to handle jobs pushed to the queue. This is all well and good, but what if I want to do this on a Windows Azure web app? After searching around the solutions I found were:
Make a chron job to run "php artisan queue:listen" every minute (or every X minutes), I really dislike this solution and wanted to avoid it specially if the site gets more traffic;
Add a WebJob that runs "php artisan queue:listen" continuously (the problem here is I don't know how to write the script for the WebJob...);
I want to ask you guys for help on to know which of these is the correct solution, if there is a better one and if the WebJob is the best one how do I write the script for this? Thanks in advance.
In short, Supervisor is a modern alternative to nohup (no hang up) with a few other bits and pieces tacked on. In short, there's other resources that can keep a task running in the background (daemon) and the solution I use for Windows based projects (very few tbh) is Forever which I discovered via: https://stackoverflow.com/a/18226392/5912664
C:\myprojectroot > forever -c php artisan queue:listen --queue=some_nice_queue --tries=3
How?
Install node for Windows, then with npm install Forever
C:\myprojectroot > npm install -g forever
If you're stuck for getting Node running on Windows, I recommend the Windows Package Manager, Chocolatey
https://chocolatey.org/packages?q=node
Be sure to check for any logfiles that Forever creates, as I had left one long enough to consume 30Gb of disk space!
For Azure you can make a new webjob to your web app, and upload a .cmd file including a command like this.
php %HOME%\site\wwwroot\artisan queue:work --daemon
and defining that as a triguered and 0 * * * * * frequency cron.
that way work for me.
best.
First of all you cannot use a WebJob with Laravel on Azure. The Azure PHP Web App is hosted on Linux. WebJobs do not work with Linux at this moment.
The best way to do chron jobs in Laravel on Azure is to create an Azure Logic App. You use the Recurrence trigger and then a HTTP action to send a POST request to your Laravel Web App. You use this periodic heartbeat to run whatever actions you need to do. Be sure to add authentication to your POST request.
The next problem you will have is that POST will be synchronous so the work you are doing cannot be extensive or your HTTP request will time out or you will reach the time limit on PHP scripts (60 seconds).
The solution is not Laravel Jobs because here again you need something running in the background to process the queues.
The solution is also not PHP threads. The standard Azure PHP Web App does not support PHP Threads. You can of course build your own Web App and enable PHP threads, but this is really swimming upstream.
You simply have to live with synchronous logic. So the work you are doing with the heartbeat should take no more than about 60 seconds.
If you need more extensive processing then you really need to off load it to another place: another Web App, an Azure Function, etc.
But why not do that in the first place? The reason is cost and complexity. If you have something simple...like a daily report...you simply connect the report to the heartbeat and all the facilities for producing the report are right there in Laravel. To separate the daily report into its own container would require setup and the Web App it runs in would incur costs...not worth it in my view for something simple.
We have a website running on multiple Azure instances β typically between 2 and 5.
There is a PHP script I would like to schedule to run every few minutes on each instance. (It just makes a local copy of data from a system that couldn't handle the load from all our users hitting it in real-time.)
If it were just one instance, that would be easy - I'd use Azure Scheduler to call www.example.com/my-scheduled-task.php every 5 minutes.
But the script needs to run on each instance, so that every instance has a reasonably up-to-date copy of the data. How would you achieve this? I can't work out if it's something in Azure Scheduler, or if I should be looking at some sort of startup script?
You can use a continuous webjob for that.
Just tweak your php script to have a loop and add a sleep of a few minutes between runs of your code.
The continuous webjob will run on all of your instances and even if somethings fails it will be brought back up.
Per my experience, a PHP webjob running on your each webapp instance is the good solution as #AmitApple said. However, I think you can try to use a scheduled webjob with a CRON expression for ensuring a start time, not a continuous one with a sleep time. And please make sure the script can be completed in the interval time.
You can refer to the section Create a scheduled WebJob using a CRON expression of the doc Run Background tasks with WebJobs to know how to get start.
Please see the note of the section Create a continuously running WebJob https://azure.microsoft.com/en-us/documentation/articles/web-sites-create-web-jobs/#CreateContinuous.
Note:
If your web app runs on more than one instance, a continuously running WebJob will run on all of your instances. On-demand and scheduled WebJobs run on a single instance selected for load balancing by Microsoft Azure.
For Continuous WebJobs to run reliably and on all instances, enable the Always On* configuration setting for the web app otherwise they can stop running when the SCM host site has been idle for too long.