I'm running a cron job on a direct admin server that runs every minute and every minute the connections/processes almost triples to about 90 from about 30. They die right away and they drop back down to about 30.
https://laravel.com/docs/5.8/scheduling#scheduling-queued-jobs
I'm using the scheduler cron job
* * * * * cd /path-to-your-project && php artisan schedule:run >> /dev/null 2>&1
What would be creating all these connections?
I also have a problem when restarting the queue with php artisan queue:restart.
A sleeping process starts that lasts forever, crashing the server eventually so I have to manually kill it.
edit here's a screenshot
I think your job is having error. Try php artisan [your job command]. Check if it is running fine. Besides, don't you have a cron table to record your job status?
Related
I am trying for queue:work in Shared Hosting.
I went it with Cron Jobs, in every minute. But it is leading to Server Error after few hours the project started throwing Too many connections. Here is my Cron Job method
/usr/local/bin/php /home/username/public_html/system/artisan queue:work --timeout=60>> /dev/null 2>&1
Please, anyone, share which is the best practice for Starting Queue:Work, and I want it not to be shut down if it crashes or timeout, I want it to be restarted again.
By default, when the queue:work command finishes executing current tasks, it continues to wait for new work, and never exits. If you use it with the cron, it is not what you want.
You should use it with the --stop-when-empty option:
/usr/local/bin/php /home/username/public_html/system/artisan queue:work --stop-when-empty --timeout=60>> /dev/null 2>&1
Then the working thread will exit when there is no work left, and will be restarted by cron in a minute.
Without this option, a new thread is started every minute and keeps waiting for new work, but previous one does not exit, so you end up with a lot of threads holding connections.
I have the Laravel schedule command triggered in the crontab as:
* * * * * php /home/forge/site/artisan schedule:run
This is set via Forge.
Then, in app/Console/Kernel.php I'm triggering my job to run hourly:
$schedule->job(new GetRecentArtists())->hourly();
But it's still running every minute.
My understanding was that the artisan command needs running every minute so that job schedules can then be checked to see if the job needs triggering as per its' specific schedule.
Update
I've tried restarting the queue as suggested in the comment by Sahidul, and have confirmed that none of the built-in schedules (hourly, daily, etc) prevent it from running every minute.
When I remove the schedule from the crontab and run it via php artisan schedule:run, I get 'No scheduled commands are ready to run' but the task runs anyway.
This was a red herring. I had my log message showing the job as running in the __construct method of the job, rather than in the handle method.
The job is constructed when the scheduler runs, then added to the queue to be executed as per the schedule - so the construct method was being called immediately, but the job wasn't actually being handled until the scheduled time.
I'm running vTiger 7.0 and I noticed on the first of the month, none of my invoices were created. I then took a look at the Scheduler and noticed that the "Last scan started" and "Last scan ended" fields show that none of the cron jobs had fired in days. The cron jobs are scheduled to fire in 15 minute intervals, with the exception of "RecurringInvoice" which runs every 12 hours.
If I visit /myvtigerinstall/vtigercron.php, the cron jobs will all fire but nobody wants to have to manually run cron jobs!
Has anyone had a similar issue before with vTiger?
I'm not exactly sure how to troubleshoot this error effectively and efficiently. I've checked permissions and they all seem to be in order.
If you've installed vTiger CRM on a dedicated server, you maybe have added a line in linux crontab so that the cron executes...
For instance :
* * * * * sh home/vtiger/vtigerCRM5/apache/htdocs/vtigerCRM/cron/vtigercron.sh
If the cron doesn't launch automatically, it means it's not launched by the cron bot...
Vtiger minimum cron frequency is 15 minutes.
Add the following line in crontab
*/15 * * * * wget -O- --spider "http://vtigercrmurl/vtigercron.php" >/dev/null 2>&1
or use the following free services
https://www.easycron.com/
https://cron-job.org
I'm trying to make Laravel automatically handle the emailing queue but can't make the task scheduler working. The problem is like:
I already got jobs successfully in the database table, and in Kernel.php:
$schedule->command('queue:work')->everyMinute();
on the remote server I've run this command under the project folder:
* * * * * php artisan schedule:run >> /dev/null 2>&1
But the scheduler still refuse to work, as job still remains in the table. If I manually run
artisan queue:work
the email is sent then.
What am I getting wrong here? Many thanks!
Firstly I would suggest you to not use laravel's command scheduler.
Pros and cons of using laravel's task scheduler:
pros
Your cron task gets embedded to your code. So if you change your server you don't need to remember which all cron tasks you had.
cons
Let's say you have several other cron tasks. Task T1 runs every minute but task T2 runs every day while task T3 runs every Tuesday. Now to just check this you will be running a daemon which will check if every minute of you have any task in queue schedule. Also your queue should respect each jobs and their respective timings.
Instead what you can do is create separate commands for every task. And run cron jobs for them.
But even if you wanted to do what you were doing already or want to know why your cron task was not running, then here is what you were forgetting "running the artisan command in your project directory".
* * * * * cd path_to_your_laravel_project & php artisan schedule:run
I am using php file_get_contents function to get some data from other websites.Also i use cron jobs to run that script automatically.The cronjobs works fine but sometimes fails to run.
this is my command in cron jobs (in cpanel):
*/10 * * * * /usr/bin/php -q public_html/include/imp.php > /dev/null 2>&1
this command should repeat every 10 minutes.but not work sometimes even for 48 hours. it's ok when i run that php script manually.
thank you.
I had a similar issue, the script ran fine when the URL was visited directly, and would run sometimes, seemingly randomly. Turns out it was host server resource issue. Hostgator wont run a cron job when I'm connected via SSH with Putty on 2 computers.
From their help pages
"SSH access is limited to two simultaneous connections on Shared and Reseller plans.
Note: Any cron jobs configured will require one of these sessions to be available in order to run, since cron jobs run under the same shell as SSH."
When I close the SSH connection, the cron job immediately runs.
Maybe your host has a similar rule?