Execute Long running php scripts via cron jobs - php

I have a php script that updates a database. It takes round 5 minutes to complete. I want to set it to run automatically via cron job on cpanel but it don't execute via cron job.
Is there any time execution limit on cron jobs or is there any other effective way to automate this process on server?

set_time_limit() might help you
documentation
you can use supervisor to keep command line tasks running

Related

Perform Cron Job using PHP once

I am doing a game, and I have to make a cron job, which will execute another PHP file after 60 seconds once only, how can i perform that using PHP?
Better use at instead of cron for such jobs (that is if you can't do it inside your script anyway)
The at command schedules a command to be run once at a particular time

Execute php script Sequentially and Continuously

I have developed a script which should execute continuously and sequentially. Now on creating a cron job for this script, it keeps executing asynchronously.
1) I kept a while loop in script and thought of executing this script once so i used #reboot to execute the script once but if apache craches then this script will not start executing by its own ??
2)Using * * * * * cron it executes this script but creates multi-thread and all cron overlaps on the server eventually.
I have ran out of ideas how to execute a server script continuously and sequentially even if apache server restarts.
You're asking for:
a script which should execute continuously and sequentially
That's the definition of a daemon. You can use upstart to easily create a daemon with your php code.
Here you can find a good article to explain how create a daemon with upstart and node.js (but is easy to adapt to a php script): http://kvz.io/blog/2009/12/15/run-nodejs-as-a-service-on-ubuntu-karmic/
cron is for repeating jobs on a timed basis (min interval: 1 minute). It's intended for scripts that eventually exit and need to be restarted from scratch. Sounds like your script is essentially trying to be a daemon - started once, then left running permanently.
Instead of cron to repeatedly start new copies of the script, simply start your script ONCE via an init script, e.g. /etc/rc.local
There are some ideas to deal with checking if a script is running in this question, and you could have a cron run every couple of minutes to ensure its running and starting it if not.

Can a PHP script run on my server the whole time?

I am pretty new to PHP but I want to crate a website uptime checker which should email me whenever my websites are down.
This means that the script should run non stop. Can this be done in PHP? How?
No. You want to run a script at a set interval instead. Cron is what you want.
Here's a tutorial to get started. http://net.tutsplus.com/tutorials/php/managing-cron-jobs-with-php-2/
You could use a cron task to schedule your script running at regular intervals. Or if you are hosting on Windows use the Windows Scheduler.
Normally there's an execution time limit that will stop scripts from running longer than a specified time. A better solution would be to set up a cron job that executes the script periodically.
you can do it by a cron job (a script to check your website) and make it run at regular intervals.

How to assign cron jobs through php

Is it possible to schedule some tasks in php to run automatically on the server at a certain time of the day? . I heard abt cron jobs. is there any way to set it via php code
http://php.net/system lets you to run any command-line utility.
As for the certain command to set a cron job you have either to google a little or ask on serverfault.

long time cron job via wget/curl?

I am working on cron jobs for my php app and planning to use cron via wget/curl.
Some of my php cron jobs can take 2-3 hours. How to let php work 2-3 hours from cron tab ? Is it good practice to run such long time jobs via cron wget/curl ? Anyone got experience on this ? I also have email queue and i needs to be run every 10 seconds, but cron tab is minute level. Any suggest on this case ?
Thanks for reading.
When you use wget/curl, you are requesting for a page from a webserver. Every webserver will have a time out period, so this might time out.
Also some of the hosting providers may stop the process running beyond certain minutes ( basically done to control the rogue threads).
So it is not advisable to schedule using wget/curl if the job takes more than few minutes.
Try scheduling it using actual scheduler. You can run php from command line
php [options] [-f] [--]
[args...]
php command should be on the path.
You can use the following at the start of your script to tell PHP to effectively never time out:
set_time_limit(0);
What may be wiser is, if crontab is going to run the script every 24 hours, set the timeout to 24 hours so that you don't get two copies running at the same time.
set_time_limit(24*60*60);
Crontab only allows minute-level execution because, that's the most often you should really be launching a script - there are startup/shutdown costs that make more rapid scheduling inefficient.
If your application requires a queue to be checked every 10 seconds a better strategy might be to have a single long-running script that does that checking and uses sleep() occasionally to stop itself from hogging system resources.
On a UNIX system such a script should really run as a daemon - have a look at the PEAR System_Daemon package to see how that can be accomplished simply.
I've just run into the problem with a long running PHP script executed via cron and wget. The script terminates after 15 minutes due to the default timeout in wget.
However I believe this method does in fact work if you set up the correct timeouts. The PHP script you run needs to be set to have an unlimited run time. Best to set this at the start of your long running script.
set_time_limit(0);
When using wget you also need to remove its timeout by passing -T0. e.g.
wget -q0 -T0 http://yourhost/job.php
Be very careful not to overload your server with long running scripts.
If using wget to call the long running cron job, consider some of it's defaults that aren't desirable in this situation. By default it'll --tries 20 times if no data is read within --read-timeout of 900 seconds. So at a maximum set tries to 1 to ensure the cron job is only called once, since you of course wouldn't want a long running script called multiple times while it's still running.
wget -t 1 https://url/
If on cPanel and you don't care for any output emails after each cron job run, also use the following flags -O and -q to hide output, and redirect errors with 2>&1.
wget -O /dev/null -q -t 1 https://url/ 2>&1

Categories