I have developed a php script, that I want to run continuously.
For example, the script run.php executes a list of tasks, but I don't know how long it will take: it can take 30 sec, 1 min, 2 min, or more.
The problem is this script can't be executed simultaneous.
So I can't use the cron job because if I setup a cron job every minute, but the script is running during more than 1 minutes, I'll have bugs).
In fact, I want that this cron job to be Perpetual executed : everytime run.php is ended, the script run.php is reloaded, again and again...
I don't know how to resolve this problem.
Any help please ?
Thank you.
You've [at least] two options:
Under run.php code, determine if it is already running (by setting some external control system that is checked at the beginning of the script) and setup the crontab job as you described. [I recommend this approach].
Use something like (while :; do php run.php; done) & run it once, and put it inside the /etc/rc.local to make it run at every system start. I don't really like this approach, but it's a possible solution.
Related
I wrote some tvguide scrapers in php. I run them from a script that is executed by a cronjob. This script is run every minute, and checks if a scraper needs to start. This way i can alter and manage these scraping jobs without having to modify the cron itself.
These scraping scripts vary in runtime, some take no more than 1 minute, and others can take up to 4 hours. When i run them one after another there is no problem. But when i try to run two script simultaniously - one or both scripts hang. Resulting in a email from the cron:
sh: line 1: 700865 Hangup /usr/local/bin/php /home/id789533/domains/erdesigns.eu/public_html/tvg_schedules/scraper.php --country=dk --provider=1 --scraper=tv2 2>&1
Where the /usr/local/..... is the command for the script, and which is called from the scheduler script.
I just cant find anything related to this message, and i have no idea how to fix it. I can send the script itself if needed.
All advise and help would be apreciated.
[Edit] I also took a look at the resource usage, and the load never gets higher than 150mb and 15% load. I have a limit of 400% and 1GB.
I execute the scripts from the php script like so:
shell_exec(sprintf("/usr/local/bin/php %s 2>&1", $scraper));
where $scraper is the filename. It executes the script like it should, but after a while i get the message sh: line 1: 000000 Hangup
I know for sure that it is not allocating to much memory, someone who can direct me to the right way? I dont know where to look right now.
PHP is a language intended for the web with features like a cap for maximum execution time to make sure scripts do not run indefinitely and thereby blocking resources. Therefore, PHP is not the best choice for this task.
If it is only a short script I would advice you to convert it into a BASH or Python script. However, if you want to stick to PHP, check your php.ini file for settings restricting execution time.
I was wondering how to make a php daemon script that runs one time at the day?
Do you know any good frameworks with benefits?
or is it just small code?
Thanks
I was wondering how to make a php deamon script that runs one time at
the day?
In order to do this, get familiar with cron jobs. A cron job is a function that gets executed by the server on a time interval. Usually you'd edit your "crontab" by executing crontab -e
Then, once inside, you'd write the interval you want, followed by the command.
Typically it looks like:
30 18 * * * rm /home/someuser/tmp/* > /home/someuser/cronlogs/clean_tmp_dir.log
Since its PHP, you can either a) run your php command as a php cli command, OR b) you can make the command get executed when a particular page is run... and just execute that in cron via a curl -X GET 'http://url/' (etc.)
Also, note that you can write all of your stuff in a shell script file and actually run that file as your cron command... that reduces line-item complexity
cron
Sorry I haven't closed this one.
I actually discovered that my host didn't allowed cron jobs running. So I found a relevant homepage that offer a free service to make a request for me when I needed. In my case, I have specified a url link that should be requested to my RESTful API each day.
The link is here and works like a charm :)
I have a PHP script that needs to be run at certain times every weekday. Is cron or Windows task scheduler the only way to do this?
Is there a way to set this up from within another PHP script?
Depends how exact the timing needs to be. A technique I've seen used (dubbed "poor man's cron") is to have a frequently accessed script (a site's home page, for example) fire off a task on the first page load after a certain time (kept track of in the database).
If you need any sort of guaranteed accuracy, though, cron or a Windows scheduled task is really the best option. If your host doesn't support them, it's time to get a better one.
Apart from cron or Scheduled Tasks, you can have your PHP script to always run it. The process should sleep (within a loop) until the time has reached. Then, the process could execute the remaining functions/methods. Or, you can set up another script (which will act as a daemon) to check for the time and execute the other script.
Well since the web is a pull mechanism you have to have some sort of action that will trigger a PHP script to execute. cron is an option on *nix and task scheduler on windows. You could also write your own service that has a timer but only if needed, this is common on windows services for updaters, jobs etc.
One way you could do it is in the cron task just call a php script for each action needed. Or one php script that executes other tasks. The problem with web based tasks though such as PHP is timeouts. Make sure your tasks are under 60-90 seconds. If not you might look at using python , perl or ruby or even bash scripts to do the work rather than the PHP script.
cron seems like the best option for you though. You will have to call your script with wget. There are examples here: http://www.thesitewizard.com/general/set-cron-job.shtml
For instance this runs the script everyday at 11:
30 11 * * * /usr/bin/wget http://www.example.com/cron.php
Cron, of course, is by far the best way to schedule anything on *nix.
If this is in a remote server you do not have cron access to, you can setup cron/windows scheduler on your computer, to open a web browser to the page that contains the script you wish to run
You probably want to use cron (or windows scheduled tasks).
If you really wanted, you could set up another php script to run continuously with an infinite loop (with a sleep command inside the loop, say for 30 seconds or so) and then when you reach your desired day/time execute the other script via a shell command call. While possible, I can't think if a single good reason to use this method rather than cron/scheduled tasks
You can write a long running script that runs your main script in predefined times but it will be very unnecessary, error prone, and it will basically be a "cron rewrite in phph".
Using the real cron itself will be easier and a more robust solution. If you are packaging an application, you can put a file in /etc/cron.d which contains a single cron line running your application.
You'll need to use a cron job (under Linux/Unix) or a scheduled task under Windows. You could have another script running on a continuous basis which checks the time and executes a script at a specified interval, but using the OS-supplied mechanism is easier to manage and resilient to restarts, etc.
The Uniform Server project has some good suggestions on mimicking cron in environments where cron is unacceptable. Still though, if cron is at all an option, use it.
I have a cron job the executes a PHP script. The cron is setup to run every minute, this is done only for testing purposes. The PHP script it is executing is designed to convert videos uploaded to the server by users to a flash format (eg... .flv). The script executes fine when manually doing it via command line, however when executing via cron it starts fine but after one minute it just stops.
It seems that when the next cron is executed it "kills" the last cron execution.
I added the following PHP function:
ignore_user_abort(true);
In hopes that it would not abort the last execution, I tested setting the cron to run every 5 minutes, which works fine, however a conversion of a video may take over 5 minutes so I need to figure out why its stoping when another cron is executed.
Any help would be appreciated.
Thank you!
EDIT:
My cron looks like:
*/1 * * * * php /path_to_file/convert.php
I don't think cron kills any processes. However, cron isn't really suitable for long running processes. What may be happening here is that your script tramples all over itself when it is executed multiple times. For example, both PHP processes may be trying to write to the same file at the same time.
First, make sure you not only look in the php error log but also try to capture output from the PHP file itself. E.g:
*/1 * * * * * php /path/to/convert.php & >> /var/log/convert.log
You could also use a simplistic lockfile to ensure that convert.php isn't executed multiple times. Something like:
if (file_exists('/tmp/convert.lock')) {
exit();
}
touch('/tmp/convert.lock');
// convert here
unlink('/tmp/convert.lock');
cron itself won't stop a previous instance of a job running so, if there's a problem, there's almost certainly something in your PHP doing it. You'll need to post that code.
No, it will not. You can keep a second process from running by creating a lock file that the script checks for on each run. If the file exists, it does not run. This should also, if appropriate, be used in conjunction with a maximum execution time so that one process does not stall future executions indefinitely. The lock file can just be an empty plain text file called /tmp/foo.lock.
I am working on cron jobs for my php app and planning to use cron via wget/curl.
Some of my php cron jobs can take 2-3 hours. How to let php work 2-3 hours from cron tab ? Is it good practice to run such long time jobs via cron wget/curl ? Anyone got experience on this ? I also have email queue and i needs to be run every 10 seconds, but cron tab is minute level. Any suggest on this case ?
Thanks for reading.
When you use wget/curl, you are requesting for a page from a webserver. Every webserver will have a time out period, so this might time out.
Also some of the hosting providers may stop the process running beyond certain minutes ( basically done to control the rogue threads).
So it is not advisable to schedule using wget/curl if the job takes more than few minutes.
Try scheduling it using actual scheduler. You can run php from command line
php [options] [-f] [--]
[args...]
php command should be on the path.
You can use the following at the start of your script to tell PHP to effectively never time out:
set_time_limit(0);
What may be wiser is, if crontab is going to run the script every 24 hours, set the timeout to 24 hours so that you don't get two copies running at the same time.
set_time_limit(24*60*60);
Crontab only allows minute-level execution because, that's the most often you should really be launching a script - there are startup/shutdown costs that make more rapid scheduling inefficient.
If your application requires a queue to be checked every 10 seconds a better strategy might be to have a single long-running script that does that checking and uses sleep() occasionally to stop itself from hogging system resources.
On a UNIX system such a script should really run as a daemon - have a look at the PEAR System_Daemon package to see how that can be accomplished simply.
I've just run into the problem with a long running PHP script executed via cron and wget. The script terminates after 15 minutes due to the default timeout in wget.
However I believe this method does in fact work if you set up the correct timeouts. The PHP script you run needs to be set to have an unlimited run time. Best to set this at the start of your long running script.
set_time_limit(0);
When using wget you also need to remove its timeout by passing -T0. e.g.
wget -q0 -T0 http://yourhost/job.php
Be very careful not to overload your server with long running scripts.
If using wget to call the long running cron job, consider some of it's defaults that aren't desirable in this situation. By default it'll --tries 20 times if no data is read within --read-timeout of 900 seconds. So at a maximum set tries to 1 to ensure the cron job is only called once, since you of course wouldn't want a long running script called multiple times while it's still running.
wget -t 1 https://url/
If on cPanel and you don't care for any output emails after each cron job run, also use the following flags -O and -q to hide output, and redirect errors with 2>&1.
wget -O /dev/null -q -t 1 https://url/ 2>&1