I have searched SO, but can't seem to find any topic covering my little problem.
I'm quite new to cron jobs.
I have an IP based alarm. This alarm can control wireless power outlets, turning them on and off within the web based control panel. I can control the power outlets with a simple http command, making them turn on and off.
I have made a php script taking care of this. Right now they are 2 separate scripts, one for turning on and one for turning off. The script are only controlling one specific power outlet.
My problem is that i need a time based switching scheme. First I thought of making the php script sleep, but at the sleep time is 1 hour, it would not be my first choice.
So here we go.
Is it possible to set up cron job to:
1: run the on script for 1 sec, just to trigger the http command in the script.
2: wait 1 hour.
3: run the off script for 1 sec to trigger the http command.
4: wait 2 hours.
5: start all over again.
There are no problem with the alarm system, sending the OFF http command even if the power outlet is off, and vice versa.
You can combine all commands into single and run it every 3 hours.
0 */3 * * * /path/to/1st_script; sleep 3600; /path/to/2nd_script
This will run the 1st_script every 3 hours on 0-minute, then wait 1 hour, then run the 2nd_script.
Now I just have to figure out how to make the system accept my php script in cronjob.
I have read that I have to add /usr/bin/php to the cronjob command, but that does not exist on my RaspberryPi server (nginx with FPM/FastCGI)
Related
I need to set a PHP script to run once every 5 minutes between 8pm and 9pm, nightly.
My understanding of cronjobs is that they run on a fixed time interval such as once every day, once every week etc.
What I need is some way of running regular cronjobs (every 5 minutes) within a specific time frame (8pm to 9pm). The script that runs is the same script.
Ideas
Run one PHP script at 8pm via cronjob ad loop the code inside a timer loop so the script runs constantly (with sleep) for the hour. This seems to me to be a poor solution; inefficient.
Run a secondary cronjob trigger by the first so at 8pm run a cronjob to then run a second cronjob to run every 300 seconds to call the script. Is this even possible?
Install multiple duplicate cronjobs at each required time interval, so one at 20:00, one at 20:05, one at 20:10, 20:15, etc. This is not very DRY programming.
Run the cronjob every 5 minutes 24/7 and exit the PHP script early when the time does not fall between 8pm and 9pm . This seems wasteful/inefficient as 95% of cronjob script triggers will simply exit.
None of these ideas seem to really efficiently fit the bill. Are there choices I have missed?
To be absolutely clear:
I use cronjobs already but they run on single-interval timeframes. I have read all about cronjobs on here and elsewhere and can't find any guidance for multi-inteval time frames.
I am looking for something that runs within two time frames rather than simply one!
Question:
What method(s) can I use to run a standalone PHP script every 5 minutes between two timepoints (8pm and 9pm, in this case)?
Typical, that after posting this question I find a solution from this useful link
The answer being:
*/5 20 * * * /usr/bin/php /www/virtual/username/cron.php
This */5 runs every five minutes between the times of 2000 and 2000 + 1 hours. As pointed out by Barmar this means the final cronjob execution would be 2055.
I was looking for a way to continuously run a PHP script every 15 seconds online, so that I may manage some accounts using an API. I was able to find a script that satisfies what I was looking for as follows:
#!/bin/bash
#This script runs every 15 seconds
#This script is ran in /etc/rc.local (startup)
while (sleep 15 && php test.php) &
do
wait $!
done
This script works perfectly and runs every 15 seconds. However, now I want to modify the script such that it may
do what the script is already doing
run a second script, every 5 minutes
Is there a way to modify the current while loop so that I may achieve this? Thanks!
If you really want to use a loop
Runs forever until you terminate it, although yours also does.
Change sleep to whatever interval you want, i just set it to 1 for this example.
Set up if statements and use modulus to set the time frame for each one, also probably want to set count back to 0 in the highest timed if to stop it getting too large.
You can add as many as you want for as many times as you want :)
#!/bin/bash
while (true)
do
sleep 1
if (( count % 15 == 0 )) ;then
php test.php
fi
if (( count % 300 == 0 )); then
SecondScript
count=0
fi
(( count = count +1 ))
done
There are much better ways to do this because PHP requires and HTTP request to run that file.
Run a cron job to run this request every set interval (not recommended)
Use another program like python which can run a daemon to do this
Do the calculation every time the page is requested (recommended and almost always possible)
The right way to do such kind of things is via "Cron Job".
The software utility Cron is a time-based job scheduler in Unix-like
computer operating systems. People who set up and maintain software
environments use cron to schedule jobs (commands or shell scripts) to
run periodically at fixed times, dates, or intervals.
source: http://en.wikipedia.org/wiki/Cron
Useful Source: How to get a unix script to run every 15 seconds?
We're building an application that uses data from an api. We don't want to proces all this data because it's too much. We want to capture the data 4 times per day. Therefore we need a php script that saves this to the database without a button or human help.
We know how to get the data and understand how to upload this to a database. We don't know how to do this timed and if this is even possible.
We tried to do this with this code, but the page needs to be refreshed to echo.
if (date('His') == 114300) {
echo date('His');
}
A PHP script only runs when it's called via a request from a browser or another client, or when run through the command line.
You could, in theory, create a script with infinite execution time, and have that running. The script would need to check the server time and make the requests. The problem would be that it'd stop if the server restarted, etc. It's also highly inefficient.
Just call your script with a cronjob, four times a day, e.g. like so:
0 */6 * * * curl -k http://example.com/download.php >/dev/null 2>&1
The /6 means every six hours. The curl command calls the script and >/dev/null discards the output.
The best way to do this is by creating a cron job that runs 4 times a day
0 13,14,15,16 * * * /usr/bin/php path/to/cron.php &> /dev/null
your path/to/cron.php will be run at 13:00, 14:00, 15:00 and 16:00
http://www.thegeekstuff.com/2011/07/php-cron-job/
If you have a server (standalone / vps, with shell access), install a cron job.
If you have cpanel access, find the cron job setting.
I'm developing a mobile app that display the live scores for premiership football games. When a game starts, I want to query an external API every 30 seconds to retrieve the latest scores. Most games start at 3pm on a saturday, but some start at 12.45pm, others at 1.30pm, 2pm and 3pm on Sundays and some during the week at the latest time of 8pm.
I have a table in my database populated with all the fixtures for the season and the times they start at.
So I'm guessing I need a cron that runs every 15 minutes between 12.45 and 8pm (games never start outside of these times) that checks my database to see if a game is starting. Then, if there is a game starting, another cron must begin that queries the external API every 30 seconds for the latest score in that game. This cron would run for approximately 1 hour and 45 minutes.
What would be the best way to achieve this sort of cron setup? I'm on a shared server with Plesk software running on it, and don't have ssh access to the server.
Judging by one of your comments
the main problem is I'm on a shared server with no ssh access...
I think the issue here is you don't have access to the command shell and can only render your files to the server using Plesk. Hence, to render your solution, you are going to use cron jobs.
In that case, I will recommend double-checking with your hosting provider for any restrictions on number of cron jobs you can run / minimum frequency permitted for cron jobs. If there are restrictions on processing of cron jobs, you can use an online cron scheduler (like this). Note that external cron service will only allow hitting publicly accessible urls, so you will need to write and deploy code accordingly.
Hence forth, I will assume your cron jobs are working and there are no issues with them.
When a game starts, I want to query an external API every 30 seconds to retrieve the latest scores. Most games start at 3pm on a saturday, but some start at 12.45pm, others at 1.30pm, 2pm and 3pm on Sundays and some during the week at the latest time of 8pm.
Approach 1
Use a single updateMatchesAndScores.php file to update match information for any new matches (mark in the db as active/closed), and update scores for currently active matches.
Cron jobs can not handle the logic of the kind if match is on only then run this, that logic has to go to a script.
You can then run it from 12-10 PM like following
* 12-22 * * * php -f updateMatchesAndScores.php
* 12-22 * * * sleep 30 && php -f updateMatchesAndScores.php
In case of url http://some.server.address.com/updateMatchesAndScores, this becomes
* 12-22 * * * wget http://some.server.address.com/updateMatchesAndScores
* 12-22 * * * sleep 30 && wget http://some.server.address.com/updateMatchesAndScores
You can break this into multiple cron jobs (12.45-12.59, 13:00-20:59, 21:00-21:45) assuming games happen in time range [12.45, 21:45]. This will optimize the unnecessary runs from 12.00-12.45 and so on.
Approach 2
Start a daemon process once using a one time cron job, and then check every minute if its still running or not.
Lets call the script updateMatchesAndScores.php. Put the sleep functionality in this for (1) 15 minutes if no game is on (2) 30 seconds if any game is on in this (3) Sleep from 21:46 to 12:44 next day. You can spawn a separate sub process for every game, so that you don't have to implement (2) for sleeping every 30 minutes in this script.
Caveats - (1) the execution time of the script will start delaying the code a bit, so 15 minutes will soon turn into 15 minutes and x seconds (2) There is a max execution time within php, so you will have to set it accordingly (3) Based on your code quality, there might be memory leaks affecting you slightly.
An optimization here could be to run the process every day using a cron job (which stops after the last game is over - irrespective of whether its 21:46 or 4:30) and restart it accordingly if not already running.
It might be more straightforward to have just one cron job, that looks for game start times, as you have suggested, but which then starts a job for each game that runs for the entire length of the game, looping until the game is over, and sleeping for 30 seconds at the bottom of the loop.
From what you describe, a cron-based solution does not sound like the best approach.
It would be better to set up a daemon (or daemon-like) process, which periodically checks for games starting (e.g. every 15 minutes, playing the role of your first cron job), and spawns a subprocesses to carry out the tasks for each game starting (the role of your second cron job).
If you particularly like cron, you can have a cron job checking the daemon process is running, and starting it if not ;)
I am working on cron jobs for my php app and planning to use cron via wget/curl.
Some of my php cron jobs can take 2-3 hours. How to let php work 2-3 hours from cron tab ? Is it good practice to run such long time jobs via cron wget/curl ? Anyone got experience on this ? I also have email queue and i needs to be run every 10 seconds, but cron tab is minute level. Any suggest on this case ?
Thanks for reading.
When you use wget/curl, you are requesting for a page from a webserver. Every webserver will have a time out period, so this might time out.
Also some of the hosting providers may stop the process running beyond certain minutes ( basically done to control the rogue threads).
So it is not advisable to schedule using wget/curl if the job takes more than few minutes.
Try scheduling it using actual scheduler. You can run php from command line
php [options] [-f] [--]
[args...]
php command should be on the path.
You can use the following at the start of your script to tell PHP to effectively never time out:
set_time_limit(0);
What may be wiser is, if crontab is going to run the script every 24 hours, set the timeout to 24 hours so that you don't get two copies running at the same time.
set_time_limit(24*60*60);
Crontab only allows minute-level execution because, that's the most often you should really be launching a script - there are startup/shutdown costs that make more rapid scheduling inefficient.
If your application requires a queue to be checked every 10 seconds a better strategy might be to have a single long-running script that does that checking and uses sleep() occasionally to stop itself from hogging system resources.
On a UNIX system such a script should really run as a daemon - have a look at the PEAR System_Daemon package to see how that can be accomplished simply.
I've just run into the problem with a long running PHP script executed via cron and wget. The script terminates after 15 minutes due to the default timeout in wget.
However I believe this method does in fact work if you set up the correct timeouts. The PHP script you run needs to be set to have an unlimited run time. Best to set this at the start of your long running script.
set_time_limit(0);
When using wget you also need to remove its timeout by passing -T0. e.g.
wget -q0 -T0 http://yourhost/job.php
Be very careful not to overload your server with long running scripts.
If using wget to call the long running cron job, consider some of it's defaults that aren't desirable in this situation. By default it'll --tries 20 times if no data is read within --read-timeout of 900 seconds. So at a maximum set tries to 1 to ensure the cron job is only called once, since you of course wouldn't want a long running script called multiple times while it's still running.
wget -t 1 https://url/
If on cPanel and you don't care for any output emails after each cron job run, also use the following flags -O and -q to hide output, and redirect errors with 2>&1.
wget -O /dev/null -q -t 1 https://url/ 2>&1