I have a PHP script to parse live scores from JSON url,then check if some scores are change it call another php script to push notifications to IOS devices.
my question is how can i make the first script run every 20 seconds.
It depends on the host OS that you're using. For linux/unix/etc. the preferred tool for scheduling tasks is cron. For Windows or Mac or anything like that there are other built-in schedulers.
Using cron, to schedule a task to run every minute might look something like this:
* * * * * /path/to/command
The granularity, however, seems to only go to the minute. So without a more granular task scheduler, you may need to internally handle the 20 second delay.
One idea could be to wrap the task in a script which runs the command, sleeps for 20 seconds, runs the command again, sleeps for 20 seconds, and runs the command again. Then the cron job would call that wrapping script. My bash is a little rusty, but the idea would look something like this:
/path/to/command
sleep 20
/path/to/command
sleep 20
/path/to/command
$total_time = 0;
$start_time = microtime(true);
while($total_time < 60)//run while less than a minute
{
//DoSomething;
echo $total_time."\n";
sleep(20);//wait amount in seconds
$total_time = microtime(true) - $start_time ;
}
add this to cronjob to run every minute.
maybe you can try sleep
while(1) {
exec('php path/to/script.php');
sleep(20);
}
Generaly a cronjob is perfect for this kind of task. Have a look at this here: http://www.thegeekstuff.com/2011/07/php-cron-job/
If that's not applicable you might be able to Script something for the PHP CLI http://php.net/manual/en/features.commandline.php which allows you unlimited runtime. Just put a while(true) loop with a sleep() in there and run it with the php command like > php yourscript.php
Related
I have a file on the server that extract some data from pages, like a crawler. Ok, now, sometimes, the execution of the script takes 5 seconds, but sometimes takes even 1 minute or 2.
Setting a cron job for accessing the file at 2, 3, or 5 minute is not comfortable for me, because I need this crawler to run as fast as possible.
So my question is:
Can I set a cron job to run, let's say, every 5 minutes and set php to re-run the script again and again and again ?
More clear:
*/5 * * * * wget -O - site.com/index.php?cron >/dev/null 2>&1
index.php?cron
function cron_action()
{
//some action here
// Call again the function
cron_action();
}
cron_action();
As I don't understand very well how does cron job react at my script, I don't know either what will happen when, on another 5 minutes, the cron job will acces the url again.
I will be in a infinity loop ?
How you would do that ? I need some advices please. I really need to set the cron job run faster and, in my opinion, the functions from php must be recalled forever.
Setup cronjob:
* * * * * php /path/to/your/file.php --arg1=value1 --arg2=value2 >> /dev/null 2>&1
Your file.php
<?php
//test your args
var_dump($argv);
$minute = date('i',time());
//run every 10 minutes
if(intval($minute)%10==0){
//run your cron job code here
cron_action();
}
else{
//do other code
}
function cron_action()
{
//do stub
}
And this is how to use cronjob with laravel https://laravel.com/docs/5.3/scheduling, you will learn from that.
Possibly you can use the crontab to run it faster, but you can have a pid-lock(sort of) so that each time the crontab is called there can only one script running.
I have this Cron (Split into multiple lines for readability)
8,18,28,38,48,58 * * * *
/usr/local/bin/setlock
-n /tmp/cronlock.1618472.147531 sh
-c $'/home/ryannaddy/trendie.co/get/next_trends'
It runs this file:
#!/usr/local/php54/bin/php-cgi -q
<?php
set_time_limit(90);
require_once __DIR__ . "/../includes/setup.php";
$db = new Database();
$lock = (bool)$db->getOne("select get_lock('get_trends', 0)");
// File is already running; don't run again.
if(!$lock){
echo "Lock Exists\n";
exit;
}
$trendie = new Trendie();
$trendie->prepare();
$trendie->setNextId();
$trendie->getCandidates();
$trendie->selectFinal();
$db->getOne("select release_lock('get_trends')");
For some reason, the cron doesn't always end, it can run for hours, but it shouldn't. On average it runs in about 30 seconds successfully, but from time to time it doesn't end, and I have to manually log into my server and kill the process to allow it to run.
I have attempted to remove the mysql get_lock, but that doesn't fix it. I also added the set_time_limit(90), and that doesn't fix it either. The $trendie->getCandidates method does lots of http requests (15-20) either using a website's API or file_get_contents() depending on the application. But as stated before they usually all end within 30 seconds.
So... Why isn't this being limited to 90 seconds set_time_limit(90); if it takes too long to run?
I have the cron set to email me any output, and I am getting this when it doesn't work:
setlock: fatal: unable to lock /tmp/cronlock.1618472.147531: temporary failure
What I did, was I created another cron, one that runs 1 minute before this one with this code in it:
#!/bin/sh
pkill -f next_trends
So far it has worked like a charm!
I am running a script which constantly works over my Database. How ever It is necessary to restart the script once an hour. Obviously I can't do that automatically. I don't want to use daemon, its too complex for me right now. Easier solution is to use cron job but biggest drawback is, it won't stop the last script. Script runs in infinite while(true) loop
However is that possible if I make function in a script, lets say
function exitScript()
{
exit;
}
And then on Cron job if i do something like
php /home/Path/public_html/webservice/myScript.php exitScript and then
php /home/Path/public_html/webservice/myScript.php
What will be the format and How can I run both one by one using cron job or make another PHP who does so?
I need advice.
Here is a little trick easy to made which you can use..
1st set you cron jobs to run on each hour.
* */1 ..... cronjob.php
2nd At start of your script define 1 constant with time:
define('SCRIPT_START_TIME', time());
3rd At your exit script set up a condition check to exit if 59 minutes are passed from this constant to current time.. :)
function exitScript()
{
if((time() - SCRIPT_START_TIME) > 59*60){
exit();
}
}
4th at each while LOOP start the exit script .
I've written a PHP script that gathers all the data from an IceCast stream and stores it in an array. I want to measure how many listeners the stream has ever five minutes. Is there a way to remotely run the script so that it "refreshes" every five minutes and sticks the number of listeners into a database? Thanks!
A Cron job is what you are looking for. You can search on SO/Google/etc. for how to create/setup a Cron job.
I have used following snippets to periodically run a script.
The main advantage of it is that run after $min minutes (configurable) after finish the current process. That cannot be achieved with cron that run exactly each X time. See the differences? In this way you can be sure of wait a given amount of time between processes.
Maybe is not exactly what you want but I would like to share this useful technique.
script_at.php file:
function init_at()
{
// my code
runNextPlease();
}
function runNextPlease()
{
$min = 5;
exec ("at now + $min minutes -f " . PATH_TO_SOURCE . "script_at.sh", $output, $out);
my_logger("at return status: $out");
}
script_at.sh file:
#!/bin/bash
/usr/bin/wget -c -t0 -o /dev/null -O /dev/null http://domain/script_at.php
Is there a way to make a function "PHP/JS" or a cron job to call a link/href every 12 hours for example?
Thanks in advance!
To run a PHP script via cron job every 12 hours, you can do the following:
0 */12 * * * curl -s -o /dev/null http://example.com/your-php-script.php
Of course you could do it yourself with a cron job, but there are monitoring services that
will check that a page is accessible and returns the expected data, and email or text you
if it fails.
There are attempts at this, e.g.
https://github.com/StefanLiebenberg/cron.js
http://elijahr.blogspot.de/2009/03/javascript-cron.html (https://gist.github.com/4403566)
Otherwise it might be simpler to just write a traditional cron script, if possible.