30 second cron does not end execution - php

I have this Cron (Split into multiple lines for readability)
8,18,28,38,48,58 * * * *
/usr/local/bin/setlock
-n /tmp/cronlock.1618472.147531 sh
-c $'/home/ryannaddy/trendie.co/get/next_trends'
It runs this file:
#!/usr/local/php54/bin/php-cgi -q
<?php
set_time_limit(90);
require_once __DIR__ . "/../includes/setup.php";
$db = new Database();
$lock = (bool)$db->getOne("select get_lock('get_trends', 0)");
// File is already running; don't run again.
if(!$lock){
echo "Lock Exists\n";
exit;
}
$trendie = new Trendie();
$trendie->prepare();
$trendie->setNextId();
$trendie->getCandidates();
$trendie->selectFinal();
$db->getOne("select release_lock('get_trends')");
For some reason, the cron doesn't always end, it can run for hours, but it shouldn't. On average it runs in about 30 seconds successfully, but from time to time it doesn't end, and I have to manually log into my server and kill the process to allow it to run.
I have attempted to remove the mysql get_lock, but that doesn't fix it. I also added the set_time_limit(90), and that doesn't fix it either. The $trendie->getCandidates method does lots of http requests (15-20) either using a website's API or file_get_contents() depending on the application. But as stated before they usually all end within 30 seconds.
So... Why isn't this being limited to 90 seconds set_time_limit(90); if it takes too long to run?
I have the cron set to email me any output, and I am getting this when it doesn't work:
setlock: fatal: unable to lock /tmp/cronlock.1618472.147531: temporary failure

What I did, was I created another cron, one that runs 1 minute before this one with this code in it:
#!/bin/sh
pkill -f next_trends
So far it has worked like a charm!

Related

Cron job could not open input file

I have a cron job on my hosting server that is supposed to execute a phpscript every 30 minutes. This php script is used to scrap a csv file and uptate that data into a database (i'm using MySQL for that). When executed manually that script works - it takes it about 45 seconds to finish the process.
Now when that cron job runs i'm getting this message:
Could not open input file.
At first i thought that this might be because it takes over 30 seconds to execute that script so i decided to set the max execution time at the begging of the php script:
ini_set('max_execution_time', 300);
But again the same message came up.
This is my cron job:
/usr/local/bin/php /home/emmaein/domain.com/folder/script.php?token=d8cn3j
P.S: that token get variable is sort of a password so it can't be executed by everybody so basically my php script looks like this:
if($_GET['token'] === 'd8cn3j'){
//open csv
//get data
//update db
} else{
exit('I see you >:)');
}
When PHP is not running inside a web server, you can't access $_GET variables (there's no such thing). Instead you should use command line arguments:
<?php
if ($argc > 1 && $argv[1] === 'd8cn3j') {
// Do stuff
}
And then your crontab becomes:
/usr/local/bin/php /home/emmaein/domain.com/folder/script.php d8cn3j
The concept of $argc (the number of arguments) and $argv (an array of the arguments) is fairly standard among CLI programs, and is documented on PHP's website.

Set a cron job and reaccess forever using PHP

I have a file on the server that extract some data from pages, like a crawler. Ok, now, sometimes, the execution of the script takes 5 seconds, but sometimes takes even 1 minute or 2.
Setting a cron job for accessing the file at 2, 3, or 5 minute is not comfortable for me, because I need this crawler to run as fast as possible.
So my question is:
Can I set a cron job to run, let's say, every 5 minutes and set php to re-run the script again and again and again ?
More clear:
*/5 * * * * wget -O - site.com/index.php?cron >/dev/null 2>&1
index.php?cron
function cron_action()
{
//some action here
// Call again the function
cron_action();
}
cron_action();
As I don't understand very well how does cron job react at my script, I don't know either what will happen when, on another 5 minutes, the cron job will acces the url again.
I will be in a infinity loop ?
How you would do that ? I need some advices please. I really need to set the cron job run faster and, in my opinion, the functions from php must be recalled forever.
Setup cronjob:
* * * * * php /path/to/your/file.php --arg1=value1 --arg2=value2 >> /dev/null 2>&1
Your file.php
<?php
//test your args
var_dump($argv);
$minute = date('i',time());
//run every 10 minutes
if(intval($minute)%10==0){
//run your cron job code here
cron_action();
}
else{
//do other code
}
function cron_action()
{
//do stub
}
And this is how to use cronjob with laravel https://laravel.com/docs/5.3/scheduling, you will learn from that.
Possibly you can use the crontab to run it faster, but you can have a pid-lock(sort of) so that each time the crontab is called there can only one script running.

Windows PHP repeating script via popen

I'm trying to create a browser-started self-calling/repeating PHP script on Windows with PHP (currently 5.3.24 but soon will be latest). It will act as a daemon to monitor changes in a database (every few seconds, so cron/schedule is out) and then call other PHP scripts to perform work when changes are found. For the purposes of this question please ignore the fact that I'd be better off doing this in C# or some other language :)
To keep things simple I started out by trying to use popen to run a second PHP script in the background...
// BatchMonitor.php
SaveToMonitorTable(1); // save 1st test entry to see if the script reached this point
$Command = '"" "C:\Program Files (x86)\PHP\v5.3\php.exe" C:\inetpub\wwwroot\Test.php --Instance=' . $Data->Instance;
pclose(popen("start /B $Command", "r"));
SaveToMonitorTable(2); // save 2nd test entry to see if the script reached this point
exit();
// Test.php
SaveToTestTable(1);
Sleep(10);
SaveToTestTable(2);
exit();
If I run BatchMonitor.php in the browser it works fine. As expected it will save 1 to the monitor table, call Test.php which saves 1 to the test table, the original BatchMonitor.php will continue without waiting for a response and save 2 to the monitor table before exiting, then 10 seconds later the test page saves 2 to the test table before exiting. The second script starts fine, the first script does not wait for a reply and all parameters are correctly passed between scripts. With everything working as intended I then changed the system to work as a repeating loop by calling itself (with delay) instead of another script...
// BatchMonitor.php
SaveToMonitorTable(1); // save 1st test entry to see if the script reached this point
$Command = '"" "C:\Program Files (x86)\PHP\v5.3\php.exe" C:\inetpub\wwwroot\BatchMonitor.php --Instance=' . $Data->Instance;
pclose(popen("start /B $Command", "r"));
SaveToMonitorTable(2); // save 2nd test entry to see if the script reached this point
exit();
If I run BatchMonitor.php in the browser it runs once and that is it. It will save 1 to the database, wait 10 seconds and then save 2 to the database before exiting. The page returns successfully with no script or PHP errors but it doesn't repeat as it should.
Both BatchMonitor.php and Test.php use line-for-line identical functions to get the parameters and both files run correctly and identical on the first iteration. If I use exec instead of popen then the page loops correctly with all logic working as expected (with the one obvious flaw of creating a never-ending chain of scripts awaiting for response values that will never come).
Am I missing something obvious? Does popen have some sort of secret rule that prevents a page/process from opening duplicates of itself? Are there any alternatives to using popen or exec? I read about WScript.Shell but it might be a while before I can schedule that to get enabled so for now it's not an option and I'm hoping there is something more standard that I can use.
I dont feel like this should cbe your actual answer, But why do you disbandon scheduled tasks/cronjobs because you want something done every X seconds? Having the script minute.php calling 5seconds.php with ofcouse 5 second intervals in between would create a repeated taak evert 5 seconds right?
Strangely enough you are kinda using the same sort of mechanism from your browser already.
My only concern would be to take the processed time in account and create a safe script which ensures no more than 1 '5seconds.php' can run at any given time.

Making Crons to quit running an existing script

I am running a script which constantly works over my Database. How ever It is necessary to restart the script once an hour. Obviously I can't do that automatically. I don't want to use daemon, its too complex for me right now. Easier solution is to use cron job but biggest drawback is, it won't stop the last script. Script runs in infinite while(true) loop
However is that possible if I make function in a script, lets say
function exitScript()
{
exit;
}
And then on Cron job if i do something like
php /home/Path/public_html/webservice/myScript.php exitScript and then
php /home/Path/public_html/webservice/myScript.php
What will be the format and How can I run both one by one using cron job or make another PHP who does so?
I need advice.
Here is a little trick easy to made which you can use..
1st set you cron jobs to run on each hour.
* */1 ..... cronjob.php
2nd At start of your script define 1 constant with time:
define('SCRIPT_START_TIME', time());
3rd At your exit script set up a condition check to exit if 59 minutes are passed from this constant to current time.. :)
function exitScript()
{
if((time() - SCRIPT_START_TIME) > 59*60){
exit();
}
}
4th at each while LOOP start the exit script .

execute php function every 20 seconds

I have a PHP script to parse live scores from JSON url,then check if some scores are change it call another php script to push notifications to IOS devices.
my question is how can i make the first script run every 20 seconds.
It depends on the host OS that you're using. For linux/unix/etc. the preferred tool for scheduling tasks is cron. For Windows or Mac or anything like that there are other built-in schedulers.
Using cron, to schedule a task to run every minute might look something like this:
* * * * * /path/to/command
The granularity, however, seems to only go to the minute. So without a more granular task scheduler, you may need to internally handle the 20 second delay.
One idea could be to wrap the task in a script which runs the command, sleeps for 20 seconds, runs the command again, sleeps for 20 seconds, and runs the command again. Then the cron job would call that wrapping script. My bash is a little rusty, but the idea would look something like this:
/path/to/command
sleep 20
/path/to/command
sleep 20
/path/to/command
$total_time = 0;
$start_time = microtime(true);
while($total_time < 60)//run while less than a minute
{
//DoSomething;
echo $total_time."\n";
sleep(20);//wait amount in seconds
$total_time = microtime(true) - $start_time ;
}
add this to cronjob to run every minute.
maybe you can try sleep
while(1) {
exec('php path/to/script.php');
sleep(20);
}
Generaly a cronjob is perfect for this kind of task. Have a look at this here: http://www.thegeekstuff.com/2011/07/php-cron-job/
If that's not applicable you might be able to Script something for the PHP CLI http://php.net/manual/en/features.commandline.php which allows you unlimited runtime. Just put a while(true) loop with a sleep() in there and run it with the php command like > php yourscript.php

Categories