I want to run a php script every 5 minutes that processes some simple mysql queries to identify potential errors and in case an error is recorded as a database entry the php script sends out an email.
From my research, it seems like cron jobs (or task schedulers) usually take care of running scripts at specified times, however I cannot find this option anywhere at the hosting service I am using (who runs "Parallels Plesk Panel 11.0.9" as the management interface I can access).
Therefore I tried the following "trick":
<?php
$active = $_GET["a"];
set_time_limit(0);
while($active == 1){
include 'alert_exe.php';
sleep(300); // execute script every 5 mins
}
?>
To active the script I enter the url (".../alert.php?a=1"). This works fine for a couple of minutes, however it seems after 2 or 3 minutes the script stops executing.
Any idea how I can prevent the stopping of the script or alternative suggestions how to achieve the automatic execution of a script every 5minutes (without being able to access cron jobs)?
Thanks!
It would not surprise me that a hosting service would protect its servers from getting overloaded with long-running scripts, and would configure a time-out after which such scripts are aborted (see PHP: Runtime Configuration Settings and max_execution_time in particular).
If you have a PC that can stay turned on, an alternative solution would be to let it send a request to the server every 5 minutes:
<?php
// Your PHP code that has to be executed every 5 minutes comes here
?>
<script>
setTimeout(function () { window.location.reload(); }, 5*60*1000);
// just show current time stamp to see time of last refresh.
document.write(new Date());
</script>
There is a max_execution_time parameter which stops the script if it takes too long, 30 seconds by default, see the docs - http://php.net/manual/en/info.configuration.php#ini.max-execution-time.
You can try to do set_time_limit(0) at the beginning of your script (see http://php.net/manual/en/function.set-time-limit.php), or change the max_execution_time parameter itself.
But in general, I would not go with such solution, it is not very reliable. Better find a hosting where you can use cron or you can try to look for some external service which will ping your script every 5 minutes (probably you can use services which monitor the web application health).
try this solution:
<?php
$interval=5; //minutes
set_time_limit(0);
while (true)
{
$now=time();
include("alert_exe.php");
sleep($interval*05-(time()-$now));
}
?>
Stop the script by restarting apache, or build in a return value from your internal script which changes while (true) to while (false)
Related
I'm currently running an Apache server (2.2) on my local machine (Windows) which I'm using to run some PHP scripts to take care of some tedious work. One of the scripts involves a ton of moving, resizing, and download / uploading files to another server. I would very much like the script to run constantly so that I don't have to baby the script by starting it up again every time it times out.
set_time_limit(0);
ignore_user_abort(1);
Both are set in my script, but after about 30mins to an hour the script stops and I get the 504 Gateway Time-out message in my browser. Is there something I missing in Apache or PHP to prevent the timeout? Or should I be running the script a different way?
Or should I be running the script a different way?
Definitely. You should run your script from command line (CLI)
if i should implement something like this i would you 2 different scripts:
A. process_controller.php
B. process.php
The workflow should be:
the user call the script A by using a browser
the script A start the script B by using a system() or exec() and pass to it a "process token" via command line.
the script B write the execution status into a shared space: a file named as the token, a database table. in general something that can be read also by the script A by using the token as reference
the script A contains an AJAX call, in polling, that ask to the script A the status of the process for a given token
Ajax polling:
<script>
var $myToken;
function ajaxPolling()
{
$.get('process_controller.php?action=getStatus&token='+$myToken, function(data) {
$('.result').html(data);
});
}
setInterval("ajaxPolling()",60*1000); //Every minute
</script>
there are some considerations about the communication between the 2 processes, depending on how many instances of the script B you would be able to run in parallel
Just one: you don't need a random/unique token
One per user: session_start(); $token = session_id();
More than one per user: session_start(); $token = session_id().microtime();
If you need to run it form your browser, You should make sure that there is not php execution limit in the php.ini file, but also that there is not limit set in mod_php (or what ever you are using) under apache.
Use php's system() to call a shell script which starts a service/background task.
I have a PHP script that runs on a shared hosting environment server. This PHP script takes a long time to run. It may take 20 to 30 minutes to finish a run. It is a recurring background process. However I do not have control over when the process starts (it could be triggered every five minutes, or every three hours, no one knows).
Anyway, at the beginnin of this script I would like to detect if the previous process is still running, if the earlier run is still running and has not finished, then I would not run the script again. If it is not running, then I run the new process.
In other words, here is a pseudo code. Let's call the script abc.php
1. Start script abc.php
2. Check if an older version of abc.phh is still running. If it is running, then terminate
3. If it is not running, then continue with abc.php and do your work which might take 30 minutes or more
How can I do that? Please keep in mind this is shared hosting.
UPDATE: I was thinking of using a DB detection mechanism. So, when the script starts, it will set a value in a DB as 'STARTED=TRUE', when done, it will set 'STARTED=FALSE'. However this solution is not proper, because there is no garantee that the script will terminate properly. It might get interrupted, and therefore may not update the STARTED value to FALSE. So the DB solution is out of the question. It has to be a process detection of some sort, or maybe a different solution that I did not think off. Thanks.
If this is a CGI process, I would try using exec + ps, if the latter is available in your environment. A quick SO search turns up this answer: https://stackoverflow.com/a/7182595/177920
You'll need to have a script that is responsible for (and separate from) checking to see if your target script is running, of course, otherwise you'll always see that your target script is running based on the order of ops in your "psuedo code".
You can implement a simple locking mechanism: Create a tmp lock file when script starts and check before if the lock file already exists. If it does, dont run the script, it it doesnt create a lock file and run the script. At then end of successful run, delete the lock file so that it will run properly next time.
if(!locked()) {
lock();
// your code here
unlock();
} else {
echo "script already running";
}
function lock() { file_put_contents("write.lock", 'running'); }
function locked() { return file_exists("write.lock"); }
function unlock() { return unlink("write.lock"); }
I am using the paid host Hosting24 to run my website. I got a cron job which execute the following code every 1 minute.
<?php
require_once('connect.php');
for($c = 0; $c < 60; $c=$c+5)
{
// php to mysql queries SELECT/ UPDATE/ INSERT etc...
sleep(5);
}
mysql_close($my_connection);}
?>
I tried to use the for loop to allow the script to run for 1 minute. Eventually my script should run for as long as I want it to be because the server will execute it every 1 min.
However, I opened my website for a short while and then I cannot connect to it. I cannot even access my cpanel.
Is my cron job script overheating the system, so the system is down?
How should I set up my cron job script to let it run every 1 min and lasts for 1 min?
Thanks.
It's been my experience that cron jobs that need to include files should contain the full path to that file (the CLI environment can differ greatly from the environment inside the web server). Try that and see if it helps.
If not, the next thing you need to do is turn the cron job off and run it from the CLI yourself, using top to look at the process usage. See how long it takes for your cron to run.
I have an issue which stops my (slow) process. I start my background slow process using a php page with a button as follow:
<form id="trial" method="post" action=""><input name="trial" value="Start!" type="submit">
<?php
set_time_limit(0);
if (isset($_POST['trial'])) {
system("/srv/www/cgi-bin/myscript.sh");
}
?>
At some point after 1.5 days the process stops, I have modified the php.ini and the apache config file inserting a very high number in the timeout directive, but it seems it does not work, or there is some other process that is stopping myscript.sh.. do you have any suggestions?
thanks!
I'm assuming you have access to the server via SSH based on your post.
If the real goal is to get your script to run continuously, why not log in and
nohup myscript.sh
As long as your script behaves, it will continue to run as long as it needs to after you close the terminal.
Check the Logs
To determine why your script is failing, you'll definitely want to check /var/log/kern.log and /var/log/syslog. Look for any entries containing your script or any of it's children. Your script may be getting killed off by the kernel ( exceeding limits ) or erroring out at runtime.
Execute the script continuously will take some problem so set Cron for every 30 mins in your system.
set_time_limit(30);
system("/srv/www/cgi-bin/myscript.sh");
Cron setup :
30 * * * * php /path/to/your/php/file.php
I'm currently running an Apache server (2.2) on my local machine (Windows) which I'm using to run some PHP scripts to take care of some tedious work. One of the scripts involves a ton of moving, resizing, and download / uploading files to another server. I would very much like the script to run constantly so that I don't have to baby the script by starting it up again every time it times out.
set_time_limit(0);
ignore_user_abort(1);
Both are set in my script, but after about 30mins to an hour the script stops and I get the 504 Gateway Time-out message in my browser. Is there something I missing in Apache or PHP to prevent the timeout? Or should I be running the script a different way?
Or should I be running the script a different way?
Definitely. You should run your script from command line (CLI)
if i should implement something like this i would you 2 different scripts:
A. process_controller.php
B. process.php
The workflow should be:
the user call the script A by using a browser
the script A start the script B by using a system() or exec() and pass to it a "process token" via command line.
the script B write the execution status into a shared space: a file named as the token, a database table. in general something that can be read also by the script A by using the token as reference
the script A contains an AJAX call, in polling, that ask to the script A the status of the process for a given token
Ajax polling:
<script>
var $myToken;
function ajaxPolling()
{
$.get('process_controller.php?action=getStatus&token='+$myToken, function(data) {
$('.result').html(data);
});
}
setInterval("ajaxPolling()",60*1000); //Every minute
</script>
there are some considerations about the communication between the 2 processes, depending on how many instances of the script B you would be able to run in parallel
Just one: you don't need a random/unique token
One per user: session_start(); $token = session_id();
More than one per user: session_start(); $token = session_id().microtime();
If you need to run it form your browser, You should make sure that there is not php execution limit in the php.ini file, but also that there is not limit set in mod_php (or what ever you are using) under apache.
Use php's system() to call a shell script which starts a service/background task.