Making Crons to quit running an existing script - php

I am running a script which constantly works over my Database. How ever It is necessary to restart the script once an hour. Obviously I can't do that automatically. I don't want to use daemon, its too complex for me right now. Easier solution is to use cron job but biggest drawback is, it won't stop the last script. Script runs in infinite while(true) loop
However is that possible if I make function in a script, lets say
function exitScript()
{
exit;
}
And then on Cron job if i do something like
php /home/Path/public_html/webservice/myScript.php exitScript and then
php /home/Path/public_html/webservice/myScript.php
What will be the format and How can I run both one by one using cron job or make another PHP who does so?
I need advice.

Here is a little trick easy to made which you can use..
1st set you cron jobs to run on each hour.
* */1 ..... cronjob.php
2nd At start of your script define 1 constant with time:
define('SCRIPT_START_TIME', time());
3rd At your exit script set up a condition check to exit if 59 minutes are passed from this constant to current time.. :)
function exitScript()
{
if((time() - SCRIPT_START_TIME) > 59*60){
exit();
}
}
4th at each while LOOP start the exit script .

Related

How to Execute a PHP task and put on hold/sleep another task that is being executed right after until the first one is completed/half through?

I have a PHP Code that does some tasks.
Lets say someone executes the code by doing so https://localhost/code.php.
I have an employee that executes the script over curl from a separate server, what is the best way to prevent him from launching the script twice, before the (already running) script is actually completed/finished goes to the end.
TLDR: I would need a function, to wait until the task/code (that's running now) completes and the secondary task that is trying to be launched has given (sleep for few seconds or until the first tasks completes).
TLDR2: Looking for function [The title says it]
Any ideas? thanks.
While a session won't work with cURL, the idea is valid -- you need to set something persistent outside of your script. So, how about writing to a local file, or writing to a database?
if ( file_exists('lock.txt') ) die;
file_put_contents ('lock.txt', 'This file prevents script execution', LOCK_EX);
(... your script code here...)
unlink ('lock.txt');
If you know that there is only one user who will hit your server you can simply use session data.
<?php
session_start();
if (true === $_SESSION["NOT_FINISHED"] ?? false) {
die("Previous job is not finished yet!");
} else {
$_SESSION["NOT_FINISHED"] = true;
// start whatever job need to be done here
...
// when job is done and finished lets release out busy flag
unset( $_SESSION["NOT_FINISHED"]);
}

How to run a php file at a specific time?

So, I'm working on a time-sensitive website in PHP on my CentOS server. I have a random time selected in the future, within 24 hours of the present. At that point, I need a PHP file to execute, and a new date to be selected and the same file to be opened. How is this possible to accomplish? I looked briefly at cronjobs, but I couldn't find a way to make them open at a specific, random, time.
You can use at command, run your PHP file and in the end, register make another call to at for the next time. Something like this
<?php
// your PHP code in here, and then find out when is the next call time
$time = date('H:i', intval($time)); // or another good way to make sure time value is safe to use as a shell argument, like using escapeshellarg()
$run_me = "/usr/bin/env php " . __FILE__;
exec("echo '$run_me' | at '$time'");
One possible workaround is to run a script from a cron job, say, every 10 minutes. On the top of the script, check a specific file which is supposed to contain a timestamp. If the current time is greater than the value from the file, do the job, and write the new timestamp value into this file.
$time_to_run = intval(file_get_contents('my.timestamp'));
if(time() >= $time_to_run) {
do stuff
file_put_contents(time() + random value, 'my.timestamp');
}
If you need more granularity, a better option would be to run it as a daemon (see advices here) and just loop forever (probably with some sleep() inside) until the time comes.

Don't run script if it's already running

I've been completely unsuccessful finding an answer to this question. Hopefully someone here can help.
I have a PHP script (a WordPress template, to be specific) that automatically imports and processes images when a user hits it. The problem is that the image processing takes up a lot of memory, particularly if multiple users are accessing the template at the same time and initiating the image processing. My server crashed multiple times because of this.
My solution to this was to not execute the image-processing function if it was already running. Before the function started running, I would check a database entry named image_import_running to see if it was set to false. If it was, the function then ran. The very first thing the function did was set image_import_running to true. Then, after it was all finished, I set it back to false.
It worked great -- in theory. The site hasn't crashed since, I can tell you that. But there are two major problems with it:
If the user closes the page while it's loading, the script never finishes processing the images and therefore never sets image_import_running back to false. The template will never process images again until it's manually set to false.
If the script times out while it's processing images -- and that's a strong possibility if there are many images in the queue -- you have essentially the same problem as No. 1: the script never gets to the point where it sets image_import_running back to false.
To handle No. 1 (the first one of the two problems I realized), I added ignore_user_abort(true) to the script. Did it work? I don't know, because No. 2 is still an issue. That's where I'm stumped.
If I could ask the server whether the script was running or not, I could do something like this:
if($import_running && $script_not_running) {
$import_running = false;
}
But how do I set that $script_not_running variable? Beats me.
I've shared this entire story with you just in case you have some other brilliant solution.
Try using
ignore_user_abort(true); it will continue to run even if the person leaves and closes the browser.
you might also want to put a number instead of true false in the db record and set a maximum number of processes that can run together
As others have suggested, it would be best to move the image processing out of the request itself.
As an interim "fix", store a timestamp alongside image_import_running when a processing job begins (e.g., image_import_commenced). This is a very crude mechanism, but if you know the maximum time that a job can run before timing out, the script can check whether that period of time has elapsed.
e.g., if image_import_running is still true but the current time is more than 10 minutes since image_import_commenced, run the processing anyway.
What about setting a transient with an expiry time that would throttle the operation?
if(!get_transient( 'import_running' )) {
set_transient( 'import_running', true, 30 ); // set a 30 second transient on the import.
run_the_import_function();
}
I would rather store the job into database flagging it pending and set a cron job to execute the processing one job at a time.
For Me i use just this simple idea with a text document. for example run.txt file
in the top script use :
if((file_get_contents('run.txt') != 'run'){ // here the script will work
$file = fopen('run.txt', 'w+');
fwrite($file, 'run');
fclose('run.txt');
}else{
exit(); // if it find 'run' in run.txt the script will stop
}
And add this in the end of your script file
$file = fopen('run.txt', 'w+');
fwrite($file, ''); //will delete run word for the next try ;)
fclose('run.txt');
That will check if script already work by checking runt.txt contents
if run word exist in run.txt it will not run
Running a cron would definitively be a better solution. Idea to store url in a table is a good one.
To answer to the original question, you may run a ps auxwww command with exec (Check this page: How to get list of running php scripts using PHP exec()? ) and move your function in a separated php file.
exec("ps auxwww|grep myfunction.php|grep -v grep", $output);
Just add following on the top of your script.
<?php
// Ensures single instance of script run at a time.
$fileName = basename(__FILE__);
$output = shell_exec("ps -ef | grep -v grep | grep $fileName | wc -l");
//echo $output;
if ($output > 2)
{
echo "Already running - $fileName\n";
exit;
}
// Your php script code.
?>

PHP sleep delay

In PHP, I want to put a number of second delay on each iteration of the loop.
for ($i=0; $i <= 10; $i++) {
$file_exists=file_exists($location.$filename);
if($file_exists) {
break;
}
//sleep for 3 seconds
}
How can I do this?
Use PHP sleep() function. http://php.net/manual/en/function.sleep.php
This stops execution of next loop for the given number of seconds. So something like this
for ($i=0; $i <= 10; $i++) {
$file_exists=file_exists($location.$filename);
if($file_exists) {
break;
}
sleep(3); // this should halt for 3 seconds for every loop
}
I see what you are doing... your delaying a script to constantly check for a file on the filesystem (one that is being uploaded or being written by another script I assume). This is a BAD way to do it.
Your script will run slowly. Choking the server if several users are running that script.
Your server may timeout for some users.
HDD access is a costly resource.
There are better ways to do this.
You could use Ajax. And use a timeout to call your PHP script every few seconds. This will avoid the slow script loading. And also you can keep doing it constantly (the current for loop will only run for 33 seconds and then stop).
You can use a database. In some cases database access is faster than HDD access. Especially with views and caching. The script creating the file/uploading the file can set a flag in a table (i.e. file_exists) and then you can have a script that checks that field in your database.
You can use sleep(3) which sleeps the thread for 3 seconds.
Correction sleep method in php are in seconds.
Hare are two ways to sleep php script for some period of time. When you have your code and want to pause script working for some time use these functions.
In these examples the first part of code will be done on script run and the second part of code will be done but with time delay.
Using sleep() function you can define sleep time in seconds.
Example:
echo "Message 1";
// The first part of code.
$timeInSeconds = 3;
sleep($timeInSeconds);
// The second part of code.
echo "Message 2";
This way it is possible to sleep php script for 3 seconds. Using this function you can sleep script for whole number (integer) of seconds.
Using usleep() function you can define sleep time in microseconds. This sleep time is convenient for intervals that require more precise time than one second.
Example:
echo "Message 1";
// The first part of code.
$timeInMicroSeconds = 2487147;
usleep($timeInMicroSeconds);
// The second part of code.
echo "Message 2";
You can use this function if you want to sleep php for smaller time values than second (float). In this example I have put script to sleep for 2.487147 seconds.
Have you considered using a PHP Daemon script using supervisorD. I use it in multiple tasks that are required to be running all the time.
The catch is making sure that each time you are running your script you check for memory resources. If its too high, stop the process and then let it restart itself up again.
I have successfully used this process to be always checking database records for tasks to process.
It might be overkill but worth considering.

execute php function every 20 seconds

I have a PHP script to parse live scores from JSON url,then check if some scores are change it call another php script to push notifications to IOS devices.
my question is how can i make the first script run every 20 seconds.
It depends on the host OS that you're using. For linux/unix/etc. the preferred tool for scheduling tasks is cron. For Windows or Mac or anything like that there are other built-in schedulers.
Using cron, to schedule a task to run every minute might look something like this:
* * * * * /path/to/command
The granularity, however, seems to only go to the minute. So without a more granular task scheduler, you may need to internally handle the 20 second delay.
One idea could be to wrap the task in a script which runs the command, sleeps for 20 seconds, runs the command again, sleeps for 20 seconds, and runs the command again. Then the cron job would call that wrapping script. My bash is a little rusty, but the idea would look something like this:
/path/to/command
sleep 20
/path/to/command
sleep 20
/path/to/command
$total_time = 0;
$start_time = microtime(true);
while($total_time < 60)//run while less than a minute
{
//DoSomething;
echo $total_time."\n";
sleep(20);//wait amount in seconds
$total_time = microtime(true) - $start_time ;
}
add this to cronjob to run every minute.
maybe you can try sleep
while(1) {
exec('php path/to/script.php');
sleep(20);
}
Generaly a cronjob is perfect for this kind of task. Have a look at this here: http://www.thegeekstuff.com/2011/07/php-cron-job/
If that's not applicable you might be able to Script something for the PHP CLI http://php.net/manual/en/features.commandline.php which allows you unlimited runtime. Just put a while(true) loop with a sleep() in there and run it with the php command like > php yourscript.php

Categories