I have 3 scripts that do some stuff.
I want to run them continously and concurrently.
Let's say for example:
First script took 30 minutes to finish.
Second - 20 mins.
Third - 5 mins.
So I need everyone of them to run immediately after it's finished.
The 3 scripts make UPDATE in a same DB, but they need to work separately.
They can run together at once, but not couple of times(my english sucks, sorry about that).
Let's explain what I mean with example:
firstScript.php is running
secondScript.php is running
thirdScript.php is running
firstScript.php trying to start but it still running. Wait.(till finish)
May be some shell script will do the job, but how?
Make a bash script that takes one argument, and have it do something like this:
if [ -f /tmp/$1 ]
then
echo "Script already running"
else
touch /tmp/$1
php $1
rm /tmp/$1
fi
Set up a cron to run this script and pass it the name of the php script you want to run.
You could execute a shell command just before the php script dies. Example :
while($i < 1000)
{
$i++;
}
shell_exec("bin/php.exe some_script.php");
If you are working on a shared hosting account this might not work do to security issues.
note the "bin/php.exe" need to be edited for your server, point to where ever your php is installed.
Related
I have centos VPS hosting and installed WHM/cPanel . I want to run a php script using command line for unlimited time.
My script is look like:
<?php
set_time_limit(0);
while(true)
{
//code to send me email
sleep(600);
}
?>
I know that this script should be run for unlimited time.
I have used these commands:
php myfile.php &
nohup php myfile.php &
I found these commands on stackoverflow. And these are running fine. But after one hours, It stop automatically.
I think, i am doing right. But i do not know, which is killing that process.
If not,
i want to know that How to run this script for unlimited time.
What you are doing is correct. It should run. I have PHP scripts that run for much longer than an hour. They run for days on end. I also have programs that are not PHP that should run forever, but don't. It is because they die due to a bug in the program. For example, xscreensaver dies on me about once a week. To keep it running, I use this shell script (which you can use to keep your PHP running):
while:
do
xscreensaver &
wait
done
Now, running that shell script will start the program again if it ever dies for any reason.
If you have console access try using cronjob to run this file.
see : https://www.centos.org/docs/5/html/Deployment_Guide-en-US/ch-autotasks.html
also : http://alvinalexander.com/linux/linux-crontab-file-format-example
Make sure you use php-command in cron-job
Note : You'll require admin rights to edit/work with cron-jobs
I have a php-cli script that is run by cron every 5 minutes. Because this interval is short, multiple processes are run at the same time. That's not what I want, since this script has to write inside a text file a numeric id that is incremented each time. It happens that writers are writing at the same time on this text file, and the value written is incorrect.
I have tried to use php's flock function to block writing in the file, when another process is writing on it but it doesnt work.
$fw = fopen($path, 'r+');
if (flock($fw, LOCK_EX)) {
ftruncate($fw, 0);
fwrite($fw, $latestid);
fflush($fw);
flock($fw, LOCK_UN);
}
fclose($fw);
So I suppose that the solution to this is create a bash script that verifies if there is an instance of this php script that is running, if so it should wait until it finished. But I dont know how to do it, any ideas?
The solution I'm using with a bash script is this:
exec 9>/path/to/lock/file
if ! flock -n 9 ; then
echo "another instance is running";
exit 1
fi
# this now runs under the lock until 9 is closed (it will be closed automatically when the script ends)
A file descriptor 9> is created in /var/lock/file, and flock will exit a new process that's trying to run, unless there is no other instance of the script that is running.
How can I ensure that only one instance of a script is running at a time (mutual exclusion)?
I don't really understand how incrementing a counter every 5 minutes will result in multiple processes trying to write the counter file at the same time, but...
A much simpler approach is to use a simple locking mechanism similar to the below:
<?php
$lock_filename = 'nobodyshouldincrementthecounterwhenthisfileishere';
if(file_exists($lock_filename)) {
return;
}
touch($lock_filename);
// your stuff...
unlink($lock_filename);
This as a simple approach will not deal with a situation when the script breaks before it could remove the lock file, in which case it would never run again until it is removed.
More sophisticated approaches are also possible as you suggest, e.g. fork the job in its own process, write the PID into a file, then before running the job it could be checked whether that PID is still running.
To prevent start of a next session of any program until the previous session still running, such as next cron job, I recommend to use either built into your program or external check of running process of this program. Just execute before starting of your program
ps -ef|grep <process_name>|grep -v grep|wc -l
and check, if its result will be 0. Only in this case your program could be started.
I suppose, that you must guarantee an absence of 3rd party process having similar name. (For this purpose give your program a longer and unique name). And a name of your program must not contain pattern "grep".
This work good in combination with normal regular starting of your program, that is configured in a cron table, by cron daemon.
For the case if your check is written as an external script, an entry in the crontab might look like
<time_specification> <your_starter_script> <your_program> ...
2 important remarks: Exit code of your_starter_script must be 0 in case of not starting of your program and it would be better to completely prohibit writing to stdout or stderr by this script.
Such starter is very short and a simple programming exercise. Therefore I don't feel a need to provide its complete code.
Instead of using cron to run your script every 5 minutes, how about using at to schedule your script to run again, 5 minutes after it finishes. Near the end of your script, you can use shell_exec() to run an at command to schedule your script to run again in 5 minutes, like so:
at now + 5 minutes /path/to/script
Or, perhaps even simpler than my previous answer (using at to schedule the script to run again in 5 minutes) is make your script a daemon, by using a non-terminating loop, like so:
while(1) {
// whatever your script does here....
sleep(300) //wait 5 minutes
}
Then, you can do away with scheduling by way of cron or at altogether. Just simply run your script in the background from the command line, like so:
/path/to/your/script &
Or, add /path/to/your/script in /etc/rc.local to make your script start automatically when the machine boots.
I have seen this question on here before so I am sorry for the repetition but I have still not found an answer to my problem.
I have a bash script that takes a while to run. It needs to be passed variables set by a user on a webpage (don't worry there will be plenty of validation for security etc)
I can get the bash file to start but it dies after 20 seconds maybe when run from the webpage.
When run from the terminal.. runs absolutely fine.
Ok so I have the following:
$bashFile = shell_exec('./CoinCreationBashFile.sh "'.$coinName.'" "'.$coinNameAbreviation.'" "'.$blockReward.'" "'.$blockSpacing.'" "'.$targetTimespan.'" "'.$totalCoins.'" "'.$firstBitAddy.'" "'.$seedNode.'" "'.$seedName.'" "'.$headline.'" ');
Now this executes the bash file, but I read up about Shell_exec php with nohup and came up with the following:
$bashFile = shell_exec('nohup ./CoinCreationBashFile.sh "'.$coinName.'" "'.$coinNameAbreviation.'" "'.$blockReward.'" "'.$blockSpacing.'" "'.$targetTimespan.'" "'.$totalCoins.'" "'.$firstBitAddy.'" "'.$seedNode.'" "'.$seedName.'" "'.$headline.'" >/dev/null 2>&1 &');
But this still died after short time :(
So read up about set_time_limit and max_execution_time and set these to like 10000000 in the php.ini file.... Yet still no joy :(
Just want to run a bash script without it timing out and exiting. Don't really want to have to put an intermediate step in there but someone suggested I look at ZeroMQ to "detach worker from process" so I may have to go this route.
many thanks in advance
dont try runging a script via browser if they take more then 60 seconds instead try running this with SSH or as a cronjob.
My project calls for 3 php scripts that are run with if-else conditions. The first script is loaded on the index page of the site to check if a condition is set, and if it is, it calls for the second script. The second script check to see if other conditions are set and it finally calls for the last script if everything is good.
Now I could do this by just including the scripts in the if statement, but since the final result is a resource hogging MySQL dump, i need it to be run independently of the original trigger page.
Also those scripts should continue doing their things once triggered, regardless of the user actions on the index page.
One last thing: it should be able to run on win and nix.
How would you do this?
Does the following code make any sense?
if ($blah != $blah-size){
shell_exec ('php first-script.php > /dev/null 2>/dev/null &');
}
//If the size matches, die
else {
}
Thanks a million in advance.
UPDATE: just in case someone else is going through the same deal.
There seem to be a bug in php when running scripts as cgi but command line in Apache works with all the versions I've tested.
See the bug https://bugs.php.net/bug.php?id=11430
so instead i call the script like this:
exec("php-cli mybigfile.php > /dev/null 2>/dev/null &");
Or you could call it as shell. It works on nix systems but my local windows is hopeless so if anyone run it on windows and it works, please update this.
I would not do this by shell exec because you'd have no control over how many of these resource-hogging processes would be running at any one time. Thus, a user could go click-click-click-click and essentially halt your machine.
Instead, I'd build a work queue. Instead of running the dump directly, the script would submit a record to some sort of FIFO queue (could be a database table or a text file in a dir somewhere) and then immediately return. Next you'd have a cron script that runs at regular intervals and checks the queue to see if there's any work to do. If so, it picks the oldest thing, and runs it. This way, you're assured that you're only ever running one dump at a time.
The easiest way I can think is that you can do
exec("screen -d -m php long-running-script.php");
and then it will return immediately and run in the background. screen will allow you to connect to it and see what's happening.
You can also do what you're doing with 'nohup php long-running-script.php', or by writing a simple C app that does daemonize() and then execs your script.
I'm working on a wamp development environment and testing how long it takes to get the site indexed. I'm doing this by running cron manually.
The problem is that if there's 700 jobs in the job_queue, each run of cron does only some of them, so I need to run cron several times. How could I keep calling cron in a loop until there are no more jobs left in the job_queue?
Also, I'm open to any drush alternatives. I'm aware of drush cron, but this also does only some of the jobs each run, so needs to be run again manually.
If you want to run something all at once until it's done, cron is the wrong tool for the job; batch API is the tool for that. Unfortunately the search module is written to update only on cron, but there's not a lot of code in search_cron to copy into a batch function. So I'd suggest going the batch route rather wrapping some sort of pseudo-batch around cron, as your end goal doesn't seem to involve cron at all.
step 1. - Run cron every minute
step 2. - Just check if the script is already running. If so - close the "new script"
$lid='';
if(is_array($argv) && isset($argv[1])) {
$lid=$argv[1];
}
if($lid!=='') {
$xx=array();
exec('ps x | grep "thefile.php '.$lid.'" | grep -v "grep"', &$xx);
if(count($xx)>1) {
die('Script '.$lid.' running already... exiting... '."\n");
}
}
Put it in cron with every minute php thefile.php 1
php thefile.php 2
php thefile.php 3 to run 3 scripts at the same time.
drush search-index will generate the remaining search index for you.