I'm working on a wamp development environment and testing how long it takes to get the site indexed. I'm doing this by running cron manually.
The problem is that if there's 700 jobs in the job_queue, each run of cron does only some of them, so I need to run cron several times. How could I keep calling cron in a loop until there are no more jobs left in the job_queue?
Also, I'm open to any drush alternatives. I'm aware of drush cron, but this also does only some of the jobs each run, so needs to be run again manually.
If you want to run something all at once until it's done, cron is the wrong tool for the job; batch API is the tool for that. Unfortunately the search module is written to update only on cron, but there's not a lot of code in search_cron to copy into a batch function. So I'd suggest going the batch route rather wrapping some sort of pseudo-batch around cron, as your end goal doesn't seem to involve cron at all.
step 1. - Run cron every minute
step 2. - Just check if the script is already running. If so - close the "new script"
$lid='';
if(is_array($argv) && isset($argv[1])) {
$lid=$argv[1];
}
if($lid!=='') {
$xx=array();
exec('ps x | grep "thefile.php '.$lid.'" | grep -v "grep"', &$xx);
if(count($xx)>1) {
die('Script '.$lid.' running already... exiting... '."\n");
}
}
Put it in cron with every minute php thefile.php 1
php thefile.php 2
php thefile.php 3 to run 3 scripts at the same time.
drush search-index will generate the remaining search index for you.
Related
I am currently following a tutorial that teaches how to create a queue in php. An infinite loop was created in a php script. I simplified the code in order to focus on the question at hand:
while(1) {
echo 'no jobs to do - waiting...', PHP_EOL;
sleep(10);
}
I use PuTTy (with an SSH connection) to connect to the linux terminal in my shared hosting account (godaddy). If I run php queuefile.php, I know it will run with no problems (already tested the code with a finite for loop instead of the infinite while loop).
QUESTION: How could I exit out of the infinite loop once it has started? I have already read online the option of creating code that "checks" if it should continue looping with something like the following code:
$bool = TRUE;
while ($bool)
{
if(!file_exists(allow.txt)){$bool = FALSE}
//... the rest of the code
though I am curious if there might be a command I can type in the terminal, or a set of keys I can push that will cause the script to terminate. If there is any way of terminating the script, or if there is a better way to make the previous "check", I would love your feedback!
Pushing Ctrl+C should stop the running program that is running in the foreground.
You could also kill it when you login in another session an do some ps aux | grep my-php-script.php and see if it is your program, then you can use pkill -f my-php-script.php to kill this process.
I understand so that you want to make cron in your server. Therefore you should log in your server via putty and create cron job.
For example:
After logging...
crontab -e
Then add
1 2 3 4 5 /path/to/command arg1 arg2
I have a php-cli script that is run by cron every 5 minutes. Because this interval is short, multiple processes are run at the same time. That's not what I want, since this script has to write inside a text file a numeric id that is incremented each time. It happens that writers are writing at the same time on this text file, and the value written is incorrect.
I have tried to use php's flock function to block writing in the file, when another process is writing on it but it doesnt work.
$fw = fopen($path, 'r+');
if (flock($fw, LOCK_EX)) {
ftruncate($fw, 0);
fwrite($fw, $latestid);
fflush($fw);
flock($fw, LOCK_UN);
}
fclose($fw);
So I suppose that the solution to this is create a bash script that verifies if there is an instance of this php script that is running, if so it should wait until it finished. But I dont know how to do it, any ideas?
The solution I'm using with a bash script is this:
exec 9>/path/to/lock/file
if ! flock -n 9 ; then
echo "another instance is running";
exit 1
fi
# this now runs under the lock until 9 is closed (it will be closed automatically when the script ends)
A file descriptor 9> is created in /var/lock/file, and flock will exit a new process that's trying to run, unless there is no other instance of the script that is running.
How can I ensure that only one instance of a script is running at a time (mutual exclusion)?
I don't really understand how incrementing a counter every 5 minutes will result in multiple processes trying to write the counter file at the same time, but...
A much simpler approach is to use a simple locking mechanism similar to the below:
<?php
$lock_filename = 'nobodyshouldincrementthecounterwhenthisfileishere';
if(file_exists($lock_filename)) {
return;
}
touch($lock_filename);
// your stuff...
unlink($lock_filename);
This as a simple approach will not deal with a situation when the script breaks before it could remove the lock file, in which case it would never run again until it is removed.
More sophisticated approaches are also possible as you suggest, e.g. fork the job in its own process, write the PID into a file, then before running the job it could be checked whether that PID is still running.
To prevent start of a next session of any program until the previous session still running, such as next cron job, I recommend to use either built into your program or external check of running process of this program. Just execute before starting of your program
ps -ef|grep <process_name>|grep -v grep|wc -l
and check, if its result will be 0. Only in this case your program could be started.
I suppose, that you must guarantee an absence of 3rd party process having similar name. (For this purpose give your program a longer and unique name). And a name of your program must not contain pattern "grep".
This work good in combination with normal regular starting of your program, that is configured in a cron table, by cron daemon.
For the case if your check is written as an external script, an entry in the crontab might look like
<time_specification> <your_starter_script> <your_program> ...
2 important remarks: Exit code of your_starter_script must be 0 in case of not starting of your program and it would be better to completely prohibit writing to stdout or stderr by this script.
Such starter is very short and a simple programming exercise. Therefore I don't feel a need to provide its complete code.
Instead of using cron to run your script every 5 minutes, how about using at to schedule your script to run again, 5 minutes after it finishes. Near the end of your script, you can use shell_exec() to run an at command to schedule your script to run again in 5 minutes, like so:
at now + 5 minutes /path/to/script
Or, perhaps even simpler than my previous answer (using at to schedule the script to run again in 5 minutes) is make your script a daemon, by using a non-terminating loop, like so:
while(1) {
// whatever your script does here....
sleep(300) //wait 5 minutes
}
Then, you can do away with scheduling by way of cron or at altogether. Just simply run your script in the background from the command line, like so:
/path/to/your/script &
Or, add /path/to/your/script in /etc/rc.local to make your script start automatically when the machine boots.
I am using the paid host Hosting24 to run my website. I got a cron job which execute the following code every 1 minute.
<?php
require_once('connect.php');
for($c = 0; $c < 60; $c=$c+5)
{
// php to mysql queries SELECT/ UPDATE/ INSERT etc...
sleep(5);
}
mysql_close($my_connection);}
?>
I tried to use the for loop to allow the script to run for 1 minute. Eventually my script should run for as long as I want it to be because the server will execute it every 1 min.
However, I opened my website for a short while and then I cannot connect to it. I cannot even access my cpanel.
Is my cron job script overheating the system, so the system is down?
How should I set up my cron job script to let it run every 1 min and lasts for 1 min?
Thanks.
It's been my experience that cron jobs that need to include files should contain the full path to that file (the CLI environment can differ greatly from the environment inside the web server). Try that and see if it helps.
If not, the next thing you need to do is turn the cron job off and run it from the CLI yourself, using top to look at the process usage. See how long it takes for your cron to run.
I have 3 scripts that do some stuff.
I want to run them continously and concurrently.
Let's say for example:
First script took 30 minutes to finish.
Second - 20 mins.
Third - 5 mins.
So I need everyone of them to run immediately after it's finished.
The 3 scripts make UPDATE in a same DB, but they need to work separately.
They can run together at once, but not couple of times(my english sucks, sorry about that).
Let's explain what I mean with example:
firstScript.php is running
secondScript.php is running
thirdScript.php is running
firstScript.php trying to start but it still running. Wait.(till finish)
May be some shell script will do the job, but how?
Make a bash script that takes one argument, and have it do something like this:
if [ -f /tmp/$1 ]
then
echo "Script already running"
else
touch /tmp/$1
php $1
rm /tmp/$1
fi
Set up a cron to run this script and pass it the name of the php script you want to run.
You could execute a shell command just before the php script dies. Example :
while($i < 1000)
{
$i++;
}
shell_exec("bin/php.exe some_script.php");
If you are working on a shared hosting account this might not work do to security issues.
note the "bin/php.exe" need to be edited for your server, point to where ever your php is installed.
i have a php script that should be run automatically every day.
as the php script is run on a request,how can i do it?
is there any way else using cronjob task?
Two options:
Use crontab demon
Hire a worker and make him open script in a browser every 24 hours
The choice is yours :)
To use crontab, type crontab -e in console, the text file opens. Add a line at the end:
0 0 * * * /usr/bin/php /var/www/mysite/httpdocs/daily_stats.php
Where:
0 0 * * * - run every day at 00:00
/usr/bin/php -path to your PHP (can be determined by which php command)
/var/www/mysite/httpdocs/daily_stats.php - path to your PHP script
if which php outputs nothing, install PHP cli by running:
sudo aptitude install php5-cli
Good luck!
use cron job option who start automatically and give result before 24 hours
Use the cron job, this is the best solution.
otherwise, you can run an infinite loop inside php and sleep 24 hours.
horrible solution though.
If cron isn't available in some sort of way you could use Google app engine's cron for this. Because cron is the way to go.
If cron is not available you could execute a php script in CLI which will run all the time.
In the script you can make a infinite while loop.
In the while loop, check a file on disk or a db record (you can controll this file or db record from an external script, telling the looping script what to do (CLI execute another script at a given hour) and when to exit)
If you use a database,don't forget to initialize and close the db connection each time the loop runs.
I'd sleep the loop every 1 min or so.. you could use this instead of linux cron for many more things.