I'm like to improve script below, or maybe know if exist a better way to rewrite to better results.
I use this on two files cron1.php and cron2.php executed every 5 seconds and need to prevent running twice.
Script execution time depends of filesize, most of the time took around 2 seconds, but for huge files can take 25/30 seconds, for this i need to stop execution.
I'm on right way? Any suggestion to improve?
$fp = fopen("cron.lock", "a+");
if (flock($fp, LOCK_EX | LOCK_NB))
{
echo "task started\n";
// Here is my long script
// Cron run every 5 seconds
sleep(2);
flock($fp, LOCK_UN);
}
else
{
echo "task already running\n";
exit;
}
fclose($fp);
I generally do a file operation like dumping the getmypid in the lock file. So externally I can know which pid has locked it. In some debugging cases, that is helpful.
Finally, unlink the lock file when you are done.
Related
I came across this link while trying to learn how to lock files to prevent a script reading from a file as another is writing, or two scripts writing to the same file simultaneously.
I created two scripts, readandwritelock.php and readlock.php, the first script to retrieve the file with file_get_contents, append it and then write back to the same file with file_put_contents($file, $data, LOCK_EX), and the second that just retrieves the file with file_get_contents after flock($file, LOCK_SH).
<?php
//readandwritelock.php
$myfile = fopen('15-11-2018.txt', 'r+');
if (flock($myfile, LOCK_SH)) {
echo "Gotten lock<br>";
$current = file_get_contents('15-11-2018.txt');
/*I commented this on my second test to see if file_put_contents will work.
After uncommenting and third test, it does not work anymore.
if (flock($myfile, LOCK_UN)) {
echo "Unlocked<br>";
}*/
$current .= "appending";
if (file_put_contents('15-11-2018.txt', $current, LOCK_EX)) {
echo "Success";
}
else {
echo "Failed";
//browser loads indefinitely so this does not run
}
fclose($myfile);
}
?>
The problem I am facing is that the first try I was able to file_get_contents after getting the lock, and then releasing the lock and proceed to append and file_put_contents($file, $data, LOCK_EX). However on the second try I decided to comment the releasing of the LOCK_SH lock to test and see what would happen. The script file loads indefinitely (Waiting for localhost...) on my browser, so I reverted back the changes for my third try, but this time the script file still loads indefinitely. It's as if the LOCK_SH was never released.
I must be doing something wrong, but I do not know what exactly it is. Could someone explain?
This was tested on XAMPP and macOS High Sierra and Chrome.
<?php
//readlock.php
//works as normal
$myfile = fopen('15-11-2018.txt', 'r');
if (flock($myfile, LOCK_SH)) {
echo "Gotten lock<br>";
$current = file_get_contents('15-11-2018.txt');
echo $current;
if (flock($myfile, LOCK_UN)) {
echo "<br>Unlocked";
}
fclose($myfile);
}
?>
The reason why your browser seems to load indefinitely is because your PHP file never finishes.
First you get a LOCK_SH (a shared or read lock) for your file, which is fine while you are reading the content.
The problem is that you also try to get a LOCK_EX (an exclusive lock) on the same file in the file_put_contents function. Therefore the file_put_contents functions blocks until all other locks (shared AND exclusive ones) are unlocked, which can't work (this is a deadlock).
For your code to work properly, you can either try to get an exlusive lock in the first place
if( flock($myfile, LOCK_EX) ) {
// ...
or you unlock the shared lock before you write
flock($myfile, LOCK_UN);
if ( file_put_contents('15-11-2018.txt', $current, LOCK_EX) ) {
// ...
In general it is a good idea to keep a locks life as short as possible. If you plan to make extensive manipulations to your data between reading and writing, I would recommend to unlock the file right after reading and lock it again right for writing.
I hvave a cron job that is doing some updates. After completion of that process i need to run another cron job for inserting that updated records into another table.For this I am trying this script.
$output = shell_exec('ps -C php -f');
if (strpos($output, "php do_update.php")===false) {
shell_exec('php insert_process.php');
}
But it is not doing any thing and i am getting ps is not internal / external command.
So how can I find that first cron job execution is completed or not?Once that is executed then i will run the second cron for inserting data. Any holp would be greatly appreciated.
You could use some kind of a lock file.
First file:
at the beginning of the first script create empty file on the disk called lockfile.txt (or any other name)
at the end remove the file from the disk
Second file
check if the file called lockfile.txt exists, if not - run the code
you can use flock on running the do_update.php add a lock to it and in
`insert_process.php` run the process if your previous lock is released
in your do_update.php add this
$fp = fopen("lock.txt", "r+");
if (flock($fp, LOCK_EX)) {
//do your process
flock($fp, LOCK_UN); //release the lock
}
in your insert_process.php add this
$fp = fopen("lock.txt", "r+");
if (flock($fp, LOCK_EX | LOCK_NB)) { // checks for the lock
// do your process
}
I am using a crontab that executes a PHP file. I want to implement the flock() command to help prevent duplicate crontabs from running at one time. If I have:
* * * * * php /var/www/html/welcome.php
How can i add this flock() command? Thanks!
Try this:
$fh = fopen('mutex.txt','r'); // Any convenient file (MUTual EXclusion)
flock($fh, LOCK_EX); // get exclusive lock. Will block until lock is acquired
// Do your exclusive stuff...
flock($fh, LOCK_UN); // release lock
fclose($fh); // close Mutex file.
For complete your answer and as you use a crontab every minute, you may encounter a problem :
If for any reason, your script lack to finish in 1 minute his job or the script fail somewhere and does not remove the lock (stuck inside a 'while'...), the next crontab will start and stay in your process list until the previous remove his lock, and so on...
A better approach would be :
$fh = fopen('/path/to/mutex.txt', 'r'); //Any convenient file (MUTual EXclusion)
if(!flock($fh, LOCK_EX | LOCK_NB)) //Exit if lock still active
exit(-1);
//Your code here
flock($fh, LOCK_UN); //release lock
fclose($fh); //close Mutex file.
And that will avoid any stack of process php
I want to have a PHP file that is used as a counter. It will a) echo the current value of a txt file, and b) increment that file using an exclusive lock so no other scripts can read or write to it while it's being used.
User A will write and increment this number, while User B requests to read the file. Is it possible that User A can lock this file so no one can read or write to it until User A's write is finished?
I've used flock in the past, but I'm not sure how to get the file to wait until it is available, rather than quitting if it's already been locked
My goal is:
LOCK counter.txt; write to counter.txt;
while at the same time
Read counter.txt; realize it's locked so wait until that lock is finished.
//
$fp = fopen("counter.txt", 'w+');
if(flock($fp, LOCK_EX)) {
fwrite($fp, $counter + 1);
flock($fp, LOCK_UN);
} else {
// try again??
}
fclose($fp);
From documentation: By default, this function will block until the requested lock is acquired
So simply use flock in your reader (LOCK_SH) and writer (LOCK_EX), and it is going to work.
However I highly discourage use of blocking flock without timeout as this means that if something goes wrong then your program is going to hang forever. To avoid this use non-blocking request like this (again, it is in doc):
/* Activate the LOCK_NB option on an LOCK_EX operation */
if(!flock($fp, LOCK_EX | LOCK_NB)) {
echo 'Unable to obtain lock';
}
And wrap it in a for loop, with sleep and break after failed n-tries (or total wait time).
EDIT: You can also look for some examples of usage here. This class is a part of ninja-mutex library in which you may be interested too.
I have a script that I want to make sure only one is running, I am doing:
$fp = fopen("lock.txt", "w");
if (flock($fp, LOCK_EX|LOCK_NB)) { // do an exclusive lock
//Processing code
flock($fp, LOCK_UN);
}else{
echo "Could not lock file!";
}
fclose($fp);
The problem is when I start the second script, it just stays there waiting. If I then stop the first script, the second script will then print "Could Not lock this file". Why doesn't the second script just stop immediately and report that message?
If it ran the second script, it would know that the file is locked and should exit. When I watch my processes I can see the second script sat there ... it's just wating.
Thanks.
EDIT: I have just tried a quick and dirty database lock i.e set a running field to -1 and check for it when the script opens. But that doesn't work. I have also tried using sockets as described: here . It seems like the second file won't even run ... Should I even be worried?
$fp = fopen("lock.txt", "w");
$block = 0;
if (flock($fp, LOCK_EX|LOCK_NB, $block)) { // do an exclusive lock
sleep(5);
flock($fp, LOCK_UN);
echo "Could lock file!";
}else{
echo "Could not lock file!";
}
fclose($fp);
works for me. Try adding the 3'rd parameter
Edit:
You may have some session locking problems:
when you call session_start() it will block(put further requests in a waiting state) any other requests to your script until the original script that started the session has finished. To test this, try either accessing with 2 different browsers, or avoid calling session_start() in this script.