I have a JavaScript application which posts messages to server. I have to gather those messages on server side and analyse them later, so I'm simply writing them to file. The problem is, when I open the file for reading, ie. in Notepad, messages are not being written. Since flock() is blocking and the locks should be mandatory on Windows, I expected the script to simply wait until I close the file and then save all pending messages, but it doesn't seem to work this way. Is there a way to make sure that all messages will finally be saved to the file, even if other process got exclusive access to it? I can't lose any message, even if someone opens the file for reading or copies it. Can I achieve it with PHP, or maybe I should rather send messages to database instead? PHP version is 7.0.4, my script looks like this:
$f = fopen('log.csv', 'a+');
flock($f, LOCK_EX);
$text = date('Y-m-d H:i:s'). ";" .htmlspecialchars($_POST["message"]). PHP_EOL;
clearstatcache();
fwrite($f, $text);
fflush($f);
flock($f, LOCK_UN);
fclose($f);
?>
flock returns true if successful or false if failed.
while(!flock($f, LOCK_EX)) {
sleep(5);
}
This won't fix the problem of your script timing out if another process has it locked for a long time. In that case, you might want to close the file and try opening a different file name.
Related
I'd like to allow 1 access at a time to a .txt file until the script finishes, so that it shows the message "Busy" when I open the same PHP again.
I have tried this:
<?php
$file = fopen('file.txt', 'w+');
if (flock($file, LOCK_EX)) {sleep(60); flock($file, LOCK_UN);}
else {echo 'Busy';}
fclose($file);
However it never shows the "Busy" message. Every new tab I open is in "sleep" mode. What am I doing wrong?
The documentation for flock is extremely poor.
As stated in the top comment:
If the file has been locked with LOCK_EX in another process, the CALL WILL BLOCK UNTIL ALL OTHER LOCKS have been released.
The result of this is, you will always be blocking on the call to flock. Your else clause would only execute if there is some error that prevents the file lock from being created - not when the file is already locked.
In PHP, I have created an advisory lock using flock like this:
$fileHandle = fopen($filePath, 'c');
flock($fileHandle, LOCK_EX | LOCK_NB);
Running the same code in another process will subsequently fail since the lock is exclusive. But the second process is able to run:
$fileHandle = fopen($filePath, 'c');
flock($fileHandle, LOCK_UN); // returns true
The file is still locked though, as confirmed by running a third process. So why does the unlocking request return true?
While the PHP Manual page about flock() seems ambiguous on this front, I would suggest that the return true would be because the function executed successfully, rather than because any file was locked or unlocked successfully.
Please also note that the PHP Manual page shows a lot of caveats regarding this function and I would go so far as to suggest digging out an alternative routine to lock out access to a file.
I came across this link while trying to learn how to lock files to prevent a script reading from a file as another is writing, or two scripts writing to the same file simultaneously.
I created two scripts, readandwritelock.php and readlock.php, the first script to retrieve the file with file_get_contents, append it and then write back to the same file with file_put_contents($file, $data, LOCK_EX), and the second that just retrieves the file with file_get_contents after flock($file, LOCK_SH).
<?php
//readandwritelock.php
$myfile = fopen('15-11-2018.txt', 'r+');
if (flock($myfile, LOCK_SH)) {
echo "Gotten lock<br>";
$current = file_get_contents('15-11-2018.txt');
/*I commented this on my second test to see if file_put_contents will work.
After uncommenting and third test, it does not work anymore.
if (flock($myfile, LOCK_UN)) {
echo "Unlocked<br>";
}*/
$current .= "appending";
if (file_put_contents('15-11-2018.txt', $current, LOCK_EX)) {
echo "Success";
}
else {
echo "Failed";
//browser loads indefinitely so this does not run
}
fclose($myfile);
}
?>
The problem I am facing is that the first try I was able to file_get_contents after getting the lock, and then releasing the lock and proceed to append and file_put_contents($file, $data, LOCK_EX). However on the second try I decided to comment the releasing of the LOCK_SH lock to test and see what would happen. The script file loads indefinitely (Waiting for localhost...) on my browser, so I reverted back the changes for my third try, but this time the script file still loads indefinitely. It's as if the LOCK_SH was never released.
I must be doing something wrong, but I do not know what exactly it is. Could someone explain?
This was tested on XAMPP and macOS High Sierra and Chrome.
<?php
//readlock.php
//works as normal
$myfile = fopen('15-11-2018.txt', 'r');
if (flock($myfile, LOCK_SH)) {
echo "Gotten lock<br>";
$current = file_get_contents('15-11-2018.txt');
echo $current;
if (flock($myfile, LOCK_UN)) {
echo "<br>Unlocked";
}
fclose($myfile);
}
?>
The reason why your browser seems to load indefinitely is because your PHP file never finishes.
First you get a LOCK_SH (a shared or read lock) for your file, which is fine while you are reading the content.
The problem is that you also try to get a LOCK_EX (an exclusive lock) on the same file in the file_put_contents function. Therefore the file_put_contents functions blocks until all other locks (shared AND exclusive ones) are unlocked, which can't work (this is a deadlock).
For your code to work properly, you can either try to get an exlusive lock in the first place
if( flock($myfile, LOCK_EX) ) {
// ...
or you unlock the shared lock before you write
flock($myfile, LOCK_UN);
if ( file_put_contents('15-11-2018.txt', $current, LOCK_EX) ) {
// ...
In general it is a good idea to keep a locks life as short as possible. If you plan to make extensive manipulations to your data between reading and writing, I would recommend to unlock the file right after reading and lock it again right for writing.
Have seen several script example with php file writing where they lock the file with an if statement. I have not found any examples of what I should write in else. I use this to create simple txt log files and do not want to print anything to the user if it fails. Can I skip the if/else or what should I have in the else?
$fh = fopen($logFile, 'w');
if (flock($fh, LOCK_EX)) {
foreach($datecounts as $datecount)
{
fwrite($fh, $datecount.PHP_EOL);
}
flock($fh, LOCK_UN);
}
else
{
//couldn't lock.
}
fclose($fh);
of course, you can skip it. But you can also use it for debug purposes (like logging in case of errors)
if you don't want to do anything just close it .
flock($fh, LOCK_UN);
so that when you try next time it will not create issue for you .
do not want to print anything to the user if it fails...
So what do you want to do if it fails?
Or to put it another way, what are the implications of failure?
It looks like you're writing a log file, and if it can't lock the file you just want to silently fail. That means you won't get a log, nor any indication that logging isn't happening.
If logging is important to you then this isn't great. But if you're happy with that situation, then you're right; there's no need for the else clause.
But you should also consider what a failure to lock the file means. It could mean that you have a problem with file permissions or that someone else has it open, or maybe that the disk is full or even a disk error. You may not care that the log file doesn't get written, but any of these problems could lead to a worse situation later in your program, and reporting it early could save you from that.
Have a file in a website. A PHP script modifies it like this:
$contents = file_get_contents("MyFile");
// ** Modify $contents **
// Now rewrite:
$file = fopen("MyFile","w+");
fwrite($file, $contents);
fclose($file);
The modification is pretty simple. It grabs the file's contents and adds a few lines. Then it overwrites the file.
I am aware that PHP has a function for appending contents to a file rather than overwriting it all over again. However, I want to keep using this method since I'll probably change the modification algorithm in the future (so appending may not be enough).
Anyway, I was testing this out, making like 100 requests. Each time I call the script, I add a new line to the file:
First call:
First!
Second call:
First!
Second!
Third call:
First!
Second!
Third!
Pretty cool. But then:
Fourth call:
Fourth!
Fifth call:
Fourth!
Fifth!
As you can see, the first, second and third lines simply disappeared.
I've determined that the problem isn't the contents string modification algorithm (I've tested it separately). Something is messed up either when reading or writing the file.
I think it is very likely that the issue is when the file's contents are read: if $contents, for some odd reason, is empty, then the behavior shown above makes sense.
I'm no expert with PHP, but perhaps the fact that I performed 100 calls almost simultaneously caused this issue. What if there are two processes, and one is writing the file while the other is reading it?
What is the recommended approach for this issue? How should I manage file modifications when several processes could be writing/reading the same file?
What you need to do is use flock() (file lock)
What I think is happening is your script is grabbing the file while the previous script is still writing to it. Since the file is still being written to, it doesn't exist at the moment when PHP grabs it, so php gets an empty string, and once the later processes is done it overwrites the previous file.
The solution is to have the script usleep() for a few milliseconds when the file is locked and then try again. Just be sure to put a limit on how many times your script can try.
NOTICE:
If another PHP script or application accesses the file, it may not necessarily use/check for file locks. This is because file locks are often seen as an optional extra, since in most cases they aren't needed.
So the issue is parallel accesses to the same file, while one is writing to the file another instance is reading before the file has been updated.
PHP luckily has a mechanisms for locking the file so no one can read from it until the lock is released and the file has been updated.
flock()
can be used and the documentation is here
You need to create a lock, so that any concurrent requests will have to wait their turn. This can be done using the flock() function. You will have to use fopen(), as opposed to file_get_contents(), but it should not be a problem:
$file = 'file.txt';
$fh = fopen($file, 'r+');
if (flock($fh, LOCK_EX)) { // Get an exclusive lock
$data = fread($fh, filesize($file)); // Get the contents of file
// Do something with data here...
ftruncate($fh, 0); // Empty the file
fwrite($fh, $newData); // Write new data to file
fclose($fh); // Close handle and release lock
} else {
die('Unable to get a lock on file: '.$file);
}