Any points to use else in file lock? - php

Have seen several script example with php file writing where they lock the file with an if statement. I have not found any examples of what I should write in else. I use this to create simple txt log files and do not want to print anything to the user if it fails. Can I skip the if/else or what should I have in the else?
$fh = fopen($logFile, 'w');
if (flock($fh, LOCK_EX)) {
foreach($datecounts as $datecount)
{
fwrite($fh, $datecount.PHP_EOL);
}
flock($fh, LOCK_UN);
}
else
{
//couldn't lock.
}
fclose($fh);

of course, you can skip it. But you can also use it for debug purposes (like logging in case of errors)

if you don't want to do anything just close it .
flock($fh, LOCK_UN);
so that when you try next time it will not create issue for you .

do not want to print anything to the user if it fails...
So what do you want to do if it fails?
Or to put it another way, what are the implications of failure?
It looks like you're writing a log file, and if it can't lock the file you just want to silently fail. That means you won't get a log, nor any indication that logging isn't happening.
If logging is important to you then this isn't great. But if you're happy with that situation, then you're right; there's no need for the else clause.
But you should also consider what a failure to lock the file means. It could mean that you have a problem with file permissions or that someone else has it open, or maybe that the disk is full or even a disk error. You may not care that the log file doesn't get written, but any of these problems could lead to a worse situation later in your program, and reporting it early could save you from that.

Related

How to rename() a file in PHP that needs to remain locked while doing so?

I have a text file which multiple users will be simultaneously editing (limited to an individual line per edit, per user). I have already found a solution for the "line editing" part of the required functionality right here on StackOverflow.com, specifically, the 4th solution (for large files) offered by #Gnarf in the following question:
how to replace a particular line in a text file using php?
It basically rewrites the entire file contents to a new temporary file (with the user's edit included) and then renames the temporary file to the original file once finished. It's great!
To avoid one user's edit causing a conflict with another user's edit if they are both attempting an edit at the same time, I have introduced flock() functionality, as can be seen in my variation on the code here:
$reading = fopen($file, 'r');
$writing = fopen($temp, 'w');
$replaced = false;
if ((flock($reading, LOCK_EX)) and (flock($writing, LOCK_EX))) {
echo 'Lock acquired.<br>';
while (!feof($reading)) {
$line = fgets($reading);
$values = explode("|",$line);
if ($values[0] == $id) {
$line = $id."|comment edited!".PHP_EOL;
$replaced = true;
}
fputs($writing, $line);
}
flock($reading, LOCK_UN);
flock($writing, LOCK_UN);
fclose($reading);
fclose($writing);
} else {
echo 'Lock not acquired.<br>';
}
I've made sure the $temp file always has a unique filename. Full code here: https://pastebin.com/E31hR9Mz
I understand that flock() will force any other execution of the script to wait in a queue until the first execution has finished and the flock() has been released. So far so good.
However, the problem starts at the end of the script, when the time has come to rename() the temporary file to replace the original file.
if ($replaced) {
rename($temp, $file);
} else {
unlink($temp);
}
From what I have seen, rename() will fail if the original file still has a flock(), so I need to release the flock() before this point. However, I also need it to remain locked, or rename() will fail when another user running the same script immediately opens a new flock() as soon as the previous flock() is released. When this happens, it will return:
Warning: rename(temporary.txt,original.txt): Access is denied. (code: 5)
tl;dr: I seem to be in a bit of a Catch-22. It looks like rename() won't work on a locked file, but unlocking the file will allow another user to immediately lock it again before the rename() can take place.
Any ideas?
update: After some extensive research into how flock() works, (in layman's terms, there is no guarantee that another script will respect the "lock", and therefore it is not really a "lock" at all as one would assume from the literal meaning of the word) I have opted for this solution instead which works like a charm:
https://docstore.mik.ua/orelly/webprog/pcook/ch18_25.htm
"Good lock" on your locking adventures.

PHP script file loads indefinitely on browser when flock (LOCK_SH/LOCK_EX)

I came across this link while trying to learn how to lock files to prevent a script reading from a file as another is writing, or two scripts writing to the same file simultaneously.
I created two scripts, readandwritelock.php and readlock.php, the first script to retrieve the file with file_get_contents, append it and then write back to the same file with file_put_contents($file, $data, LOCK_EX), and the second that just retrieves the file with file_get_contents after flock($file, LOCK_SH).
<?php
//readandwritelock.php
$myfile = fopen('15-11-2018.txt', 'r+');
if (flock($myfile, LOCK_SH)) {
echo "Gotten lock<br>";
$current = file_get_contents('15-11-2018.txt');
/*I commented this on my second test to see if file_put_contents will work.
After uncommenting and third test, it does not work anymore.
if (flock($myfile, LOCK_UN)) {
echo "Unlocked<br>";
}*/
$current .= "appending";
if (file_put_contents('15-11-2018.txt', $current, LOCK_EX)) {
echo "Success";
}
else {
echo "Failed";
//browser loads indefinitely so this does not run
}
fclose($myfile);
}
?>
The problem I am facing is that the first try I was able to file_get_contents after getting the lock, and then releasing the lock and proceed to append and file_put_contents($file, $data, LOCK_EX). However on the second try I decided to comment the releasing of the LOCK_SH lock to test and see what would happen. The script file loads indefinitely (Waiting for localhost...) on my browser, so I reverted back the changes for my third try, but this time the script file still loads indefinitely. It's as if the LOCK_SH was never released.
I must be doing something wrong, but I do not know what exactly it is. Could someone explain?
This was tested on XAMPP and macOS High Sierra and Chrome.
<?php
//readlock.php
//works as normal
$myfile = fopen('15-11-2018.txt', 'r');
if (flock($myfile, LOCK_SH)) {
echo "Gotten lock<br>";
$current = file_get_contents('15-11-2018.txt');
echo $current;
if (flock($myfile, LOCK_UN)) {
echo "<br>Unlocked";
}
fclose($myfile);
}
?>
The reason why your browser seems to load indefinitely is because your PHP file never finishes.
First you get a LOCK_SH (a shared or read lock) for your file, which is fine while you are reading the content.
The problem is that you also try to get a LOCK_EX (an exclusive lock) on the same file in the file_put_contents function. Therefore the file_put_contents functions blocks until all other locks (shared AND exclusive ones) are unlocked, which can't work (this is a deadlock).
For your code to work properly, you can either try to get an exlusive lock in the first place
if( flock($myfile, LOCK_EX) ) {
// ...
or you unlock the shared lock before you write
flock($myfile, LOCK_UN);
if ( file_put_contents('15-11-2018.txt', $current, LOCK_EX) ) {
// ...
In general it is a good idea to keep a locks life as short as possible. If you plan to make extensive manipulations to your data between reading and writing, I would recommend to unlock the file right after reading and lock it again right for writing.

How to make flock block if other process locked the file?

I have a JavaScript application which posts messages to server. I have to gather those messages on server side and analyse them later, so I'm simply writing them to file. The problem is, when I open the file for reading, ie. in Notepad, messages are not being written. Since flock() is blocking and the locks should be mandatory on Windows, I expected the script to simply wait until I close the file and then save all pending messages, but it doesn't seem to work this way. Is there a way to make sure that all messages will finally be saved to the file, even if other process got exclusive access to it? I can't lose any message, even if someone opens the file for reading or copies it. Can I achieve it with PHP, or maybe I should rather send messages to database instead? PHP version is 7.0.4, my script looks like this:
$f = fopen('log.csv', 'a+');
flock($f, LOCK_EX);
$text = date('Y-m-d H:i:s'). ";" .htmlspecialchars($_POST["message"]). PHP_EOL;
clearstatcache();
fwrite($f, $text);
fflush($f);
flock($f, LOCK_UN);
fclose($f);
?>
flock returns true if successful or false if failed.
while(!flock($f, LOCK_EX)) {
sleep(5);
}
This won't fix the problem of your script timing out if another process has it locked for a long time. In that case, you might want to close the file and try opening a different file name.

PHP write to included file

I need to include one PHP file and execute function from it.
After execution, on end of PHP script I want to append something to it.
But I'm unable to open file. It's possible to close included file/anything similar so I'll be able to append info to PHP file.
include 'something.php';
echo $somethingFromIncludedFile;
//Few hundred lines later
$fh = fopen('something.php', 'a') or die('Unable to open file');
$log = "\n".'$usr[\''.$key.'\'] = \''.$val.'\';';
fwrite($fh, $log);
fclose($fh);
How to achieve that?
In general you never should modify your PHP code using PHP itself. It's a bad practice, first of all from security standpoint. I am sure you can achieve what you need in other way.
As Alex says, self-modifying code is very, VERY dangerous. And NOT seperating data from code is just dumb. On top of both these warnings, is the fact that PHP arrays are relatively slow and do not scale well (so you could file_put_contents('data.ser',serialize($usr)) / $usr=unserialize(file_get_contents('data.ser')) but it's only going to work for small numbers of users).
Then you've got the problem of using conventional files to store data in a multi-user context - this is possible but you need to build sophisticated locking queue management. This usually entails using a daemon to manage the queue / mutex and is invariably more effort than its worth.
Use a database to store data.
As you already know this attempt is not one of the good ones. If you REALLY want to include your file and then append something to it, then you can do it the following way.
Be aware that using eval(); is risky if you cannot be 100% sure if the content of the file does not contain harmful code.
// This part is a replacement for you include
$fileContent = file_get_contents("something.php");
eval($fileContent);
// your echo goes here
// billion lines of code ;)
// file append mechanics
$fp = fopen("something.php", "a") or die ("Unexpected file open error!");
fputs($fp, "\n".'$usr[\''.$key.'\'] = \''.$val.'\';');
fclose($fp);

PHP - Open TXT file, add +1 to contents when link clicked

How can I make it so when a user clicks on a link on my web page, it writes to a .txt file named "Count.txt", which contains only a number and adds 1 to that number? Thank you.
If you forego any validity checking you could do it with something as simple as:
file_put_contents($theCounterFile, file_get_contents($theCounterFile)+1);
Addition:
There's talk about concurrency in this thread and it should be noted that it is a good idea to use a database and transactions to deal with concurrency, I'd highly recommend against writing a bunch of plumbing code to do this in a file.
If you've ever had, or think you might ever have two requests for the resource in the same second you should look into PDO with mysql, or PDO with SQLite instead of a file, use transactions (and InnoDB or better if you're going for mysql).
But really, even if you get two requests in the same microsecond (highly unlikely), chances of locking the file are slim as it will not be kept open and the two requests will probably not be handled parallel enough to lock anyway. Reality check: how many hits on the same resource do you get on average in the same minute?...
If you decide to do anything more advanced, like say two numbers, you may want to consider using SQLite. It's about as about as fast and as simple as opening and closing a file, but is much more flexible.
Open the file, lock the file (VERY important), read the number currently in there, add 1 to the number, write number back to file, release the lock and close the file.
ie. something like :
$fp = fopen("count.txt", "r+");
if (flock($fp, LOCK_EX)) { // do an exclusive lock
$num = fread($fp, 10);
$num++;
fseek($fp, 0);
fwrite($fp, $num);
flock($fp, LOCK_UN); // release the lock
} else {
// handle error
}
fclose($fp);
should work (not tested).
Generally this is quite easy:
$count = (int)file_get_contents('/path/to/Count.txt');
file_put_contents('/path/to/Count.txt', $count++, LOCK_EX);
But you'll run into concurrency problems using this code. One way to generate a lock safe from any race condition is:
$countFile = '/path/to/Count.txt';
$countTemp = tempnam(dirname($countFile), basename($countFile));
$countLock = $countFile . '.lock';
$f_lock = fopen($countLock, 'w');
if(flock($f_lock, LOCK_EX)) {
$currentCount = (int)file_get_contents($countFile);
$f_temp = fopen($countTemp, 'w');
if(flock($f_temp, LOCK_EX)) {
fwrite($f_temp, $currentCount++);
flock($f_temp, LOCK_UN);
fclose($f_temp);
if(!rename($countTemp, $countFile)) {
unlink($countTemp);
}
}
flock($f_lock, LOCK_UN);
fclose($f_lock);
}

Categories