I have a php script that logs ads(banners) for a website and stores them to a .dat file. Inside this file an ID, URL, an other important information is saved. The problem I am having is that there are at any given time 4 ads on the page and so the .dat file is often corrupted when the php script attempts to write to it while it is open.
I checked and tried this solution however it did not help me:
PHP Simultaneous file access / flock() issue
The function I am using at the moment looks like this:
function writeads(){
global $bannerAdsPath, $ads, $bannerAds;
$data = fopen($bannerAdsPath, 'w') or die();
flock($data, 2) or die();
fputs($data, #join("\n", $ads)."\n");
while (list ($key, $val) = each ($bannerAds)) {
if (($key != '') && ($val != '')) {
fputs($data, $key.'='.$val."\n");
}
}
flock($data, 3);
fclose($data);
reset($bannerAds);
}
Any help would be appreciated as I have been scratching my head over this for a while.
Side bit of information, the client did not want to have their code rewritten to use a Database instead of a file so that option is out.
Thanks!
fopen with 'w' truncates the file before you have the option of flocking it.
You almost never want to use flock to unlock a file; just use fclose; the file will be unlocked when the handle is closed, and that way you know that no buffered writes will happen after you unlock.
Related
I wrote an ajax function to call a script which creates a zip archive of about 500 files. There is a loop to add the files to the archive. That loop contains a filecounter. Now I want to update the status at the browser every 50 files. But the php script will send the whole output when the script ends and the zip file is complete.
The principle is quite similar to the following post:
Echo 'string' while every long loop iteration (flush() not working)
(Except to solution is not working on my server)
I found a lot of possible solutions, but nothing works...
I tried it with the flush/ob_flush method (ob_implicit_flush as well), but flush the content don't work for me. I played a litte with the server configuration but it didn't help. Also all the examples didn't work. Maybe a server problem.
I tried SSE but the next response succeed also after the script ends.
I tried to use WebSocket but I had some problems with the handshake.
The code may look like this
$filecounter = 0;
foreach ($files as $file) {
// add $file to zip archive
$filecounter++;
if ($filecounter % 50 == 0) {
echo $filecounter;
}
}
Are there other options to get this working? Or how i get the code 'flushed'?
Thanks for your help!
You could store the progress in the Session and use a second ajax call to track the progress:
$filecounter = 0;
foreach ($files as $file) {
// add $file to zip archive
$filecounter++;
if ($filecounter % 50 == 0) {
session_start();
$_SESSION['progress'] = $filecounter;
session_write_close();
}
}
You need session_write_close(); to make the Session var accessible to the second script.
I have a text file which multiple users will be simultaneously editing (limited to an individual line per edit, per user). I have already found a solution for the "line editing" part of the required functionality right here on StackOverflow.com, specifically, the 4th solution (for large files) offered by #Gnarf in the following question:
how to replace a particular line in a text file using php?
It basically rewrites the entire file contents to a new temporary file (with the user's edit included) and then renames the temporary file to the original file once finished. It's great!
To avoid one user's edit causing a conflict with another user's edit if they are both attempting an edit at the same time, I have introduced flock() functionality, as can be seen in my variation on the code here:
$reading = fopen($file, 'r');
$writing = fopen($temp, 'w');
$replaced = false;
if ((flock($reading, LOCK_EX)) and (flock($writing, LOCK_EX))) {
echo 'Lock acquired.<br>';
while (!feof($reading)) {
$line = fgets($reading);
$values = explode("|",$line);
if ($values[0] == $id) {
$line = $id."|comment edited!".PHP_EOL;
$replaced = true;
}
fputs($writing, $line);
}
flock($reading, LOCK_UN);
flock($writing, LOCK_UN);
fclose($reading);
fclose($writing);
} else {
echo 'Lock not acquired.<br>';
}
I've made sure the $temp file always has a unique filename. Full code here: https://pastebin.com/E31hR9Mz
I understand that flock() will force any other execution of the script to wait in a queue until the first execution has finished and the flock() has been released. So far so good.
However, the problem starts at the end of the script, when the time has come to rename() the temporary file to replace the original file.
if ($replaced) {
rename($temp, $file);
} else {
unlink($temp);
}
From what I have seen, rename() will fail if the original file still has a flock(), so I need to release the flock() before this point. However, I also need it to remain locked, or rename() will fail when another user running the same script immediately opens a new flock() as soon as the previous flock() is released. When this happens, it will return:
Warning: rename(temporary.txt,original.txt): Access is denied. (code: 5)
tl;dr: I seem to be in a bit of a Catch-22. It looks like rename() won't work on a locked file, but unlocking the file will allow another user to immediately lock it again before the rename() can take place.
Any ideas?
update: After some extensive research into how flock() works, (in layman's terms, there is no guarantee that another script will respect the "lock", and therefore it is not really a "lock" at all as one would assume from the literal meaning of the word) I have opted for this solution instead which works like a charm:
https://docstore.mik.ua/orelly/webprog/pcook/ch18_25.htm
"Good lock" on your locking adventures.
I have a script that re-writes a file every few hours. This file is inserted into end users html, via php include.
How can I check if my script, at this exact moment, is working (e.g. re-writing) the file when it is being called to user for display? Is it even an issue, in terms of what will happen if they access the file at the same time, what are the odds and will the user just have to wait untill the script is finished its work?
Thanks in advance!
More on the subject...
Is this a way forward using file_put_contents and LOCK_EX?
when script saves its data every now and then
file_put_contents($content,"text", LOCK_EX);
and when user opens the page
if (file_exists("text")) {
function include_file() {
$file = fopen("text", "r");
if (flock($file, LOCK_EX)) {
include_file();
}
else {
echo file_get_contents("text");
}
}
} else {
echo 'no such file';
}
Could anyone advice me on the syntax, is this a proper way to call include_file() after condition and how can I limit a number of such calls?
I guess this solution is also good, except same call to include_file(), would it even work?
function include_file() {
$time = time();
$file = filectime("text");
if ($file + 1 < $time) {
echo "good to read";
} else {
echo "have to wait";
include_file();
}
}
To check if the file is currently being written, you can use filectime() function to get the actual time the file is being written.
You can get current timestamp on top of your script in a variable and whenever you need to access the file, you can compare the current timestamp with the filectime() of that file, if file creation time is latest then the scenario occured when you have to wait for that file to be written and you can log that in database or another file.
To prevent this scenario from happening, you can change the script which is writing the file so that, it first creates temporary file and once it's done you just replace (move or rename) the temporary file with original file, this action would require very less time compared to file writing and make the scenario occurrence very rare possibility.
Even if read and replace operation occurs simultaneously, the time the read script has to wait will be very less.
Depending on the size of the file, this might be an issue of concurrency. But you might solve that quite easy: before starting to write the file, you might create a kind of "lock file", i.e. if your file is named "incfile.php" you might create an "incfile.php.lock". Once you're doen with writing, you will remove this file.
On the include side, you can check for the existance of the "incfile.php.lock" and wait until it's disappeared, need some looping and sleeping in the unlikely case of a concurrent access.
Basically, you should consider another solution by just writing the data which is rendered in to that file to a database (locks etc are available) and render that in a module which then gets included in your page. Solutions like yours are hardly to maintain on the long run ...
This question is old, but I add this answer because the other answers have no code.
function write_to_file(string $fp, string $string) : bool {
$timestamp_before_fwrite = date("U");
$stream = fopen($fp, "w");
fwrite($stream, $string);
while(is_resource($stream)) {
fclose($stream);
}
$file_last_changed = filemtime($fp);
if ($file_last_changed < $timestamp_before_fwrite) {
//File not changed code
return false;
}
return true;
}
This is the function I use to write to file, it first gets the current timestamp before making changes to the file, and then I compare the timestamp to the last time the file was changed.
Is there any way to check id a file is being accessed or modified by another process from a php script. i have attempted to use the filemtime(), fileatime() and filectime() functions but i have the script in a loop which is checking continuously but it seems once the script has been executed it will only take the time from the first time the file was checked.. an example would be uploading files to a FTP or SMB share i attempted this below
while(1==1)
{
$LastMod = filemtime("file");
if(($LastMod +60) > time())
{
echo "file in use please wait... last modified : $LastMod";
sleep(10);
}else{
process file
}
}
I know the file is constantly changing but the $LastMod variable is not updating but end process and execute again will pick up a new $LastMod from the file but dosnt seem to update each time the file is checked in the loop
I have also attempted this with looking at filesize() but get the same symptoms i also looked into flock() but as the file is created or modified outside PHP I don't see how this would work.
If anyone has any solutions please let me know
thanks Vip32
PS. using PHP to process the files as requires interaction with mysql and querying external websites
The file metadata functions all work off stat() output, which caches its data, as a stat() call is a relatively expensive function. You can empty that cache to force stat() to fetch fresh data with clearstatcache()
There are other mechanisms that allow you to monitor for file changes. Instead of doing a loop in PHP and repeatedly stat()ing, consider using an external monitoring app/script which can hook into the OS-provided mechanism and call your PHP script on-demand when the file truly does change.
Add clearstatcache(); to your loop:
while(true)
{
$LastMod = filemtime("file");
clearstatcache();
if(($LastMod +60) > time())
{
echo "file in use please wait... last modified : $LastMod";
sleep(10);
}else{
process file
}
}
I experimenting with twitter streaming API,
I use Phirehose to connect to twitter and fetch the data but having problems storing it in files for further processing.
Basically what I want to do is to create a file named
date("YmdH")."."txt"
for every hour of connection.
Here is how my code looks like right now (not handling the hourly change of files)
public function enqueueStatus($status)
$data = json_decode($status,true);
if(isset($data['text'])/*more conditions here*/) {
$fp = fopen("/tmp/$time.txt");
fwirte ($status,$fp);
fclose($fp);
}
Help is as always much appreciated :)
You want the 'append' mode in fopen - this will either append to a file or create it.
if(isset($data['text'])/*more conditions here*/) {
$fp = fopen("/tmp/" . date("YmdH") . ".txt", "a");
fwrite ($status,$fp);
fclose($fp);
}
From the Phirehose googlecode wiki:
As of Phirehose version 0.2.2 there is
an example of a simple "ghetto queue"
included in the tarball (see file:
ghetto-queue-collect.php and
ghetto-queue-consume.php) that shows
how statuses could be easily collected
on to the filesystem for processing
and then picked up by a separate
process (consume).
This is a complete working sample of doing what you want to do. The rotation time interval is configurable too. Additionally there's another script to consume and process the written files too.
Now if only I could find a way to stop the whole sript, my log keeps filling up (the script continues execution) even if I close the browser tab :P