I am looking to numerically increment to a file's content if the file was last modified within 24 hours otherwise reset the file's content to 1. However I want to ensure this continues to work regardless of how many users visit the script at the same time (the script would always need to execute but ensure it does not overwrite/calculate incorrectly - I believe this is where flock comes to use).
Please see below code:
$host_limit = 50;
$file = 'timer.txt';
$fh = fopen($file,'r+');
if (flock($fh,LOCK_EX)) {
$content = fgets($fh);
//FILE HAS NOT BEEN MODIFIED IN LAST 24 HOURS
if (strtotime('-24 hours') > filemtime($file)) {
$content = 1;
} else {
$content = ($content + 1);
}
fwrite($fh, $content);
fflush($fh);
flock($fh,LOCK_UN);
}
fclose($fh);
if ($content < $host_limit) {
//do stuff
}
Would the above work as I would like (as have no way to simulate what I am anticipating to test)?
Instead of using fopen and fwrite, you could use
file_get_contents($file);
and
file_put_contents($file, $content, LOCK_EX);
Check the manual:
http://php.net/manual/en/function.file-get-contents.php
http://php.net/manual/en/function.file-put-contents.php
Related
I developed a very simple counter in PHP. It works as expected but occasionally it resets to zero. No idea why. I suspect it could be related to concurrent visitors but I have no idea how to prevent that in case I am correct. Here is the code:
function updateCounter($logfile) {
$count = (int)file_get_contents($logfile);
$file = fopen($logfile, 'w');
if (flock($file, LOCK_EX)) {
$count++ ;
fwrite($file, $count);
flock($file, LOCK_UN);
}
fclose($file);
return number_format((float)$count, 0, ',', '.') ;
}
Thank you in advance.
file_get_contents on a locked file will probably get a "false" (== 0) and the logfile is probably unlocked again, when it comes to writing.
A classic race condition...
As file_get_contents() can return false accessing a previously locked file, the consequent fwrite() may write a zero or 1, resetting our counter to zero.
So we try to read the counter file after the locking has been succeeded for us.
function updateCounter($logfile) {
//$count = (int)file_get_contents($logfile);
if(file_exists($logfile)) {
$mode = 'r+';
} else {
$mode = 'w+';
}
//
$file = fopen($logfile, $mode);
//
if (flock($file, LOCK_EX)) {
//
// read counter file:
//
$count = (int) fgets($file);
$count++ ;
//
// point to the beginning of the file:
//
rewind($file);
fwrite($file, $count);
flock($file, LOCK_UN);
}
fclose($file);
return number_format((float)$count, 0, ',', '.') ;
}
//
$logfile = "counter.log";
echo updateCounter($logfile);
Please see usernotes on https://www.php.net/manual/en/function.flock.php .
I would append a character into the file and use strlen on the file contents to get the hits. Please note that your file will get big overtime but this can be easily solved with a cronjob that sums it up and cache it into another readonly file.
You can also use !is_writeable and check if its locked and if so you can miss the hit or wait with a while loop until its writable. Tricky but it works. It depends how valuable each hit will be and how much effort you would like to invest in this counter.
When I run this function on multiple scripts one script generated warning:
fread(): Length parameter must be greater than 0
function test($n){
echo "<h4>$n at ".time()."</h4>";
for ($i = 0; $i<50; $i++ ){
$fp = fopen("$n.txt", "r");
$s = fread($fp, filesize("$n.txt") );
fclose($fp);
$fp = fopen("$n.txt", "w");
$s = $_SERVER['HTTP_USER_AGENT'].' '.time();
if (flock($fp, LOCK_EX)) { // acquire an exclusive lock
fwrite($fp, $s);
// fflush($fp);// flush output before releasing the lock
flock($fp, LOCK_UN); // release the lock
} else {
echo "Couldn't get the lock!";
}
}
}
I try to write reading of the file for multiple users, but only one user can write the file. I know that when I use fwrite with flock - LOC_EX, next scripts must wait till the write is finished. But here it seems like filesize doesn't wait till the write operation is finished. My opinion is that it tries to reach the file when the file size is 0, and as a result this produces the problem: 0 bytes will be read from the file, when it is written by original script.
Is it possible to fix this for fread function?
Purpose of this script is to test fread with some limit and to check the data which I read later, if the data are really written when I did not used fflush.
function test($n){
echo "<h4>$n at ".time()."</h4>";
for ($i = 0; $i<50; $i++ ){
$start = microtime(true);
$fp = fopen("$n.txt", "r");
if(filesize($n.txt) > 0)
{
$s = fread($fp, filesize($n.txt) );
fclose($fp);
$fp = fopen("$n.txt", "w");
$s = $_SERVER['HTTP_USER_AGENT'].' '.time();
if (flock($fp, LOCK_EX)) { // acquire an exclusive lock
fwrite($fp, $s);
// fflush($fp);// flush output before releasing the lock
flock($fp, LOCK_UN); // release the lock
} else {
echo "Couldn't get the lock!";
}
}
else
{
echo "Filesize must be greater than 0";
}
}
}
please change $s variables name its use same things two time
$fp = fopen("$n.txt", "r");
$s = fread($fp, filesize("$n.txt") );
fclose($fp);
The error occurs in the middle line of the above three lines.
Firstly, these three lines could be rewritten into a single line as follows:
$s = file_get_contents("$n.txt");
However, these isn't necessary, as these three lines are entirely redundant in your code. They don't do anything useful.
What they do is open a file, store its contents to $s and then close it.
But you are then immediately setting $s to a different value, thus throwing away the previous value, and making it pointless to have read it from the file in the first place.
If you need to keep the original contents of the file, then use file_get_contents() and make sure you don't overwrite the contents of the variable.
If you don't need the original contents of the file, then just delete those three lines from your code.
Incidentally, this error highlights a couple of good coding practices that you should take on board: Firstly, never re-use a variable for two different things, and secondly always give your variables (and functions) good names. $s is not a good name; $previousFileContents would be a better name; it would have made the error much more obvious.
I am having this strange issue and can't figure it out.
On some websites I have this script works perfect... same code, same server settings...
With php, there is a simple page view hit counter that stores locally in a txt file.
Then I echo out the value on the footer copyright area of my websites to give the client a quick statistic... its pretty cool how fast it grows.
Anyway.. i have a client corner grill ny . com (seo purposes I added spaces )
On that website.. its been working great for years.
Now another website and a bunch more.. for example... savianos . com
This breaks.. and the text value is blank.
This is the counter.php code
<?php
session_start();
$counter_name = "counter/hits.txt";
//Check if a text file exists. If not create one and initialize it to zero.
if (!file_exists($counter_name)) {
$f = fopen($counter_name, "w");
fwrite($f,"0");
fclose($f);
}
// Read the current value of our counter file
$f = fopen($counter_name,"r");
$counterVal = fread($f, filesize($counter_name));
fclose($f);
// Has visitor been counted in this session?
// If not, increase counter value by one
if(!isset($_SESSION['hasVisited'])){
$_SESSION['hasVisited']="yes";
$counterVal++;
$f = fopen($counter_name, "w");
fwrite($f, $counterVal);
fclose($f);
}
?>
Now, if I add a value in the txt file.. like 1040... and go to the website it starts to work... then after a week or so I check it .. its blank again.
Any ideas?
I am thinking that this may be happening because the website might get a TON of views during dinner time friday night.. and the simple script can't handle it so.. while its trying to write a added a number it just breaks and go to blank.. and never starts back up again.
The structure is this.
/counter/ folder has
counter.php and a hits.txt file
Every page of the website the very first thing is
<?php include ('counter/counter.php'); ?>
and in the footer of the website we have
<?php echo $counterVal; ?>
Your code looks perfect, but let's understand the situation. You have a file which can be accessed concurrently for many users, because page visit can be done by multiple users on same time. This does't seem right you have to lock the file manipulation for another user while someone is modifying it, right?. Please have a look
Visits counter without database with PHP
It is most likely because you have two concurrent scripts that tried to open the file at one and one of them fail. You have to use flock() when there are multiple instances of the script that could operate at the same time. Counter are some of the heaviest things if you going to use file reading and writing. I wrote this wrapper to easily implement file locking.
If you want to check out one of my counters that in operation try http://ozlu.org. That dynamic counter image was self-built. The fileReadAll will read the entire file in one shot. The file writer only has two modes, write or append. You can pass the fileWriter an array or a string and it will write it to the file. The function will not add any \n to format your text so you would have to add that. The default mode for the fileWriteAll is w if you do not set the third argument.
function fileWriteAll($file, $content, $mode = "w"){
$mode = $mode === "w" || $mode === "a"? $mode : "w";
$FILE = fopen($file, $mode);
while (!flock($FILE, LOCK_EX)) { usleep(1); }
if( is_array($content) ){
for ($i = 0; $i < count($content); $i++){
fwrite($FILE, $content[$i]);
}
} else {
fwrite($FILE, $content);
}
flock($FILE, LOCK_UN);
fclose($FILE);
}
function fileReadAll($file){
$FILE = fopen($file, 'r');
while (!flock($FILE, LOCK_SH)) { usleep(1); }
$content = fread($FILE, filesize($file));
flock($FILE, LOCK_UN);
fclose($FILE);
return $content;
}
Your modified code:
session_start();
$counterName = './views.txt';
if (!file_exists($counterName)) {
$file = fopen($counterName, 'w');
fwrite($file, '0');
fclose($file);
}
$file = fopen($counterName, 'r');
$value = fread($file, filesize($counterName));
fclose($file);
if (! isset($_SESSION['visited'])) {
$_SESSION['visited'] = 'yes';
$value++;
$file = fopen($counterName, 'w');
fwrite($file, $value);
fclose($file);
}
session_unset();
echo $value;
I have an issue with writing and reading to text file.
I have to first write from a text file to another text file some values which I need to read again. Below are the code snippets:
Write to text file:
$fp = #fopen ("text1.txt", "r");
$fh = #fopen("text2.txt", 'a+');
if ($fp) {
//for each line in file
while(!feof($fp)) {
//push lines into array
$thisline = fgets($fp);
$thisline1 = trim($thisline);
$stringData = $thisline1. "\r\n";
fwrite($fh, $stringData);
fwrite($fh, "test");
}
}
fclose($fp);
fclose($fh);
Read from the written textfile
$page = join("",file("text2.txt"));
$kw = explode("\n", $page);
for($i=0;$i<count($kw);$i++){
echo rtrim($kw[$i]);
}
But, if I am not mistaken due to the "/r/n" I used to insert the newline, when I am reading back, there are issues and I need to pass the read values from only the even lines to a function to perform other operations.
How do I resolve this issue? Basically, I need to write certain values to a textfile and then read only the values from the even lines.
I'm not sure whether you have issues with the even line numbers or with reading the file back in.
Here is the solution for the even line numbers.
$page = join("",file("text2.txt"));
$kw = explode("\n", $page);
for($i=0;$i<count($kw);$i++){
$myValue = rtrim($kw[$i]);
if(i % 2 == 0)
{
echo $myValue;
}
}
I have a script which, each time is called, gets the first line of a file. Each line is known to be exactly of the same length (32 alphanumeric chars) and terminates with "\r\n".
After getting the first line, the script removes it.
This is done in this way:
$contents = file_get_contents($file));
$first_line = substr($contents, 0, 32);
file_put_contents($file, substr($contents, 32 + 2)); //+2 because we remove also the \r\n
Obviously it works, but I was wondering whether there is a smarter (or more efficient) way to do this?
In my simple solution I basically read and rewrite the entire file just to take and remove the first line.
I came up with this idea yesterday:
function read_and_delete_first_line($filename) {
$file = file($filename);
$output = $file[0];
unset($file[0]);
file_put_contents($filename, $file);
return $output;
}
There is no more efficient way to do this other than rewriting the file.
No need to create a second temporary file, nor put the whole file in memory:
if ($handle = fopen("file", "c+")) { // open the file in reading and editing mode
if (flock($handle, LOCK_EX)) { // lock the file, so no one can read or edit this file
while (($line = fgets($handle, 4096)) !== FALSE) {
if (!isset($write_position)) { // move the line to previous position, except the first line
$write_position = 0;
} else {
$read_position = ftell($handle); // get actual line
fseek($handle, $write_position); // move to previous position
fputs($handle, $line); // put actual line in previous position
fseek($handle, $read_position); // return to actual position
$write_position += strlen($line); // set write position to the next loop
}
}
fflush($handle); // write any pending change to file
ftruncate($handle, $write_position); // drop the repeated last line
flock($handle, LOCK_UN); // unlock the file
}
fclose($handle);
}
This will shift the first line of a file, you dont need to load the entire file in memory like you do using the 'file' function. Maybe for small files is a bit more slow than with 'file' (maybe but i bet is not) but is able to manage largest files without problems.
$firstline = false;
if($handle = fopen($logFile,'c+')){
if(!flock($handle,LOCK_EX)){fclose($handle);}
$offset = 0;
$len = filesize($logFile);
while(($line = fgets($handle,4096)) !== false){
if(!$firstline){$firstline = $line;$offset = strlen($firstline);continue;}
$pos = ftell($handle);
fseek($handle,$pos-strlen($line)-$offset);
fputs($handle,$line);
fseek($handle,$pos);
}
fflush($handle);
ftruncate($handle,($len-$offset));
flock($handle,LOCK_UN);
fclose($handle);
}
you can iterate the file , instead of putting them all in memory
$handle = fopen("file", "r");
$first = fgets($handle,2048); #get first line.
$outfile="temp";
$o = fopen($outfile,"w");
while (!feof($handle)) {
$buffer = fgets($handle,2048);
fwrite($o,$buffer);
}
fclose($handle);
fclose($o);
rename($outfile,$file);
I wouldn't usually recommend opening up a shell for this sort of thing, but if you're doing this infrequently on really large files, there's probably something to be said for:
$lines = `wc -l myfile` - 1;
`tail -n $lines myfile > newfile`;
It's simple, and it doesn't involve reading the whole file into memory.
I wouldn't recommend this for small files, or extremely frequent use though. The overhead's too high.
You could store positional info into the file itself. For example, the first 8 bytes of the file could store an integer. This integer is the byte offset of the first real line in the file.
So, you never delete lines anymore. Instead, deleting a line means altering the start position. fseek() to it and then read lines as normal.
The file will grow big eventually. You could periodically clean up the orphaned lines to reduce the file size.
But seriously, just use a database and don't do stuff like this.
Here's one way:
$contents = file($file, FILE_IGNORE_NEW_LINES);
$first_line = array_shift($contents);
file_put_contents($file, implode("\r\n", $contents));
There's countless other ways to do that also, but all the methods would involve separating the first line somehow and saving the rest. You cannot avoid rewriting the whole file. An alternative take:
list($first_line, $contents) = explode("\r\n", file_get_contents($file), 2);
file_put_contents($file, implode("\r\n", $contents));
My problem was large files. I just needed to edit, or remove the first line. This was a solution I used. Didn't require to load the complete file in a variable. Currently echos, but you could always save the contents.
$fh = fopen($local_file, 'rb');
echo "add\tfirst\tline\n"; // add your new first line.
fgets($fh); // moves the file pointer to the next line.
echo stream_get_contents($fh); // flushes the remaining file.
fclose($fh);
I think this is best for any file size
$myfile = fopen("yourfile.txt", "r") or die("Unable to open file!");
$ch=1;
while(!feof($myfile)) {
$dataline= fgets($myfile) . "<br>";
if($ch == 2){
echo str_replace(' ', ' ', $dataline)."\n";
}
$ch = 2;
}
fclose($myfile);
The solutions here didn't work performantly for me.
My solution grabs the last line (not the first line, in my case it was not relevant to get the first or last line) from the file and removes that from that file.
This is very quickly even with very large files (>150000000 lines).
function file_pop($file)
{
if ($fp = #fopen($file, "c+")) {
if (!flock($fp, LOCK_EX)) {
fclose($fp);
}
$pos = -1;
$found = 0;
while ($found < 2) {
if (fseek($fp, $pos--, SEEK_END) < 0) { // can not seek to position
rewind($fp); // rewind to the beginnung of the file
break;
};
if (ord(fgetc($fp)) == 10) { // newline
$found++;
}
}
$lastpos = ftell($fp); // get current position of file
$lastline = fgets($fp); // get current line
ftruncate($fp, $lastpos); // truncate file to last position
flock($fp, LOCK_UN); // unlock
fclose($fp); // close the file
return trim($lastline);
}
}
You could use file() method.
Gets the first line
$content = file('myfile.txt');
echo $content[0];