php read & write from file at the same time - php

Basically I have an XML file to populate data with and I will have a cron (in PHP) that updates it every 5 minutes. But at the same time, I will have users accessing this file all the time (and I'm talking about thousands of users).
When I tried a script myself by writing 2million text lines in a .txt file and reading it at the same time, of course the file_get_contents() was getting the current text in the .txt file and does not wait for it to end and get the contents when it's finished. So what I did is, I write to a temporary file and then rename it to the original .txt file. The renaming process on my PC takes up 0.003 seconds (calculated using microtime()).
Do you think this is a suitable solution or there will be users which will end up having an error that the file does not exists?

Of course this is not suitable.. You have to lock the file in this 0.003 microseconds.
A very simple way is a flag
For example create file called isReplacing
After replacing is done, delete file isReplacing
When a user wants the file say in getfile.php
while(file_exists("isReplacing"))
{}
//NOW echo file_get_contents()
//BETTER:
if(file_exists("isReplacing"))
{
//GET DATA FROM DATABASE
}
else
{
//ECHO THE FILE
}
NOTE this is a dumb way but I just want to demonstrate

Related

PHP: Cannot read from a .txt file

So I have a gaming server that automatically sends the amount of online players to a .php file that updates a .txt file to the current amount of players every minute.
However, when I try to write the contents of the .txt file into my website, it doesn't read the .txt file at all. The .txt just contains 1 number.
Example:
players.txt contains one number, that number is 11 (for players online).
<h5> Come play with <?php echo file_get_contents("players.txt");?> other players</h5>
The outcome is "Come play with other players".
It reads the file but assumes it's empty. Server would throw an error in case file not existed or it couldn't be opened because of permissions or other reasons.
Try this:
1. Echo random string ex. <?php echo '12' ?>
2. Create another file for ex. players1.txt with random number and get it's contents the same way ex. <?php echo file_get_contents('players1.txt'); ?>
I'm not a great web developer but a .txt file shouldn't be this hard to read, especially when it's 1 number.
This is subjective to the fact you know it's a txt file and you know it should contain a number, the issue you're encountering is probably logical rather than subjective. It could be an image file or a .dat file, if a process is writing to the file that file could be locked so other processes can not read the file, or the reader is denied access due to ownership, regardless of the file type or contents.
What does your PHP error log say?
Check the file has the correct group/owner permissions and update them as required.
Have you been clearing your static PHP cache to keep any retrieved values up to date?
Are you writing to the file with correct locking?
Using a Database is far, far more efficient and performant to using a filesystem to track this sort of data.
If you're using Unity be aware that it's local file and online security provisions are absolutely appauling, not saying this is an issue for you specifically, but just as a general proviso, limit accessability as much as possible. Always clean your data.
$valueToPlaceIntoFile = (int)$_POST['playerCounter'];

What happens when file_put_content writes to the same file at the same time?

If a script is setup to save text to a given .txt file (file_put_contents) upon pageload, what happens when 2 users loads the page at the exact same time?
Will one of the users (or both) receive an error? Will the .txt file be corrupted? Will the text written to the .txt file be broken? ...or?
I have written a cache function that fetches content from an API and saves it to a .txt file. It writes new data to the .txt file if it has been longer than 3 minutes since last time new data was fetched.
(It decides upon pageload if new data needs to be fetched, or if data from .txt file should be used).
Will this work without problems? Or is there anything I can do to prevent any errors from happening?
No error will happen, but both processes will write to the file at the same time.
You need to use the LOCK_EX flag to ensure that only one process is writing to the file at a time:
if (false === file_put_contents('path/to/file', 'data', LOCK_EX)) {
// Writing to the file failed
}

Using PHP to update file after a new copy is uploaded

So I'm trying to see if something like this is possible WITHOUT using database.
A file is uploaded to the server /files/file1.html
PHP is tracking the upload time by checking last update time in database
If the file (file1.html) has been updated since the last DB time, PHP makes changes; Otherwise, no changes are made
Basically, for a text simulation game (basketball), it outputs HTML files for rosters/stats/standings/etc. and I'd like to be able to insert each team's Logo at the top (which the outputted files don't do). Obviously, it would need to be done often as the outputted files are uploaded to the server daily. I don't want to have to go through each team's roster manually inserting images at the top.
Don't have an example as the league hasn't started.
I've been thinking of just creating a button on the league's website (not created yet) that when pushed would update the pages, but I'm hoping to have PHP do it by itself.
Yes, you could simply let php check for the file creation date (the point in time where the file was created on the server, not the picture itself was made). check http://php.net/manual/en/function.filemtime.php and you should be done within 30mins ;)
sexy quick & dirty unproven code:
$filename = 'somefile.txt';
$timestamp_now = time(); // get timestamp from now (seconds)
if (filemtime($filename) > $timestamp_now) {
// overwrite the file (maybe check for existing file etc first)
}

Mark a .CSV file if it has been read

I have two WordPress sites on two different servers. Site B creates a .CSV file with the latest comments. Site A reads that file and gets the information and performs some functions on it. These are two independent processes that are on separate servers.
I create the CSV in 'append' mode so that I can compile the new comments without fear of skipping any while running the function on the other side:
$fp = fopen('new_comments.csv', 'a');
However, once I get the .CSV on Site A, I have no way to write to the .CSV and tell it that I have read the contents.
I suppose I could overwrite the .CSV data in 'w' mode and only run it once right before I run the other function, but is there any other way to make sure that once I read from the .CSV on Site A, I can refresh the .CSV on Site B?
Use the filesystem function filemtime to get the last modification time of the file. You can then do a check in your second server to see if it has been already read.
Similarly, you can do it with the size of the file.
(You could also put a time period of "Grace" between the two servers to do a check in there).

File opened by a script accessible by the user

I need to write a lot of data into a file while almost at the same time (at least at the time file is still opened by fopen()) user's browser needs to access it.
I found it's impossible until fclose() or end of the script.
Is there any way to make it possible?
Perhaps its better to store the data in memory, or work with a temporary file. Then write to the master file at designated points rather then holding open the file for the entire execution of the script.
An option would be to send the file's mimetype to the user (text/plain for example), and echo the current file contents. After that you write both to the file and the output, so that the output to the user will mimic the file.

Categories