How to keep a file open in PHP? [duplicate] - php

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
File resource persistence in PHP
If using fopen for opening a file, it will be closed and unset at the end of the PHP script even without fclose.
$fp = fopen('data.txt', 'w');
fwrite($fp, 'text');
fclose($fp);
Now if it is a frequently used script, we need to open/close the file with the filesystem too many times (file I/O). It would be better to keep the file open. This is the technique that database systems use.
Is there a function in PHP to leave a file open and do not re-open it on the next run?
Or How we can setup a semi-server to keep a file opened for frequent access by PHP?

No - you cant. You can open the file in the start of your script/scripts and close it in the end. You can operate with the between the opening and the closing as much as you like. For example you can open the file in the header of your site, and close it at the footer.
To solve the task you require, you might want to take a look at a PHP extension called memcahced. It will store some pieces of information in the RAM of the machine, so that you can reuse them later. You can also add expiration time of each piece of information.
Take a look at this module here: http://www.php.net/manual/en/book.memcached.php

You could lock the file using flock(). Since PHP 5.3.2 the file remains locked until explicitely unlocked, so you need to make sure that the version of the PHP on the server you're running the code is higher than or at least 5.3.2

There is no way to keep a file open between sereral execution of the same script (luckily:)
Opening a file is not very intensive, I suspect it does not worth setting up a "semi-server" that keep the file open.
Why do you need that ?

Related

server http doesn't work while updating php file via ftp [duplicate]

This question already has answers here:
How to update a website without making it go down?
(3 answers)
Closed 3 years ago.
when I update a php file via ftp (filezilla), pages using that file stop to work until transfer is complete. The server is linux with nginx/php-fpm, but I had the same problem with apache. The only "solution" I found is to edit directly the php file on a server remote shell, and update the content. But this is a very uncomfortable solution.
Is there anybody with a better solution?
Thanks
It is normal if you are doing upload via FTP. the Best solution is using Continues Deployment services with zero downtime approach.
Continuous deployment without downtime
But if you talk about one file. You can just check php file if it exists or correct uploaded file else you can use old copy of this file.
Somesthing like this :
$file = 'uploaded.php';
$oldFile = 'uploaded_old.php';
if (file_exists($file)) {
require_once($file);
} else {
require_once($oldFile);
}

Can high load cause PHP to not be able to write to a textfile?

I have a PHP script which is under high load with multiple calls per second (originating from another computer). It's running on PHP 5.5.14 on an IIS server. Every request and response to the script is logged using
file_put_contents('log_2019-09-12.txt', $msg, FILE_APPEND);
Every request and response is also logged on the client computer, and there I see occasional PHP errors like this one:
PHP ERROR 2: file_put_contents(C:\\WWW\\project-x\\logs\\log_2019-09-11.txt): failed to open stream: Permission denied
These seem to happen about every ~140 minutes, usually there are 8 of them in a row and then things work for another ~140 minutes, handling several requests per second and logging successfully to the log file.
Could it be that PHP is usually writing to an in-memory file and then actually writes the contents to disk every ~140 minutes, and that's what's causing this error? If so, how can I circumvent it?
Answered by Magnus Eriksson in the comments
Try and add the LOCK_EX argument when writing: file_put_contents($file, $text, FILE_APPEND|LOCK_EX) From the manual: the LOCK_EX flag to prevent anyone else writing to the file at the same time
You have no permission to write this.
mkdir($path, 0755, true)
Check to see if you have permissions as you write.

PHP ftp_get download zero bytes files

in a PHP project, I need to download CSV files from a FTP server. I'm using PHP ftp_XXX function to do this.
I'm working on two separate computers, one can download FTP files with no problem; the other one initiate the FTP connection, open and create a file on my disk but after a few seconds (sounds like a timeout), the script end with this error:
PHP Warning: ftp_get(): Opening BINARY mode data connection for...
I've already tried to use passive mode, the connection is closed at the end of my script and the strange thing is that this is working on another computer, and on my server.
So here are my questions:
1) do you have any idea why this is happening?
2) are there configuration in php.ini or apache to enable properly PHP FTP?
Thanks you.
Cyril
Maybe you exceeded maximum execution time.
Try to increase it:
http://php.net/manual/en/function.set-time-limit.php

How to open a file on a distant server with XML Reader in php?

I have got a huge xml file (~ 85 Mo) and I would like to open it with XML Reader (then my script read selected lines). I have downloaded it on my PC and my script works (using Wamp).
Now I would like to do the same online. Server login is aaa and password is bbb (of course it's an example).
I tried the following statement:
$xml = new XMLReader();
if ($xml->open('ftp://aaa:bbb#ftp.website.com/myfile.xml')){
echo 'OK';
}
while($xml->read()){
// my script here...
}
It seems I am wrong because my web browser indicates me that the page is too long to load. What is the good way to proceed? Or did I miss something important?
Since the file in question is an XML it is not wise to partially download it since it will probably break the xml structure making the parser to fail.
You could get a cronjob to retrieve the file occasionally and you would open it from a local location on the server, or retrieve it once and cache it locally so that it would speed subsequent requests.
http://davidwalsh.name/increase-php-script-execution-time-limit-ini_set
The default execution time is 20 secs or 30, increase it to 1 minute and retry
I would use curl to connect to the FTP server and download the file.
http://php.net/manual/en/book.curl.php
Your probably best to download the file to the server first using PHP's FTP functions and then open the file locally. I wouldn't rely on using FTP within the XML library, you'll get more flexibility this way.

Serving file via PHP script and safe replacing this file on-the-fly

My PHP script serves files using readfile() and manually added HTTP headers. Files are about ~1 MB big, so no partial-reading is necessary.
What is the best way to replace this file with a updated one, but ensure already started downloads won't be broken? I'd like pending downloads to be finished with fetched old file version.
Platform: Linux, PHP 5, Apache 2.
Use version numbering in the filename:
$candidates=sort(glob($dir . $prefix . '*'));
$most_recent=array_pop($candidates);
e.g.
file0001.pdf
file0002.pdf
file0003.pdf
C.
You could use flock to get a lock on the file handle - this will mean you writer process will block until it is able to get an exclusive lock.
So, your readers would obtain a LOCK_SH shared lock, and then simply use fpassthru to output the file, and finally release the lock.
The write would obtain a LOCK_EX exclusive lock, perform the write in safety, and then release the lock allowing any blocked readers to start fetching the new file.

Categories