Why open file is deleted? On Windows Xamp, I get message "still working", but on other PHP serwer file is deleted, even if it is open and I get message "file deleted". I can delete file from FTP too, even if first script is still working :(
<?php
$handle = fopen("resource.txt", "x");
sleep(10);
?>
<?php
if (file_exists("resource.txt") && #unlink("resource.txt") === false) {
echo "still worning";
exit;
}
else
echo "file deleted";
?>
UNIX systems typically let you do this, yes. The underlying C unlink function is documented as such:
The unlink() function removes the link named by path from its directory
and decrements the link count of the file which was referenced by the
link. If that decrement reduces the link count of the file to zero, and
no process has the file open, then all resources associated with the file
are reclaimed. If one or more process have the file open when the last
link is removed, the link is removed, but the removal of the file is
delayed until all references to it have been closed.
In other words, you can basically mark the file for deletion at any time, but the system will actually keep it around as long as applications are still accessing it. Only when all applications have let go of the file will it finally actually be removed. Windows apparently does not do it that way. Update: Since PHP 7.3 it's now possible to unlink open files.
As a side note, UNIX' behaviour is the only sane behaviour in a multi-process environment. If you have to wait for all processes to close access to a file before the system lets you remove it, it's basically impossible to remove frequently accessed files at all. Yes, that's where those Windows dialog boxes about "Cannot delete file, still in use, retry?" come from which you can never get rid of.
Related
I am wrestling with a problem. I stripped the example to this script that can be run as stand-alone application:
<?php
if(file_exists("x")){
print "<div>Deleting dir</dir>";
rmdir("x");
} else {
print "<div>Not exists</dir>";
}
clearstatcache();
mkdir("x");
If I call it repeatedly (F5 in browser) then sometimes this error occurs:
Deleting dir
Warning: mkdir(): Permission denied in F:\EclipseWorkspaces\Ramses\www\deploy\stripped_example.php on line 9
10-20 times it works OK and next time this error occurs. I googled more users has this problem but without solution.
https://github.com/getgrav/grav/issues/467
My example creates the directory in cwd, where anybody has full control. In addition the mkdir $mode parameter is ignored in windows. After the error the "x" directory truly not exists and in next attempt (F5) it is always created without error. I hopped later added "clearstatcache()" will help but nope.
In my full application I am using full file path. The deleted directory is not empty and I must clean it first. After successfull deleting the error occurs almost always.
My system is Windows 7, PHP 7.0.5, Apache 2.4
Windows doesn't let you delete things if another process is accessing them.
Check if your antivirus or some other process is opening the folder.
You can check this in resource monitor, from task manager.
Try the code with additional check on existing:
<?php
if(is_dir("x")){
print "<div>Deleting dir</dir>";
rmdir("x");
} else {
print "<div>Not exists</dir>";
}
clearstatcache();
if (!is_dir("x")) {
mkdir("x");
}
Had the same problem with Windows 10, Xampp and PHP 7. Problem was Kaspersky Internet Security, scanning and blocking the directory. Disabling KIS mkdir always works for me. Instead of directly recreating you can try rename, if disabling security software is not an option for you.
$time = time();
mkdir($path . $time);
rename($path . $time, $path);
juste delete espace name folder with Function Trim in php
I'm having a strange problem using ftp_get() on one of the two identical instances. One is on localhost and another on an actual server. I'm using the following to download a file via FTP. Both of the instances download from the same FTP servers with the same credentials and same paths.
$result = ftp_get($connection, $downloadPath, $serverPath, FTP_BINARY);
if ($result) {
$successfulWrites[] = $downloadPath; // file name only without path
} else {
// on second attempt to download file with same name, ftp_get() returns false
// this is where I throw an exception in my code
}
On my localhost, I can download the same file over and over, and it doesn't matter what the file name on the FTP server is or where it's located.
On second instance, which is identical to the localhost's (i.e. pulled from the same git repo) in terms of code, I can download a file once, but the same file cannot be downloaded again, and ftp_get() returns false. If I change the name of the file on the FTP server, I can download it, but after that it won't work again. i.e. ftp_get() will return false.
I don't have access to the FTP server log. If it's available, I'm going to try to get it today from the host. But can anyone think of a reason this might be happening? ftp_get() just returns true or false without any explanation, so I'm pretty stuck with this.
I'm using PHP 5.4, and I have no idea what the spec is of the FTP (regular FTP) server.
As discussed, it sounded like ftp_get was successfully obtaining the file and writing it locally. I wonder whether due to a permissions problem, when it tries to write the file locally again, it fails. Thus, the FTP channel itself is fine, and the problem is just local.
I'm somewhat surprised at this though, as I would imagine PHP would have raised a warning. Is your error_reporting set to allow this whilst you are debugging?
I make a site map, and make it with php file, that generate it from mysql. I change host and now I have problem with writing into file. I can't understand something.
Here is my example:
<?php
$xml = 'bla bla xml'; //... some xml generating code
$fp = fopen($_SERVER['DOCUMENT_ROOT'].'/my_site_map.xml', 'w');
if($fp)
echo 'we opened it';
else
echo 'we failed';
$fwrite=fwrite($fp, $xml, strlen($xml));
if($fwrite==false)
echo "another fail";
fclose($fp);
echo "we done";
?>
The question is: my file my_site_map.xml have a permission 664 (rw-rw-r--), and I can't use this script if I open this php page from browser, so, if I try to do this I'll see: "we failed another fail we done"; But if I open this through crontab and see a log file, I can see this: "we opened it we done". I want exactly this but the main problem is that the file isn't have been rewritten. Why? And how can I fix this? Thanks.
My server is nginx not an Apache, didn't thought that this info will valuable
Well I don't have enough rep to comment so this will have to be an answer.
I'm going to take a stab in the dark and say the file is owned by your user or root, not the process that is running the webserver. Nor is the file owned by the group the webserver process is run under.
So either chown/chgrp the file to be owned by the apache(?!) process running, e.g. chown apache file or set the file to have write permissions to everyone, e.g. chmod 666 file
Don't chmod 777 as commented above unless it's an executable file and you want anyone to be able to run it. The 1st solution is a better practice than just giving anyone read access to a file.
Edit: In comment to the comments on the original answer above, if the file isn't an executable then don't give it 7 for any permisions. 6 is read/write and is suitable for a text file you are opening to write to (even 2 is if it comes to that).
Edit 2: Try catching any exceptions that your fopen function runs in a try catch block:
try {
$fp = fopen($_SERVER['DOCUMENT_ROOT'].'/my_site_map.xml', 'w');
} catch (Exception $e) {
echo "The error is" . $e->getMessage();
}
For PHP code here are the links to change it on the fly. You can change it to what ever make your edits then change it back as needed.
http://php.net/manual/en/function.chmod.php
http://php.net/manual/en/function.chown.php
http://php.net/manual/en/function.chgrp.php
Examples are included on each link with the documentations. Find the permissions that works best for what your doing. There isn't a one size fits all for permissions since it really depends on your end product (web app, page, what ever).
I have a directory with files that need processing in a batch with PHP. The files are copied on the server via FTP. Some of the files are very big and take a long time to copy. How can I determine in PHP if a file is still being transferred (so I can skip the processing on that file and process it in the next run of the batch process)?
A possibility is to get the file size, wait a few moments, and verify if the file size is different. This is not waterproof because there is a slight chance that the transfer was simply stalled for a few moments...
One of the safest ways of doing this is to upload the files with a temporary name, and rename them once the transfer is finished. You program should skip files with the temporary name (a simple extension works just fine.) Obviously this requires the client (uploader) to cooperate, so it's not ideal.
[This also allows you to delete failed (partial) transfers after a given time period if you need that.]
Anything based on polling the file size is racy and unsafe.
Another scheme (that also requires cooperation from the uploader) can involve uploading the file's hash and size first, then the actual file. That allows you to know both when the transfer is done, and if it is consistent. (There are lots of variants around this idea.)
Something that doesn't require cooperation from the client is checking whether the file is open by another process or not. (How you do that is OS dependent - I don't know of a PHP builtin that does this. lsof and/or fuser can be used on a variety of Unix-type platforms, Windows has APIs for this.) If another process has the file open, chances are it's not complete yet.
Note that this last approach might not be fool-proof if you allow restarting/resuming uploads, or if your FTP server software doesn't keep the file open for the entire duration of the transfer, so YMMV.
Our server admin suggested ftpwho, which outputs which files are currently transferred.
http://www.castaglia.org/proftpd/doc/ftpwho.html
So the solution is to parse the output of ftpwho to see if a file in the directory is being transferred.
Some FTP servers allow running commands when certain event occurs. So if your FTP server allows this, then you can build a simple signalling scheme to let your application know that the file has been uploaded more or less successfully (more or less is because you don't know if the user intended to upload the file completely or in parts). The signalling scheme can be as simple as creation of "uploaded_file_name.ext.complete" file, and you will monitor existence of files with ".complete" extension.
Now, you can check if you can open file for writing. Most FTP servers won't let you do this if the file is being uploaded.
One more approach mentioned by Mat is using system-specific techniques to check if the file is opened by other process.
Best way to check would be to try and get an exclusive lock on the file using flock. The sftp/ftp process will be using the fopen libraries.
// try and get exclusive lock on file
$fp = fopen($pathname, "r+");
if (flock($fp, LOCK_EX)) { // acquire an exclusive lock
flock($fp, LOCK_UN); // release the lock
fclose($fp);
}
else {
error_log("Failed to get exclusive lock on $pathname. File may be still uploading.");
}
It's not realy nice trick, but it's simple :-), the same u can do with filemtime
$result = false;
$tryies = 5;
if (file_exists($filepath)) {
for ($i=0; $i < $tryies; $i++) {
sleep(1);
$filesize[] = filesize($filepath);
}
$filesize = array_unique($filesize);
if (count($filesize) == 1) {
$result = true;
} else {
$result = false;
}
}
return $result;
What I would like to script: a PHP script to find a certain string in loads of files
Is it possible to read contents of thousands of text files from another ftp server without actually downloading those files (ftp_get) ?
If not, would downloading them ONCE -> if already exists = skip / filesize differs = redownload -> search certain string -> ...
be the easiest option?
If URL fopen wrappers are enabled, then file_get_contents can do the trick and you do not need to save the file on your server.
<?php
$find = 'mytext'; //text to find
$files = array('http://example.com/file1.txt', 'http://example.com/file2.txt'); //source files
foreach($files as $file)
{
$data = file_get_contents($file);
if(strpos($data, $find) !== FALSE)
echo "found in $file".PHP_EOL;
}
?>
[EDIT]: If Files are accessible only by FTP:
In that case, you have to use like this:
$files = array('ftp://user:pass#domain.com/path/to/file', 'ftp://user:pass#domain.com/path/to/file2');
If you are going to store the files after you download them, then you may be better served to just download or update all of the files, then search through them for the string.
The best approach depends on how you will use it.
If you are going to be deleting the files after you have searched them, then you may want to also keep track of which ones you searched, and their file date information, so that later, when you go to search again, you won't waste time searching files that haven't changed since the last time you checked them.
When you are dealing with so many files, try to cache any information that will help your program to be more efficient next time it runs.
PHP's built-in file reading functions, such as fopen()/fread()/fclose() and file_get_contents() do support FTP URLs, like this:
<?php
$data = file_get_contents('ftp://user:password#ftp.example.com/dir/file');
// The file's contents are stored in the $data variable
If you would need to get a list of the files in the directory, you might want to check out opendir(), readdir() and closedir(), which I'm pretty sure supports FTP URLs.
An example:
<?php
$dir = opendir('ftp://user:password#ftp.example.com/dir/');
if(!$dir)
die;
while(($file = readdir($dir)) !== false)
echo htmlspecialchars($file).'<br />';
closedir($dir);
If you can connect via SSH to that server, and if you can install new PECL (and PEAR) modules, then you might consider using PHP SSH2. Here's a good tutorial on how to install and use it. This is a better alternative to FTP. But if it is not possible, your only solution is file_get_content('ftp://domain/path/to/remote/file');.
** UPDATE **
Here is a PHP-only implementation of an SSH client : SSH in PHP.
With FTP you'll always have to download to check.
I do not know what kind of bandwidth you're having and how big the files are, but this might be an interesting use-case to run this from the cloud like Amazon EC2, or google-apps (if you can download the files in the timelimit).
In the EC2 case you then spin up the server for an hour to check for updates in the files and shut it down again afterwards. This will cost a couple of bucks per month and avoid you from potentially upgrading your line or hosting contract.
If this is a regular task then it might be worth using a simple queue system so you can run multiple processes at once (will hugely increase speed) This would involve two steps:
Get a list of all files on the remote server
Put the list into a queue (you can use memcached for a basic message queuing system)
Use a seperate script to get the next item from the queue.
The procesing script would contain simple functionality (in do while loop)
ftp_connect
do
item = next item from queue
$contents = file_get_contents;
preg_match(.., $contents);
while (true);
ftp close
You could then in theory fork off multiple processes through the command line without needing to worry about race conditions.
This method is probabaly best suited to crons/batch processing, however it might work in this situation too.