Won't delete all files with 'delete recursive dir' function - php

I try to delete a big directory with a lot of subfolders and files (>1000). There are many functions built for this purpose, I use the following:
function rrmdir($dir) {
if (is_dir($dir)) {
$objects = scandir($dir);
foreach ($objects as $object) {
if ($object != "." && $object != "..") {
if (filetype($dir . "/" . $object) == "dir"){
log_message(201,array(),'Try to delete folder: '.$dir.'/'.$object);
rrmdir($dir . "/" . $object);
}else{
log_message(201,array(),'Try to delete FILE: '.$dir.'/'.$object);
unlink($dir . "/" . $object);
}
}
}
reset($objects);
rmdir($dir);
}
}
The problem is, that many files are left behind. Is this usual behavior, are is something wrong with my code? If it is usual behavior, how can I get around this problem?
Thanks in advance.

Different Operating handles this different. At most OS an filesystems, files can be locked exclusive for read or write operations.
If another process holds the file-handle with a lock, your process may not modify (or delete) the file. This may also be true for different threads.

Related

PHP unlink() Error: "Directory not empty"

I have the following recursive method to delete a directory and all its sub-directories and files:
protected function _rrmdir($dir)
{
if (is_dir($dir)) {
$objects = scandir($dir);
foreach ($objects as $object) {
if ($object != '.' && $object != '..') {
if (filetype($dir . '/' . $object) == 'dir') {
_rrmdir($dir . '/' . $object);
} else {
unlink($dir . '/' . $object);
}
}
}
reset($objects);
rmdir($dir);
}
}
On occasion, the get a Warning, "Directory not empty".
The directory is actually created as a temporary holder for files. The files are downloaded from the Internet using the following snippet:
file_put_contents($filename, file_get_contents($file))
After they are downloaded (a write operation), they are then uploaded to a website (a read operation). Once done uploading, the temporary folder and its files are then deleted.
The odd thing is that when I look inside the temporary folder, there are no files there. It's as if the code tried to delete the folder while the last file was in the process of being deleted?
Any ideas what might be wrong and how to resolve it? I need this code to run on Windows and *nix, so a *nix only solution is not an option.
The constant DIRECTORY_SEPARATOR might help you with Windows/Unix compatibility.
For the folder not empty, try this:
protected function _rrmdir($dir)
{
if (is_dir($dir)) {
$objects = scandir($dir);
foreach ($objects as $object) {
if ($object != '.' && $object != '..') {
if (is_dir($dir . DIRECTORY_SEPARATOR . $object)) {
_rrmdir($dir . DIRECTORY_SEPARATOR . $object);
} else {
if( is_file($dir . DIRECTORY_SEPARATOR . $object) ) {
if(!unlink($dir . DIRECTORY_SEPARATOR . $object)) {
// code in case the file was not removed
}
// wait a bit here?
} else {
// code for debug file permission issues
}
}
}
}
reset($objects);
rmdir($dir);
}
}
It might happen that you try to remove a file which permissions are not at php exec level.
The is_file() method will return FALSE only if no read permissions, mind that write permissions are needed by the execution owner to delete files.

Can't get scandir to scan root directory

I've tried every path I can think of.
''
'/'
'htdocs/'
No matter what I try, I cant figure out how to scan the root directory.
So, how do you do it?
Current Code:
function pathing(){
$files = scandir('/');
foreach ($files as $file) {
if ($file === '.' OR $file === '..') {
} else {
print_r($file . ' ');
}
}
}
I'm moving my comment to an answer in order to bring attention to the solution for others who stumble across this question.
The directory root location is accessible in the $_SERVER array.
$_SERVER['DOCUMENT_ROOT']

PHP GAE readdir() doesn't work as intended

I'm trying to recursively list every file that is in my bucket. It's not too many files but I'd like to list them to test a few things. This code works on a normal file system but it's not working on Google Cloud Storage.
Anyone have any suggestions?
function recurse_look($src) {
$dir = opendir($src);
while(false !== ( $file = readdir($dir)) ) {
if (( $file != '.' ) && ( $file != '..' )) {
if ( is_dir($src . '/' . $file) ) {
recurse_look($src . '/' . $file);
}
else {
echo $src . '/' . $file;
echo "<br />";
}
}
}
closedir($dir);
}
recurse_look("gs://<BUCKET>");
Personally, I would recommend not using a filesystem-impersonation abstraction layer on top of Google Cloud Storage, for a task such as listing everything inside a bucket -- rather, just reach out for the underlying functionality.
In particular, see https://cloud.google.com/storage/docs/json_api/v1/json-api-php-samples for everything about authentication etc, and, once, that's taken care of, focus on just one line in the example:
$objects = $storageService->objects->listObjects(DEFAULT_BUCKET);
This is all you need to list all objects in a bucket (which is not the same thing as "files in a directory", and the "filesystem simulations" on top of buckets and objects, I offer as being just my personal opinion, end up hurting rather than helping despite their excellent intentions:-).
Now if the objects' names contain e.g slashes and you want to take that into account as symbolically signifying something or other, go right ahead, but at least this way you're sure you're getting all the objects actually existing in the bucket, and, nothing but those!-)
Now that glob is working, you can try something like this
function lstree($dir) {
foreach (glob($dir . '/*') as $path) {
if (is_dir($path)) {
echo $path;
lstree($path);
} else {
echo $path;
}
}
lstree('gs://{bucket}/');

List files and subfolders from database in PHP

Beginner : I cant seem to get my head around the logic of it. Have searched but seems to come up with listing files and folders from an actual directory ie. (opendir).
My problem is :
Im trying to work out (in PHP) how to list files and subfolders from a path stored in a database. (Without any access to the file or dir, so just from the path name)
For example database shows:
main/home/television.jpg
main/home/sofa.jpg
main/home/bedroom/bed.jpg
main/home/bedroom/lamp.jpg
So if i specify main/home - it shows: television.jpg, sofa.jpg and the name of the subfolder : bedroom.
scanFolder('main/home');
function scanFolder($dir) {
foreach (scandir($dir) as $file) {
if (!in_array($file, array('.', '..'))) {
if (is_dir($file)) {
scanFolder($dir . '/' . $file);
}
else {
echo $dir . '/' . $file . "\n";
}
}
}
}
You would probably want to check on each iteration if the filename is a directory or not. If it is, open it up and read its contents and output them. A recursive function would work best in this situation.
http://php.net/manual/en/function.is-dir.php

PHP / Windows - Opendir() fails opening subdirectories within symbolic linked directories

Does anyone know a solution to this problem? I'm unable to open a subdirectory within a symboliclink'd directory. I've confirmed that the paths are correct (even copy & pasted the path into explorer, which parsed it fine). This is a strange, annoying, bug :|.
Example:
C:\folder\symbolic_link\dir1\dir2 - opening dir2 fails.
C:\folder\symbolic_link\dir1 - works
C:\folder\real_directory\dir1\dir2 - works
C:\folder\real_directory\dir1 - works
Alright, I finally found a hack to solve this bug in php's handling of symlinks on windows. The bug occurs when recursively iterating through files/directories using opendir(). If a symlink to a directory exists in the current directory, opendir() will fail to read the directories in the directory symlink. It is caused by something funky in php's statcache, and can be resolved by calling clearstatcache() before calling opendir() on the directory symlink (also, the parent directory's file-handle must be closed).
Here is an example of the fix:
<?php
class Filesystem
{
public static function files($path, $stats = FALSE)
{
clearstatcache();
$ret = array();
$handle = opendir($path);
$files = array();
// Store files in directory, subdirectories can't be read until current handle is closed & statcache cleared.
while (FALSE !== ($file = readdir($handle)))
{
if ($file != '.' && $file != '..')
{
$files[] = $file;
}
}
// Handle _must_ be closed before statcache is cleared, cache from open handles won't be cleared!
closedir($handle);
foreach ($files as $file)
{
clearstatcache($path);
if (is_dir($path . '/' . $file))
{
$dir_files = self::files($path . '/' . $file);
foreach ($dir_files as $dir_file)
{
$ret[] = $file . '/' . $dir_file;
}
}
else if (is_file($path . '/' . $file))
{
$ret[] = $file;
}
}
return $ret;
}
}
var_dump(filessystem::files('c:\\some_path'));
Edit: It seems that clearstatcache($path) must be called before any file-handling functions on the symlink'd dir. Php isn't caching symlink'd dirs properly.

Categories