Related
I have been working on a project that involves a step, during which the script needs to automatically remove a certain directory in Linux ( and all its contents ).
I am currently using the following code to do that:
# Perform a recursive removal of the obsolete folder
$dir_to_erase = $_SESSION['path'];
function removeDirectory($dir_to_erase) {
$files = glob($dir_to_erase . '/*');
foreach ($files as $file) {
is_dir($file) ? removeDirectory($file) : unlink($file);
}
rmdir($dir_to_erase);
return;
}
Where $_SESSION['path'] is the folder to erase. Been working like a charm, but I recently had to add a .htaccess file to the folder, and I noticed the script stopped working correctly ( it keeps removing the rest of the files fine, but not the .htaccess files ).
Can anyone point me as to what I should add to the code include the hidden dot files in the removal process?
simply, you can rely on DirectoryIterator
The DirectoryIterator class provides a simple interface for viewing
the contents of filesystem directories.
function removeDirectory($dir_to_erase) {
$files = new DirectoryIterator($dir_to_erase);
foreach ($files as $file) {
// check if not . or ..
if (!$file->isDot()) {
$file->isDir() ? removeDirectory($file->getPathname()) : unlink($file->getPathname());
}
}
rmdir($dir_to_erase);
return;
}
there are lot of features there you may make use of them, as check the owner which is pretty useful to make sure not to remove critical file.
You can slightly modify your function to remove hidden files also:
function removeDirectory($dir)
{
if (is_dir($dir)) {
$objects = scandir($dir);
foreach ($objects as $object) {
if ($object != "." && $object != "..") {
if (is_dir($dir."/".$object))
removeDirectory($dir."/".$object);
else
unlink($dir."/".$object);
}
}
rmdir($dir);
}
}
As per this answer:
PHP glob() doesnt find .htaccess
glob(".*") will find .htaccess
I'm trying to create a function which will run periodically to delete any old "user" folders which don't have an active user. This is the function I have:
function delete_temp_user_files() {
$dir = $_SERVER['DOCUMENT_ROOT'] . "/users/";
$dir_files = array();
$dir_files = scandir($dir);
foreach ($dir_files as $username) {
if ($username == "." || $username == "..") continue;
if (!user_exist($username)) {
$dir1 = $_SERVER['DOCUMENT_ROOT'] . "/users/" . $username;
if (!is_dir($dir1)) continue;
if (file_exists($dir1)) unlink($dir1);
}
}
}
But when it tries to delete the directory I get the error "Warning: unlink(/path/to/users/delete1/): Is a directory in /page.php on line..." I know the directory exists because I can see it in the directory, and it found it using scandir() anyway.
I use unlink to delete these same folders in other scripts and it works fine. The directories aren't empty so I can't user rmdir().
I'm not familiar with permissions or anything like that, is this some kind of issue with that? And do I need to worry about permissions if I only ever use PHP scripts to delete files and folders (like when the user clicks on the delete button which runs the script I wrote)?
UPDATE:
After scouring the web I finally found how to delete a directory and it's not easy lol Add this function into the previous function and it works!
function delete_dir($directory) {
foreach(glob("{$directory}/*") as $file)
{
if(is_dir($file)) {
delete_dir($file);
} else {
unlink($file);
}
}
rmdir($directory);
}
unlink is used to delete files, use rmdir
Please note, you must first delete all files in directory.
Also, your code is very dangerous. Assume with time, you have 100,000 users so will have 100,000 folders. Can you imagine how much time will this line take?
foreach ($dir_files as $username) {
Please think alternate way.
Good way, don't delete users from your database. find users who didn't login since 6 months (say) and disable them. this way, your loop is smaller.
There is a function given on the rmdir documentation page :
function rrmdir($dir) {
if (is_dir($dir)) {
$objects = scandir($dir);
foreach ($objects as $object) {
if ($object != "." && $object != "..") {
if (filetype($dir."/".$object) == "dir") rrmdir($dir."/".$object); else unlink($dir."/".$object);
}
}
reset($objects);
rmdir($dir);
}
}
Sure it will make your life easier.
I know that this is a late answer, but I solved it by checking if the file exists or not.
I'm using database for the path of the file. So you may try an example code below.
if(!empty($course->picture)){
// unlink here
}
I have a lot of functions and classes that I have included in my website.
With the help of stackoverflow I recieved a script that automaticly includes all files in a folder and its subfolders: PHP: Automatic Include
When testing the script it always worked and I never had any problems with it.
But recently when switching from a windows server to a linux server it gives problems with extension of classes.
PHP Fatal error: Class 'AcumulusExportBase' not found in path/functions/classes/Acumulus/AcumulusExportWorkshop.php on line 3, referer: pagesite/?page_id=346
AcumulusExportWorkshop extends from AcumulusExportBase.
This all fully works on windows but refuses to work on linux.
I can fix this creating a include_once 'AcumulusExportBase.php'; but if there is a better solution it all seems unnecessary and annyoing work.
The code I use is the following:
load_folder(dirname(__FILE__));
function load_folder($dir, $ext = '.php') {
if (substr($dir, -1) != '/') { $dir = "$dir/"; }
if($dh = opendir($dir)) {
$files = array();
$inner_files = array();
while($file = readdir($dh)) {
if($file != "." and $file != ".." and $file[0] != '.') {
if(is_dir($dir . $file)) {
$inner_files = load_folder($dir . $file);
if(is_array($inner_files)) $files = array_merge($files, $inner_files);
} else {
array_push($files, $dir . $file);
}
}
}
closedir($dh);
foreach ($files as $file) {
if (is_file($file) and file_exists($file)) {
$lenght = strlen($ext);
if (substr($file, -$lenght) == $ext && $file != 'loader.php') { require_once($file); }
}
}
}
}
Can anyone tell me how it is that windows has no problems with extension classes and linux does? Also is there a fix for the problem without having to manual include the base classes?
Have you verified that AcumulusExportBase is included before AcumulusExportWorkshop under Linux? PHP is sensitive to the order of imports.
Both other answers are correct (and I've upvoted them both). Your problem will be the order the files are loaded (see Mark's response) and the recursion is also wrong (see KIKO).
However there is a better way of doing what you want: use an autoloader. http://php.net/manual/en/language.oop5.autoload.php
First time is confusing, but once you've grasped it, it's a lovely way of loading files.
Basically you say "If I need class X and it's not loaded, then load file Y.php".
If you're being super-lazy and don't want to specify each class then you can say "If I need class X and it's not loaded, run through the directory structure looking for a file called X.php and load that, my class will be in there." You can mix in what you have above to do this.
This way, you can load AcumulusExportWorkshop first, and then it looks for AcumulusExportBase afterwards and runs happily.
And, more beneficially, you only load what you need. If you never need the class, it never gets loaded.
I would like to answer your question, but regretably I do not have a Windows PHP server installed. I can however look at, and test, your code. The first thing I notice is the malformed recursion. To get the 'inner_files', recursion is used, which is fine, but this requires your function to return a value, namely the array of files. It does not. Furthermore, although you're using 'require_once', this is called on each recursion, meaning you try to include 'deep' files many times. In short: It's time to somewhat simplify your code.
load_folder(dirname(__FILE__));
function load_folder($dir,$ext = '.php')
{
if (substr($dir,-1) != '/') $dir = $dir.'/';
if ($handle = opendir($dir))
{
while($file = readdir($handle))
{
if (($file != '.') && ($file != '..') && ($file[0] != '.'))
{
if (is_dir($dir.$file)) load_folder($dir.$file,$ext);
else
{
if ((substr($file,-strlen($ext)) == $ext) &&
($file != 'loader.php') &&
file_exists($dir.$file)) require_once($dir.$file);
}
}
}
closedir($handle);
}
}
This works under linux, and performs the same task. I corrected the fact that $ext was missing from internal load_folder().
My advise is to never blindly copy code you find on the internet. Always check it, and then check again. Make sure you understand how it work. If you do not your projects will be littered with bug and impossible for anyone to maintain.
As Robbie stated, both of the other answers are correct, and an autoloader is the ideal solution.
An autoloader may seem (at first) to be slightly more complicated, but it presents benefits that are genuinely significant and make it well worth using.
Here are a few of them:
You do not need to manually include or require files.
You do not need to worry about files being loaded in the correct sequence - the interpreter will load any dependencies automatically. (In your case, the differences in the Windows and Linux operating system exposed this weakness in the existing code)
You can avoid loading files that are not needed.
The interpreter does not need to parse unnecessary classes and code.
Here are some things you should know about autoloaders:
You can have as many autoloaders as you need and want - they are stored in a stack and executed in sequence. If the first one does not load the class, the next one is used, and so on, until the class is loaded or there are no more autoloaders to try.
An autoloader is a callable - either a method of a class, or a function.
Exactly how you implement the autoloader is up to you - so if your project has a specific directory structure that relates to the class type or hierarchy, you can instruct it to look in specific directories, making it more efficient.
Most of us like to keep our classes in separate files. This makes it easier to find the classes we are interested in, and keeps the files smaller, which makes them easier to understand.
PHP does not enforce any kind of naming convention when it comes to the names of the files we use, but most developers prefer to save Classes in files with file names that relate to the class name.
The autoloader feature assumes that there is a way to load the correct file when presented with the file name. So a good practice is to have a simple way of generating the file name from the class name - the simplest is to use the class name as the file name.
Here is my preferred autoloader - which I have adapted from code by Jess Telford that I found online when I was learning PHPUnit - (http://jes.st/2011/phpunit-bootstrap-and-autoloading-classes/)
class ClassDirectoryAutoLoader {
static private $classNamesDirectory = array();
public static function crawlDirectory($directory) {
$dir = new DirectoryIterator($directory);
foreach ($dir as $file) {
self::addClassesAndCrawlDirectories($file);
}
}
private static function addClassesAndCrawlDirectories($file){
if (self::isRealDirectory($file)) {
self::crawlDirectory($file->getPathname());
} elseif (self::isAPhpFile($file)) {
self::saveClassFilename($file);
}
}
private static function isRealDirectory($file){
// ignore links, self and parent
return $file->isDir() && !$file->isLink() && !$file->isDot();
}
private static function isAPhpFile($file){
//ends in .php
return substr($file->getFilename(), -4) === '.php';
}
private static function saveClassFilename($file){
//assumes that the filename is the same as the classname
$className = substr($file->getFilename(), 0, -4);
self::registerClass($className, $file->getPathname());
}
public static function registerClass($className, $fileName) {
self::$classNamesDirectory[$className] = $fileName;
}
public static function loadClass($className) {
if (isset(self::$classNamesDirectory[$className])) {
require_once(self::$classNamesDirectory[$className]);
}
}
}
$classDir = dirname(__FILE__) . '/../classes'; // replace with the root directory for your class files
ClassDirectoryAutoLoader::crawlDirectory($classDir);
spl_autoload_register(array('ClassDirectoryAutoLoader', 'loadClass'));
What this code does is
Recurse through the directories (from the classDir), looking for .php files.
Builds an associative array that maps the classname to the full filename.
Registers an autoloader (the loadClass method).
When the interpreter tries to instantiate a class that is not defined, it will run this autoloader, which will:
Check if the file is stored in the associative array.
Require the file if it is found.
I like this autoloader because:
It's simple.
It's general - you can use it in virtually any project that follows a few simple conventions (see below).
It only crawls the directory tree once, not every time a new class is instantiated.
It only requires the files that are needed.
The loadClass method is super-efficient, simple performing a lookup and a require.
This code makes some assumptions:
All of the classes are stored in a specific directory.
All of the files in that directory contain classes.
The file name exactly matches the class name.
There are no side effects from requiring a file (i.e. the file contains only a class definition, no procedural code).
Breaking these assumptions will break this autoloader.
These are the conventions you need to follow to make use of this autoloader:
Keep all classes under a single directory.
Only class definition files under the class directory.
Make a seperate file for each public class.
Name each file after the class it contains.
No procedural code with side effects in these class definition files.
Well, I am building a System which uses an auto-loader, So here's what I made:
function endswith($string,$tidbit){
// Length of string
$strlen = strlen($string);
// Length of ending
$tidlen = strlen($tidbit);
// If the tidbit is of the right length (less than or equal to the length of the string)
if($tidlen <= $strlen){
// Substring requires a place to start the copying
$tidstart = $strlen - $tidlen;
// Get $tidlen characters off the end of $string
$endofstring = substr($string, $tidstart, $tidlen);
// If the $tidbit matches the end of the string
$ret = ($endofstring == $tidbit);
return $ret;
} else {
// Failure
return -1;
}
}
// Working
function ush_load_path($path) {
if (is_dir($path)) {
if (is_file($path . '/' . (explode('/', $path)[count(explode('/', $path)) - 1]) . '.inc')) {
require_once $path . '/' . (explode('/', $path)[count(explode('/', $path)) - 1]) . '.inc';
}
ush_load_path_recursive($path);
// If it is a file
} else if (is_file($path)) {
require_once $path;
// Failure
} else {
return false;
}
}
function ush_load_path_recursive($path) {
// Directory RESOURCE
$path_dir = opendir($path);
// Go through the entries of the specified directory
while (false != ($entry = readdir($path_dir))) {
if ($entry != '.' && $entry != '..') {
// Create Full Path
$path_ext = $path . '/' . $entry;
// Development
if (is_dir($path_ext)) {
ush_load_path_recursive($path_ext);
} else if (is_file($path_ext)) {
if (ush_is_phplib($path_ext)) {
print $path_ext . '<br />';
require_once $path_ext;
} else {
// Do nothing
}
}
}
}
}
// Working
function ush_is_phplib($path) {
return endswith($path, '.inc');
}
Can you do a print_r($files) after the closedir($dh); and before the foreach so we could see which files are actually being loaded and in which order?
load_folder(dirname(__FILE__));
function load_folder($dir, $ext = '.php') {
if (substr($dir, -1) != '/') { $dir = "$dir/"; }
clearstatcache(); // added to clear path cache
if($dh = opendir($dir)) {
$files = array();
$inner_files = array();
while($file = readdir($dh)) {
if($file != "." and $file != ".." and $file[0] != '.') {
if(is_dir($dir . $file)) {
$inner_files = load_folder($dir . $file);
if(is_array($inner_files)) $files = array_merge($files, $inner_files);
} else {
array_push($files, $dir . $file);
}
}
}
closedir($dh);
clearstatcache($dir); // added to clear path cache
foreach ($files as $file) {
if (is_file($file) and file_exists($file)) {
$lenght = strlen($ext);
if (substr($file, -$lenght) == $ext && $file != 'loader.php') { require_once($file); }
}
}
}
}
It seems that clearstatcache($path) must be called before any file-handling functions on the symlink'd dir. Php isn't caching symlink'd dirs properly.
I might be missing some point here, but it gives me the creeps just seeing that script...I am making the assumption that you call that function with a folder where you keep all your php function files and what not, which are all included, even if only one of those files is needed for the actual script to work.
Am I missing something here, or is this how it is working? If not, I am mislead by the description and function code.
If this is really what you are doing, there are better ways of including the needed files, without all those unnecessary inclusions.
I have a class that handles all my script loading. All I have to do is register the loading function:
spl_autoload_register('Modulehandler::Autoloader');
Then, whenever a new file is required, PHP will use my function to lookup the file.
Here is the function itself:
static function Autoloader($className) {
$files = array($className);
$lowerClass = strtolower($className);
if (strcmp($className, $lowerClass) != 0) $files[] = $lowerClass;
foreach (self::$modules as $moduleName => $module) {
foreach ($files as $className) {
$file = "{$module}/classes/{$className}.php";
if (file_exists($file)) {
require $file;
break;
}
}
}
}
This class has a little more to itself than just the loading, as I also have the ability to add Modules to the loader, so I only search the folders from the included modules, allowing some performance gain over the alternative of searching through all the modules. Besides that, there is the obvious benefit of only including the necessary files.
I hope this fits into what you need, and helps you out.
Have you checked whether that AcumulusExportBase is properly included in AcumulusExportWorkshop ?
And please keep in mind that Linux is very much case sensitive so the file name should in proper case.
LIKE a.JPG can be called a.jpg in windows but in LINUX we need to maintain the proper case.
The primary difference is the type of filesystem. When you use Mac, or Windows they are both not case sensitive, but Linux actually treats the filename "Capitalized.php" as "CAPITALIZED.PHP"
That is why most popular frameworks have lower cased filenames.
Im trying to Delete ALL Text files from a directory using a php script.
Here is what I have tried.....
<?php array_map('unlink', glob("/paste/*.txt")); ?>
I dont get an Error when I run this, yet It doesnt do the job.
Is there a snippet for this? Im not sure what else to try.
Your Implementation works all you need to do is use Use full PATH
Example
$fullPath = __DIR__ . "/test/" ;
array_map('unlink', glob( "$fullPath*.log"))
I expanded the submitted answers a little bit so that you can flexibly and recursively unlink text files located underneath as it's often the case.
// #param string Target directory
// #param string Target file extension
// #return boolean True on success, False on failure
function unlink_recursive($dir_name, $ext) {
// Exit if there's no such directory
if (!file_exists($dir_name)) {
return false;
}
// Open the target directory
$dir_handle = dir($dir_name);
// Take entries in the directory one at a time
while (false !== ($entry = $dir_handle->read())) {
if ($entry == '.' || $entry == '..') {
continue;
}
$abs_name = "$dir_name/$entry";
if (is_file($abs_name) && preg_match("/^.+\.$ext$/", $entry)) {
if (unlink($abs_name)) {
continue;
}
return false;
}
// Recurse on the children if the current entry happens to be a "directory"
if (is_dir($abs_name) || is_link($abs_name)) {
unlink_recursive($abs_name, $ext);
}
}
$dir_handle->close();
return true;
}
You could modify the method below but be careful. Make sure you have permissions to delete files. If all else fails, send an exec command and let linux do it
static function getFiles($directory) {
$looper = new RecursiveDirectoryIterator($directory);
foreach (new RecursiveIteratorIterator($looper) as $filename => $cur) {
$ext = trim($cur->getExtension());
if($ext=="txt"){
// remove file:
}
}
return $out;
}
i have modified submitted answers and made my own version,
in which i have made function which will iterate recursively in current directory and its all child level directories,
and it will unlink all the files with extension of .txt or whatever .[extension] you want to remove from all the directories, sub-directories and its all child level directories.
i have used :
glob() From the php doc:
The glob() function searches for all the pathnames matching pattern
according to the rules used by the libc glob() function, which is
similar to the rules used by common shells.
i have used GLOB_ONLYDIR flag because it will iterate through only directories, so it will be easier to get only directories and unlink the desired files from that directory.
<?php
//extension of files you want to remove.
$remove_ext = 'txt';
//remove desired extension files in current directory
array_map('unlink', glob("./*.$remove_ext"));
// below function will remove desired extensions files from all the directories recursively.
function removeRecursive($directory, $ext) {
array_map('unlink', glob("$directory/*.$ext"));
foreach (glob("$directory/*",GLOB_ONLYDIR) as $dir) {
removeRecursive($dir, $ext);
}
return true;
}
//traverse through all the directories in current directory
foreach (glob('./*',GLOB_ONLYDIR) as $dir) {
removeRecursive($dir, $remove_ext);
}
?>
For anyone who wonder how to delete (for example: All PDF files under public directory) you can do this:
array_map('unlink', glob( public_path('*.pdf')));
Trying to add the ability to delete a Folder using FTP and all subfolders and files contained within that folder.
I have built a recursive function to do so, and I feel like the logic is right, but still doesnt work.
I did some testing, I am able to delete on first run if the path is just an empty folder or just a file, but can't delete if it is a folder containing one file or a folder containing one empty subfolder. So it seems to be a problem with traversing through the folder(s) and using the function to delete.
Any ideas?
function ftpDelete($directory)
{
if(empty($directory))//Validate that a directory was sent, otherwise will delete ALL files/folders
return json_encode(false);
else{
global $conn_id;
# here we attempt to delete the file/directory
if( !(#ftp_rmdir($conn_id,$directory) || #ftp_delete($conn_id,$directory)) )
{
# if the attempt to delete fails, get the file listing
$filelist = #ftp_nlist($conn_id, $directory);
# loop through the file list and recursively delete the FILE in the list
foreach($filelist as $file)
ftpDelete($file);
#if the file list is empty, delete the DIRECTORY we passed
ftpDelete($directory);
}
else
return json_encode(true);
}
};
I took some time to write my own version of a recursive delete function over ftp, this one should be fully functional (I tested it myself).
Try it out and modify it to fit your needs, if it's still not working there are other problems. Have you checked the permissions on the files you are trying to delete?
function ftp_rdel ($handle, $path) {
if (#ftp_delete ($handle, $path) === false) {
if ($children = #ftp_nlist ($handle, $path)) {
foreach ($children as $p)
ftp_rdel ($handle, $p);
}
#ftp_rmdir ($handle, $path);
}
}
Ok found my problem. Since I wasn't moving into the exact directory I was trying to delete, the path for each recursive file being called wasn't absolute:
function ftpDeleteDirectory($directory)
{
global $conn_id;
if(empty($directory))//Validate that a directory was sent, otherwise will delete ALL files/folders
return json_encode(false);
else{
# here we attempt to delete the file/directory
if( !(#ftp_rmdir($conn_id,$directory) || #ftp_delete($conn_id,$directory)) )
{
# if the attempt to delete fails, get the file listing
$filelist = #ftp_nlist($conn_id, $directory);
# loop through the file list and recursively delete the FILE in the list
foreach($filelist as $file)
{
// return json_encode($filelist);
ftpDeleteDirectory($directory.'/'.$file);/***THIS IS WHERE I MUST RESEND ABSOLUTE PATH TO FILE***/
}
#if the file list is empty, delete the DIRECTORY we passed
ftpDeleteDirectory($directory);
}
}
return json_encode(true);
};
function recursiveDelete($handle, $directory)
{ echo $handle;
# here we attempt to delete the file/directory
if( !(#ftp_rmdir($handle, $directory) || #ftp_delete($handle, $directory)) )
{
# if the attempt to delete fails, get the file listing
$filelist = #ftp_nlist($handle, $directory);
// var_dump($filelist);exit;
# loop through the file list and recursively delete the FILE in the list
foreach($filelist as $file) {
recursiveDelete($handle, $file);
}
recursiveDelete($handle, $directory);
}
}
You have to check (using ftp_chdir) for every "file" you get from ftp_nlist to check if it is a directory:
foreach($filelist as $file)
{
$inDir = #ftp_chdir($conn_id, $file);
ftpDelete($file)
if ($inDir) #ftp_cdup($conn_id);
}
This easy trick will work, because if ftp_chdir works, the current $file is actually a folder, and you've moved into it. Then you call ftpDelete recursively, to let it delete the files in that folder. Afterwards, you move back (ftp_cdup) to continue.