I'm working on a Minecraft Server Dashboard, and one of it's functions is to backup and restore your world (a directory). I already have a function (see below), but, as you can probably see, it's pretty bad code. Anyone know of a better, cleaner function?
function backupOrRestoreWorld($source,$target){
foreach(glob($target.'*.*')as$v){
unlink($v);
}
if(is_dir($source)){
#mkdir($target);
$d=dir($source);
while(FALSE!==($entry=$d->read())){
if($entry=='.'||$entry=='..'){
continue;
}
$Entry=$source.'/'.$entry;
if(is_dir($Entry)){
backupOrRestoreWorld($Entry,$target.'/'.$entry);
continue;
}
copy($Entry,$target.'/'.$entry);
}
$d->close();
}
else{
copy($source,$target);
}
if($source == "server/world"){
return "World backed up.";
}
else {
return "World restored from backup.";
}
}
I wouldn't do this in PHP. Just use system("cp -a $source $dest"). (And make sure the user can not in any way control the contents of $source and $dest or you will be hacked.)
I would create more functions out of it, each one doing a distinctive job, probably encapsulated into a class, like
empty_directory
copy_directory
You can still maintain your single function then making use of the subroutines / objects to provide a façade into your application, for example dealing with exceptions for error handling and such.
Next to that you're having not really bad code. It's a recursive function for copying the data which might stress the file-system a bit - which presumably can be neglected. If you move the decent functionality into units of it's own, you can change that over time if you run into actual problems.
But the first benefit would be to make use of exceptions in subroutines I think:
function backupOrRestoreWorld($source, $target)
{
empty_directory($target);
copy_directory($source, $target);
if($source == "server/world"){
return "World backed up.";
}
else {
return "World restored from backup.";
}
}
function empty_directory($path)
{
$path = rtrim($path, '/');
if (!is_dir($path))
{
throw new InvalidArgumentException(sprintf('Not a directory ("%s").', $path));
}
if (!is_writable($path))
{
throw new InvalidArgumentException(sprintf('Directory ("%s") is not a writeable.', $path));
}
$paths = glob($path.'/*.*');
if (false === $paths)
{
throw new Exception(sprintf('Unable to get path list on path "%s" (glob failed).', $path));
}
foreach ($paths as $v)
{
unlink($v);
}
}
function copy_directory($source, $target)
{
$source = rtrim($source, '/');
$target = rtrim($target, '/');
if (!is_dir($source))
{
throw new InvalidArgumentException(sprintf('Source ("%s") is not a valid directory.', $source));
}
if (!is_readable($source))
{
throw new InvalidArgumentException(sprintf('Source ("%s") is not readable.', $source));
}
if (!is_dir($target))
$r = mkdir($target);
if (!is_dir($target))
{
throw new InvalidArgumentException(sprintf('Target ("%s") is not a valid directory.', $target));
}
if (!is_writable($target))
{
throw new InvalidArgumentException(sprintf('Target ("%s") is not a writeable.', $target));
}
$dirs = array('');
while(count($dirs))
{
$dir = array_shift($dirs)
$base = $source.'/'.$dir;
$d = dir($base);
if (!$d)
{
throw new Exception(sprintf('Unable to open directory "%s".', $base));
}
while(false !== ($entry = $d->read()))
{
// skip self and parent directories
if (in_array($entry, array('.', '..'))
{
continue;
}
// put subdirectories on stack
if (is_dir($base.'/'.$entry))
{
$dirs[] = $dir.'/'.$entry;
continue;
}
// copy file
$from = $base.'/'.$entry;
$to = $target.'/'.$dir.'/'.$entry;
$result = copy($from, $to);
if (!$result)
{
throw new Exception(sprintf('Failed to copy file (from "%s" to "%s").', $from, $to);
}
}
$d->close();
}
}
This example is basically introducing two functions, one to empty a directory and another one to copy the contents of one directory to another. Both functions do throw exceptions now with more or less useful descriptions what happens. I tried to reveal errors early, that's by checking input parameters and by performing some additional tests.
The empty_directory function might be a bit short. I don't know for example, if there a subdirectories and those aren't empty, if unlink would work. I leave this for an exercise for you.
The copy_directory function is working non-recursive. This is done by providing a stack of directories to process. In case there is a subdirectory, the directory is put on the stack and processed after the all files of the current directory are copied. This helps to prevent switching directories too often and is normally faster. But as you can see, it's very similar to your code.
So these functions concentrate on the file-system work. As it's clear what they do and for what they are, you can concentrate inside your function on the main work, like the logic to determine in which direction the copying has been done. As the helper functions now throw exceptions, you can catch those, too. Additionally you could verify that $source and $target actually contain values that you explicitly allow. For example, you don't want to have .. or / inside of them probably, but just characters from a-z.
This will help you as well to find other causes of error, like overwriting attempts etc.:
function backupOrRestoreWorld($source, $target)
{
$basePath = 'path/to/server';
$world = 'world';
$pattern = '[a-z]+';
try
{
if (!preg_match("/^{$pattern}\$/", $source))
{
throw new InvalidArgumentException('Invalid source path.');
}
if (!preg_match("/^{$pattern}\$/", $target))
{
throw new InvalidArgumentException('Invalid target path.');
}
if ($source === $target)
{
throw new InvalidArgumentException('Can not backup or restore to itself.');
}
$targetPath = $basePath.'/'.$target;
if (is_dir($targetPath))
empty_directory($targetPath);
copy_directory($basePath.'/'.$source, $targetPath);
if($source === $world)
{
return "World backed up.";
}
else
{
return "World restored from backup.";
}
}
catch(Exception $e)
{
return 'World not backed up. Error: '.$e->getMessage();
}
}
In the example backupOrRestoreWorld still acts as the original function but now returns an error message and more specifically checks for error conditions in it's own logic. It's a bit borked because it converts exceptions to the return value, which are two sorts of error handling (which you might not want), but it's a compromise to interface with your existing code (the façade) while covering itself's input validation with exceptions.
Additionally, the values it works on (parameters) are specified at the top of the function instead of later on in the function's code.
Hope this helps. What's left?
The copy_directory function could check if a directory already exists, that it is empty. Otherwise it would not truly copy a world, but mix two worlds.
The empty_directory function should be properly checked if it actually does the job to empty a directory in a fail-safe manner, especially while dealing with subdirectories.
You can make up your mind about a general way of doing error handling inside your application so that you can more easily deal with errors inside your application.
You can think about creating objects to deal with storing and retrieving worlds so things can be extended easily in the long run.
.1. Using # operator will lead you to trouble. NEVER use it. If there is a possibility of unexisting file - CHECK IT first!
if (!file_exists($target)) mkdir($target);
.2. I am quite surprised why you're using glob in the first part and don't use it in the second.
.3. Your "clean target directory code first" code won't clean subdirectories.
I used, from here: http://codestips.com/php-copy-directory-from-source-to-destination/
<?
function copy_directory( $source, $destination ) {
if ( is_dir( $source ) ) {
mkdir( $destination );
$directory = dir( $source );
while ( FALSE !== ( $readdirectory = $directory->read() ) ) {
if ( $readdirectory == '.' || $readdirectory == '..' ) {
continue;
}
$PathDir = $source . '/' . $readdirectory;
if ( is_dir( $PathDir ) ) {
copy_directory( $PathDir, $destination . '/' . $readdirectory );
continue;
}
copy( $PathDir, $destination . '/' . $readdirectory );
}
$directory->close();
}else {
copy( $source, $destination );
}
}
?>
Works so long as you don't try to copy a directory inside itself and go into an infinite loop..
I have some recursive copying code that follows, but first a few thoughts.
Conversely to what some others think - I believe the file system functions are about the only place where the # operator makes sense. It allows you to raise your own exceptions rather than having to handle the functions built in warnings. With all file system calls you should check for failure conditions.
I would not do something that was suggested to you:
if (!file_exists($target)) mkdir($target);
(which assumes that mkdir will succeed). Always check for failure (the file_exists check does not match the complete set of possibilities where mkdir would fail). (e.g broken sym links, inaccessible safe mode files).
You must always decide how you will handle exceptions, and what are the initial conditions required for your function to succeed. I treat any operation that fails with the file system as an exception which should be caught and dealt with.
Here is the recursive copying code that I use:
/** Copy file(s) recursively from source to destination.
* \param from \string The source of the file(s) to copy.
* \param to \string The desitination to copy the file(s) into.
* \param mode \int Octal integer to specify the permissions.
*/
public function copy($from, $to, $mode=0777)
{
if (is_dir($from))
{
// Recursively copy the directory.
$rDir = new RecursiveDirectoryIterator(
$from, FilesystemIterator::SKIP_DOTS);
$rIt = new RecursiveIteratorIterator(
$rDir, RecursiveIteratorIterator::SELF_FIRST);
// Make the directory - recursively creating the path required.
if (!#mkdir($to, $mode, true))
{
throw new Exception(
__METHOD__ .
'Unable to make destination directory: ' . var_export($to, true));
}
foreach ($rIt as $file)
{
$src = $file->getPathname();
$dest = $to . $rIt->getInnerIterator()->getSubPathname();
if (is_dir($src))
{
if (!#mkdir($dest, $mode))
{
throw new Exception(
__METHOD__ .
'From: ' . $from . ' To: ' . $to .
' Copying subdirectory from:' . $src . ' to: ' . $dest);
}
}
else
if (!#copy($src, $dest))
{
throw new Exception(
__METHOD__ .
'From: ' . $from . ' To: ' . $to .
' Copying file from: ' . $src . ' to: ' . $dest);
}
}
}
}
else
{
if (!#copy($from, $to))
{
throw new Exception(
__METHOD__ .
'Copying single file from: ' . $from . ' to: ' . $to);
}
}
}
Before copy all files, you must to make "/destination" permission to be 0777
$dst = '/destination'; // path for past
$src = '/source'; // path for copy
$files = glob($src.'/*.*');
foreach($files as $file){
$file_to_go = str_replace($src, $dst, $file);
copy($file, $file_to_go);
}
$dir_array = array();
$dir_array[] = $src;
while ($dir_array != null) {
$newDir = array ();
foreach ($dir_array as $d) {
$results = scandir($d);
foreach ($results as $r) {
if ($r == '.' or $r == '..') continue;
if (is_file($d . '/' . $r)) {
} else {
$path = $d.'/'.$r;
$newDir[] = $path;
$new = str_replace($src, $dst, $path);
mkdir($new, 0777, true);
$files = glob($path.'/*.*');
foreach($files as $file) {
$file_to_go = str_replace($src, $dst, $file);
copy($file, $file_to_go);
}
continue;
}
}
}
$dir_array = $newDir;
}
all done.
Thanks.
Related
I have a zip file containing one folder, that contains more folders and files, like this:
myfile.zip
-firstlevel
--folder1
--folder2
--folder3
--file1
--file2
Now, I want to extract this file using PHPs ZipArchive, but without the "firstlevel" folder. At the moment, the results look like this:
destination/firstlevel/folder1
destination/firstlevel/folder2
...
The result I'd like to have would look like this:
destination/folder1
destination/folder2
...
I've tried extractTo, which produces the first mentioned result, and copy(), as suggested here, but this doesn't seem to work at all.
My current code is here:
if($zip->open('myfile.zip') === true) {
$firstlevel = $zip->getNameIndex(0);
for($i = 0; $i < $zip->numFiles; $i++) {
$entry = $zip->getNameIndex($i);
$pos = strpos($entry, $firstlevel);
if ($pos !== false) {
$file = substr($entry, strlen($firstlevel));
if(strlen($file) > 0){
$files[] = $file;
}
}
}
//attempt 1 (extractTo):
//$zip->extractTo('./test', $files);
//attempt 2 (copy):
foreach($files as $filename){
copy('zip://'.$firstlevel.'/'.$filename, 'test/'.$filename);
}
}
How can I achieve the result I'm aiming for?
Take a look at my Quick Unzipper script. I wrote this for personal use a while back when uploading large zip files to a server. It was a backup, and 1,000s of files take forever with FTP so using a zip file was faster. I use Git and everything, but there wasn't another option for me. I place this php file in the directory I want the files to go, and put the zip file in the same directory. For my script, they all have to operate in the same directory. It was an easy way to secure it for my needs, as everything I needed was in the same dir.
Quick Unzipper: https://github.com/incomepitbull/QuickUnzipper/blob/master/unzip.php
I linked the file because I am not showcasing the repo, just the code that makes the unzip tick. With modern versions of PHP, there should't be anything that isn't included on your setup. So you shouldn't need to do any server config changes to use this.
Here is the PHP Doc for the ZipArchive class it uses: http://php.net/manual/en/class.ziparchive.php
There isn't any included way to do what you want, which is a shame. So I would unzip the file to a temp directory, then use another function to copy the contents to where you want. So when using ZipArchive, you will need to return the first item to get the folder name if it is unknown. If the folder is known, ie: the same pesky folder name every time, then you could hard code the name.
I have made it return the first item from the index. So if you ALWAYS have a zip with 1 folder inside it, and everything in that folder, this would work. However, if you have a zip file without everything consolidated inside 1 folder, it would fail. The code I have added will take care of your question. You will need to add further logic to handle alternate cases.
Also, You will still be left with the old directory from when we extract it to the temp directory for "processing". So I included code to delete it too.
NOTE: The code uses a lot of if's to show the processing steps, and print a message for testing purposes. You would need to modify it to your needs.
<?php
public function copyDirectoryContents($source, $destination, $create=false)
{
if ( ! is_dir($source) ) {
return false;
}
if ( ! is_dir($destination) && $create === true ) {
#mkdir($destination);
}
if ( is_dir($destination) ) {
$files = array_diff(scandir($source), array('.','..'));
foreach ($files as $file)
{
if ( is_dir($file) ) {
copyDirectoryContents("$source/$file", "$destination/$file");
} else {
#copy("$source/$file", "$destination/$file");
}
}
return true;
}
return false;
}
public function removeDirectory($directory, $options=array())
{
if(!isset($options['traverseSymlinks']))
$options['traverseSymlinks']=false;
$files = array_diff(scandir($directory), array('.','..'));
foreach ($files as $file)
{
if (is_dir("$directory/$file"))
{
if(!$options['traverseSymlinks'] && is_link(rtrim($file,DIRECTORY_SEPARATOR))) {
unlink("$directory/$file");
} else {
removeDirectory("$directory/$file",$options);
}
} else {
unlink("$directory/$file");
}
}
return rmdir($directory);
}
$file = dirname(__FILE__) . '/file.zip'; // full path to zip file needing extracted
$temp = dirname(__FILE__) . '/zip-temp'; // full path to temp dir to process extractions
$path = dirname(__FILE__) . '/extracted'; // full path to final destination to put the files (not the folder)
$firstDir = null; // holds the name of the first directory
$zip = new ZipArchive;
$res = $zip->open($file);
if ($res === TRUE) {
$firstDir = $zip->getNameIndex(0);
$zip->extractTo($temp);
$zip->close();
$status = "<strong>Success:</strong> '$file' extracted to '$temp'.";
} else {
$status = "<strong>Error:</strong> Could not extract '$file'.";
}
echo $status . '<br />';
if ( empty($firstDir) ) {
echo 'Error: first directory was empty!';
} else {
$firstDir = realpath($temp . '/' . $firstDir);
echo "First Directory: $firstDir <br />";
if ( is_dir($firstDir) ) {
if ( copyDirectoryContents($firstDir, $path) ) {
echo 'Directory contents copied!<br />';
if ( removeDirectory($directory) ) {
echo 'Temp directory deleted!<br />';
echo 'Done!<br />';
} else {
echo 'Error deleting temp directory!<br />';
}
} else {
echo 'Error copying directory contents!<br />';
}
} else {
echo 'Error: Could not find first directory';
}
}
I'm not a native PHP developer but I make do with what I can find and hack together out there, so please excuse me if this doesn't make too much sense:
I have a simple (so it seems) script that takes two arguments in the URL. One is a simple string (title of the ZIP to be created) and the other is a serialized array of audio tracks that point to the files on the server that need zipping. These then pass just fine and are unserialized etc. Here's the script:
<?php
$audioTitleVar = unserialize(rawurldecode($_GET['audioTitle']));
$audioArrayVar = unserialize(rawurldecode($_GET['audioArray']));
function create_zip( $files = array(), $destination = '', $overwrite = true ) {
if(file_exists($destination) && !$overwrite) { return false; }
$valid_files = array();
if(is_array($files)) {
foreach($files as $file) {
if( file_exists($_SERVER['DOCUMENT_ROOT'] . str_replace("http://mydomain.com","", $file)) ) {
$valid_files[] = $_SERVER['DOCUMENT_ROOT'] . str_replace("http://mydomain.com","", $file);
}
}
}
if(count($valid_files)) {
$zip = new ZipArchive();
if($zip->open($destination,$overwrite ? ZIPARCHIVE::OVERWRITE : ZIPARCHIVE::CREATE) !== true) {
return false;
}
foreach( $valid_files as $file ) {
$just_name = preg_replace("/(.*)\/?([^\/]+)/","$2",$file);
$zip->addFile($file,$just_name);
//$zip->addFile($file,$file);
}
//echo '<p>The zip archive contains ' . $zip->numFiles . ' files with a status of ' . $zip->status . '</p>';
$zip->close();
return file_exists($destination);
} else {
return false;
}
}
$fileName = $_SERVER['DOCUMENT_ROOT'] . '/wp-content/themes/jones/zip/' . $audioTitleVar . '.zip';
create_zip( $audioArrayVar, $fileName, true );
//echo '<p>File Path: ' . $fileName . '</p>';
var_dump(file_exists($fileName));
?>
I guess my real issue here, is that, although the script DOES NOT error...no ZIP is created. I have, over time placed outputs in the function at certain parts to see if it was even getting there, and it is - so I'm stumped on this.
What I really need is for one of you guys to scan the script over to see if there's anything glaringly obvious that just won't work!
Could it be a permissions thing? Should PHP be running in CGI mode or Apache? Or does this not make a difference?
The bones of this script were taken from: http://davidwalsh.name/create-zip-php but it's never worked even with that version alone.
PS - I'd just like to add, that if I uncomment the line that tells me how many files are in the ZIP etc, it seems to return the correct info...but the final check on whether the file exists returns false.
Ok, I have finally got it working.
Seems there was something in the addFile() that corrupted the code.
I changed this:
foreach( $valid_files as $file ) {
$just_name = preg_replace("/(.*)\/?([^\/]+)/","$2",$file);
$zip->addFile($file,$just_name);
//$zip->addFile($file,$file);
}
To this:
foreach( $valid_files as $file ) {
// Use basename to get JUST the file name from the path.
$zip->addFile($file, basename($file) );
}
The basename function gets only the file NAME to then place in the ZIP. Before it was the whole path on the server, and this was causing Windows to choke on it.
Hope this helps someone someday.
I have this code so far which perfectly but relies on there being a directory in place:
$path = '/home/sites/therealbeercompany.co.uk/public_html/public/themes/trbc/images/backgrounds/'.$this->slug;
$bgimagearray = array();
$iterator = new DirectoryIterator($path);
foreach ($iterator as $fileinfo) {
if ($fileinfo->isFile() && !preg_match('\.jpg$/', $fileinfo->getFilename())) {
$bgimagearray[] = "'" . $fileinfo->getFilename() . "'";
}
}
I need to work in a bit at the top so that if the directory doesnt exist it defaults to the images sat in the root of the background directory...
Any help would be appreciated.
You want is_dir. Test your slug directory, and if it doesn't exist, use the root background directory instead.
Use is_dir to see if the dir is there, and if not, set $path to the current path (where the script is running from)
if (!is_dir($path)) {
$path = $_SERVER["PATH_TRANSLATED"];
}
I very much dangerously assumed that $path is not going to be used anywhere else :)
(is_dir is better, thanks!)
DirectoryIterator will throw an UnexpectedValueException when the path cannot be opened, so you can wrap the call into a try/catch block and then fallback to the root path. In a function:
function getBackgroundImages($path, $rootPath = NULL)
{
$bgImages = array();
try {
$iterator = new DirectoryIterator($path);
// foreach code
} catch(UnexpectedValueException $e) {
if($rootPath === NULL) {
throw $e;
}
$bgImages = getBackgroundImages($rootPath);
}
return $bgImages;
}
But of course file_exists or is_dir are a valid options too.
You could also use the file_exists function to check for the directory.
I wrote a basic content-management system for my website, including an administration panel. I understand basic file IO as well as copying via PHP, but my attempts at a backup script callable from the script have failed. I tried doing this:
//... authentication, other functions
for(scandir($homedir) as $buffer){
if(is_dir($buffer)){
//Add $buffer to an array
}
else{
//Back up the file
}
}
for($founddirectories as $dir){
for(scandir($dir) as $b){
//Backup as above, adding to $founddirectories
}
}
But it did not seem to work.
I know that I can do this using FTP, but I want a completely server-side solution that can be accessed anywhere with sufficient authorization.
Here is an alternative though: why don't you Zip the source directory instead?
function Zip($source, $destination)
{
if (extension_loaded('zip') === true)
{
if (file_exists($source) === true)
{
$zip = new ZipArchive();
if ($zip->open($destination, ZIPARCHIVE::CREATE) === true)
{
$source = realpath($source);
if (is_dir($source) === true)
{
$files = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($source), RecursiveIteratorIterator::SELF_FIRST);
foreach ($files as $file)
{
$file = realpath($file);
if (is_dir($file) === true)
{
$zip->addEmptyDir(str_replace($source . '/', '', $file . '/'));
}
else if (is_file($file) === true)
{
$zip->addFromString(str_replace($source . '/', '', $file), file_get_contents($file));
}
}
}
else if (is_file($source) === true)
{
$zip->addFromString(basename($source), file_get_contents($source));
}
}
return $zip->close();
}
}
return false;
}
You can even unzip it afterwards and archive the same effect, although I must say I prefer having my backups compressed in zip file format.
if you have access to execute tar binary file through exec function it would be faster and better i think:
exec('tar -zcvf ' . realpath('some directory') .'/*);
or
chdir('some directory')
exec('tar -zcvf ./*');
You can use recursion.
for(scandir($dir) as $dir_contents){
if(is_dir($dir_contents)){
backup($dir_contents);
}else{
//back up the file
}
}
you got it almost right
$dirs = array($homedir);
$files = array();
while(count($dirs)) {
$dir = array_shift($dirs);
foreach(glob("$dir/*") as $e)
if(is_dir($e))
$dirs[] = $e;
else
$files[] = $e;
}
// here $files[] contains all files from $homedir and below
glob() is better than scandir() because of more consistent output
I us something called UPHP. Just call zip() to do that. here:
<?php
include "uphplib.php";
$folder = "data";
$dest = "backup/backup.zip";
zip($folder, $dest);
?>
UPHP is a PHP library. download: here
Here is a backup script with ftp, scp, mysqldump, pg_dump and filesystem capabilities https://github.com/skywebro/php-backup
i use a simple function to backup a file:
<?php
$oldfile = 'myfile.php';
$newfile = 'backuped.php';
copy($oldfile, $newfile) or die("Unable to backup");
echo 'Backup is Completed';
?>
I'm writing a photo gallery script in PHP and have a single directory where the user will store their pictures. I'm attempting to set up page caching and have the cache refresh only if the contents of the directory has changed. I thought I could do this by caching the last modified time of the directory using the filemtime() function and compare it to the current modified time of the directory. However, as I've come to realize, the directory modified time does not change as files are added or removed from that directory (at least on Windows, not sure about Linux machines yet).
So my questions is, what is the simplest way to check if the contents of a directory have been modified?
As already mentioned by others, a better way to solve this would be to trigger a function when particular events happen, that changes the folder.
However, if your server is a unix, you can use inotifywait to watch the directory, and then invoke a PHP script.
Here's a simple example:
#!/bin/sh
inotifywait --recursive --monitor --quiet --event modify,create,delete,move --format '%f' /path/to/directory/to/watch |
while read FILE ; do
php /path/to/trigger.php $FILE
done
See also: http://linux.die.net/man/1/inotifywait
What about touching the directory after a user has submitted his image?
Changelog says: Requires php 5.3 for windows to work, but I think it should work on all other environments
with inotifywait inside php
$watchedDir = 'watch';
$in = popen("inotifywait --monitor --quiet --format '%e %f' --event create,moved_to '$watchedDir'", 'r');
if ($in === false)
throw new Exception ('fail start notify');
while (($line = fgets($in)) !== false)
{
list($event, $file) = explode(' ', rtrim($line, PHP_EOL), 2);
echo "$event $file\n";
}
Uh. I'd simply store the md5 of a directory listing. If the contents change, the md5(directory-listing) will change. You might get the very occasional md5 clash, but I think that chance is tiny enough..
Alternatively, you could store a little file in that directory that contains the "last modified" date. But I'd go with md5.
PS. on second thought, seeing as how you're looking at performance (caching) requesting and hashing the directory listing might not be entirely optimal..
IMO edubem's answer is the way to go, however you can do something like this:
if (sha1(serialize(Map('/path/to/directory/', true))) != /* previous stored hash */)
{
// directory contents has changed
}
Or a more weak / faster version:
if (Size('/path/to/directory/', true) != /* previous stored size */)
{
// directory contents has changed
}
Here are the functions used:
function Map($path, $recursive = false)
{
$result = array();
if (is_dir($path) === true)
{
$path = Path($path);
$files = array_diff(scandir($path), array('.', '..'));
foreach ($files as $file)
{
if (is_dir($path . $file) === true)
{
$result[$file] = ($recursive === true) ? Map($path . $file, $recursive) : $this->Size($path . $file, true);
}
else if (is_file($path . $file) === true)
{
$result[$file] = Size($path . $file);
}
}
}
else if (is_file($path) === true)
{
$result[basename($path)] = Size($path);
}
return $result;
}
function Size($path, $recursive = true)
{
$result = 0;
if (is_dir($path) === true)
{
$path = Path($path);
$files = array_diff(scandir($path), array('.', '..'));
foreach ($files as $file)
{
if (is_dir($path . $file) === true)
{
$result += ($recursive === true) ? Size($path . $file, $recursive) : 0;
}
else if (is_file() === true)
{
$result += sprintf('%u', filesize($path . $file));
}
}
}
else if (is_file($path) === true)
{
$result += sprintf('%u', filesize($path));
}
return $result;
}
function Path($path)
{
if (file_exists($path) === true)
{
$path = rtrim(str_replace('\\', '/', realpath($path)), '/');
if (is_dir($path) === true)
{
$path .= '/';
}
return $path;
}
return false;
}
Here's what you may try. Store all pictures in a single directory (or in /username subdirectories inside it to speed things up and to lessen the stress on the FS) and set up Apache (or whaterver you're using) to serve them as static content with "expires-on" set to 100 years in the future. File names should contain some unique prefix or suffix (timestamp, SHA1 hash of file content, etc), so whenever uses changes the file its name gets changed and Apache will serve a new version, which will get cached along the way.
You're thinking the wrong way.
You should execute your directory indexer script as soon as someone's uploaded a new file and it's moved to the target location.
Try deleting the cached version when a user uploads a file to his directory.
When someone tries to view the gallery, look if there's a cached version first. If there's a cached version, load it, otherwise, generate the page, cache it, done.
I was looking for something similar and I just found this:
http://www.franzone.com/2008/06/05/php-script-to-monitor-ftp-directory-changes/
For me looks like a great solution since I'll have a lot of control (I'll be doing an AJAX call to see if anything changed).
Hope that this helps.
Here is a code sample, that would return 0 if the directory was changed.
I use it in backups.
The changed status is determined by presence of files and their filesizes.
You could easily change this, to compare file contents by replacing
$longString .= filesize($file);
with
$longString .= crc32(file_get_contents($file));
but it will affect execution speed.
#!/usr/bin/php
<?php
$dirName = $argv[1];
$basePath = '/var/www/vhosts/majestichorseporn.com/web/';
$dataFile = './backup_dir_if_changed.dat';
# startup checks
if (!is_writable($dataFile))
die($dataFile . ' is not writable!');
if (!is_dir($basePath . $dirName))
die($basePath . $dirName . ' is not a directory');
$dataFileContent = file_get_contents($dataFile);
$data = #unserialize($dataFileContent);
if ($data === false)
$data = array();
# find all files ang concatenate their sizes to calculate crc32
$files = glob($basePath . $dirName . '/*', GLOB_BRACE);
$longString = '';
foreach ($files as $file) {
$longString .= filesize($file);
}
$longStringHash = crc32($longString);
# do changed check
if (isset ($data[$dirName]) && $data[$dirName] == $longStringHash)
die('Directory did not change.');
# save hash do DB
$data[$dirName] = $longStringHash;
file_put_contents($dataFile, serialize($data));
die('0');