I'm creating a php site where a company will upload a lot of images. I'd like one folder to contain upto 500-1000 files and PHP automatically creates a new one if previous contains more that 1000 files.
For example, px300 has folder dir1 which stores 500 files, then a new one dir2 will be created.
Are there any existed a solutions?
This task is simple enough not to require an existing solution. You can make use of scandir to count the number of files in a directory, and then mkdir to make a directory.
// Make sure we don't count . and .. as proper directories
if (count(scandir("dir")) - 2 > 1000) {
mkdir("newdir");
}
A common approach is to create one-letter directories based on the file name. This works particularly well if you assign random names to files (and random names are good to avoid name conflicts in user uploads):
/files/a/c/acbd18db4cc2f85cedef654fccc4a4d8
/files/3/7/37b51d194a7513e45b56f6524f2d51f2
In this way:
if ($h = opendir('/dir')) {
$files = 0;
while (false !== ($file = readdir($h))) {
$files++
}
if($files > 1000){
//create dir
mkdir('/newdir')
}
}
You could use the glob function. It will return an array matching your pattern, which you could count for the amount of files.
Related
The project I am working on requires creating .tar.gz archives and feeding it to an external service. This external service works only with .tar.gz so another type archive is out of question. The server where the code I am working on will execute does not allow access to system calls. So system, exec, backticks etc. are no bueno. Which means I have to rely on pure PHP implementation to create .tar.gz files.
Having done a bit of research, it seems that PharData will be helpful to achieve the result. However I have hit a wall with it and need some guidance.
Consider the following folder layout:
parent folder
- child folder 1
- child folder 2
- file1
- file2
I am using the below code snippet to create the .tar.gz archive which does the trick but there's a minor issue with the end result, it doesn't contain the parent folder, but everything within it.
$pd = new PharData('archive.tar');
$dir = realpath("parent-folder");
$pd->buildFromDirectory($dir);
$pd->compress(Phar::GZ);
unset( $pd );
unlink('archive.tar');
When the archive is created it must contain the exact folder layout mentioned above. Using the above mentioned code snippet, the archive contains everything except the parent folder which is a deal breaker for the external service:
- child folder 1
- child folder 2
- file1
- file2
The description of buildFromDirectory does mention the following so it not containing the parent folder in the archive is understandable:
Construct a tar/zip archive from the files within a directory.
I have also tried using buildFromIterator but the end result with it also the same, i.e the parent folder isn't included in the archive. I was able to get the desired result using addFile but this is painfully slow.
Having done a bit more research I found the following library : https://github.com/alchemy-fr/Zippy . But this requires composer support which isn't available on the server. I'd appreciate if someone could guide me in achieving the end result. I am also open to using some other methods or library so long as its pure PHP implementation and doesn't require any external dependencies. Not sure if it helps but the server where the code will get executed has PHP 5.6
Use the parent of "parent-folder" as the base for Phar::buildFromDirectory() and use its second parameter to limit the results only to "parent-folder", e.g.:
$parent = dirname("parent-folder");
$pd->buildFromDirectory($parent, '#^'.preg_quote("$parent/parent-folder/", "#").'#');
$pd->compress(Phar::GZ);
I ended up having to do this, and as this question is the first result on google for the problem here's the optimal way to do this, without using a regexp (which does not scale well if you want to extract one directory from a directory that contains many others).
function buildFiles($folder, $dir, $retarr = []) {
$i = new DirectoryIterator("$folder/$dir");
foreach ($i as $d) {
if ($d->isDot()) {
continue;
}
if ($d->isDir()) {
$newdir = "$dir/" . basename($d->getPathname());
$retarr = buildFiles($folder, $newdir, $retarr);
} else {
$dest = "$dir/" . $d->getFilename();
$retarr[$dest] = $d->getPathname();
}
}
return $retarr;
}
$out = "/tmp/file.tar";
$sourcedir = "/data/folder";
$subfolder = "folder2";
$p = new PharData($out);
$filemap = buildFiles($sourcedir, $subfolder);
$iterator = new ArrayIterator($filemap);
$p->buildFromIterator($iterator);
$p->compress(\Phar::GZ);
unlink($out); // $out.gz has been created, remove the original .tar
This allows you to pick /data/folder/folder2 from /data/folder, even if /data/folder contains several million OTHER folders. It then creates a tar.gz with the contents all being prepended with the folder name.
I have an HTML form and one of the inputs creates a folder. The folder name is chosen by the website visitor. Every visitor creates his own folder on my website so they are randomly generated. They are created using PHP Code.
Now I would like to write a PHP code to copy a file to all of the child directories regardless the quantity of directories being generated.
I do not wish to stay writing a PHP line for every directory that is created - i.e. inserting the filename name manually (e.g. folder01, xyzfolder, folderabc, etc...) but rather automatically.
I Googled but I was unsuccessful. Is this possible? If yes, how can I go about it?
Kindly ignore security, etc... I am testing it internally prior to rolling out on a larger scale.
Thank you
It is sad I cannot comment so go on...
//get the new folder name
$newfolder = $_POST['newfoldername'];
//create it if not exist
if(!is_dir("./$newfolder")) {
mkdir("./$newfolder", 0777, true);
}
//list all folder
$dirname = './';
$dir = opendir($dirname);
while($file = readdir($dir)) {
if(($file != '.' OR $file != '..') AND is_dir($dirname.$file))
{
//generate a randomname
$str = 'yourmotherisveryniceABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789';
$randomname = str_shuffle($str);
$actualdir = $dirname.$file;
//copy of the file
copy($uploadedfile['tmp_name'], $actualdir.$randomname);
}
}
closedir($dir);
I just want to say, you seem to be lazy by looking for what you want to do. because when I read "I would like to write a PHP code to copy" the answer is in your sentence: copy PHP and list of folders regarless how many? Then just simply list it !
Maybe you need to learn how to use google... If you search "I would like to write a PHP code to copy a file to all of the child directories regardless the quantity of directories being generated" Sure you will never find.
I have a php/mysql website where users create listings and upload multiple images (3 image sizes for each). I made the mistake of storing the images in folders as below :
images/listings/listing_id/image-name.jpg
images/listings/listing_id/thumbs/image-name.jpg
images/listings/listing_id/large/image-name.jpg
Unfortuantely now I have come across the problem where the maximum number of sub directories is 30,000 and my code breaks.
I want to now change my folder structure to the one below :
images/listings/yyyy/mm/dd/listing_id/image-name.jpg
images/listings/yyyy/mm/dd/listing_id/thumbs/image-name.jpg
images/listings/yyyy/mm/dd/listing_id/large/image-name.jpg
I decided the best way forward would be to create a php script to loop around all directories in the 'images/listings/' folder, and copy every image in to a new directory as specified above. or each 'listing_id' folder, I would need to lookup using mysql the listing_id, and get the created_date and then split the date to get the yyyy mm dd.
I'm totally lost in creating the php script, would it be possible to rename the existing directory structure without copying and deleting the old images, or would I need to copy the old images, create the new directories, move them and then delete the old folders?
So long as you don't have any listing_ids that are the same as a 4-digit year you should be able to create the new folder structure alongside the existing one.
You seem to already have a plan for how to do it, you should try actually writing some code.
Hint: rename() is the same as 'move'. Don't do "copy and delete" unless you want it to take 10x longer.
Lastly, as a rule of thumb you should try to avoid having any folder with more than about 1000 entries if possible. If there are a lot of filesystem operations that have to use the directory index for a folder that large it can drastically reduce filesystem performance.
You can use
$origin = "images/listings";
$final = "images/final";
$start = strlen($origin) + 1;
$di = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($origin, RecursiveDirectoryIterator::SKIP_DOTS));
foreach ( $di as $file ) {
if ($file->isFile()) {
$mtime = $file->getMTime();
$date = sprintf("%d/%d/%d", date("Y", $mtime), date("m", $mtime), date("d", $mtime));
$new = sprintf("%s/%s/%s", $final, $date, substr($file, $start));
$dir = dirname($new);
is_dir($dir) or mkdir($dir, null, true);
rename($file, $new);
}
}
For the moving of image,you can use the rename() Function of php.
For example
rename("../app/temp/natural.jpeg","../public/public_photo/natural.jpeg");
Above given code natural.jpeg from the folder /aap/temp/ has moved to the folder /public/public_photo/
I made one site... where i am storing user uploaded files in separate directories like
user_id = 1
so
img/upload_docs/1/1324026061_1.txt
img/upload_docs/1/1324026056_1.txt
Same way if
user_id = 2
so
img/upload_docs/2/1324026061_2.txt
img/upload_docs/2/1324026056_2.txt
...
n
So now if in future if I will get 100000 users then in my upload_docs folder I will have 100000 folders.
And there is no restriction on user upload so it can be 1000 files for 1 user or 10 files any number of files...
so is this proper way?
Or if not then can anyone suggest me how to store this files in this kind of structure???
What I would do is name the images UUIDs and create subfolders based on the names of the files. You can do this pretty easily with chunk_split. For example, if you create a folder every 4 characters you would end up with a structure like this:
img/upload_docs/1/1324/0260/61_1.txt
img/upload_docs/1/1324/0260/56_1.txt
By storing the image name 1324026056_1.txt you could then very easily determine where it belongs or where to fetch it using chunk_split.
This is a similar method to how git stores objects.
As code, it could look something like this.
// pass filename ('123456789.txt' from db)
function get_path_from_filename($filename) {
$path = 'img/upload_docs';
list($filename, $ext) = explode('.', $filename); //remove extension
$folders = chunk_split($filename, 4, '/'); // becomes 1234/5678/9
// now we store it here
$newpath = $path.'/'.$folders.'/'.$ext;
return $newpath;
}
Now when you search for the file to deliver it to the user, use a function using these steps to recreate where the file is (based on the filename which is still stored as '123456789.txt' in the DB).
Now to deliver or store the file, use get_path_from_filename.
img/upload_docs/1/0/10000/1324026056_2.txt
img/upload_docs/9/7/97555/1324026056_2.txt
img/upload_docs/2/3/23/1324026056_2.txt
I want a web page to display several random images on load. I've thought of several solutions but I would really like to be able to dump images into a folder without renaming them, and the web page will choose from those images in the folder and display them
I can imagine a PHP solution that will display random images from a folder, but it will need to look for certain names.
So my next step was to have an SQL database where every image got a key, and 10 keys would be chosen random by a query - the images associated with them would then be passed into an array that the document will load the elements of.
But now I guess I need to know how to automatically populate an SQL database by having it read a folder?
Insight appreciated, if I don't have to reinvent the wheel the better
Glob works like on the filesystem, e.g. supports wildcards
$files = glob('/path/to/files/*.jpg');
$yourRandomFile = array_rand($files);
This will return a random JPG file, based on it's extension.
This should do it :
function your_dir ($directory)
{
$results = array();
$handler = opendir($directory);
while ($file = readdir($handler)) {
if ($file != "." && $file != "..") {
$results[] = $file;
//or sql query for each file
$sql = mysql_query("SELECT * FROM your_table WHERE your_file = '".$file."'");
if(mysql_num_rows !== 0){
//your query...
}
//
}
}
closedir($handler);
return $results;
}
$directory = '/path/to/your/directory';
your_dir($directory);
Would probably be better to select existing files from db first put them into an array and exclude them rather than checking for each one.
You can do as below.
1. read complete dir from where you want the page to load the image
2. store the image names in array
3. now generate a number using random function from 1 to size of the array
4. consider the generated number as key for the array and use that image name to put the image.