Finding latest file's name and modification time in PHP - php

I search a director with glob function and get the matched files' list. Then by checking filemtime of the files I create a map. Then sort the map with respect to file dates. At last I get latest file's name and modification time. My code is like this. It works well for small directories, but it's slow for big directories. I wonder whether is there a faster/clever way?
$fileList = array();
// $id = is the argument of this function
$files = glob('myfolder'.DS.'someone'.$id.'*.txt');
if (empty($files)) {
return 0;
}
foreach ($files as $file) {
$fileList[filemtime($file)] = $file;
}
if (sizeof($files) > 1) {
ksort($fileList);
}
$latestFilename = end($fileList);
$fileLastModifDate = filemtime( $latestFilename );

i suspect that your "mapping" is actually creating a hash table, and then the sort is not efficiant as one might expect, i would try this: (this is the basic, u can fancy it up)
class file_data
{
public $time; // or whatever u use
public $file;
}
$arr =new array ();
foreach ($files as $file) {
$new_file = new file_data ();
$file_data->time = filetime($file);
$file_data->file = $file;
array_push ($arr,$file_data);
}
function file_object_compare ($a,$b)
{
if ($a->time > $b->time) return -1;
// etc... i.e 0 eual 1 otherwise
}
//and then u sort
usort ($arr,"file_object_compare");
// or
// and this is probably better, if u only need this paricular sorting
function file_object_compare ($a,$b)
{
if (filetime($a)> filetime($b)) return -1;
// etc... i.e 0 eual 1 otherwise
}
usort ($files,"file_object_compare");

Related

what is the best way for search in json file in php?

hi i have many data files in json format in a folder.
now i want to search a filed in them .my search word maybe not exist in some of them and may be exist in one of them files.
i have read this function and if not exits in a file i call the function to read another file.
when i echo the result show me and works fine but return not working and no data returned.
function get_shenavari_in_files($search,$type)
{
static $counter =1 ;
$darsadi = 0;
$find = false;
$file_name = get_files_in_dir(); // make an array of file names
$file_number = count($file_name)-$counter ;
$file="files/" .$file_name[$file_number];
$file_data = read_json($file);
for($i = 0 ; $i<count($file_data) ; $i++)
{
if($file_data[$i][$type] == $search )
{
$darsadi = $file_data[$i]['darsadi'] ;
$find = true;
echo $darsadi ; //this works and show the data
return $darsadi; // this is my problem no data return.
break;
}
}
if($find == false)
{
$counter ++;
get_shenavari_in_files($search,$type);
}
}
var_dump(get_shenavari_in_files('Euro','symbol')); //return null
Once you recurse into get_shenavari_in_files, any found value is never returned back to the inital caller, i.e. instead of
if($find == false)
{
...
get_shenavari_in_files($search,$type);
}
you simply need to prepend the function call with a returnstatement
if($find == false)
{
...
return get_shenavari_in_files($search,$type);
}
Having said that, I would try a much simpler (and thereby less error-prone) approach, e.g.:
function get_shenavari_in_files($search, $type) {
$files = glob("files/*.json"); // Get names of all JSON files in a given path
$matches = [];
foreach ($files as $file) {
$data = json_decode(file_get_contents($file), true);
foreach ($data as $row) {
if (array_key_exists($type, $row) && $row[$type] == $search) {
$matches[$file] = $search;
}
}
}
return $matches;
}
This way, you would be able to eliminate the need for a recursive call to get_shenavari_in_files. Also, the function itself would become more performant because it doesn't have to scan the file system over and over again.

Folders first from RecursiveDirectoryIterator

I wrote this code a long time ago to get files from a folder structure given in $dir.
$recursiveIterator = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($dir), RecursiveIteratorIterator::CHILD_FIRST);
$ritit = new RegexIterator($recursiveIterator, $filter);
foreach ($ritit as $splFileInfo) {
if(($splFileInfo->getFileName() != ".") && ($splFileInfo->getFileName() != "..")) {
$path = $splFileInfo->isDir()
? array($splFileInfo->getFilename() => array())
: array($splFileInfo->getFilename());
for ($depth = $ritit->getDepth() - 1; $depth >= 0; $depth--) {
$path = array($ritit->getSubIterator($depth)->current()->getFilename() => $path);
}
$return = array_merge_recursive($return, $path);
}
}
And as the title suggests, I want the $return array to have the folders first. I first attempted to correct this with a foreach after the loop, and sort into $folders and $files array, however this wouldnt change the contents inside the folders, if there were mutliple children inside children.
Is there a way to modify the above loop so that all folders appear first in the array and files after? Including children and children's children?
I don't think that you can modify the loop to get the output array the way you want it. Instead, I'd rather use recursive sorting function to sort the array after the loop.
First, create function that defines the logic for sorting elements. In your case, you want the array-type elements to be the first elements in a tier, so the sorting function could look like this:
function dirFirstSorting($a, $b)
{
if (is_array($a) && is_string($b)) {
return -1;
} elseif (is_string($a) && is_array($b)) {
return 1;
} else {
return 0;
}
}
Then, create a function that recursively sorts elements in array:
function sortFilesListing(&$array)
{
if (is_array($array)) {
uasort($array, "dirFirstSorting");
array_walk($array, "sortFilesListing");
}
}
All you need to do now, is to call sortFilesListing function with $return array provided:
sortFilesListing($return);
The $return array elements should be now sorted accordingly.

After using usort on a DirectoryIterator, method calls don't seem to work anymore

I am trying to create a function that removes the oldest files based on date, for a max of 30 files. I grab all the files in the dir. If there are more than 30, they get sorted by date. Then the oldest gets deleted.
public function cleanUpFolder($path){
$files = [];
try {
$dir = new DirectoryIterator($path);
foreach ($dir as $fileinfo) {
if (!$fileinfo->isDot()) {
$files[] = $fileinfo;
// in here i can call any valid method like getPathname()
}
}
$fileCount = count($files);
if($fileCount > self::MAX_BACKUPS){
// sort with the youngest file first
usort($files, function($a, $b) {
// in here, i can call functions like getMTime()
// and even getPath()
// but getPathname or getFileName return false or ""
return $a->getMTime() < $b->getMTime();
});
for($i = $fileCount - 1; $i > 30; $i--){
unlink($files[$i]->getPathname());
}
}
return true;
}
catch (Exception $e){
return false;
}
}
What is working
Getting the files
What is not working
Sort? I cannot tell if the sort works
Calling some methods on the DirectoryIterator while looping through the $files array
It seems like putting the $fileInfo into an array, most function calls no longer work..
I found that the $files array's values got messed up outside of the foreach loop. A var_dump of the array showed all the values were empty, which is odd. Not really a solution, but a workaround I found from this question:
$files = array();
$dir = new DirectoryIterator($path);
foreach ($dir as $fileinfo) {
if (!$fileinfo->isDot()) {
$files[] = array(
"pathname" => $fileinfo->getPathname(),
"modified" => $fileinfo->getMTime()
);
}
}
and then your usort becomes:
usort($files, function($a, $b) {
return $a['modified'] < $b['modified'];
});
and your unlink becomes:
for($i = $fileCount - 1; $i > 30; $i--){
unlink($files['pathname']);
}
=== EDIT ===
I'm not expert in PHP so this could be wrong, but the top comment of FilesystemIterator could be a clue as to why the methods are returning empty.
When you iteterate using DirectoryIterator each "value" returned is
the same DirectoryIterator object. The internal state is changed so
when you call isDir(), getPathname(), etc the correct information is
returned. If you were to ask for a key when iterating you will get an
integer index value.
FilesystemIterator (and RecursiveDirectoryIterator) on the other hand
returns a new, different SplFileInfo object for each iteration step.
The key is the full pathname of the file. This is by default. You can
change what is returned for the key or value using the "flags"
arguement to the constructor.

Omit certain results from a random array PHP

I'm trying to get 2 random results from an array of files in the "related" directory. I've managed to pull two results randomly from the directory but I need to avoid certain results depending on a variable relating to a specific file name.
My code so far is:
$foo = "bar.php";
function random_file($dir) {
$files = opendir($dir . '/*.php');
$rand_files = array_rand($files, 2);
return array(include $files[$rand_files[0]], include $files[$rand_files[1]]);
}
list($file_1,$file_2) = random_file("related");
I'm trying to pull two random results but avoid the file: bar.php. Does anyone know of a way to omit certain results from the array as I can't find anything online even close?
You can use glob function with a specific regex to only select names that are a match for you. This will limit your initial $files variable to results that do satisfy your condition and you can continue and do the random sampling without modifications.
// entries containing foo will not be included
function random_file($dir) {
$files = glob("^(?!bar.php)*");
$rand_files = array_rand($files, 2);
return array(include $files[$rand_files[0]], include $files[$rand_files[1]]);
}
list($file_1,$file_2) = random_file("related");
You also need to consider directories such as '.' and '..'. I think a switch statement and an unset() for those values in your overall array would work. Then once you have an array with files not including what you don't want. You can then pull two randoms and return that array.
This code may not be 100% perfect but should get you in the right direction.
function random_file($dir) {
$fileArray = array();
if (is_dir($dir)) {
if ($dh = opendir($dir . '/*.php')) {
while (($file = readdir($dh)) !== false) {
$fileArray = array_push($fileArray, $file)
}
for( $i = 0; $i < count($fileArray); $i++ ) {
switch($fileArray) {
case '.':
unset($array[$i]);
break;
case '..':
unset($array[$i]);
break;
case 'bar.php':
unset($array[$i]);
break;
}
}
}
closedir($dh);
}
$rand_files_keys = array_rand($fileArray, 2);
$rand_files = $fileArray[$rand_files_keys[0]];
$rand_files = $fileArray[$rand_files_keys[1]];
return $rand_files;
try this. It will keep randomizing your array until it selects two records excluding the bar.php
$rand_files = array_rand($files, 2);
while(in_array("bar.php",$rand_files))
{
$rand_files = array_rand($files, 2);
}

How to detect files with same name in various directories?

Suppose there are 2 directories on my server:
/xyz/public_html/a/
/xyz/public_html/b/
And both of them consist of many files. How do i detect the files that are common to both the folders in terms of their name and file_extension. This program is to be implemented in PHP. Any suggestions?
Using FileSystemIterator, you might do something like this...
<?
$it = new FilesystemIterator('/xyz/public_html/a/');
$commonFiles = array();
foreach ($it as $file) {
if ($file->isDot() || $file->isDir()) continue;
if (file_exists('/xyz/public_html/b/' . $file->getFilename())) {
$commonFiles[] = $file->getFilename();
}
}
Basically, you have to loop through all the files in one directory, and see if any identically-named files exist in the other directory. Remember that the file name includes the extension.
If it’s just two directories, you could use an algorithm similar to the merge algorithm of merge sort where you have two lists of already sorted items and walk them simultaneously while comparing the current items:
$iter1 = new FilesystemIterator('/xyz/public_html/a/');
$iter2 = new FilesystemIterator('/xyz/public_html/b/');
while ($iter1->valid() && $iter2->valid()) {
$diff = strcmp($iter1->current()->getFilename(), $iter2->current()->getFilename());
if ($diff === 0) {
// duplicate found
} else if ($diff < 0) {
$iter1->next();
} else {
$iter2->next();
}
}
Another solution would be to use the uniqueness of array keys so that you put each directory item into an array as key and then check for each item of the other directory if such a key exists:
$arr = array();
$iter1 = new FilesystemIterator('/xyz/public_html/a/');
foreach ($iter1 as $item) {
$arr[$item->getFilename()] = true;
}
$iter2 = new FilesystemIterator('/xyz/public_html/a/');
foreach ($iter2 as $item) {
if (array_key_exists($item->getFilename(), $arr)) {
// duplicate found
}
}
If you just want to find out which are in common, you can easily use scandir twice and find what's in common, for example:
//Remove first two elements, which will be the constant . and .. Not a very sexy solution
$filesInA = array_shift(array_shift(scandir('/xyz/publichtml/a/')));
$filesInB = array_shift(array_shift(scandir('/xyz/publichtml/b/')));
$filesInCommon = array_intersect($filesInA, $filesInB);

Categories