ftp_get() file with partial filename(not a wildcard) - php

I need to get a file based on the second half of the filename with PHP
The structure of the filename will always be NAME_123456789.dat where the number is a tracking_id(unique).
The name being John, Mel, Bronson, etc. And the number being a tracking_id.
What the process will be just for comprehension is that a person will enter their tracking_id. it will extract that from the search bar and plant it in the ftp search in the specific directory. Because the tracking_id is unique it should only return one result, hence ftp_get() right?
Any help is greatly appreciated.

Given the relatively small directory size (100+ from comments above), you should be ok first using ftp_nlist() to list all the files and then searching for and downloading the file you want.
$search = '_' . $trackingId . '.dat';
$searchLen = strlen($search);
$dir = '.'; // example directory
$files = ftp_nlist($connection, $dir);
foreach ($files as $file) {
// check if $file ends with $search
if (strrpos($file, $search) === strlen($file) - $searchLen) {
// found it, download it
ftp_get($connection, 'some/local/file/path', $dir . '/' . $file);
}
}
Better and more future-proof options can be found in Michael Berkowski's comment above...
How many files do you expect to be operating in the directory at any given time? If it is a small number, listing the contents via ftp may work suitably. If it is many thousands of files, you might want to store some sort of text manifest file to read from, or index them in a database.
These do hinge on how and when the files are uploaded to the FTP server though so given we don't know anything about that, I cannot provide any solutions.

Related

Remove files which have not filename duplicates

For each document (.pdf, .txt, .docx ecc) I have also a corresponding json file with the same filename.
Example:
file1.json,
file1.pdf,
file2.json,
file2.txt,
filex.json,
filex.pdf,
But I got also some json files which are not accompanied with the corresponding document.
I want to delete all json files which have no corresponding document. Im really stucked because I cant find a proper solution to my problem.
I know how to scandir() get the filename, extensions from pathinfo() ecc. but the issue is that for each json file I find in directory I have to perform another foreach on that directory excluding all json files and see If the same filename exists or not so than I can decide to delete it. (This is how I think to solve it).
The problem here is with performance since there are millions of files and for each json I have to run a foreach on millions of files.
Can anyone guide me to a better solution?
Thank you!
Edit: Since no one will help without first posting a piece of code (and this approach in stackoverflow is definitively wrong) here is how I'm trying.:
<?php
$dir = "2000/";
$files = scandir($dir);
foreach ($files as $file) {
$fullName = pathinfo($file);
if ($fullName['extension'] === 'json') {
if (!in_array($fullName['filename'].'.pdf', $files)){
unlink($dir.$file);
}
}
}
Now as you can see I can only search only for one type of document (.pdf in this case). I want to search for every extension excluding .json and also I don't want that for each json file to run a foreach/in_array() but achieving all this in just one foreach.
Maybe you should consider it in another way? I mean, iterate through all files, and try to find corresponding files to json, if not found remove it.
It would look like follows:
$dir = "2000/";
foreach (glob($dir . "*.json") as $file) {
$file = new \SplFileInfo($dir . $file);
if (count(glob($dir . $file->getBasename('.' . $file->getExtension()) . ".*")) === 1) {
unlink($dir . $file->getFilename());
}
}
Manual
PHP: SplFileInfo
PHP: glob

PHP copy a file to multiple child folders with random names

I have an HTML form and one of the inputs creates a folder. The folder name is chosen by the website visitor. Every visitor creates his own folder on my website so they are randomly generated. They are created using PHP Code.
Now I would like to write a PHP code to copy a file to all of the child directories regardless the quantity of directories being generated.
I do not wish to stay writing a PHP line for every directory that is created - i.e. inserting the filename name manually (e.g. folder01, xyzfolder, folderabc, etc...) but rather automatically.
I Googled but I was unsuccessful. Is this possible? If yes, how can I go about it?
Kindly ignore security, etc... I am testing it internally prior to rolling out on a larger scale.
Thank you
It is sad I cannot comment so go on...
//get the new folder name
$newfolder = $_POST['newfoldername'];
//create it if not exist
if(!is_dir("./$newfolder")) {
mkdir("./$newfolder", 0777, true);
}
//list all folder
$dirname = './';
$dir = opendir($dirname);
while($file = readdir($dir)) {
if(($file != '.' OR $file != '..') AND is_dir($dirname.$file))
{
//generate a randomname
$str = 'yourmotherisveryniceABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789';
$randomname = str_shuffle($str);
$actualdir = $dirname.$file;
//copy of the file
copy($uploadedfile['tmp_name'], $actualdir.$randomname);
}
}
closedir($dir);
I just want to say, you seem to be lazy by looking for what you want to do. because when I read "I would like to write a PHP code to copy" the answer is in your sentence: copy PHP and list of folders regarless how many? Then just simply list it !
Maybe you need to learn how to use google... If you search "I would like to write a PHP code to copy a file to all of the child directories regardless the quantity of directories being generated" Sure you will never find.

Trying to echo contents of multiple text files while sorting the output by the file name - PHP

I'm not a developer, but I'm the default developer at work now. : ) Over the last few weeks I've found a lot of my answers here and at other sites, but this latest problem has me confused beyond belief. I KNOW it's a simple answer, but I'm not asking Google the right questions.
First... I have to use text files, as I don't have access to a database (things are locked down TIGHT where I work).
Anyway, I need to look into a directory for text files stored there, open each file and display a small amount of text, while making sure the text I display is sorted by the file name.
I'm CLOSE, I know it... I finally managed to figure out sorting, and I know how to read into a directory and display the contents of the files, but I'm having a heck of a time merging those two concepts together.
Can anyone provide a bit of help? With the script as it is now, I echo the sorted file names with no problem. My line of code that I thought would read the contents of a file and then display it is only echoing the line breaks, but not the contents of the files. This is the code I've got so far - it's just test code so I can get the functionality working.
<?php
$dirFiles = array();
if ($handle = opendir('./event-titles')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
$dirFiles[] = $file;
}
}
closedir($handle);
}
sort($dirFiles);
foreach($dirFiles as $file)
{
$fileContents = file_get_contents($file);//////// This is what's not working
echo $file."<br>".$fileContents."<br/><br/>";
}
?>
Help? : )
Dave
$files = scandir('./event-titles') will return an array of filenames in filename-sorted order. You can then do
foreach($files as $file)
{
$fileContents = file_get_contents('./event-titles/'.$file);
echo $file."<br/>".$fileContents."<br/><br/>";
}
Note that I use the directory name in the file_get_contents call, as the filename by itself will cause file_get_contents to look in the current directory, not the directory you were specifying in scandir.

Check if a file matches a wildcarded spec, in a given directory, with PHP

I have a directory that files are uploaded to, and I want to be able to display a download link if the file exists. The file however has to match a particular pattern as this is the identifier of who uploaded it.
The pattern starts with /ClientFiles/ then it needs to find all files that starts with the user ID. So for example: /ClientFiles/123-UploadData.xls
So it would need to look in the ClientFiles directory and find all files that start with '123-' no matter what comes after.
Cheers
To look for files by a certain pattern you can use glob, then use is_readable to check if you can read the files.
$files = array();
foreach(glob($dirname . DIRECTORY_SEPARATOR . $clientId . '-*' as $file) {
if(is_readable($file) {
$files[] = $file;
}
}
Simply use the file_exists() function
php has a function file_exists. Use that to make some logic about if you show a link or not.

What is the best method to read a folder sub directory and files in those sub directory?

I have a image folder which contains sub directory for each album of images like
Images
Images/Album1
Images/Album2
in PHP file
I create a link for each album using a thumbnail for the album using GLOB to read all folders under Images
$dir=glob('images/*');
$dir_listing=array();
foreach($dir as $list)
{
if(is_dir($list))
$dir_listing[]= (basename($list));
}
$thumbs=glob('images/thumbnails/*');
$count=0;
foreach($thumbs as $th )
{
echo" $dir_listing <br/>";
echo"<a href='$dir_listing[$count]' ><img src='$th' /> </a>";
$count++;
}
I use Glob on each page load to get list of directories and images.
I want to know if there is a better way of doing this.
I also want to get list of all files and folder based on there Last Modified time in descending Order {Latest files and Folders first}.
Is using Glob correct or should we save the sub-directories and files in text file and read from it?
I can't tell you for sure if there is a better way of doing this, but your code should definitely work.
Using glob() is the right approach only if you have a relatively low number of files in the directory (<10,000), because if yoy have a lot of files then you could get a "Allowed memory size of XYZ bytes exhausted ..." error. In this case, it is best to use opendir();
if ($handle = opendir($path)) {
while (false !== ($file = readdir($handle))) {
// do something with the file
// note that '.' and '..' is returned even
}
closedir($handle);
Finally, use the flag GLOB_NOSORT on glob() so the end result is just like its listed on the directory in case that may be used to give you results based on last modified date.
Hope this helps.

Categories