I have a specific directory which may contain zip files.
I would like to loop through each sub-element of my directory to check if this is a zip. And unzip that. Then process the others files.
I'm using flysystem to work with my files.
So I went for this
$contents = $this->manager->listContents('local://my_directory , true);
foreach ($contents as $file) {
if( $file['extension'] == 'zip')
//Unzip in same location
}
The problem is that the files unziped are not in the loop and if the zip file, contain another zip. The second one will be never be unziped.
So I thought about it
function loopAndUnzip(){
$contents = $this->manager->listContents('local_process://' . $dir['path'] , true);
foreach ($contents as $file) {
if( $file['extension'] == 'zip')
//Unzip and after call loopAndUnzip()
}
}
But the initial function will never be finished and be called over and over if there are zip inside zip.
Isn't it a performance issue?
How to manage this kind of thing?
You can use glob to find them, and make the function recursive. You can do this by starting at a certain dir, unzip all the files into it & check if there are new zips.
I recommend using recursive directories as well. If A.zip and B.zip both have a file called example.txt, it overwrites. With dirs it wont:
function unzipAll(string $dirToScan = "/someDir", $depth=0):void {
if($depth >10 ){
throw new Exception("Maximum zip depth reached");
}
$zipfiles = glob($dirToScan."*.zip");
// Unzip all zips found this round:
foreach ($zipfiles as $zipfile) {
$zipLocation = "/".$zipname;
// unzip here to $zipLocation
// and now check if in the zip dir there is stuff to unzip:
unzipAll($dirToScan.$zipLocation, ++$depth);
}
}
The $depth is optional, but this way you cant zipbomb yourself to death.
loopAndUnzip will do all files again, so you will just again unpack the same zipfile and start over with the entire folder, ad infinitum.
Some possibilities:
Keep a list of items that was already processed or skipped and don't process those again, so while iterating over $contents, keep a separate array, and have something like:
PHP:
foreach ($contents as $file) {
if (!array_search($processedFiles, $file) {
if( $file['extension'] == 'zip')
//Unzip in same location
}
$processedFiles[] = $file;
}
Use an unzipper that returns a list of files/folders created, so you can explicitly process those instead of the full directory contents.
If the unzipper can't do it, you could fake it by extracting to a separate location, get a listing of that location, then move all the files in the original location, and process the list you got.
Related
I'm writing a json-file inside a generated folder. After an hour I want to delete the folder with its content automatically.
I tried:
$dir = "../tmpDir";
$cdir = scandir($dir);
foreach ($cdir as $key => $value)
{
if (!in_array($value,array(".","..")))
{
if (is_dir($dir.'/'.$value))
{
if(filectime($dir.'/'.$value)< (time()-3600))
{ // after 1 hour
$files = glob($dir.'/'.$value); // get all file names
foreach($files as $file)
{ // iterate files
if(is_file($file))
{
unlink($file); // delete file
}
}
rmdir($dir.'/'.$value);
/*destroy the session if the folder is deleted*/
if(isset($_SESSION["dirname"]) && $_SESSION["dirname"] == $value)
{
session_unset(); // unset $_SESSION variable for the run-time
session_destroy(); // destroy session data in storage
}
}
}
}
}
I get: rmdir(../tmpDir/1488268867): Directory not empty in /Applications/MAMP/htdocs/.... on line 46
if I remove the
if(is_file($file))
{
}
I get a permission error
Maybe someone knows why I get this error
It's much easier to use your native O/S to delete the directory when it comes to things like these, so you don't have to write a horrible loop that may have some edge cases that you could miss and then end up deleting things you shouldn't have!
$path = 'your/path/here';
if (PHP_OS !== 'WINDOWS')
{
exec(sprintf('rm -rf %s', $path));
}
else
{
exec(sprintf('rd /s /q %s', $path));
}
Of course, tailor the above to your needs. You can also use the backtick operator if you want to avoid the overhead of a function call (negligible in this case, anyway). It's also probably a good idea to use escape_shell_arg for your $path variable.
For a one-liner:
exec(sprintf((PHP_OS === 'WINDOWS') ? 'rd /s /q %s' : 'rm -rf %s', escape_shell_arg($path)));
Regardless, sometimes it's easier to let the O/S of choice perform file operations.
rmdir() removes directory if u want remove file then you should use unlink() function
Proper aporach would be using DirectoryIterator or glob() and looping through files then deleting them and after you do it you can remove empty directory.
You can also call system command rm -rf direcory_name with exec() or shell_exec()
useful links:
Delete directory with files in it?
How to delete a folder with contents using PHP
PHP: Unlink All Files Within A Directory, and then Deleting That Directory
I've found pretty useful function on php.net that removes hidden files as well
public function delTree($dir) {
$files = array_diff(scandir($dir), array('.','..'));
foreach ($files as $file) {
(is_dir("$dir/$file")) ? delTree("$dir/$file") : unlink("$dir/$file");
}
return rmdir($dir);
}
In my program I need to read .png files from a .tar file.
I am using pear Archive_Tar class (http://pear.php.net/package/Archive_Tar/redirected)
Everything is fine if the file im looking for exists, but if it is not in the .tar file then the function timouts after 30 seconds. In the class documentation it states that it should return null if it does not find the file...
$tar = new Archive_Tar('path/to/mytar.tar');
$filePath = 'path/to/my/image/image.png';
$file = $tar->extractInString($filePath); // This works fine if the $filePath is correct
// if the path to the file does not exists
// the script will timeout after 30 seconds
var_dump($file);
return;
Any suggestions on solving this or any other library that I could use to solve my problem?
The listContent method will return an array of all files (and other information about them) present in the specified archive. So if you check if the file you wish to extract is present in that array first, you can avoid the delay that you are experiencing.
The below code isn't optimised - for multiple calls to extract different files for example the $files array should only be populated once - but is a good way forward.
include "Archive/Tar.php";
$tar = new Archive_Tar('mytar.tar');
$filePath = 'path/to/my/image/image.png';
$contents = $tar->listContent();
$files = array();
foreach ($contents as $entry) {
$files[] = $entry['filename'];
}
$exists = in_array($filePath, $files);
if ($exists) {
$fileContent = $tar->extractInString($filePath);
var_dump($fileContent);
} else {
echo "File $filePath does not exist in archive.\n";
}
I currently have a program that connects to an ftp directory, if it finds csv files, runs a script, then after the script has run on the files, it creates a back up folder with the date and moves the csv files to this newly created back up folder in the ftp directory.
However, if there are no csv files in the root directory, I do not want a backup folder to be created, as there are no files to move. I know the solution is probably really simple but I cannot seem to figure it out!
logMessage("Creating backups");
$ftp_connection = #ftp_connect($ftp_url, $ftp_port, 6000);
if(!#ftp_login($ftp_connection, $ftp_username, $ftp_password )) {
logMessage("Could not connect to FTP: [$ftp_url], with Username: [$ftp_username], and Password: [$ftp_password]");
die();
}
$date = date('Y_m_d_(His)');
$newBackup = $ftp_root."/".$ftp_backup."backup_$date";
if (ftp_mkdir($ftp_connection, $newBackup)) {
logMessage ("Successfully created [$newBackup\n]");
foreach($filesToProcess as $file){
$pathData = pathinfo($file);
if(isset($pathData['extension']) && $pathData['extension'] == 'csv'){
if(!#ftp_rename($ftp_connection,
$ftp_root.'/'.$file,
$newBackup."/".$file)
){
logMessage("Unable to move file: $file")
}
}
}
}
You have used, foreach($filesToProcess as $file) ,so in $filesToProcess it's array of files. you can use, count($filesToProcess) first count number of files, then if count>0 execute code.
// $csv = your checks for string ending to .csv
$ftp_files = ftp_nlist($ftp_connection, ".");
foreach ($ftp_files = $files) {
if ($files = $csv) {
// makedir
maybe something like this in very basic syntax. ftp_nlist returns an array of all files in a particular directory.
Problem
I am building an online file manager, for downloading a whole directory structure I am generating a zip file of all subdirectories and files (recursively), therefore I use the RecursiveDirectoryIterator.
It all works well, but empty directories are not in the generated zip file, although the dir is handled correctly. This is what i am currently using:
<?php
$dirlist = new RecursiveDirectoryIterator($path, FilesystemIterator::SKIP_DOTS);
$filelist = new RecursiveIteratorIterator($dirlist, RecursiveIteratorIterator::SELF_FIRST);
$zip = new ZipArchive();
if ($zip->open($tmpName, ZipArchive::CREATE) !== TRUE) {
die();
}
foreach ($filelist as $key=>$value) {
$result = false;
if (is_dir($key)) {
$result = $zip->addEmptyDir($key);
//this message is correctly generated!
DeWorx_Logger::debug('added dir '.$key .'('.$this->clearRelativePath($key).')');
}
else {
$result = $zip->addFile($key, $key);
}
}
$zip->close();
If I ommit the FilesystemIterator::SKIP_DOTS I end up having a . file in all directories.
Conclusion
The iterator works, the addEmptyDir call gets executed (the result is checked too!) correctly, creating a zip file with various zip tools works with empty directories as intendet.
Is this a bug in phps ZipArchive (php.net lib or am I missing something? I don't want to end up creating dummy files just to keep the directory structure intact.
I am trying to list files in subdirectories and write these lists into separate text files.
I managed to get the directory and subdirectory listings and even to write all the files into a text file.
I just don't seem to manage to burst out of loops I am creating. I either end up with a single text file or the second+ files include all preceeding subdirectories content as well.
What I need to achieve is:
dir A/AA/a1.txt,a2.txt >> AA.log
dir A/BB/b1.txt,b2.txt >> BB.log
etc.
Hope this makes sense.
I've found the recursiveDirectoryIterator method as described in PHP SPL RecursiveDirectoryIterator RecursiveIteratorIterator retrieving the full tree being great help. I then use a for and a foreach loop to iterate through the directories, to write the text files, but I cannot break them into multiple files.
Most likely you are not filtering out the directories . and .. .
$maindir=opendir('A');
if (!$maindir) die('Cant open directory A');
while (true) {
$dir=readdir($maindir);
if (!$dir) break;
if ($dir=='.') continue;
if ($dir=='..') continue;
if (!is_dir("A/$dir")) continue;
$subdir=opendir("A/$dir");
if (!$subdir) continue;
$fd=fopen("$dir.log",'wb');
if (!$fd) continue;
while (true) {
$file=readdir($subdir);
if (!$file) break;
if (!is_file($file)) continue;
fwrite($fd,file_get_contents("A/$dir/$file");
}
fclose($fd);
}
I thought I'd demonstrate a different way, as this seems like a nice place to use glob.
// Where to start recursing, no trailing slash
$start_folder = './test';
// Where to output files
$output_folder = $start_folder;
chdir($start_folder);
function glob_each_dir ($start_folder, $callback) {
$search_pattern = $start_folder . DIRECTORY_SEPARATOR . '*';
// Get just the folders in an array
$folders = glob($search_pattern, GLOB_ONLYDIR);
// Get just the files: there isn't an ONLYFILES option yet so just diff the
// entire folder contents against the previous array of folders
$files = array_diff(glob($search_pattern), $folders);
// Apply the callback function to the array of files
$callback($start_folder, $files);
if (!empty($folders)) {
// Call this function for every folder found
foreach ($folders as $folder) {
glob_each_dir($folder, $callback);
}
}
}
glob_each_dir('.', function ($folder_name, Array $filelist) {
// Generate a filename from the folder, changing / or \ into _
$output_filename = $_GLOBALS['output_folder']
. trim(strtr(str_replace(__DIR__, '', realpath($folder_name)), DIRECTORY_SEPARATOR, '_'), '_')
. '.txt';
file_put_contents($output_filename, implode(PHP_EOL, $filelist));
});