'ZipArchive::close()' not working on web server - php

I am building a folder with files in it. At the end, I want to Zip this folder. On my local machine using Homestead, everything works correctly.
However, on my web server I am getting the error:
ZipArchive::close(): Can't remove file: No such file or directory
Why? The folder is filled with all files...
My code:
$zip_file = storage_path('app\\takeouts\\takeout_' . $this->dataExports->uuid . '.zip');
$this->zip = new \ZipArchive();
$this->zip->open($zip_file, \ZipArchive::CREATE | \ZipArchive::OVERWRITE);
$this->addAllFilesToZipArchive($this->folder_name);
$this->zip->close();
Storage::deleteDirectory($this->folder_name);
private function addAllFilesToZipArchive($dir)
{
$dirs = Storage::directories($dir);
$files = Storage::files($dir);
foreach ($files as $file)
{
if(Storage::exists(storage_path("app\\" . $file))) {
$this->zip->addFile(storage_path("app\\" . $file), str_replace($this->folder_name,"","/" . $file));
}
}
foreach ($dirs as $dir2) {
$this->addAllFilesToZipArchive($dir2);
}
}

It may seem a little obvious to some but it was an oversight on my behalf.
ll the close() function.
If the files added to the object aren't available at save time the zip file will not be created.

Related

Want to take backup of code in PHP where files which are to be taken as backup is read from another specified (patch)folder.?

I need to take backup of code for my Project, but the condition is :-
I generate a Patch folder through SVN, which has files structured in the same way as it is in the main Project folder (the patch has lesser files than the main project). I want to write a code which reads the main project and picks only those files which are present in the patch and saves it in a new backup folder.
The code which i wrote is :-
/*Patch Folder */
$frmFoldr = 'patch_test/';
$basepath = '/var/www/html/TestingBackupCron/';
/*The Folder where files are to be backed up it would use basepath and toFoldr folder name. */
$toFoldr = 'axisdirectcheck/';
/* Read/Copy File from this folder. */
$prodMainFolder = '';
$prodBasePath = '/var/www/html/TestingBackupCron/sijucheckfiles/';
$dir_iterator = new RecursiveDirectoryIterator($basepath . $frmFoldr);
$iterator = new RecursiveIteratorIterator($dir_iterator, RecursiveIteratorIterator::SELF_FIRST);
foreach ($iterator as $file) {
$readDir = strstr($file->getPathname(), $frmFoldr);
$tofileminpath = str_replace($frmFoldr, "", $readDir);
$fromExPath = $prodBasePath . $tofileminpath;
$toExPath = $basepath . $toFoldr . $tofileminpath;
if (strpos($fromExPath, '.php') || strpos($fromExPath, '.js') || strpos($fromExPath, '.css')) {
if (file_exists($fromExPath)) {
copy($fromExPath, $toExPath);
print_r('Files copied to ' . $toExPath . ".\n");
}else{
print_r($fromExPath." ::: Read path not found.\n");
}
}
}
The above code gives me the error "failed to open stream: No such file or directory". I think copy doesn't create folders. Please help guys.
Found a solution. Below code would create folders first and then on the next loop it would write the copied files onto those folders.
public function run($args) {
/*Patch Folder */
$frmFoldr = 'patch_test/';
$basepath = '/var/www/html/TestingBackupCron/';
/*The Folder where files are to be backed up it would use basepath and toFoldr folder name. */
$toFoldr = 'axisdirectcheck/';
/* Read/Copy File from this folder. */
$prodMainFolder = '';
$prodBasePath = '/var/www/html/TestingBackupCron/sijucheckfiles/';
if (file_exists($basepath . $frmFoldr)) {
$dir_iterator = new RecursiveDirectoryIterator($basepath . $frmFoldr);
$iterator = new RecursiveIteratorIterator($dir_iterator, RecursiveIteratorIterator::SELF_FIRST);
} else{
print_r("Read Directory '$basepath$frmFoldr' not Found. Please check the read directory filename.\n");
exit;
}
$fromDir = array();
$sendToArray = array();
/* this foreach would create folders inside the path ($toFoldr) specified */
foreach ($iterator as $file) {
$readDir = strstr($file->getPathname(), $frmFoldr);
$tofileminpath = str_replace($frmFoldr, "", $readDir);
$fromExPath = $basepath . $readDir;
$toExPath = $basepath . $toFoldr . $tofileminpath;
$sendToArray[] = $tofileminpath;
if (strpos($file, '/.')) {
if (!file_exists($toExPath)) {
$oldmask = umask(0);
mkdir("$toExPath", 0777, true);
umask($oldmask);
}
}
}
/* this foreach would copy files from PROD/UAT and paste inside the path($toExPath) specified */
foreach ($sendToArray as $fPath) {
$fromExPath = $prodBasePath . $fPath;
$toExPath = $basepath . $toFoldr . $fPath;
if (strpos($fromExPath, '.php') || strpos($fromExPath, '.js') || strpos($fromExPath, '.css')) {
if (file_exists($fromExPath)) {
copy($fromExPath, $toExPath);
print_r('Files copied to ' . $toExPath . ".\n");
}else{
print_r($fromExPath." ::: Read path not found.\n");
}
}
}
}
Thanks..

Extract a folder and then search for specific file in PHP

I´m building a php programm which uploads a zip file, extracts it and generates a link for a specific file in the extracted folder. Uploading and extracting the folder works fine. Now I´m a bit stuck what to do next. I have to adress the just extracted folder and find the (only) html file that is in it. Then a link to that file has to be generated.
Here is the code I´m using currently:
$zip = new ZipArchive();
if ($zip->open($_FILES['zip_to_upload']['name']) === TRUE)
{
$folderName = trim($zip->getNameIndex(0), '/');
$zip->extractTo(getcwd());
$zip->close();
}
else
{
echo 'Es gab einen Fehler beim Extrahieren der Datei';
}
$dir = getcwd();
$scandir = scandir($dir);
foreach ($scandir as $key => $value)
{
if (!in_array($value,array(".",".."))) //filter . and .. directory on linux-systems
{
if (is_dir($dir . DIRECTORY_SEPARATOR . $value) && $value == $folderName)
{
foreach (glob($value . "/*.html") as $filename) {
$htmlFiles[] = $filename; //this is for later use
echo "<a href='". SK_PICS_SRV . DIRECTORY_SEPARATOR . $filename . "'>" . SK_PICS_SRV . DIRECTORY_SEPARATOR . $filename . "</a>";
}
}
}
}
So this code seems to be working. I just noticed a rather strange problem. The $zip->getNameIndex[0] function behaves differently depending on the program that created the zip file. When I make a zip file with 7zip all seems to work without a problem. $folderName contains the right name of the main folder which I just extracted. For example "folder 01". But when I zip it with the normal windows zip programm the excat same folder (same structure and same containing files) the $zip->getNameIndex[0] contains the wrong value. For example something like "folder 01/images/" or "folder 01/example.html". So it seems to read the zip file differently/ in a wrong way. Do you guys know where that error comes from or how I can avoid it? This really seems strange to me.
Because you specify the extract-path by yourself you can try finding your file with php's function "glob"
have a look at the manual:
Glob
This function will return the name of the file matching the search pattern.
With your extract-path you now have your link to the file.
$dir = "../../suedkurier/werbung/"
$scandir = scandir($dir);
foreach ($scandir as $key => $value)
{
if (!in_array($value,array(".",".."))) //filter . and .. directory on linux-systems
{
if (is_dir($dir . DIRECTORY_SEPARATOR . $value))
{
foreach (glob($dir . DIRECTORY_SEPARATOR . $value . "/*.html") as $filename) {
$files[] = $value . DIRECTORY_SEPARATOR $filename;
}
}
}
}
The matched files will now be saved in the array $files (with the subfolder)
So you get your path like
foreach($files as $file){
echo $dir . DIRECTORY_SEPARATOR . $file;
}
$dir = "the/Directory/You/Extracted/To";
$files1 = scandir($dir);
foreach($files1 as $str)
{
if(strcmp(pathinfo($str, PATHINFO_EXTENSION),"html")===0||strcmp(pathinfo($str, PATHINFO_EXTENSION),"htm")===0)
{
echo $str;
}
}
Get an array of each file in the directory, check the extension of each one for htm/html, then echo the name if true.

PHP - Moving multiple files with different files names to own directory

Hi wonder if you can help,
I'm looking to do this in PHP if someone can help me. I have a number of files that look like this:
"2005532-JoePharnel.pdf"
and
"1205121-HarryCollins.pdf"
Basically I want to create a PHP code that when someone ftp uploads those files to the upload folder that it will 1) Create a directory if it doesn't exist using there name 2) Move the files to the correct directory (E.g. JoePharnel to the JoePharnel Directory ignoring the number at the beginning)
I have looked through alot of code and found this code and have adapted it:
<?php
$attachments = array();
preg_match_all('/([^\[]+)\[([^\]]+)\],?/', $attachments, $matches, PREG_SET_ORDER);
foreach ($matches as $file) {
$attachments[$file[1]] = $file[2];
}
foreach ($attachments as $file => $filename) {
$uploaddir = "upload" . $file;
$casenumdir = "upload/JoePharnel" . $CaseNumber;
$newfiledir = "upload/JoePharnel" . $CaseNumber .'/'. $file;
$each_file = $CaseNumber .'/'. $file;
if(is_dir($casenumdir)==false){
mkdir("$casenumdir", 0700); // Create directory if it does not exist
}
if(file_exists($casenumdir.'/'.$file)==false){
chmod ($uploaddir, 0777);
copy($uploaddir,$newfiledir);
}
$allfiles = $CaseNumber .'/'. $file . "[" . $filename . "]" . ",";
$filelistfinished = preg_replace("/,$/", "", $allfiles);
echo $filelistfinished;
// displays multiple file attachments on the page correctly as: casenumber/testfile1.pdf[testfile1.pdf],casenumber/testfile2.pdf[testfile2.pdf],
],
}
?>
Sorry for the lack of code but best i could find.
Any help is much appreciated.
Thanks.

Can I use php to update an outdated script with a ZIP packed script from an external domain?

I am writing a script and I was looking into providing some sort of a tool that updates the script when the client requests.
Project is simple logic.
Client server sends a request to mother server (which stores the updated scripts in ZIP files) and downloads the .ZIP file from the server.
Client server receives the .ZIP file and unpacks it overwriting the outdated script files.
Can someone give me a basic guide or what do I need to do to implement such function ?
I can use pclzip to create/extract archives but I do not know where to start.
Thanks for any help, appreciated.
You will need to use ftp(using php) to upload
example in joomla we can do the following :
jimport('joomla.client.ftp');
$ftp = JFTP::getInstance($server['ip'], 21, null, $server['user'], $server['password']);
try {
foreach($files as $file) {
$fullpath = JPATH_SITE . $file;
$this->createdirs($ftp, $rootdir . $file);
if($ftp->store($fullpath, $rootdir . $file) == false) {
throw new Exception("Cannot transfer file " . $file);
}
}
foreach($adminfiles as $file) {
$fullpath = JPATH_SITE . $file;
$this->createdirs($ftp, $rootdir . $file);
if($ftp->store($fullpath, $rootdir . $file) == false) {
throw new Exception("Cannot transfer file " . $file);
}
}
}
catch(Exception $e) {
$ftp->quit();
die($e->getMessage());
}
$ftp->quit();

PHP / Windows - Opendir() fails opening subdirectories within symbolic linked directories

Does anyone know a solution to this problem? I'm unable to open a subdirectory within a symboliclink'd directory. I've confirmed that the paths are correct (even copy & pasted the path into explorer, which parsed it fine). This is a strange, annoying, bug :|.
Example:
C:\folder\symbolic_link\dir1\dir2 - opening dir2 fails.
C:\folder\symbolic_link\dir1 - works
C:\folder\real_directory\dir1\dir2 - works
C:\folder\real_directory\dir1 - works
Alright, I finally found a hack to solve this bug in php's handling of symlinks on windows. The bug occurs when recursively iterating through files/directories using opendir(). If a symlink to a directory exists in the current directory, opendir() will fail to read the directories in the directory symlink. It is caused by something funky in php's statcache, and can be resolved by calling clearstatcache() before calling opendir() on the directory symlink (also, the parent directory's file-handle must be closed).
Here is an example of the fix:
<?php
class Filesystem
{
public static function files($path, $stats = FALSE)
{
clearstatcache();
$ret = array();
$handle = opendir($path);
$files = array();
// Store files in directory, subdirectories can't be read until current handle is closed & statcache cleared.
while (FALSE !== ($file = readdir($handle)))
{
if ($file != '.' && $file != '..')
{
$files[] = $file;
}
}
// Handle _must_ be closed before statcache is cleared, cache from open handles won't be cleared!
closedir($handle);
foreach ($files as $file)
{
clearstatcache($path);
if (is_dir($path . '/' . $file))
{
$dir_files = self::files($path . '/' . $file);
foreach ($dir_files as $dir_file)
{
$ret[] = $file . '/' . $dir_file;
}
}
else if (is_file($path . '/' . $file))
{
$ret[] = $file;
}
}
return $ret;
}
}
var_dump(filessystem::files('c:\\some_path'));
Edit: It seems that clearstatcache($path) must be called before any file-handling functions on the symlink'd dir. Php isn't caching symlink'd dirs properly.

Categories