Laravel - moving file causing error - php

I am trying to copy a file from one directory to another as follows:
$files = File::allFiles($temp);
foreach ($files as $f)
{
$f->move($destination, File::name($f));
}
I am getting this error:
Call to undefined method Symfony\Component\Finder\SplFileInfo::move()
In the following line:
$f->move($destination, File::name($f));
It seems like it does not detect $f as being of type file because every time I try and use any of its functions like
getClientOriginalName()
I get an error.
I keep getting that error. It seems as its not registering $f as being a file..
Also another thing to keep in mind is that I don't know the name of the file thus why I get all of the files in the directory (only ever 1 at a time)

Try this:
File::get($f)
Let me know what happens.

Change this line of code:
$files = File::allFiles($temp);
to
$files = Input::file('files');

Try:
$files = File::files($temp);
foreach ($files as $file)
{
File::move($file, $destination . basename($file));
// [or]
// rename($file, $destination . basename($file));
}

Related

Use copy() in a foreach loop to copy multiple files php

I'm trying to copy multiple files from one domain on a web server to another using copy() and looping through a list of files, but it's only copying the last file on the list.
Here is the contents of files-list.txt:
/templates/template.php
/admin/admin.css
/admin/codeSnippets.php
/admin/editPage.php
/admin/index.php
/admin/functions.php
/admin/style.php
/admin/editPost.php
/admin/createPage.php
/admin/createPost.php
/admin/configuration.php
This script runs on the website that I'm trying to copy the files to. Here's the script:
$filesList = file_get_contents("http://copyfromhere.com/copythesefiles/files-list.txt");
$filesArray = explode("\n", $filesList);
foreach($filesArray as $file) {
$filename = trim('http://copyfromhere.com/copythesefiles' . $file);
$dest = "destFolder" . $file;
if(!#copy($filename, $dest))
{
$errors= error_get_last();
echo "COPY ERROR: ".$errors['type'];
echo "<br />\n".$errors['message'];
} else {
echo "$filename copied to $dest from remote!<br/>";
}
}
I get the affirmative message for each and every file individually just as I should, but when I check the directory, only the last file from files-list.txt is there. I've tried changing the order, so I know the problem lies with the script, not any individual file.
The output from the echo statements looks something like this:
http://copyfromhere.com/copythesefiles/admin/admin.css copied to updates/admin/editPage.php from remote!
http://copyfromhere.com/copythesefiles/admin/admin.css copied to updates/admin/editPost.php from remote!
http://copyfromhere.com/copythesefiles/admin/admin.css copied to updates/admin/index.php from remote!
Etc
I've modified your code slightly, and tested it on my local dev server. The following seems to work:
$fileURL = 'http://copyfromhere.com/copythesefiles';
$filesArray = file("$fileURL/files-list.txt", FILE_IGNORE_NEW_LINES);
foreach ($filesArray as $file) {
$fileName = "$fileURL/$file";
$dest = str_replace($fileURL, 'destFolder', $fileName);
if (!copy($fileName, $dest)) {
$errors= error_get_last();
echo "COPY ERROR: ".$errors['type'];
echo "<br />\n".$errors['message'];
}
else {
echo "$fileName copied to $dest from remote!<br/>";
}
}
This uses the same fix that Mark B pointed out, but also consolidated the code a little.
Unless the data you're fetching from that remote site has leading/ in the path/filename, you're not generating proper paths:
$file = 'foo.txt'; // example only
$dest = "destFolder" . $file;
produces destFolderfoo.txt, and you end up littering your script's working directory with a bunch of wonky filenames. Perhaps you wanted
$dest = 'destFolder/' . $file;
^----note this
instead.

How to get contents of a file in Laravel 5.1

I am trying to get content of all the files in my directory and I am getting an error that says
ErrorException in Util.php line 114:
preg_match() expects parameter 2 to be string, array given
. Below is the code that I am using.
public function store(Request $request)
{
$directory = storage_path('app/xmlentries/uploads');
$files = File::files($directory);
foreach ($files as $file)
{
$contents = Storage::get($file);
dd($contents);
}
How would i get the contents of all my files in this folder?
Try this:
$directory = storage_path('app/xmlentries/uploads/');
foreach (glob($directory . "*") as $file) {
$fileContent = file_get_contents($file);
dd($fileContent); // change this per your need
}
Please note this will display the first file and then stop!

How to move/copy a file that has a wildcard in the name in PHP?

I am trying to copy a file that I download it. The file name is test1234.txt, but I want to access it using a wildcard like this: test*.txt and after that to move it to another folder (because I don't know how the file name looks like, but I know that the beginning is test and the rest is changing every time I download a new one). I tried some codes:
$myFile = 'C:/Users/Carl/Downloads/'. date("y-m-d") . '/test*.txt';
$myNewFile = 'C:/Users/Carl/Downloads/'. date("y-m-d").'/text.xml';
if(preg_match("([0-9]+)", $myFile)) {
echo 'ok';
copy($myFile, $myNewFile);
}
I am getting an error because of * in $myFile. Any help is very appreciated.
$myFile= 'C:/Users/Carl/Downloads/'. date("y-m-d") . '/test*.txt';
$myNyFile = 'C:/Users/Carl/Downloads/'.date("y-m-d").'/test.txt';
foreach (glob($myFile) as $fileName) {
copy($fileName, $myNyFile);
}
For complete response, if you want to only move *.txt in NewFolder.
$myFiles = 'C:/Users/Carl/Downloads/*.txt';
$myFolderDest = 'C:/Users/Carl/NewFolder/';
foreach (glob($myFiles) as $file) {
copy($file, $myFolderDest . basename($file));
}

On creating zip file by php I get two files instead of one

I'm struggling around with a simple PHP functionality: Creating a ZIP Archive with some files in.
The problem is, it does not create only one file called filename.zip but two files called filename.zip.a07600 and filename.zip.b07600. Pls. see the following screenshot:
The two files are perfect in size and I even can rename each of them to filename.zip and extract it without any problems.
Can anybody tell me what is going wrong???
function zipFilesAndDownload_Defect($archive_file_name, $archiveDir, $file_path = array(), $files_array = array()) {
// Archive File Name
$archive_file = $archiveDir."/".$archive_file_name;
// Time-to-live
$archiveTTL = 86400; // 1 day
// Delete old zip file
#unlink($archive_file);
// Create the object
$zip = new ZipArchive();
// Create the file and throw the error if unsuccessful
if ($zip->open($archive_file, ZIPARCHIVE::CREATE) !== TRUE) {
$response->res = "Cannot open '$archive_file'";
return $response;
}
// Add each file of $file_name array to archive
$i = 0;
foreach($files_array as $value){
$expl = explode("/", $value);
$file = $expl[(count($expl)-1)];
$path_file = $file_path[$i] . "/" . $file;
$size = round((filesize ($path_file) / 1024), 0);
if(file_exists($path_file)){
$zip->addFile($path_file, $file);
}
$i++;
}
$zip->close();
// Then send the headers to redirect to the ZIP file
header("HTTP/1.1 303 See Other"); // 303 is technically correct for this type of redirect
header("Location: $archive_file");
exit;
}
The code which calls the function is a file with a switch-case... it is called itself by an ajax-call:
case "zdl":
$files_array = array();
$file_path = array();
foreach ($dbh->query("select GUID, DIRECTORY, BASENAME, ELEMENTID from SMDMS where ELEMENTID = ".$osguid." and PROJECTID = ".$osproject.";") as $subrow) {
$archive_file_name = $subrow['ELEMENTID'].".zip";
$archiveDir = "../".$subrow['DIRECTORY'];
$files_array[] = $archiveDir.DIR_SEPARATOR.$subrow['BASENAME'];
$file_path[] = $archiveDir;
}
zipFilesAndDownload_Defect($archive_file_name, $archiveDir, $file_path, $files_array);
break;
One more code... I tried to rename the latest 123456.zip.a01234 file to 123456.zip and then unlink the old 123456.zip.a01234 (and all prior added .a01234 files) with this function:
function zip_file_exists($pathfile){
$arr = array();
$dir = dirname($pathfile);
$renamed = 0;
foreach(glob($pathfile.'.*') as $file) {
$path_parts = pathinfo($file);
$dirname = $path_parts['dirname'];
$basename = $path_parts['basename'];
$extension = $path_parts['extension'];
$filename = $path_parts['filename'];
if($renamed == 0){
$old_name = $file;
$new_name = str_replace(".".$extension, "", $file);
#copy($old_name, $new_name);
#unlink($old_name);
$renamed = 1;
//file_put_contents($dir."/test.txt", "old_name: ".$old_name." - new_name: ".$new_name." - dirname: ".$dirname." - basename: ".$basename." - extension: ".$extension." - filename: ".$filename." - test: ".$test);
}else{
#unlink($file);
}
}
}
In short: copy works, rename didn't work and "unlink"-doesn't work at all... I'm out of ideas now... :(
ONE MORE TRY: I placed the output of $zip->getStatusString() in a variable and wrote it to a log file... the log entry it produced is: Renaming temporary file failed: No such file or directory.
But as you can see in the graphic above the file 43051221.zip.a07200 is located in the directory where the zip-lib opens it temporarily.
Thank you in advance for your help!
So, after struggling around for days... It was so simple:
Actually I work ONLY on *nix Servers so in my scripts I created the folders dynamically with 0777 Perms. I didn't know that IIS doesn't accept this permissions format at all!
So I remoted to the server, right clicked on the folder Documents (the hierarchically most upper folder of all dynamically added files and folders) and gave full control to all users I found.
Now it works perfect!!! The only thing that would be interesting now is: is this dangerous of any reason???
Thanks for your good will answers...
My suspicion is that your script is hitting the PHP script timeout. PHP zip creates a temporary file to zip in to where the filename is yourfilename.zip.some_random_number. This file is renamed to yourfilename.zip when the zip file is closed. If the script times out it will probably just get left there.
Try reducing the number of files to zip, or increasing the script timeout with set_time_limit()
http://php.net/manual/en/function.set-time-limit.php

Check if file exists in .tar using PHP

In my program I need to read .png files from a .tar file.
I am using pear Archive_Tar class (http://pear.php.net/package/Archive_Tar/redirected)
Everything is fine if the file im looking for exists, but if it is not in the .tar file then the function timouts after 30 seconds. In the class documentation it states that it should return null if it does not find the file...
$tar = new Archive_Tar('path/to/mytar.tar');
$filePath = 'path/to/my/image/image.png';
$file = $tar->extractInString($filePath); // This works fine if the $filePath is correct
// if the path to the file does not exists
// the script will timeout after 30 seconds
var_dump($file);
return;
Any suggestions on solving this or any other library that I could use to solve my problem?
The listContent method will return an array of all files (and other information about them) present in the specified archive. So if you check if the file you wish to extract is present in that array first, you can avoid the delay that you are experiencing.
The below code isn't optimised - for multiple calls to extract different files for example the $files array should only be populated once - but is a good way forward.
include "Archive/Tar.php";
$tar = new Archive_Tar('mytar.tar');
$filePath = 'path/to/my/image/image.png';
$contents = $tar->listContent();
$files = array();
foreach ($contents as $entry) {
$files[] = $entry['filename'];
}
$exists = in_array($filePath, $files);
if ($exists) {
$fileContent = $tar->extractInString($filePath);
var_dump($fileContent);
} else {
echo "File $filePath does not exist in archive.\n";
}

Categories