I have folders on the server that contain image files. I'm trying to get those files and then upload them further, but I think I'm using the wrong function.
My code:
$dir = "../uploads/".$folderimage."/";
if ($handle = opendir($dir)) {
while (false !== ($entry = readdir($handle))) {
if ($entry != "." && $entry != "..") {
echo "$entry\n";
$handle = fopen($entry,"wb");
$mug->images_upload(array( "AlbumID" => "#####", "File" => $handle));
}
}
closedir($handle);
}
Not sure what I am doing, all I need to do is pass the file to the class function $mug->images->upload. It works from a $_POST request but I need to move the files already uploaded to the folder.
It's a bit tricky to emulate a file upload POST request for your $mug object. You would be better off if you could refactor the code in $mug as follows:
$mug fetches the uploaded file and puts it to its destination place.
You create a new function that implements the processing you wish to use here and in $mug.
Call this function from here, and from $mug with the appropriate filename.
If $mug->images_upload is expecting the file path then pass $dir.$entry to it
<?php
$dir = "../uploads/".$folderimage."/";
if ($handle = opendir($dir)) {
while (false !== ($entry = readdir($handle))) {
if ($entry != "." && $entry != "..") {
echo $entry.PHP_EOL;
$mug->images_upload(array( "AlbumID"=>"#####", "File"=>$dir.$entry));
}
}
closedir($handle);
}
//Or simpler way but slightly slower
foreach (glob($dir."*.{png,jpg,gif}", GLOB_BRACE) as $file) {
echo $file.PHP_EOL;
$mug->images_upload(array( "AlbumID"=>"#####", "File" =>$dir.$file));
}
?>
There are a number of apparent issues with the code that you have posted.
$dir = "../uploads/".$folderimage."/";
if ($handle = opendir($dir)) {
while (false !== ($entry = readdir($handle))) {
if ($entry != "." && $entry != "..") {
echo "$entry\n";
$handle = fopen($entry,"wb");
$mug->images_upload(array( "AlbumID" => "#####", "File" => $handle));
}
}
closedir($handle);
}
Clashing variables.
You have used the $handle vairable to store the directory handle, only to later overwrite it with a file resource inside the loop. As soon as you overwrite it inside the loop, the next call to readdir($handle) does not make any sense are you are calling the function on a file resource. This could very easily lead to an infinite loop since when readdir() is given rubbish, it might return NULL.
Incorrect path for fopen()
Given $folderimage = "images" and $entry = "photo.jpg", then the fopen() line will try to open the image at photo.jpg rather than ../uploads/images/photo.jpg. You likely wanted to use something like $dir . $entry, but read on as you shouldn't be using fopen() at all.
Incorrect usage of the phpSmug library
The File argument must be a string containing "the path to the local file that is being uploaded" (source). You instead try to pass a file resource from fopen() to it.
opendir()/readdir() for directory iteration is ancient
There are better ways to traverse directory contents in PHP. As mentioned in Lawrence's answer, glob() might be useful.
I would also advocate using the filesystem iterator from the SPL.
$dir = "../uploads/$folderimage/";
foreach (new FilesystemIterator($dir) as $fileinfo) {
$image_pathname = $fileinfo->getPathname();
$mug->images_upload("AlbumID=#####", "File=$image_pathname");
}
Related
I have below code to get content from remote directory.
$dirHandle = opendir("ssh2.sftp://$sftp/".PATH_OUT);
while (false !== ($file = readdir($dirHandle))) {
// something...
}
Now, the thing is, above code is in forloop. when I put $dirHandle = opendir("ssh2.sftp://$sftp/".PNB_PATH_OUT); outside of forloop then it gives me required result only for first record. So, obviously it's readdir is not working for second record in forloop.
How can I do this in such a way that I need to use opendir only once and use that connection more than 1 time?
Required Solution
$dirHandle = opendir("ssh2.sftp://$sftp/".PATH_OUT);
for(...){
while (false !== ($file = readdir($dirHandle))) {
// something...
}
}
Your while loop is traversing the entire directory until there are no more files, in which case readdir returns false. Therefore any time readdir is called after the first traversal, it will just return false because it is already at the end of the directory.
You could use rewinddir() in the for loop to reset the pointer of the directory handle to the beginning.
$dirHandle = opendir("ssh2.sftp://$sftp/".PATH_OUT);
for(...){
rewinddir($dirHandle);
while (false !== ($file = readdir($dirHandle))) {
// something...
}
}
Since the sftp stream appears to not support seeking, you should just store the results you need and do the for loop after the while loop. You are, after all, traversing the same directory multiple times.
$dirHandle = opendir("ssh2.sftp://$sftp/".PATH_OUT);
while (false !== ($file = readdir($dirHandle))) {
$files[] = $file;
}
for(...){
// use $files array
}
I have created a directory with some files in there:
index.php
one.txt
two.txt
three.txt
four.txt
In the index.php page, I am currently using this code to echo out all of the files within the directory:
<?php
$blacklist = array("index.php");
if ($handle = opendir('.')) {
while (false !== ($entry = readdir($handle))) {
if ($entry != "." && $entry != ".." && !in_array($entry, $blacklist)) {
echo "$entry\n";
}
}
closedir($handle);
}
?>
Now, if anyone views the index.php page, this is what they'll see:
one.txt two.txt three.txt four.txt
As you can see from the PHP code, index.php is blacklisted so it is not echoed out.
However, I would like to go a step further than this and echo out the contents of each text file rather than the filenames. With the new PHP code (that I need help with creating), whenever someone visits the index.php page, this is what they'll now see:
(Please ignore what is in the asterisks, they are not a part of the code, they just indicate what each text file contains)
Hello ** this is what the file **one.txt** contains **
ok ** this is what the file **two.txt** contains **
goodbye ** this is what the file **three.txt** contains **
text ** this is what the file **four.txt** contains **
Overall:
I would like to echo out the contents of every file in the directory (they are all text files) aside from index.php.
You could use file_get_contents to put the file into a string.
<?php
$blacklist = array("index.php");
if ($handle = opendir('.')) {
while (false !== ($entry = readdir($handle))) {
if ($entry != "." && $entry != ".." && !in_array($entry, $blacklist)) {
echo "$entry " . file_get_contents($entry) . "\n";
}
}
closedir($handle);
}
?>
Furthermore, you could use PHP's glob function to filter only the .txt files out, that way you do not have to blacklist files if you're going to be adding more files to that directory that need ignored.
Here is how it would be done using the glob function.
<?php
foreach (glob("*.txt") as $filename) {
echo "$filename " . file_get_contents($filename) . "\n";
}
?>
This would print the contents of the files. You can do some workaround if the path is not the current path and writing some kind of boundary between the files contents.
<?php
$blacklist = array("index.php");
if ($handle = opendir('.')) {
while (false !== ($entry = readdir($handle))) {
if ($entry != "." && $entry != ".." && !in_array($entry, $blacklist)) {
echo file_get_contents($entry) . "\n";
}
}
closedir($handle);
}
?>
I hope this helps you.
Never reinvent the wheel. Use composer.
Require symfony/finder
use Symfony\Component\Finder\Finder;
class Foo
{
public function getTextFileContents($dir)
{
$finder = (new Finder())->files()->name('*.txt');
foreach ($finder->in($dir) as $file) {
$contents = $file->getContents();
// do something while file contents...
}
}
}
I would give a chance to some SPL filesystem iterators to accomplish such this task:
$dir = '/home/mydirectory';
$rdi = new \RecursiveDirectoryIterator($dir, \FilesystemIterator::SKIP_DOTS);
$rdi = new \RegexIterator($rdi, '/\.txt$/i');
$iterator = new \RecursiveIteratorIterator($rdi, \RecursiveIteratorIterator::CHILD_FIRST);
foreach ($iterator as $file) {
echo 'Contents of the '.$file->getPathname().' is: ';
echo file_get_contents($file->getPathname());
}
This will recursively find & iterate all .txt files in given directory, including sub-directories.
Since each $file in iteration is a FilesystemIterator instance, you can use all related methods for additional controls like $file->isLink() (true for symbolic links), $file->isReadable() (false for unreadable files) etc..
If you don't want lookup sub-folders, just change the RecursiveDirectoryIterator in the second line from:
$rdi = new \RecursiveDirectoryIterator($dir, \FilesystemIterator::SKIP_DOTS);
to:
$rdi = new \DirectoryIterator($dir, \FilesystemIterator::SKIP_DOTS);
Hope it helps.
As #brock-b said, you could use glob to get the full list of files and file_get_contents to grab the contents:
$blacklist = array('index.php');
$files = glob('*.txt'); # could be *.* if needed
foreach ($files as $file) {
if (!in_array(basename($file), $blacklist)) {
echo file_get_contents($file);
}
}
Note: the blacklist wont be hit since you're seeking for *.txt files. Only useful when doing an *.* or *.php file search
I'm simply trying to loop through a folder of images and put them in an Amazon S3 bucket. I just can't figure out why it doesn't like the DATA part of the PUTFILE function.
If I try:
$res = $s3 -> putFile("test-bucket/".$entry);
It creates a blank file (0 bytes) in the bucket, so I know it's connecting OK, and the filenames are correct also. I've also tried looping through the folders files and echoing the filenames to the screen, without using the Zend S3 function, which proved this page can access the files behind root.
The documentation regarding this function says:
putFile($path, $object, $meta) puts the content of the file in $path
into the object named $object.
The optional $meta argument is the same as for putObject. If the
content type is omitted, it will be guessed basing on the source file
name.
This is what I've tried:
if ($handle = opendir('d:/web-library-photos/temp-previews')) {
$loopCounter = 0;
while (false !== ($entry = readdir($handle))) {
///
$loopCounter = $loopCounter + 1;
if ($loopCounter < 3) { // for test purposes only
$localPath = "d:/web-library-photos/temp-previews/".$entry;
$s3 = new Zend_Service_Amazon_S3($aws_access_key_id, $aws_s3_secret);
$res = $s3 -> putFile($localPath, "test-bucket/".$entry);
echo "Success for ".$entry.": " . ($res ? 'Yes' : 'No') . "<br>";
}
///
}
closedir($handle);
}
This particular "try" yields the error:
Cannot read file d:/web-library-photos/temp-previews/.' in C:\Inetpub\vhosts\.....thispage.php
So I thought I would try file_get_contents() and readfile() but still no luck!
Please, please, put me straight on this one - it's driving me round the bend.
UPDATE
Right, I have a clue what's wrong but not how to solve it.
If I simply do this:
if ($handle = opendir('d:/web-library-photos/temp-previews')) {
while (false !== ($entry = readdir($handle))) {
echo $entry."<BR>";
}
closedir($handle);
}
I get a . and a .. at the beginning of my results, which obviously can't be read as filenames!! What is going on here, how do I get around it?
Try to append this in while (false !== ($entry = readdir($handle)))
while (false !== ($entry = readdir($handle))) {
if($entry == "." || $entry == "..") continue;
/* other codes */
}
Every 72 hours I upload a new PHP file to my server. (well actually it is an xml file transformed on the server with php) Is there a method to create a link on an html page that links to the "new" PHP doc automatically everytime a new file is uploaded?
I don't want to manually change the link every 72 hours. I would ultimately like to have an html page with a list of links to every new doc that is uploaded. I found this for images but I need someting like this but for PHP files and links.
http://net.tutsplus.com/articles/news/scanning-folders-with-php/
Any help would be very appreciated.
I found a solution that add links to the xml files. Now I just need to figure out how to add a link to reference the xslt sheet for each new xml file that is upload AUTOMATICALLY. I am not sure how to do this but any help would be very helpful. Thanks for everyones help.
<?php
$count = 0;
if ($handle = opendir('.')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {$count++;
print("".$file."<br />\n");
}
}
echo '<br /><br />Return';
closedir($handle);
}
?>
To read in a directory of files and then sort them by upload time you can just use:
$files = glob("files/*.xml");
$files = array_combine($files, array_map("filemtime", $files));
arsort($files);
print "link: " . current($files); // make that an actual <a href=
You can do that pretty easily with PHP function readdir:
http://php.net/manual/en/function.readdir.php
Simply loop through the files in the directory where you upload files and have php output a link for each.
ie:
<?php
if ($handle = opendir('/path/to/upload_dir')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
echo '' . $file . '<br />';
}
}
closedir($handle);
}
?>
You'll need to edit the http:// URL on the href to point to the correct download URL for your server, as well as the server path for opendir.
Hope that helps.
list by filetype
<?php
if ($handle = opendir('/path/to/dir')) {
while (false !== ($file = readdir($handle))) {
if (strpos($file, '.php',1)||strpos($file, '.xml',1) ) {
echo "<p>$file</p>";
}
}
closedir($handle);
}
I am trying to write a php function to save and then display comments on an article.
In my save.php, I am formulating the file with:
$file = "article1/comments/file".time().".txt";
Then using fwrite() to write to a directory.
In my index I have:
if ($handle = opendir('article1/comments')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
$files = array($file);
sort($files);
foreach($files as $comments){
echo "<div class='message'>";
readfile('article1/comments/'.$comments);
echo "</div>";
}
}
}
closedir($handle);
}
For the most part this displays the comments in the correct order, but for some reason, some files are displaying out of order. Furthermore, if I change sort() to rsort(), there is no change in how they are displayed.
I presume this is because readfile() is not following the sorted array's order. So I am wondering for one, why readfile does not display the files in order from newest to oldest, and two, how can I make it display them correctly?
Thanks.
edit: I copied the directory of comments from the live site to my local xampp installation, and the comments are displayed in order locally, but using the same code on my site results in comments not being in order.
Take a look at DirectoryIterator, make sure to check 1st comment for DirectoryIterator's isFile() method, it should be enough to solve this question.
Try this:
$files = array();
if ($handle = opendir('article1/comments')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
$files[] = $file; //adding file to array
}
}
closedir($handle);
}
//if array is not empty-check (can go here)
if(count($files)>0) {
sort($files);
foreach($files as $comments){
echo "<div class='message'>";
readfile('article1/comments'.$comments);
echo "</div>";
}//~foreach
}//~if
Please, use any database for this stuff! Don't use files! This is not realy secure and has low performance