Get text data name list from folder on server - php

Not sure if this is possible or not?
I need contents of a file directory from another server in order to make a photogallery on another server.
Let's say I have a Folder on server1 named "folderName1" and the contents in the folder are images, like:
2005-14-05-this-that.jpg
2005-14-06-this-that.jpg
2005-14-07-this-that.jpg
2005-14-08-this-that.jpg
2005-14-09-this-that.jpg....
In order to make use this gallery script, I need to get a text file with this information on it.. Some folders I have 1000's of photos in them and it takes to long to write them all down..
Wondering if there is a shortcut to GET all contents from a folder and spit them out in a text file??
Thanks!!

http://php.net/manual/en/function.readdir.php
place a script on server1 (perhaps in each directory that has photo's) called 'imagelist.php'. This script loops all files according to the function I placed above and echoes every image on it's own line.
Then server2 could request this file using file_get_contents() and loop everyline and use the filenames to create a gallery.

If the server containing the images is in your control, you can have a PHP script list out all the image names by using readdir() function. Then you call this script from the other server and read/parse all the files names.
If you dont control the server hosting the files then this is not really possible unless they have directory listing enabled on that images directory.

Related

how to avoid duplicate filename in wordpress

As we all know wordpress stores your uploaded files (for me,it's just JPG files) in a folder named "uploads" under "wp-content". Files are separated into folders based on year and month.
Now i want to copy every file from every folder into a single folder on another server (for some purposes). I want to know, does wordpress rename duplicate files? is it possible that my files be overwritten on the new server?
If yes, how can i avoid this? is there a way to make wordpress rename files before storing them?
You can scan your uploaded file folder and you have to options:
1.- Set a random name for each file
2.- Set a name convention including path and file name, for example: my_path_my_filename.jpg
By the way your file wont be overwritten cause is another server
This question seems about export/import...
Check exported XML (WordPress eXtended RSS file format), you can download all media URLs at <wp:attachment_url> tag... Use any XML parser.
Example without parser, at terminal:
cat exportedSite.xml | grep wp:attachment_url
will list all URLs. Each parsed URL can be downloaded by curl or wget.
If you whant to restore the XML backup, change (only) the URLs of the wp:attachment_url tags by the new repo URLs

Save txt file in specific location

I have a recursive function which generates about 200.txt files. I am running this application on a local server.
Basically, the front end just has a file upload field, which you just choose a .csv, which it then generates all the .txt files from that, but rather than saving them on the wamp server, is it possible to save them in a specific location?
Example, if I put another field in my front end called 'fileLocation', and the user types in the pathname.
Obviously i'd have to check if it's a directory etc, but is this possible to say save all the files on:
/Volumes/computer/Users/username/Desktop/test/
I'm not sure where to proceed with this.
No, is not possible to access computer files this way by using a localhost. You could zip all files and make the browser download them. Like is described here

Calling All Files From A Different Directory Located Above Domain Name Directory

I am attempting to create a script in PHP which reads and includes all files from a directory which is above the domain name directory.
For example:
My domain name is example.com and located in /var/www/html/example.com/ and I want /var/www/html/example.com/file.php to be able to read from:
/var/www/html/videos/video1/ which contains index.html and the folders:
/var/www/html/videos/video1/images/ and /var/www/html/videos/video1/scripts/
e.g. www.example.com/file.php?dir=/var/www/html/videos/video1/index.html
If I use include (/var/www/html/videos/video1/index.html) it will only call the html file, which is done perfectly. However all the files in images folder and the scripts folders are not able to load.
I don't want to copy or call each file separately. I want to be able to only need to call index.html and then make the browser think it's in that directory and automatically read any file within there.
I know this works because Moodle uses this method (in file.php) to protect learning files by storing them in a moodledata folder which is one level above the public folder.
I've had a look but cannot make sense of it and I've searched the Internet to achieve the method I have explained above but have not had any success.
The reason I want to do this is to avoid having duplicate video files on the server for other sites that are hosted on the same server.
Many thanks in advance and for taking the time to assist.
$dir = realpath(dirname(__FILE__)."/../");
This would be the directory you are looking for. Require files relative to that.
To output a different file look at readfile which outputs straight to the buffer or perhaps use file_get_contents which can be held in a variable.
See if you can load an image in the browser directly, if you can it's probably a problem with your code, if you can't it may be a rights issue.

.htaccess for images only upload

I'm developing a very simple PHP upload feature in my site to permit users to upload JUST images. Apart from checking the mime-type through php I wanted a .htaccess file to rule what can be uploaded and what can't.
I want to insert the .htaccess in my root folder and from there writing the rules for all the folders I need to be ruled.
It's the first time I work with .htaccess and from the internet I was able to find this:
http://pastebin.com/0KNHEbw0
But it doesn't work. I'm testing it locally with my xampp on win7 and I see that I can upload any type of files in the "oggetti" folder.
What's that is wrong?
And then, to rule other folders should I write something like this?
http://pastebin.com/dFMUu1g0
Thank you in advance!
You can't control what files are uploaded through a .htaccess file: Apache, the web server parsing those commands, deals with serving the files only.
You will need to do these checks in the PHP script that handles the upload process. Note that checking the MIME type sent with the file is not a reliable method to determine a file's type! The value sent can be manipulated by an attacker. To ensure a file is an image file, you could e.g. use getimagesize().
This cannot be accomplished using .htaccess. I'm guessing what you're trying to do is prevent malicious scripts from accidentally being executed on the server. The way I normally handle file uploads like this is:
Insert filename, mime-type, etc., into a database with an auto_increment ID.
Use the ID as the file name - no extension, and place the file in a directory outside of your webroot. This way you're certain nobody can execute the file.
When a file is requested, query the database for filename mime-type and id, and send the file to the user with readfile() (follow the link for an example).

is it posible to upload multiple files using a text file that contains the path to all files to be uploaded?

Is there any way to upload multiple files using a single file?...basically i want to upload multiple pdf files at once, using one single file that contains the path to each one of the pdf files...and store the information to mysql database...
PS: i dont want to merge all the files into 1 huge pdf...i want each 1 of pdf file to be uploaded to server dir at once and then store the file info to database eg. path, file info, filename for later use..
In order for a file to be uploaded, the user has to select that file manually. It's a security measure (otherwise websites could examine arbitrary files on your computer without your knowledge, which would be bad).
No - Because it would break the Javascript sandbox model (i.e. would be a security problem).
For security concern, it's hard to do this by javascript, which means you will have the access to others local files.
Why not just pack them up into a zip file then unzip on the sever side?

Categories