I have tow servers for my web site. first, for database and php files. the second, for save useres' uploaded files.
So, if I uploade a file in server-1 xxx.com. how could i save it in server-2 yyy.com??
if you want two servers to be exact clones (contianing same files) you can run a rsync script after your first uplaod has completed. Its very easy and best of you don't have to specify files.
Lets say you want to transfer all files in directory /files/ to server2 in directory /files/2/ You can run this :
rsync /files/ yyy.com:~/files/2/
If you ONLY want specific files (extensions) to be synced, you can do this:
rsync /files/*.mp3 yyy.com:~/files/2/
The above will move ONLY MP3.
You can simply upload one file from server 1 to the server 2 using PHP's FTP functions.
See code example here: http://www.jonasjohn.de/snippets/php/ftp-example.htm
Use shared storage (SAN). Or SMB shares if on Windows. Or NFS if on Unix. Or scp (ssh) with public key authentication if on Unix.
An ugly way I used once was to pass via cURL and FTP commands
Of you course, you need to have access to your server-2 FTP...
Related
I'm building a web app with MySQL/PHP/JS.
I have files in htdocs/mywebsite/foo/foo1/ path. Example:
htdocs/mywebsite/foo/foo1/image.png
I need to move these files to:
htdocs/mywebsite/foo3/image.png
Any help?
There is a rename() function in PHP that will just rename a file or will move it from one directory to another if the from and to directories are different.
<?php
rename("htdocs/mywebsite/foo/foo1/image.png",
"htdocs/mywebsite/foo3/image.png");
?>
In order to move files on a fileserver you need to use FTP (File Transfer Protocol). Programs like FileZilla could help you with that. A proper webhost should also have a FTP client on the control panel.
Using FileZilla:
Login to your webserver, simply cut and paste or drag and drop.
if you need to simple move all files from FTP using FTP tools like filezila or many other FTP software.
second from putty use mv command to move file.
I'm using the library phpseclib0.3.7 to connect via SFTP to my web server. The users that are present have the following structure:
/dirhome/FirstUser/FirstUser/
/dirhome/SecondUser/SecondUser/
/dirhome/....../.....
Where:
dirhome:
own -> root:root
permission -> 700
FirsUser: (is the name of a user (example))
own -> root:root
permission -> 755
FirsUser: (is the name of a user (example))
own -> FirstUser:mygroup
permission -> 700
The same structure for the second user and so on. With this facility, users can view / edit / create content only within their directory.
I can easily connect in PHP using the library and view folders/files, but do not know how to upload files from user's local PC on the remote server within your personal folder.
The library provides the method:
->put($remotepath, $localpath, NET_SFTP_LOCAL_FILE);
Il problema รจ:
How can I open a dialog box to allow the user to select a file on user's PC and upload in her personal directory?
If I put: $localpath = "C:\\Users\\****\\Desktop\\test.txt"
I get:
C:\\Users\\****\\Desktop\\test.txt is not a valid file
But if I take a file that resides locally on the server works fine, but it does not make sense because users can not put its files. There are several days that I try but without success.
Also the download does not work, essentially for the same problem.
PHP is a server side scripting language. When your browser requests a PHP generated site, PHP runs on the server and your browser only gets to see the result. For that reason, PHP obviously has no way to access client side files: it already ran before anything even reaches the client computer.
I'm assuming the server with the PHP files on it and the SFTP server are different servers, otherwise your whole question doesn't make too much sense.
So, what you need to do here is a two step approach: First, you need to upload the files to the server who runs the PHP files the regular way, using a HTTP POST request. You can send the request to a PHP script, that then uses SFTP to move the files to the other server.
For downloads (as you asked this in your comments) it works similar: the browser requests a PHP script that fetches the file from the SFTP server, and then sends it to the browser as HTTP response. The browser should then display a regular file download dialog and the user can select where to store it.
FOr uploading you should consider using some kind of cron job or a background job started using PHP's exec() instead, as you will most likely either run into max execution timeouts or have to set them way higher than you should if you upload them usign PHP, especially for large files. The alternative is to use a separate PHP configuration (depending on what version of PHP you are running you can use .htaccess files, .user.ini files or different PHP-FPM pools for that) to increase the execution time for only the upload script.
I'm pretty new to the cURL library so I hope the solution to this isn't too trivial.
Basically I have a remote directory, lets say http://www.something.com/dir. In this directory following files are present:
file1.xml
file2.xml
file3_active.xml
Is there a way I can get the files where the filename matches the phrase 'active' into a string? Would the solution work both for http and ftp?
EDIT: How can I get the list/array of filenames in a remote dir? If I could do that, I could simply use strpos, get the full filename and use cURL the simple way.
Many Regards,
Andreas
How can I get the list/array of filenames in a remote dir.
Off the top of my head:
Via an FTP dir command, if you have FTP access.
Via a custom PHP (or whatever) script on the remote server which generates a machine-parsable list for you.
Via a shellexec/popen/ssh_exec to a shell command like ls or find, run through SSH.
By parsing HTML from a web-server generated directory listing (i.e., as generated by Apache mod_autoindex) on the remote server.
Each of these options is going to require some action on the part of the person hosting the remote server -- so if it's completely out of your control, I think you're SOL.
I have a directory on a remote machine in which my clients are uploading (via different tools and protocols, from WebDav to FTP) files. I also have a PHP script that returns the directory structure. Now, the problem is, if a client uploads a large file, and I make a request during the uploading time, the PHP script will return the file even if it's not completely uploaded. Is there a way to check whether a file is completely uploaded using PHP?
Setup your remote server to move uploaded files to another directory, and only query the directory files are moved to for files.
AFAIK, there is no way (at least cross-machine) to tell if a file is still being uploaded, without doing something like:
Query the file's length
Wait a few seconds
Query the file's length
If it's the same, its possibly completed
Most UNIX/Linux/BSD-like operating systems has a command called lsof (lsof stands for "list open files") which outputs a list of all currently open files in the system. You can run that command to see if any process is still working with the file. If not, your upload has finished. In this example, awk is used to filter so only files will show that are open with write or read/write file handlers:
if (shell_exec("lsof | awk '\$4 ~ /.*[uw]/' | grep " . $uploaded_file_name) == '') {
/* No file handles open for this file, so upload is finished. */
}
I'm not very familiar with Windows servers, but this thread might help you to do the same on a Windows machine (if that is what you have): How can I determine whether a specific file is open in Windows?
I would think that some operating systems include a ".part" file when downloading a file, so there may be a way to check for the existence of such a file. Otherwise, I agree with Brian's answer. If you were using the script on the same system it is simple enough to tell using move_uploaded_file()'s return if it was being uploaded by a PHP script, but it does become a challenge pulling from a remote directory that can be added to with different protocols.
I am making a feature to my site so that users can upload files (any type).
In order to secure the upload form, i made a blacklist of non-accepted filetypes. But in order to assure protection to my server (in case of uploading malicious scripts in any way) i thought to tar the uploaded files (using the tar class) so that they are stored as .tar zipped files.
So if the user wants to donwload it, then he will receive a .tar file.
My question is, is this secure enough? (since the files cannot be executed then).
[I have this reservation as i can see at the code of tar class, the "fread()"]
Thanks!
Two points, here :
Using a blacklist is a bad idea : you will never think to all possible evil filetypes.
Do not store the uploaded files into a public directory of your server :
Store those files to a directory that is not served by Apache, outside of your DocumentRoot.
And use a PHP script (even if Apaches cannot serve the files through HTTP, PHP can read them) to send those files contents to the user who wants to download them.
This will make sure that those uploaded files are never executed.
Of course, make sure your PHP script that sends the content of a file doesn't allow anyone to download any possible file that's on the server...
You can upload the files to an non web accessible location (under your webroot) and then use a download script to download the file.
The best way of handling uploaded files, in my opinion, is to place them in a folder that's not reachable through HTTP. Then when a file is requested, use a PHP file to send then download headers, the use readfile() to send the file to the user. This way, files are never executed.
That might work, assuming that you're users that will download the files can untar them (most non UNIX systems just have zip, I'd give them the option to download either format).
Also, i think its better to create a list of allowed files vs banned files. Its easy to forget to ban a specific type; whereas you will probably have a better idea of what users can upload
Dont block/allow files on extension. Make sure you are using the mime type that the server identifies the file as. This way its hard for them to fake it.
also, store the files in a non web accessible directory and download them through a script.
Even if its a bad file, they won't be able to exploit it if they can't directly access it .
When saving the files make sure you use these functions:
http://php.net/manual/en/function.is-uploaded-file.php
http://php.net/manual/en/function.move-uploaded-file.php
Dan