Transfer files between 2 servers with feedback - php

I need to implement Wordpress (or some other CMS) web site on Server_1 that when user upload files, they are transferred to another Server_2, there processed by some application and returned to Server_1 so user can download it or view. Because files can be large, I found that best solution would be FTP transfer, something like: http://www.designaeon.com/transfer-files-bw-servers-php
When file is transferred, application on Server_2 should be started and after it process files they should be returned to Server_1.
So my question is: What is the best way to implement this?
Should I use php and ftp transfer and some listeners to check folder on Server_2 if file is processed or some external application that check folder every few minutes and copy to another servers files... I would appreciate any points on how to implement this and where to look.
Thank you in advance!

This is one possible approach:
On Server_1, host the freshly uploaded files in a secret, via http accessible folder.
On Server_2, host a php that is capable of downloading, processing and outputting the file.
When the file is being uploaded, put it in the accessible folder, then use curl or wget to query the php on Server_2 passing the URL to the newly uploaded file (i.e. wget http://server_2/path/to/processor.php?file=http://server_1/path/to/secret/dir/original.pdf)
processor.php will then download the file, modify it and as a response write it back to the curl or wget process on Server_1
Have that curl or wget process on Server_1 save the modified file to your desired location.

Related

php file_exists on FTP Uploading file [duplicate]

My application is keeping watch on a set of folders where users can upload files. When a file upload is finished I have to apply a treatment, but I don't know how to detect that a file has not finish to upload.
Any way to detect if a file is not released yet by the FTP server?
There's no generic solution to this problem.
Some FTP servers lock the file being uploaded, preventing you from accessing it, while the file is still being uploaded. For example IIS FTP server does that. Most other FTP servers do not. See my answer at Prevent file from being accessed as it's being uploaded.
There are some common workarounds to the problem (originally posted in SFTP file lock mechanism, but relevant for the FTP too):
You can have the client upload a "done" file once the upload finishes. Make your automated system wait for the "done" file to appear.
You can have a dedicated "upload" folder and have the client (atomically) move the uploaded file to a "done" folder. Make your automated system look to the "done" folder only.
Have a file naming convention for files being uploaded (".filepart") and have the client (atomically) rename the file after upload to its final name. Make your automated system ignore the ".filepart" files.
See (my) article Locking files while uploading / Upload to temporary file name for an example of implementing this approach.
Also, some FTP servers have this functionality built-in. For example ProFTPD with its HiddenStores directive.
A gross hack is to periodically check for file attributes (size and time) and consider the upload finished, if the attributes have not changed for some time interval.
You can also make use of the fact that some file formats have clear end-of-the-file marker (like XML or ZIP). So you know, that the file is incomplete.
Some FTP servers allow you to configure a hook to be called, when an upload is finished. You can make use of that. For example ProFTPD has a mod_exec module (see the ExecOnCommand directive).
I use ftputil to implement this work-around:
connect to ftp server
list all files of the directory
call stat() on each file
wait N seconds
For each file: call stat() again. If result is different, then skip this file, since it was modified during the last seconds.
If stat() result is not different, then download the file.
This whole ftp-fetching is old and obsolete technology. I hope that the customer will use a modern http API the next time :-)
If you are reading files of particular extensions, then use WINSCP for File Transfer. It will create a temporary file with extension .filepart and it will turn to the actual file extension once it fully transfer the file.
I hope, it will help someone.
This is a classic problem with FTP transfers. The only mostly reliable method I've found is to send a file, then send a second short "marker" file just to tell the recipient the transfer of the first is complete. You can use a file naming convention and just check for existence of the second file.
You might get fancy and make the content of the second file a checksum of the first file. Then you could verify the first file. (You don't have the problem with the second file because you just wait until file size = checksum size).
And of course this only works if you can get the sender to send a second file.

Upload file with sftp and php

I'm using the library phpseclib0.3.7 to connect via SFTP to my web server. The users that are present have the following structure:
/dirhome/FirstUser/FirstUser/
/dirhome/SecondUser/SecondUser/
/dirhome/....../.....
Where:
dirhome:
own -> root:root
permission -> 700
FirsUser: (is the name of a user (example))
own -> root:root
permission -> 755
FirsUser: (is the name of a user (example))
own -> FirstUser:mygroup
permission -> 700
The same structure for the second user and so on. With this facility, users can view / edit / create content only within their directory.
I can easily connect in PHP using the library and view folders/files, but do not know how to upload files from user's local PC on the remote server within your personal folder.
The library provides the method:
->put($remotepath, $localpath, NET_SFTP_LOCAL_FILE);
Il problema รจ:
How can I open a dialog box to allow the user to select a file on user's PC and upload in her personal directory?
If I put: $localpath = "C:\\Users\\****\\Desktop\\test.txt"
I get:
C:\\Users\\****\\Desktop\\test.txt is not a valid file
But if I take a file that resides locally on the server works fine, but it does not make sense because users can not put its files. There are several days that I try but without success.
Also the download does not work, essentially for the same problem.
PHP is a server side scripting language. When your browser requests a PHP generated site, PHP runs on the server and your browser only gets to see the result. For that reason, PHP obviously has no way to access client side files: it already ran before anything even reaches the client computer.
I'm assuming the server with the PHP files on it and the SFTP server are different servers, otherwise your whole question doesn't make too much sense.
So, what you need to do here is a two step approach: First, you need to upload the files to the server who runs the PHP files the regular way, using a HTTP POST request. You can send the request to a PHP script, that then uses SFTP to move the files to the other server.
For downloads (as you asked this in your comments) it works similar: the browser requests a PHP script that fetches the file from the SFTP server, and then sends it to the browser as HTTP response. The browser should then display a regular file download dialog and the user can select where to store it.
FOr uploading you should consider using some kind of cron job or a background job started using PHP's exec() instead, as you will most likely either run into max execution timeouts or have to set them way higher than you should if you upload them usign PHP, especially for large files. The alternative is to use a separate PHP configuration (depending on what version of PHP you are running you can use .htaccess files, .user.ini files or different PHP-FPM pools for that) to increase the execution time for only the upload script.

When user clicks on link I want to unzip file and then open it

I would be grateful for help concerning this issue:
User clicks on a link:
the link itself has the parameter that tells which file needs to be unzipped to /unzip folder. After a file is unzipped, I would like to open the file.
How can I do this? I have the unzip part coded already.
I can suggest a following solution:
You create a folder on the server where you will unzip the files to.
You create an .htaccess file there which specifies your own php script as 404 Error handler
In your php script you parse URL and identify which file to unzip, unzip it and redirect user to the newly created file
If you need to clean the unizpped files, you can create a cronjob which will remove files older than a certain time
What you get from that is:
File transfer from server to user is handled by web server
You actually cache your work as 404 handler won't run if you have the file in place
You can significantly lower the server load as this approach reduces the amount of operations performed on server side (when file exists)
The description above assumes Apache as a web server

Is there a way to check whether a file is completely uploaded using PHP?

I have a directory on a remote machine in which my clients are uploading (via different tools and protocols, from WebDav to FTP) files. I also have a PHP script that returns the directory structure. Now, the problem is, if a client uploads a large file, and I make a request during the uploading time, the PHP script will return the file even if it's not completely uploaded. Is there a way to check whether a file is completely uploaded using PHP?
Setup your remote server to move uploaded files to another directory, and only query the directory files are moved to for files.
AFAIK, there is no way (at least cross-machine) to tell if a file is still being uploaded, without doing something like:
Query the file's length
Wait a few seconds
Query the file's length
If it's the same, its possibly completed
Most UNIX/Linux/BSD-like operating systems has a command called lsof (lsof stands for "list open files") which outputs a list of all currently open files in the system. You can run that command to see if any process is still working with the file. If not, your upload has finished. In this example, awk is used to filter so only files will show that are open with write or read/write file handlers:
if (shell_exec("lsof | awk '\$4 ~ /.*[uw]/' | grep " . $uploaded_file_name) == '') {
/* No file handles open for this file, so upload is finished. */
}
I'm not very familiar with Windows servers, but this thread might help you to do the same on a Windows machine (if that is what you have): How can I determine whether a specific file is open in Windows?
I would think that some operating systems include a ".part" file when downloading a file, so there may be a way to check for the existence of such a file. Otherwise, I agree with Brian's answer. If you were using the script on the same system it is simple enough to tell using move_uploaded_file()'s return if it was being uploaded by a PHP script, but it does become a challenge pulling from a remote directory that can be added to with different protocols.

Securing upload form at php

I am making a feature to my site so that users can upload files (any type).
In order to secure the upload form, i made a blacklist of non-accepted filetypes. But in order to assure protection to my server (in case of uploading malicious scripts in any way) i thought to tar the uploaded files (using the tar class) so that they are stored as .tar zipped files.
So if the user wants to donwload it, then he will receive a .tar file.
My question is, is this secure enough? (since the files cannot be executed then).
[I have this reservation as i can see at the code of tar class, the "fread()"]
Thanks!
Two points, here :
Using a blacklist is a bad idea : you will never think to all possible evil filetypes.
Do not store the uploaded files into a public directory of your server :
Store those files to a directory that is not served by Apache, outside of your DocumentRoot.
And use a PHP script (even if Apaches cannot serve the files through HTTP, PHP can read them) to send those files contents to the user who wants to download them.
This will make sure that those uploaded files are never executed.
Of course, make sure your PHP script that sends the content of a file doesn't allow anyone to download any possible file that's on the server...
You can upload the files to an non web accessible location (under your webroot) and then use a download script to download the file.
The best way of handling uploaded files, in my opinion, is to place them in a folder that's not reachable through HTTP. Then when a file is requested, use a PHP file to send then download headers, the use readfile() to send the file to the user. This way, files are never executed.
That might work, assuming that you're users that will download the files can untar them (most non UNIX systems just have zip, I'd give them the option to download either format).
Also, i think its better to create a list of allowed files vs banned files. Its easy to forget to ban a specific type; whereas you will probably have a better idea of what users can upload
Dont block/allow files on extension. Make sure you are using the mime type that the server identifies the file as. This way its hard for them to fake it.
also, store the files in a non web accessible directory and download them through a script.
Even if its a bad file, they won't be able to exploit it if they can't directly access it .
When saving the files make sure you use these functions:
http://php.net/manual/en/function.is-uploaded-file.php
http://php.net/manual/en/function.move-uploaded-file.php
Dan

Categories