I'm using the library phpseclib0.3.7 to connect via SFTP to my web server. The users that are present have the following structure:
/dirhome/FirstUser/FirstUser/
/dirhome/SecondUser/SecondUser/
/dirhome/....../.....
Where:
dirhome:
own -> root:root
permission -> 700
FirsUser: (is the name of a user (example))
own -> root:root
permission -> 755
FirsUser: (is the name of a user (example))
own -> FirstUser:mygroup
permission -> 700
The same structure for the second user and so on. With this facility, users can view / edit / create content only within their directory.
I can easily connect in PHP using the library and view folders/files, but do not know how to upload files from user's local PC on the remote server within your personal folder.
The library provides the method:
->put($remotepath, $localpath, NET_SFTP_LOCAL_FILE);
Il problema รจ:
How can I open a dialog box to allow the user to select a file on user's PC and upload in her personal directory?
If I put: $localpath = "C:\\Users\\****\\Desktop\\test.txt"
I get:
C:\\Users\\****\\Desktop\\test.txt is not a valid file
But if I take a file that resides locally on the server works fine, but it does not make sense because users can not put its files. There are several days that I try but without success.
Also the download does not work, essentially for the same problem.
PHP is a server side scripting language. When your browser requests a PHP generated site, PHP runs on the server and your browser only gets to see the result. For that reason, PHP obviously has no way to access client side files: it already ran before anything even reaches the client computer.
I'm assuming the server with the PHP files on it and the SFTP server are different servers, otherwise your whole question doesn't make too much sense.
So, what you need to do here is a two step approach: First, you need to upload the files to the server who runs the PHP files the regular way, using a HTTP POST request. You can send the request to a PHP script, that then uses SFTP to move the files to the other server.
For downloads (as you asked this in your comments) it works similar: the browser requests a PHP script that fetches the file from the SFTP server, and then sends it to the browser as HTTP response. The browser should then display a regular file download dialog and the user can select where to store it.
FOr uploading you should consider using some kind of cron job or a background job started using PHP's exec() instead, as you will most likely either run into max execution timeouts or have to set them way higher than you should if you upload them usign PHP, especially for large files. The alternative is to use a separate PHP configuration (depending on what version of PHP you are running you can use .htaccess files, .user.ini files or different PHP-FPM pools for that) to increase the execution time for only the upload script.
Related
i have GoDaddy shared webspace with FTP access that contains a folder with images. These images are changing every day.
Im looking for advice on what i need to do to get these images onto my server in the workplace, maybe every hour or so. The server doesnt have IIS installed is there any other way to do this?
Am i able to do a PHP script that can put all the images onto the server using the ip or something?
I had the same issue as I had two images on a remote server which I needed to copy to my local server at a predefined time each day and this is the code I was able to come up with...
try {
if(#copy('url/to/source/image.ext', 'local/absolute/path/on/server/' . date("d-m-Y") . ".gif")) {
} else {
$errors = error_get_last();
throw new Exception($errors['type'] . " - " . $errors['message']);
};
} catch(Exception $e) {
//An Error Occurred Downloading The Requested Image
//Place exception handling code here
};
I then created a cron job which ran this file on a daily basis but you would be able to run is as frequently as you needed to based on how frequently the source image changes on the source server.
A few points to explain the code...
As the code is run in the background I use the # symbol to silence visual references to any errors which occur with the copy command as they won't be viewable anyway. Instead if copy results in an error it returns false and triggers throwing a new exception which takes the error type and error message and throws a new exception for it which can then be handled in the catch block and can be actioned however you need such as making a local log entry, sending an error email, whatever you need.
As the second paramater in the copy command you will see that the path results in the filename being named based on the current date. This name can be anything but needs to be unique in the destination for it to work. If you plan on overwritting the image each time the copy is done then you can statically code the name as the same name but if you want to maintain a history then you would need to come up with a file naming solution on your local server to ensure that the files don't get overwritten each time the cron runs.
The first parameter of the copy statement has been written on the assumption that the source image file is named the same each time, if the name changes the you will need to identify how the naming is achieved and code a variable to build that name and insert it as the source filename.
This code does not alter the format of the source image file so to ensure no corruption occurs and the image can still be shown after the copy you need to ensure that the source image file and the local copy of the image file have the same file extensions, so if the source image file is a .gif file then you need to make sure the file extension in the second copy parameter is also set to .gif
I'll try to answer here instead of continuing the comment-spam ;)
you got your webspace with FTP access. let's just call it webspace;
then you got your server at your workplace. let's just call it workplace;
after all you need one server (also can be webspace for example) where you are able to run PHP. let's call it php-server;
step 1
at the workplace setup a FTP server, for example FileZilla. you setup your FTP server so that you can connect to it. so make an account and set it up so you can access the folder(s) where you want to save your images. also make sure you got access from outside your workplace - firewall settings etc.
step 2
if you can run your PHP scripts on your webspace this would be the easiest way. you can directly access the image files, establish a connection to your FTP server at workplace and upload the files.
if the PHP server is somewhere else, you have to establish a connection from the php-server to your webspace; download your files to your php-server; upload your files to the FTP server at your workplace.
will be a bit of work as you can see, but should be possible to do.
Create a .bat file that uses the ftp command line functions to get the data you want from the server. Save that .bat file somewhere and create a Scheduled Task to run the script every hour.
You can even store the actual sequence of ftp commands in a separate file (e.g. ftpcmd.dat) and call them from the script
ftp -n -s:ftpcmd.dat SERVERNAME.COM
Example ftp command file:
user MyUserName
Password
bin
cd \your\path\images
mget * C:\Temp\
quit
I need to implement Wordpress (or some other CMS) web site on Server_1 that when user upload files, they are transferred to another Server_2, there processed by some application and returned to Server_1 so user can download it or view. Because files can be large, I found that best solution would be FTP transfer, something like: http://www.designaeon.com/transfer-files-bw-servers-php
When file is transferred, application on Server_2 should be started and after it process files they should be returned to Server_1.
So my question is: What is the best way to implement this?
Should I use php and ftp transfer and some listeners to check folder on Server_2 if file is processed or some external application that check folder every few minutes and copy to another servers files... I would appreciate any points on how to implement this and where to look.
Thank you in advance!
This is one possible approach:
On Server_1, host the freshly uploaded files in a secret, via http accessible folder.
On Server_2, host a php that is capable of downloading, processing and outputting the file.
When the file is being uploaded, put it in the accessible folder, then use curl or wget to query the php on Server_2 passing the URL to the newly uploaded file (i.e. wget http://server_2/path/to/processor.php?file=http://server_1/path/to/secret/dir/original.pdf)
processor.php will then download the file, modify it and as a response write it back to the curl or wget process on Server_1
Have that curl or wget process on Server_1 save the modified file to your desired location.
I have a directory on a remote machine in which my clients are uploading (via different tools and protocols, from WebDav to FTP) files. I also have a PHP script that returns the directory structure. Now, the problem is, if a client uploads a large file, and I make a request during the uploading time, the PHP script will return the file even if it's not completely uploaded. Is there a way to check whether a file is completely uploaded using PHP?
Setup your remote server to move uploaded files to another directory, and only query the directory files are moved to for files.
AFAIK, there is no way (at least cross-machine) to tell if a file is still being uploaded, without doing something like:
Query the file's length
Wait a few seconds
Query the file's length
If it's the same, its possibly completed
Most UNIX/Linux/BSD-like operating systems has a command called lsof (lsof stands for "list open files") which outputs a list of all currently open files in the system. You can run that command to see if any process is still working with the file. If not, your upload has finished. In this example, awk is used to filter so only files will show that are open with write or read/write file handlers:
if (shell_exec("lsof | awk '\$4 ~ /.*[uw]/' | grep " . $uploaded_file_name) == '') {
/* No file handles open for this file, so upload is finished. */
}
I'm not very familiar with Windows servers, but this thread might help you to do the same on a Windows machine (if that is what you have): How can I determine whether a specific file is open in Windows?
I would think that some operating systems include a ".part" file when downloading a file, so there may be a way to check for the existence of such a file. Otherwise, I agree with Brian's answer. If you were using the script on the same system it is simple enough to tell using move_uploaded_file()'s return if it was being uploaded by a PHP script, but it does become a challenge pulling from a remote directory that can be added to with different protocols.
I have a question.
I have a script that handles file upload, and after the file is done uploading, I send $_FILES data over to another local script via curl, which handles the files and puts it into proper place.
The problem is, it works perfectly on my local, using following curl settings:
curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_POST,count($fields));
curl_setopt($ch,CURLOPT_POSTFIELDS,$fields_string);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
I run windows 7, but when I moved the script to my dedicated server (CentOS), it stopped working.
After doing some research, when the file is uploaded, it is stored in /tmp directory.
It turns out, the file uploaded to /tmp is deleted right before my curl call. It is known that PHP deletes tmp file uploads once the script finishes executing.
Is there a setting I could use in CURL to bypass this problem? It works fine locally, I just don't understand why it wouldn't work on my CentOS server..
UPDATE: It worked on my other server, which runs on linux as well... I don't know what particular setting it is to change this, but it seems like every server configuration is different on this.
Because you are using CURL you are sending the web server a new request.
There are a couple of things you are asking the web server to do:
get uploaded file
process script send
new request to web server process
second request
in the mean time, the engine knows it needs to perform the following task;
tidy up uploaded files
The different environments opens the possibilty to process those tasks in a different order (e.g. Linux+Apache vs Windows+IIS). When it decides to tidy up the uploaded files would explain what you are seeing:
get uploaded file
process script send
new request to web server
<- here
process second request
<- here
in the first position, your script breaks, in the second it will work. This is because in the first spot your uploaded file has been deleted before your second request/script is processed/run. Like Marc mentioned, this is core functionality so you will need to modify your script to use move_uploaded_file() and then pass the location of the file to your other script.
Hopefully that sheds some light on why it's operating differently in the different environments.
You'd have to move the file out of /tmp to another directory using move_uploaded_file() when the original upload finishes, or initiate the curl upload from within the same script. Otherwise PHP will clean up the file and there's not much else you can do.
The automatic cleanup is core PHP functionality, and curl can't affect it.
I have tow servers for my web site. first, for database and php files. the second, for save useres' uploaded files.
So, if I uploade a file in server-1 xxx.com. how could i save it in server-2 yyy.com??
if you want two servers to be exact clones (contianing same files) you can run a rsync script after your first uplaod has completed. Its very easy and best of you don't have to specify files.
Lets say you want to transfer all files in directory /files/ to server2 in directory /files/2/ You can run this :
rsync /files/ yyy.com:~/files/2/
If you ONLY want specific files (extensions) to be synced, you can do this:
rsync /files/*.mp3 yyy.com:~/files/2/
The above will move ONLY MP3.
You can simply upload one file from server 1 to the server 2 using PHP's FTP functions.
See code example here: http://www.jonasjohn.de/snippets/php/ftp-example.htm
Use shared storage (SAN). Or SMB shares if on Windows. Or NFS if on Unix. Or scp (ssh) with public key authentication if on Unix.
An ugly way I used once was to pass via cURL and FTP commands
Of you course, you need to have access to your server-2 FTP...