I want to keep a folder on one machine in sync with a folder on another. This is for a WordPress deployment plugin so I can't rely on rsync or other commands being present on either machine. PHP and a web server will be available on both machines and ideally it would work over HTTP.
My current thinking is that the requesting machine posts the local file list with last modified dates to a script on the other machine. The other machine compares with its files and responds with the modified files - either a list of files to be fetched individually or with the changed files inlined in the response.
I'd rather use an existing solution if one exists, though. Any ideas?
I've created a simple set of classes to implement this: https://github.com/outlandishideas/sync
On the server, e.g. example.com/remote.php:
const SECRET = '5ecR3t'; //make this long and complicated
const PATH = '/path/to/source'; //sync all files and folders below this path
$server = new \Outlandish\Sync\Server(SECRET, PATH);
$server->run(); //process the request
On the client:
const SECRET = '5ecR3t'; //this must match the secret key on the server
const PATH = '/path/to/destination'; //target for files synced from server
$client = new \Outlandish\Sync\Client(SECRET, PATH);
$client->run('http://example.com/remote.php'); //connect to server and start sync
Your best bet is checking when the script was last run and then uploading the folder with ftp_* functions.
<?php
$username = 'root'; // and this
$password = 'password'; // this also
$host = 'my-remote-server.com'; // and this
$remote_backup = 'backups/'; // folder on remote server to upload to
$backup_folder = 'to_backup/'; // folder to backup
$temp_folder = 'temp_files/'; // a folder on the local server i can write to
$last_run = file_get_contents("{$temp_folder}last_run.txt"); // You'll probably want to get this from a database instead
if($last_run <= strtotime('-1 day'))
{
file_put_contents("{$temp_folder}last_run.txt", time()); // Update the last time this was ran
$file = time() . '_backup.zip'; // what the file will be called both remotely and locally
$ftp = ftp_connect($host); // connect to the ftp server
ftp_login($ftp, $username, $password); // login to the ftp server
$zip = new ZipArchive; // create a new instance of ZipArchive
$zip->open($temp_folder . $file, ZIPARCHIVE::CREATE); // Create a new archive
foreach(glob($backup_folder . '*') as $file) // Loop through all files in the local backup directory
{
$zip->addFile($file); // add that file
}
ftp_chdir($ftp, $remote_backup); // cd into the remote backup folder
$upload = ftp_nb_put($ftp, $remote_backup . $file, $temp_folder . $file); // non-blocking put, uploads the local backup onto the remote server
while($upload === FTP_MOREDATA)
{
// do something else while we're waiting for the non-blocking upload to finish
}
ftp_close($ftp); // closes the connection
}
It should be non-blocking (well - the upload to the remote server), so if you don't have many files to zip it'll be fine to include on the index page for example. There isn't any error handling so you may want to add that in. It also doesn't delete the local backup, either, you may want to handle that.
In PHP I would not recommend it for a bunch of reasons.
I have exactly what you need as a python app.
This app is built to run as a service, you simply start it and forget about it :)
App: https://gist.github.com/8f62786582c6933395eb
Shell: https://gist.github.com/e08a99937c6f5deac4ab
Note: the shell file should be called fsyncd not fsyncd.sh :)
The PHP version of the above:
https://gist.github.com/3963cbc58793ff7e9773
Note: You need to get it running on both sites and configure each to connect to the other and set them to be executed by crons. Preferably not by WP crons.
I have the path to the directory that will be synced defined here:
define("PATH_DATA", PATH_ROOT . "data" . DIRECTORY_SEPARATOR);
In my case the data folder is in the script folder. You should just set an absolute path or use the WP core to get the WP uploads dir.
The principal is:
find a way to get the two servers capable of talking to each other.
I used a socket server/client approach.
You could do a HTTP _POST processor (server) and a HTTP _POST maker (client).
Keep a record of last sync time.
At certain intervals read folder for and record any files modified from the last sync time.
Send list of files to be updated with their modified timestamp to the other server.
It should compare your list to his records and tell you which of the files he does not have.
Send those files.
The receiver will write the files and set modified date to the one on the other server. (this is important, to avoid infinite loops)
Good luck.
Related
I have a list of videos (.mp4) in a remote server.
From my application codeigniter, I connect with FTP to the remote server and I can list those videos.
I want to rename those files, but I should verify if file is opened by another process (video is playing) before renaming it.
How can check if a file is opened by another process with php?
This my code:
$ftp = new Simple_ftp();
$ftp->init("server",'login','password');
$conn_id = $ftp->connexion();
if($conn_id == 3){
$files = $ftp->ls('path_files');
foreach($files as $file){
...
}
}
You cannot check for that having only FTP access.
If the server is using Linux based OS and you have ssh access, you can remotely connect and use the lsof program.
There is no way of knowing by FTP, however other methods I suggest taking a look at the following: flock
It checks for file locking
i'm stuck with a Project i'm currently working on.
I have to make a PHP Script that uploads a File to a specific FTP, the file gets processed by another script which is observing the FTP on the Fly. After the processing is done a new File is generated with one of 4 possible file extensions and the original file gets deleted automaticly. Here's where my Problem starts, i'm not that much into PHP for i'm working with it far too rarely.
So i have to search for the file with one of the four possible Extensions and download it to the machine where the PHP Script is running on and the search needs to be done by this PHP Script. Any suggestions how to achieve this for i have not a glimps of a clue :(
You can not search through FTP protocol.
You have to list a directory and then search for desired file(s) locally:
$ftp = ftp_connect( $ftpHost );
ftp_login( $ftp, $ftpUsername, $ftpPassword ) or die( 'Oh No!' );
$files = ftp_nlist( $ftp, 'www/myDir' );
$filteredFiles = preg_grep( '/\.php$/i', $files );
ftp_close($ftp);
With above example all the files in www/Dir directory with .php extension are now in $filteredFiles array.
Alternatives:
If your remote server allow SSH2 connection, you can retrieve the files list through a SSH2 connection;
If your remote server is php/HTTP enabled, you can write a php script on remote server to search file(s) and then perform an HTTP request.
I'm making a utility that provides a GUI to easy edit certain values in a csv file on a remote server. My boss wants the utility in php running on the private webserver. I'm new to php, but I was able to get the GUI file modifier working locally without issues. The final piece now is rather than the local test file I need to grab a copy of the requested file off of the remote server, edit it, and then replace the old file with the edited one. My issue is uploading and downloading the file.
When I searched for a solution I found the following:
(note in each of these I am just trying to move a test file)
$source = "http://<IP REMOTE SERVER>/index.html";
$dest = $_SERVER['DOCUMENT_ROOT']."index.html";
copy($source, $dest);
This solution ran into a permissions error.
$source ="http://<IP REMOTE SERVER>/index.html";
$destination = $_SERVER['DOCUMENT_ROOT']."newfile.html";
$data = file_get_contents($source);
$handle = fopen($destination, "w");
fwrite($handle, $data);
fclose($handle);
This also had a permissions error
$connection = ssh2_connect('<IP REMOTE SERVER>', 22);
ssh2_auth_password($connection, 'cahenk', '<PASSWORD>');
ssh2_scp_recv($connection, '/tmp/CHenk/CHenk.csv', 'Desktop/CHenk.csv');
This solution has the error Fatal error: Call to undefined function ssh2_connect() which I have learned is because the function is not a part of the default php installation.
In closing, is there any easy way to read/write files to the remote server through php either by changing permissions, having the php extension installed, or a different way entirely that will work. Basically I'm trying to find the solution that requires the least settings changes to the server because I am not the administrator and would have to go through a round about process of getting any changes done. If something does need to be changed instructions on doing so or a link to instructions would be greatly appreciated.
Did you set the enable-url-fopen-wrapper in your php.ini?(only if your php version is older)
Please look # php remote files storing in example 2
I'd like to upload large files to my server, but i would like to be able to make breaks (for example, the user must be able to shut down his computer and to continue after reboot) in the upload process.
I think i can handle the client side upload, but I don't know how to make the server side. What is the best way to make it on the server side? Is PHP able to do that ? Is PHP the most efficient?
Thanks a lot
If you manage to do the client side do post the file in chunks, you could do something like this on the server side:
// set the path of the file you upload
$path = $_GET['path'];
// set the `append` parameter to 1 if you want to append to an existing file, if you are uploading a new chunk of data
$append = intval($_GET['append']);
// convert the path you sent via post to a physical filename on the server
$filename = $this->convertToPhysicalPath($path);
// get the temporary file
$tmp_file = $_FILES['file']['tmp_name'];
// if this is not appending
if ($append == 0) {
// just copy the uploaded file
copy($tmp_file, $filename);
} else {
// append file contents
$write_handle = fopen($filename, "ab");
$read_handle = fopen($tmp_file, "rb");
$contents = fread($read_handle, filesize($tmp_file));
fwrite($write_handle, $contents);
fclose($write_handle);
fclose($read_handle);
}
If you are trying to design a web interface to allow anyone to upload a large file and resume the upload part way though I don't know how to help you. But if all you want to do is get files from you computer to a server in a resume-able fashion you may be able to use a tool like rsync. Rsync compare the files on the source and destination, and then only copies the differences between the two. This way if you have 50 GB of files that you upload to your server and then change one, rsync will very quickly check that all the other files are the same, and then only send your one changed file. This also means that if a transfer is interrupted part way through rsync will pick up where it left off.
Traditionally rsync is run from the command line (terminal) and it is installed by default on most Linux and Mac OS X.
rsync -avz /home/user/data sever:src/data
This would transfer all files from /home/user/data to the src/data on the server. If you then change any file in /home/user/data you can run the command again to resync it.
If you use windows the easiest solution is probably use DeltaCopy which is a GUI around rsync.download
I have Wamp (server called emerald) running and Mamp running on my Mac. People register on Mamp. Emerald is basically file hosting.
Emerald connects to Mamp's mysql database, to login users. However, I want to create a directories for new registrations on Emerald using PHP.
How can I do this? I have tried using this code:
$thisdir = "192.168.1.71";
$name = "Ryan-Hart";
if(mkdir($thisdir ."/documents/$name" , 0777))
{
echo "Directory has been created successfully...";
}
But had no luck. It basically needs to connect the other server and create a directory, in the name of the user.
I hope this is clear.
You can't create directories through http. You need a filesystem connection to the remote location (a local hard disk, or a network share for example).
The easiest way that doesn't require setting up FTP, SSH or a network share would be to put a PHP script on Emerald:
<?php
// Skipping sanitation because it's only going to be called
// from a friendly script. If "dir" is user input, you need to sanitize
$dirname = $_GET["dir"];
$secret_token = "10210343943202393403";
if ($_GET["token"] != $secret_token) die ("Access denied");
// Alternatively, you could restrict access to one IP
error_reporting(0); // Turn on to see mkdir's error messages
$success = mkdir("/home/www/htdocs/docs/".$dirname);
if ($success) echo "OK"; else echo "FAIL";
and call it from the other server:
$success = file_get_contents("http://192.168.1.71/create_script.php?token=10210343943202393403&dir=HelloWorld");
echo $success; // "OK" or "FAIL"
Create a script on another server that creates the dir and call it remotely.
Make sure you have security check (+a simple password at least)
There is no generic method to access remote server filesystems. You have to use a file transfer protocol and server software to do so. One option would be SSH, which however requires some setup.
$thisdir = "ssh2.sftp://user:pass#192.168.1.71/directory/";
On Windows you might get FTP working more easily, so using an ftp:// url as directory might work.
As last alternative you could enable WebDAV (the PUT method alone works for file transfers, not creating directories) on your WAMP webserver. (But then you probably can't use the raw PHP file functions, probably needs a wrapper class or curl to utilize it.)
I know this is old but i think this might me useful, in my experience:
if(mkdir($thisdir ."/documents/name" , 0777))
doesn't work, i need to do it:
mkdir($thisdir, 0777);
mkdir($thisdir ."/documents" , 0777);
mkdir($thisdir ."/documents/name" , 0777));
hope it helps :)