I have copied and tried many PHP scripts from SO posts. I am trying to download files from a server running Centos. Via psftp (putty) I can login manually and copy files. But I want to automate the process, hence the need for a script.
On a similar server running on Windows am able to download files by ftp via a simple Perl script. On the Centos server I get connection refused with the Perl script. So I tried several php scripts. Are the scripts below (from SO posts) for the job? or what is wrong with the scripts?
script 1
#!/usr/bin/php
<?php
include('Net/SSH2.php');
$sftp = new Net_SFTP('xx.xx.xxx.xxx');
if (!$sftp->login('myuser', 'mypasswd')) {
exit('Login Failed');
}
// outputs the contents of filename.remote to the screen
echo $sftp->get('gateway_data*');
?>
Script 2
#!/usr/bin/php
<?php
include('Net/SSH2.php');
username='myuser';
password='mypasswd';
// Create SCP connection using a username and password
$scp = new SCP(
'xx.xx.xxx.xxx',
new SSH2Password($username, $password)
);
#################################
$sftp = ssh2_sftp($conn);
// Create a new local folder
ssh2_sftp_mkdir($sftp, './data');
// Retrieve a list of files
$files = scandir('ssh2.sftp://' . $sftp . '/data/gateway_data*');
################################################################
?>
In the first of PHP script you have posted you're doing echo $sftp->get('gateway_data*'); whereas in the Perl script you're doing cp gateway_data_301.txt. Try doing that in the PHP script. eg. echo $sftp->get('gateway_data_301.txt');.
As is it is unclear what you're expecting to happen. Unless the file name /actually/ has a wild card in it then are you expecting it to download every file that starts off with gateway_data* and just concatenate them in the output? Personally, I think just returning false or NULL would be better than that.
You can use your script 2 in PHP. However something is missing there. You are only opening the source directory. You must write a loop over all files in that folder.
// Retrieve a list of files
$files = scandir('ssh2.sftp://' . $sftp . '/data/gateway_data*');
foreach ($files as $key => $value) {
See the example of how to send a file with SFTP using SFTPConnection.
Related
i'm stuck with a Project i'm currently working on.
I have to make a PHP Script that uploads a File to a specific FTP, the file gets processed by another script which is observing the FTP on the Fly. After the processing is done a new File is generated with one of 4 possible file extensions and the original file gets deleted automaticly. Here's where my Problem starts, i'm not that much into PHP for i'm working with it far too rarely.
So i have to search for the file with one of the four possible Extensions and download it to the machine where the PHP Script is running on and the search needs to be done by this PHP Script. Any suggestions how to achieve this for i have not a glimps of a clue :(
You can not search through FTP protocol.
You have to list a directory and then search for desired file(s) locally:
$ftp = ftp_connect( $ftpHost );
ftp_login( $ftp, $ftpUsername, $ftpPassword ) or die( 'Oh No!' );
$files = ftp_nlist( $ftp, 'www/myDir' );
$filteredFiles = preg_grep( '/\.php$/i', $files );
ftp_close($ftp);
With above example all the files in www/Dir directory with .php extension are now in $filteredFiles array.
Alternatives:
If your remote server allow SSH2 connection, you can retrieve the files list through a SSH2 connection;
If your remote server is php/HTTP enabled, you can write a php script on remote server to search file(s) and then perform an HTTP request.
I'm making a utility that provides a GUI to easy edit certain values in a csv file on a remote server. My boss wants the utility in php running on the private webserver. I'm new to php, but I was able to get the GUI file modifier working locally without issues. The final piece now is rather than the local test file I need to grab a copy of the requested file off of the remote server, edit it, and then replace the old file with the edited one. My issue is uploading and downloading the file.
When I searched for a solution I found the following:
(note in each of these I am just trying to move a test file)
$source = "http://<IP REMOTE SERVER>/index.html";
$dest = $_SERVER['DOCUMENT_ROOT']."index.html";
copy($source, $dest);
This solution ran into a permissions error.
$source ="http://<IP REMOTE SERVER>/index.html";
$destination = $_SERVER['DOCUMENT_ROOT']."newfile.html";
$data = file_get_contents($source);
$handle = fopen($destination, "w");
fwrite($handle, $data);
fclose($handle);
This also had a permissions error
$connection = ssh2_connect('<IP REMOTE SERVER>', 22);
ssh2_auth_password($connection, 'cahenk', '<PASSWORD>');
ssh2_scp_recv($connection, '/tmp/CHenk/CHenk.csv', 'Desktop/CHenk.csv');
This solution has the error Fatal error: Call to undefined function ssh2_connect() which I have learned is because the function is not a part of the default php installation.
In closing, is there any easy way to read/write files to the remote server through php either by changing permissions, having the php extension installed, or a different way entirely that will work. Basically I'm trying to find the solution that requires the least settings changes to the server because I am not the administrator and would have to go through a round about process of getting any changes done. If something does need to be changed instructions on doing so or a link to instructions would be greatly appreciated.
Did you set the enable-url-fopen-wrapper in your php.ini?(only if your php version is older)
Please look # php remote files storing in example 2
I want to keep a folder on one machine in sync with a folder on another. This is for a WordPress deployment plugin so I can't rely on rsync or other commands being present on either machine. PHP and a web server will be available on both machines and ideally it would work over HTTP.
My current thinking is that the requesting machine posts the local file list with last modified dates to a script on the other machine. The other machine compares with its files and responds with the modified files - either a list of files to be fetched individually or with the changed files inlined in the response.
I'd rather use an existing solution if one exists, though. Any ideas?
I've created a simple set of classes to implement this: https://github.com/outlandishideas/sync
On the server, e.g. example.com/remote.php:
const SECRET = '5ecR3t'; //make this long and complicated
const PATH = '/path/to/source'; //sync all files and folders below this path
$server = new \Outlandish\Sync\Server(SECRET, PATH);
$server->run(); //process the request
On the client:
const SECRET = '5ecR3t'; //this must match the secret key on the server
const PATH = '/path/to/destination'; //target for files synced from server
$client = new \Outlandish\Sync\Client(SECRET, PATH);
$client->run('http://example.com/remote.php'); //connect to server and start sync
Your best bet is checking when the script was last run and then uploading the folder with ftp_* functions.
<?php
$username = 'root'; // and this
$password = 'password'; // this also
$host = 'my-remote-server.com'; // and this
$remote_backup = 'backups/'; // folder on remote server to upload to
$backup_folder = 'to_backup/'; // folder to backup
$temp_folder = 'temp_files/'; // a folder on the local server i can write to
$last_run = file_get_contents("{$temp_folder}last_run.txt"); // You'll probably want to get this from a database instead
if($last_run <= strtotime('-1 day'))
{
file_put_contents("{$temp_folder}last_run.txt", time()); // Update the last time this was ran
$file = time() . '_backup.zip'; // what the file will be called both remotely and locally
$ftp = ftp_connect($host); // connect to the ftp server
ftp_login($ftp, $username, $password); // login to the ftp server
$zip = new ZipArchive; // create a new instance of ZipArchive
$zip->open($temp_folder . $file, ZIPARCHIVE::CREATE); // Create a new archive
foreach(glob($backup_folder . '*') as $file) // Loop through all files in the local backup directory
{
$zip->addFile($file); // add that file
}
ftp_chdir($ftp, $remote_backup); // cd into the remote backup folder
$upload = ftp_nb_put($ftp, $remote_backup . $file, $temp_folder . $file); // non-blocking put, uploads the local backup onto the remote server
while($upload === FTP_MOREDATA)
{
// do something else while we're waiting for the non-blocking upload to finish
}
ftp_close($ftp); // closes the connection
}
It should be non-blocking (well - the upload to the remote server), so if you don't have many files to zip it'll be fine to include on the index page for example. There isn't any error handling so you may want to add that in. It also doesn't delete the local backup, either, you may want to handle that.
In PHP I would not recommend it for a bunch of reasons.
I have exactly what you need as a python app.
This app is built to run as a service, you simply start it and forget about it :)
App: https://gist.github.com/8f62786582c6933395eb
Shell: https://gist.github.com/e08a99937c6f5deac4ab
Note: the shell file should be called fsyncd not fsyncd.sh :)
The PHP version of the above:
https://gist.github.com/3963cbc58793ff7e9773
Note: You need to get it running on both sites and configure each to connect to the other and set them to be executed by crons. Preferably not by WP crons.
I have the path to the directory that will be synced defined here:
define("PATH_DATA", PATH_ROOT . "data" . DIRECTORY_SEPARATOR);
In my case the data folder is in the script folder. You should just set an absolute path or use the WP core to get the WP uploads dir.
The principal is:
find a way to get the two servers capable of talking to each other.
I used a socket server/client approach.
You could do a HTTP _POST processor (server) and a HTTP _POST maker (client).
Keep a record of last sync time.
At certain intervals read folder for and record any files modified from the last sync time.
Send list of files to be updated with their modified timestamp to the other server.
It should compare your list to his records and tell you which of the files he does not have.
Send those files.
The receiver will write the files and set modified date to the one on the other server. (this is important, to avoid infinite loops)
Good luck.
Suppose there is a file (test.txt) with content.
testing php data
employee data
country data
I want to write this content in the /tmp dir of a remote linux machine. I am using following code
// $con contains all content of the text.txt file
$con = file_get_contents("C:/wamp/www/test.txt");
// $ssh is the sshobject for the remote machine
$ssh->exec('echo "$con" > /tmp/text1.txt');
It creates a empty file on the remote machine. What should I use for make copy of content on remote machine ?
check your quotes.
$ssh->exec('echo "'.$con.'" > /tmp/text1.txt');
$con won't be parsed with a single quote.
You should write this code
// $con contains all content of the text.txt file
$con = file_get_contents("C:/wamp/www/test.txt");
// $ssh is the sshobject for the remote machine
$ssh->exec("echo '{$con}' > /tmp/text1.txt");
I need to run a series of six .sh files on the server.
An example of one of the .sh files:
wget ftp://xxxxxx:xxxxxx#ftp.interhome.com/accommodation.xml.zip
unzip accommodation.xml.zip
php accommodation.php
rm -rf accommodation.xml.zip
rm -rf accommodation.xml
I tried running the following from a php file:
echo shell_exec('sh accomodation.sh');
Which was stupid because the file appears to execute repeatedly and I think I've just taken down the server. Whoops.
I've inherited this site and have never used .sh files before. I'm also a php novice.
How would I go about running the files only once and then running the next?
Many thanks
you can do all this from within PHP, you do not need any shell-script.
/* get the file via ftp */
// connect to server
$ftp = ftp_connect('ftp.interhome.com');
// login
$login = ftp_login($ftp,"username","password");
// download file to tmp.zip
$file = ftp_get($ftp, 'tmp.zip', 'accommodation.xml.zip', FTP_BINARY);
// disconnect from server
ftp_close($ftp);
/* unzip the file */
// new zip-instance
$zip = new ZipArchive;
// open downloaded file
$res = $zip->open(’tmp.zip’);
// check if file is readable
if ($res === TRUE) {
// extract to current directory
$zip->extractTo(’./’);
// close zip-file
$zip->close();
}
/* your code from accommodation.php goes here */
// delete files
unlink('tmp.zip');
unlink('accommodation.xml');
voila