PHP - Download from URL and Upload via FTP - php

Slightly weird concept here... A client of ours wants data pushed to them over FTP/S
The idea is that we download one of our reports by downloading from a URL (a CSV File), then push this to the client over FTP/S. I know I can do this in bash scripts using wget & ftp - but need to add to this over a web interface so PHP is the best way forward.
As this is a background task I can extend time-outs etc.
I know also I can use fopen to download and save a file, then find it and upload it using the PHP FTP library. Just looking for a way to download using fopen, hold the data in memory to upload straight away.
Any help appreciated in advance!

To retrieve the data from the URL you have a few options. You say you want the data in memory only to push directly to the FTP host.
One approach (that I find the simplest to use, but lacking in terms of reliability and error handling) is file_get_contents()
Example:
$url = 'http://www.domain.com/csvfile';
$data = file_get_contents($url);
Now you have your csv data in $data, over to how to push this to an ftp server.
Again the simplest way to do this is to use the builtin stream wrappers as used in the get example above. (Note however that this requires PHP 4.3.0)
Simply build up the connection string like this.
$protocol = 'ftps';
$hostname = 'ftp.domain.com';
$username = 'user';
$password = 'password';
$directory = '/pub';
$filename = 'filename.csv';
$connectionString = sprintf("%s://%s:%s#%s%s/%s",
$protocol,$username,$hostname,
$password,$directory,
$filename);
file_put_contents($connectionString,$data);
Have a look at the ftp wrappers manual
If this does not work there are other options.
You could use curl to get the data and the FTP Extension to push it.

To avoid saving the file to disk and "to upload straight away" i.e. to start pushing to FTP as soon as the first chunk of Data is downloaded?
Try this:
http://www.php.net/manual/en/function.stream-copy-to-stream.php
You'll need an FTP server and client library which support resuming uploads

Related

PHP Read/Write files on remote server

I'm making a utility that provides a GUI to easy edit certain values in a csv file on a remote server. My boss wants the utility in php running on the private webserver. I'm new to php, but I was able to get the GUI file modifier working locally without issues. The final piece now is rather than the local test file I need to grab a copy of the requested file off of the remote server, edit it, and then replace the old file with the edited one. My issue is uploading and downloading the file.
When I searched for a solution I found the following:
(note in each of these I am just trying to move a test file)
$source = "http://<IP REMOTE SERVER>/index.html";
$dest = $_SERVER['DOCUMENT_ROOT']."index.html";
copy($source, $dest);
This solution ran into a permissions error.
$source ="http://<IP REMOTE SERVER>/index.html";
$destination = $_SERVER['DOCUMENT_ROOT']."newfile.html";
$data = file_get_contents($source);
$handle = fopen($destination, "w");
fwrite($handle, $data);
fclose($handle);
This also had a permissions error
$connection = ssh2_connect('<IP REMOTE SERVER>', 22);
ssh2_auth_password($connection, 'cahenk', '<PASSWORD>');
ssh2_scp_recv($connection, '/tmp/CHenk/CHenk.csv', 'Desktop/CHenk.csv');
This solution has the error Fatal error: Call to undefined function ssh2_connect() which I have learned is because the function is not a part of the default php installation.
In closing, is there any easy way to read/write files to the remote server through php either by changing permissions, having the php extension installed, or a different way entirely that will work. Basically I'm trying to find the solution that requires the least settings changes to the server because I am not the administrator and would have to go through a round about process of getting any changes done. If something does need to be changed instructions on doing so or a link to instructions would be greatly appreciated.
Did you set the enable-url-fopen-wrapper in your php.ini?(only if your php version is older)
Please look # php remote files storing in example 2

Is it possible to download a file from one server (sftp) and upload it to my server using php?

I have been told this cannot be done but I want to get some other opinions here. I am a bit of a newbie when it comes to things like this.
My Site: ExampleSiteA.com
File to download: ExampleSiteB.com
Basically, I am downloading a csv file from ExampleSiteB.com to make updates to my site, ExampleSiteA.com. To do this, I am downloading the csv file manually through CoreFTP and then uploading it manually to ExampleSiteA.com. The file changes daily and I would like to skip this step so I can automate the process.
Keep in mind that I need to download the csv file from ExampleSiteB.com through SFTP...
I am not sure if it is possible to directly download/upload a file from one server to another if one is SFTP. The file size is also quite large, it averages about 25,000 KB / 25 MB.
Another option that I haven't explored yet is requiring or including a file from another server... is that an option or a possibility? The file is located in a folder exclusively for my site and a login is required for SFTP download.
Any insight will be appreciated. Thanks in advance!
Go here and download what you need: http://phpseclib.sourceforge.net/
UPDATE
FOR SFTP
Then in your script:
<?php
include('Net/SFTP.php');
$url = 'http://www.downloadsite.com';
$fileToDownload = "yourCSV.csv";
$cmd = "wget -q \"$url\" -O $fileToDownload";
exec($cmd);
$sftp = new Net_SFTP('www.uploadsite.com');
if (!$sftp->login('username', 'password')) {
exit('Login Failed');
}
echo $sftp->pwd() . "\r\n";
$sftp->put('remote.file.csv', 'yourCSV.csv', NET_SFTP_LOCAL_FILE);
print_r($sftp->nlist());
?>
If you need to connect to a second server for download:
$sftp2 = new Net_SFTP('www.serverFromWhichToDownload.com');
if (!$sftp2->login('username', 'password')) {
exit('Login Failed');
}
echo $sftp2->pwd() . "\r\n";
$sftp2->get('localFileName.csv', 'remoteFileName.csv');
print_r($sftp2->nlist());
Read the docs for further help and examples: http://phpseclib.sourceforge.net/documentation/net.html#net_sftp_get
To Log what your connection is doing if it fails, etc. use this:
include('Net/SSH2.php');
define('NET_SSH2_LOGGING', true);
$ssh = new Net_SSH2('www.domain.tld');
$ssh->login('username','password');
echo $ssh->getLog();
FOR FTP upload - SO has gone crazy, does not want to format my code, but here it is anyway:
$file = 'somefile.txt';
$remote_file = 'readme.txt';
$conn_id = ftp_connect($ftp_server);
$login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);
if (ftp_put($conn_id, $remote_file, $file, FTP_ASCII)) {
echo "successfully uploaded $file\n";
} else {
echo "There was a problem while uploading $file\n";
}
ftp_close($conn_id);
Yes, that's possible using ssh2_sftp.
http://php.net/manual/en/function.ssh2-sftp.php
I have had good luck with cURL in the past. If you are on a Linux box, it would be trivial to set up a CRON job to do this update process for you. A good reference for CLI HTTP scripting in cURL can be found here, however you may need the -T flag (for file transport) to accomplish the upload portion. Speaking of uploading, if you can run the script/process/crontab from the server you would like to update, I would recommend downloading from the web server to obviate one trip and a third party. Or, if you need to update on demand, you could write a PHP script that uses the built in PHP cURL functions. If you take the Linux+CLI route, you could also use sftp.
Update: In testing curl with sftp (curl -u uname:pword sftp://domain.tld) I get the following error: curl: (1) Protocol sftp not supported or disabled in libcurl on Kubuntu 12.04. So cURL may not be a good idea. I also tested CLI sftp (sftp uname#domain.tld:/dir/file.ext) but could not find a way (short of using ssh keys) to send authentication. Thus, this would necessarily be a manual process unless you did set up ssh keys between the servers. As it does not sound like you have that kind of access to ExampleSiteB.com, this probably isn't acceptable.
Update 2: Since my initial answer turned out to be of little use, I figured I would expand upon one of the above answers. I was trying to find a solution that did not involve a PECL extension, but I did not have much luck with ftp_ssh_connect(). I recommend trying it, you may have better luck and could forgo the PECL extension route.
Sigh, on further reading, it appears ftp_ssh_connect is, understandably, incompatible with the sftp protocol. However, I found a nice blog post about utilizing ssh2_connect() and ssh2_sftp() (as mentioned in a previous answer) and figured I would post that to give you some additional assistance. It is not as simple as calling the functions for most PHP distributions. Here is the blog post. Some of those steps may not be necessary or you may need to do some additional things listed in another blog post I ran across, here.
On my system, all I had to do was run apt-get install libssh2-1-dev libssh2-php and I was able to find ssh2 in my php -m output.
Having an include, as long as you have read/write permissions on the website you're getting the file from should work, however this is just guess work atm as i don't have any means of checking it. Good luck though!
Yes, you should be able to do this.
Whoever told you that you can't do this might be getting confused with JavaScript and cross-site scripting browser restrictions which prevent JavaScript downloaded from one domain to access content in a different domain.
That being said, if you are using PHP which to me implies that you are talking about PHP running on a web sever, you should be able to use PHP or any other scripting or programming language to download the file from SiteB.com, then update the file, and then finally FTP the file to a different web server (SiteA.com).

PHP FTP transfer

I have files that are automatically uploaded onto a server from mobile phones, and I need to automatically transfer these files from the server to another server using PHP.
Could someone please explain how I would do this?
Thanks for any help
PHP has FTP functionality built in with FTP wrappers:
Allows read access to existing files and creation of new files via FTP. If the server does not support passive mode ftp, the connection will fail.
This means you can use FTP like any other file - an extremely simple example:
<?php
$data = file_get_contents('some/other/file.txt');
$fname = "ftp://name:yourpassword#127.55.41.10:21/some/path/filename.txt";
file_put_contents($fname,$data);
?>

Read content of 12000 files from another FTP server

What I would like to script: a PHP script to find a certain string in loads of files
Is it possible to read contents of thousands of text files from another ftp server without actually downloading those files (ftp_get) ?
If not, would downloading them ONCE -> if already exists = skip / filesize differs = redownload -> search certain string -> ...
be the easiest option?
If URL fopen wrappers are enabled, then file_get_contents can do the trick and you do not need to save the file on your server.
<?php
$find = 'mytext'; //text to find
$files = array('http://example.com/file1.txt', 'http://example.com/file2.txt'); //source files
foreach($files as $file)
{
$data = file_get_contents($file);
if(strpos($data, $find) !== FALSE)
echo "found in $file".PHP_EOL;
}
?>
[EDIT]: If Files are accessible only by FTP:
In that case, you have to use like this:
$files = array('ftp://user:pass#domain.com/path/to/file', 'ftp://user:pass#domain.com/path/to/file2');
If you are going to store the files after you download them, then you may be better served to just download or update all of the files, then search through them for the string.
The best approach depends on how you will use it.
If you are going to be deleting the files after you have searched them, then you may want to also keep track of which ones you searched, and their file date information, so that later, when you go to search again, you won't waste time searching files that haven't changed since the last time you checked them.
When you are dealing with so many files, try to cache any information that will help your program to be more efficient next time it runs.
PHP's built-in file reading functions, such as fopen()/fread()/fclose() and file_get_contents() do support FTP URLs, like this:
<?php
$data = file_get_contents('ftp://user:password#ftp.example.com/dir/file');
// The file's contents are stored in the $data variable
If you would need to get a list of the files in the directory, you might want to check out opendir(), readdir() and closedir(), which I'm pretty sure supports FTP URLs.
An example:
<?php
$dir = opendir('ftp://user:password#ftp.example.com/dir/');
if(!$dir)
die;
while(($file = readdir($dir)) !== false)
echo htmlspecialchars($file).'<br />';
closedir($dir);
If you can connect via SSH to that server, and if you can install new PECL (and PEAR) modules, then you might consider using PHP SSH2. Here's a good tutorial on how to install and use it. This is a better alternative to FTP. But if it is not possible, your only solution is file_get_content('ftp://domain/path/to/remote/file');.
** UPDATE **
Here is a PHP-only implementation of an SSH client : SSH in PHP.
With FTP you'll always have to download to check.
I do not know what kind of bandwidth you're having and how big the files are, but this might be an interesting use-case to run this from the cloud like Amazon EC2, or google-apps (if you can download the files in the timelimit).
In the EC2 case you then spin up the server for an hour to check for updates in the files and shut it down again afterwards. This will cost a couple of bucks per month and avoid you from potentially upgrading your line or hosting contract.
If this is a regular task then it might be worth using a simple queue system so you can run multiple processes at once (will hugely increase speed) This would involve two steps:
Get a list of all files on the remote server
Put the list into a queue (you can use memcached for a basic message queuing system)
Use a seperate script to get the next item from the queue.
The procesing script would contain simple functionality (in do while loop)
ftp_connect
do
item = next item from queue
$contents = file_get_contents;
preg_match(.., $contents);
while (true);
ftp close
You could then in theory fork off multiple processes through the command line without needing to worry about race conditions.
This method is probabaly best suited to crons/batch processing, however it might work in this situation too.

How do I use PHP CLI to automate FTP when I don't have access to PHP's native FTP handle?

I'm writing an automation script on a production server that, among other things, needs to grab a list of remote files via FTP (FTP is the only option for interacting with the remote filesystem) and selectively download them.
Why I can't use PHP's native FTP wrappers
This is a production server in a very brittle environment. I'm writing it using PHP CLI, since most of the existing automation scripts are written this way. However, although I have a very new PHP 5.1.2 installation, I'm not able to recompile it with --with-ftp, and that option is not enabled.
The remaining options
So, my options are to connect, get my file list, and selectively download using shell_exec() or the php_filesystem functions using an FTP stream and the PHP native filesystem functions.
Unfortunately, I'm not able to find good code examples of either. When I try to shell_exec using FTP commands, the program hangs, presumably because control stays at the shell once I open up the FTP prompt.
$ftp_connect_command = "ftp -v -n $bl_ftp_host";
$ftp_login_command = "user $bl_ftp_user $bl_ftp_password";
$ftp_bye_command = "bye";
$ftp_connect_response = shell_exec("$ftp_connect_command");
// this never executes, because it hangs here waiting for a return to shell
$ftp_login_response = shell_exec($ftp_login_command);
Or, I imagine the stream based way to do this would be:
$ftp_path = "ftp://$bl_ftp_user:$bl_ftp_user#$bl_ftp_host/";
$stream_options = array('ftp' => array('overwrite' => false));
$context = stream_context_create();
if ($dh = opendir($ftp_path, $context))
{
while (filename = readdir($dh))
{
print($filename);
}
}
But I'm not sure if this is considered a reliable method.
Can anyone provide code samples showing how to capture a directory list and download files by either of these methods?
Apparently, wordpress uses pemftp for pure-PHP FTP (on systems compiled without FTP support)
Have you looked at the native PHP FTP library?
http://us2.php.net/ftp

Categories