I'm trying to write to an existing file in an as400 ftp server in append mode ... but it keeps giving me error ... the error it returns is that it gives "operation completed etc ..." but does not write anything to the file... what am I wrong?
// set up a connection or die
$conn_id = ftp_connect($ftp_server) or die("Couldn't connect to $ftp_server");
// try to login
if (#ftp_login($conn_id, $ftp_user, $ftp_pass)) {
echo "Connected as $ftp_user#$ftp_server\n";
} else {
echo "Couldn't connect as $ftp_user\n";
}
// execute command
if (ftp_raw($conn_id, "cd /QSYS.LIB/DWFOR800.LIB")) {
debug("cd /QSYS.LIB/DWFOR800.LIB executed successfully");
} else {
debug("could not execute cd /QSYS.LIB/DWFOR800.LIB");
}
// open file for reading
$file = CODE_PHP_PATH."test.txt";
// execute command
if (ftp_raw($conn_id, "app $file EDI50FTP.FILE")) {
debug("app $file EDI50FTP.FILE executed successfully");
} else {
debug("could not execute app $file EDI50FTP.FILE");
}
// close this connection and file handler
ftp_close($conn_id);
Maybe depends on the version of IBM i? It could be that you are on an old version. You used to have to include:
quote site namefmt 1
as the first command. So that it recognizes the naming used. It defaults to the library file system, but, at IBM i V7.2 and later at least, it will default to the integrated file system if your first command is cd.
This is my ftp log for Windows Console FTP, and it works for me.
230 MMURPHY logged on.
ftp> cd /QSYS.LIB/JMMTEST.LIB
250-NAMEFMT set to 1.
250 "/QSYS.LIB/JMMTEST.LIB" is current library.
ftp> append test.txt TEST.FILE
200 PORT subcommand request successful.
150 Sending file to member TEST in file TEST in library JMMTEST.
226 File transfer completed successfully.
ftp: 934 bytes sent in 0.21Seconds 4.36Kbytes/sec.
So it looks like your ftp commands are correct if you are using IBM i v7.2 or greater. Given that you are using PHP 5.6. It is possible that you are also using an outdated version of IBM i.
An other possibility is the file you are appending to is not a physical file with a single character field. FTP doesn't really transfer database files, but it does transfer stream files. A database file can be done, but I'm not sure it can be appended, and you typically have to use binary mode to do it.
Related
I am currently struggling with using the SSH2 built-in libraries for PHP (running version 5.5). I am trying to upload a file to an SFTP server as the title states however I keep getting a "stream operation failed" error message.
After attempting to debug the code itself the connection works, the sftp resource is assigned an ID correctly, however when fopen is called for writing the file directly to the remote server it fails.
// open Live environment if we are not in dev
$connection = ssh2_connect($this->_settings['source_host'], 22);
$authSuccess = ssh2_auth_password($connection, $this- >_settings['source_user'], $this->_settings['source_password']);
$sftp = ssh2_sftp($connection);
And finally the fopen() call:
if($operation == 'export') {
$handle = fopen("ssh2.sftp://".$sftp."/remotecopy/IN/".$filename, $mode);
}
I added debug messages in my own code to verify if the data from the _settings array is also used correctly and it is, however I can't explain the stream error.
Message: fopen(): Unable to open ssh2.sftp://Resource id #173/PATH GOES HERE/filename.xxx on remote host
Message: fopen(ssh2.sftp://Resource id #173/PATH GOES HERE/filename.xxx): failed to open stream: operation failed
As a note the file does not exist on the remote host but according to my knowledge 'w' mode in PHP fopen() should create the file if it does not exist.
I can't use the other PHP library as our whole project uses the builtin ssh2 libraries and the person in charged told me to not use it as it works fine everywhere else.
i think you'd have an easier time if you used phpseclib, a pure PHP SFTP implementation. eg.
<?php
include('Net/SFTP.php');
$sftp = new Net_SFTP('www.domain.tld');
if (!$sftp->login('username', 'password')) {
exit('Login Failed');
}
// puts a three-byte file named filename.remote on the SFTP server
$sftp->put('filename.remote', 'xxx');
// puts an x-byte file named filename.remote on the SFTP server,
// where x is the size of filename.local
$sftp->put('filename.remote', 'filename.local', NET_SFTP_LOCAL_FILE);
?>
One of the nice things about phpseclib is it's logging so if that doesn't work you can do define('NET_SSH2_LOGGING', NET_SSH2_LOG_COMPLEX); after including Net/SFTP.php and then do echo $sftp->getLog() after the point where it fails. That might provide some insight into what's going on if it still isn't working.
The answer was easy, I had incorrectly formatted path on the remote server. After verifying my settings it works just fine.
Thank you all for the hints and help.
Our customers run our php program on their own server. Some are Linux, some are Windows. To update the program I use ftp. Recently I've changed the FTP server, i.e. the physical machine running the server.
Now some of our customers experience troubles if they want to update. If they execute a testskript they get following message:
Warning: ftp_get() [function.ftp-get]: Opening BINARY mode data connection for _testupdate.txt (68 bytes). in C:\Programme\Zend\Apache2\htdocs_testupdate.php on line 65
I've tried to download the file via passive and active mode and also with ASCII and BINARY mode... . But nothing changes. Here is the code:
echo "<br> Testfilegröße wurde richtig ermittelt.";
$bstat = ftp_get ( $conn_id, "_testupdate.txt", "_testupdate.txt", FTP_BINARY); #FTP_ASCII oder FTP_BINARY
$exists = file_exists("_testupdate.txt");
At first I thought that the firewall could cause the problem, but this seems unlikely, because the testscript can connect and login to the ftp server
Has somebody an idea what I can try to solve the problem?
Try this:
ftp_pasv($conn_id, TRUE);
Helps if a firewall is indeed the culprit.
Otherwise, make sure your file is indeed binary or ASCII. This error would be thrown in the wrong case.
I'm trying to write a sizable chunk of data to a file that is opened via fopen() in php. The protocol wrapper I'm using is ftp, so the file is remote to the server running the php code. The file I'm writing to is on a Windows server.
I verified that the file does, in fact, get created by my php code, but the problem is that the data within the file is either non-existant (0KB) or writing to the file stops prematurely. Not sure why this is the case.
Here is the code I am using for handling the operation:
$file_handle = fopen($node['ftp'].$path_to_lut, "wb", 0, $node['ftp_context']);
include_once($file);
if ($file_handle)
{
fwrite($file_handle, $string); //$string is inside included $file
fclose($file_handle);
} else {
die('There was a problem opening the file.');
}
This code works fine when I host it on my local machine, but when I upload it to my webhost (Rackspace Cloud), it fails. This leads me to believe it's an issue related to the configuration of the my server at Rackspace, but want to know if there is anything I can do to my php code to make it more robust.
Any ideas to ensure fwrite actually finishes writing the string to the remote machine?
Thanks!
Okay, I changed the code that writes to the file like so:
if ($file_handle)
{
if ($bytesWritten = fwrite($file_handle, $string) ) {
echo "There were " . $bytesWritten . " bytes written to the text file.";
}
if (!fflush($file_handle)) {
die("There was a problem outputting all the data to the text file.");
}
if (!fclose($file_handle)) {
die("There was a problem closing the text file.");
}
} else {
die("No file to write data to. Sorry.");
}
What is strange is that the echo statement shows the following:
There were 10330 bytes written to the text file.
And yet, when I verify the text file size via FTP it shows it to be 0K and the data inside the file is, in fact, truncated. I can't imagine it has to do with the FTP server itself because it works if the PHP is hosted on a machine other than the one on Rackspace Cloud.
** UPDATE **
I spoke to a Rackspace Cloud rep who mentioned that they require passive ftp if you're going to ftp from their servers. I setup the remote server to handle passive ftp connections, and have verified that passive ftp now works on the remote server via the OSX Transmit ftp client. I added:
ftp_pasv($file_handle, true);
Right after the fopen() statement, but I get an error from PHP saying the I didn't provide a valid resource to ftp_pasv(). How can I ensure that the connection to the ftp site that PHP makes is PASV and not ACTIVE and still use fwrite()? Incidentally, I've noticed that the Windows machine reports that the file being written by my PHP code is 4096 bytes on disk. It never gets beyond that amount. This led me to change the output_buffering php value to 65536 just to troubleshoot, but that didn't fix the issue either. . .
** UPDATE PART DUEX **
Troubleshooting the problem on the my virtual server on the Rackspace Cloud Sites product was proving too difficult because they don't offer enough admin rights. I created a very small cloud server on Rackspace's Cloud Server product and configured everything to the point where I'm still seeing the same error with fwrite(). To make sure that I could write a file from that server to a remote server, I used basic ftp commands within my bash shell on the cloud server. It worked fine. So, I assume that there is a bug within the php implementation of fwrite(), and that it is probably due to some type of data throttling issue. When I write to the remote server from my local environment which has a slow upspeed compared to what is offered on the Rackspace Cloud server, it works fine. Is there any way to effectively throttle down the speed of the write? Just askin' :)
** UPDATE PART III *
So, I took the suggestion from #a sad dude and implemented a function that might help somebody trying to write to a new file and send it off in its entirety via ftp:
function writeFileAndFTP($filename=null, $data=null, $node=null, $local_path=null, $remote_path=null)
{
// !Determin the path and the file to upload from the webserver
$file = $local_path.'/'.$filename;
// !Open a new file to write to on the local machine
if (!($file_handle = fopen($file, "wb", 0))) {
die("There was a problem opening ".$file." for writing!");
}
// !Write the file to local disk
if ($bytesWritten = fwrite($file_handle, $data) ) {
//echo "There were " . $bytesWritten . " bytes written to " . $file;
}
// !Close the file from writing
if (!fclose($file_handle)) {
die("There was a problem closing " . $file);
}
// !Create connection to remote FTP server
$ftp_cxn = ftp_connect($node['addr'], $node['ftp_port']) or die("Couldn't connect to the ftp server.");
// !Login to the remote server
ftp_login($ftp_cxn, $node['user'], getPwd($node['ID'])) or die("Couldn't login to the ftp server.");
// !Set PASV or ACTIVE FTP
ftp_pasv($ftp_cxn, true);
// !Upload the file
if (!ftp_put($ftp_cxn, $remote_path.'/'.$filename, $file, FTP_ASCII)) {
die("There was an issue ftp'ing the file to ".$node['addr'].$remote_path);
}
// !Close the ftp connection
ftp_close($ftp_cxn);
}
The length of the string fwrite can write in one go is limited on some platforms (which is why it returns the number of bytes written). You can try running it in a loop, but a better idea is to simply use file_put_contents, which guarantees that the whole string will be written.
http://www.php.net/manual/en/function.file-put-contents.php
I need to write a script that is run as a cron job every night which transfers some report files via sftp to another server.
The report files are created every night using another cron in the format 'support_[date].csv' & 'download_[date].csv'.
I'm wondering if you had any pointers on how to do the following:
Find the 2 files created on latest [date]
Copy these files to another server using SFTP
I've tried several PHP scripts utilising the ssh2 extension, but to no avail. Is there a way to do it using a shell script? It's not something I am hugely familiar with to be honest (hence going down the PHP route initially)
This was one of my PHP scripts which didn't work:
$src = 'test.csv';
$filename = 'test.csv';
$dest = '/destination_directory_on_server/'.$filename;
$connection = ssh2_connect('example.com', 22);
ssh2_auth_password($connection, 'username', 'password');
// Create SFTP session
$sftp = ssh2_sftp($connection);
$sftpStream = fopen('ssh2.sftp://'.$sftp.$dest, 'w');
try {
if (!$sftpStream) {
throw new Exception("Could not open remote file: $dest<br>");
}
$data_to_send = file_get_contents($src);
if ($data_to_send === false) {
throw new Exception("Could not open local file: $src.<br>");
}
if (fwrite($sftpStream, $data_to_send) === false) {
throw new Exception("Could not send data from file: $src.<br>");
} else {
//Upload was successful, post-upload actions go here...
}
fclose($sftpStream);
} catch (Exception $e) {
//error_log('Exception: ' . $e->getMessage());
echo 'Exception: ' . $e->getMessage();
if($sftpStream) {fclose($sftpStream);}
}
This were the error messages I got:
Warning: fopen() [function.fopen]: URL
file-access is disabled in the server
configuration in
/path_to_script/sftp-test.php on line
17
Warning: fopen(ssh2.sftp://Resource id
3/destination_directory_on_server/test.csv)
[function.fopen]: failed to open
stream: no suitable wrapper could be
found in /path_to_script/sftp-test.php
on line 17 Exception: Could not open
remote file:
/destination_directory_on_server/test.csv
using the terminal to find latest date of your file, you can use ls -1tr . Then use scp (not sftp) to copy/transfer files over
example,
#!/bin/bash
latest_download=$(ls -1tr download*csv | tail -1)
latest_support=$(ls -1tr support*csv | tail -1)
scp $latest_download user#somehost.com:somedir # syntax from memory, check man page for correct syntax
scp $latest_support user#somehost.com:somedir
check the man page of scp for usage
Muchos kudos to ghostdog74! Managed to get this working, but with sftp.
First I managed to set up key authentication, then partly using ghostdog74's script I did this and it worked perfectly!
cd /directorywithfilesin
latest_download=$(ls -1tr download* | tail -1)
latest_support=$(ls -1tr support* | tail -1)
sftp username#example.com <<EOF
cd /dir_to_copy_to
put $latest_download
put $latest_support
EOF
Thanks!
Among other problems with ghostdog74's method is that it's non-portable. My recommendation would be to use phpseclib, a pure PHP SFTP implementation.
This will not work from PHP from your server because your php.ini has disabled remote wrappers
allow_url_fopen boolean
This option enables the URL-aware fopen wrappers that enable accessing URL object like files. Default wrappers are provided for the access of remote files using the ftp or http protocol, some extensions like zlib may register additional wrappers.
Note: This setting can only be set in php.ini due to security reasons.
However, you could simply let your cron job call a shell script that that uses sftp or rsync directly. You don't have to do this with PHP.
I'm voting to move this to ServerFault to get better support for shell scripting.
The answer is right there, in the error message:
Warning: fopen() [function.fopen]: URL file-access is disabled in the server configuration
means that file-access through URL wrappers is disabled in the server configuration.
Check your PHP config, especially allow_url_fopen. PHP documentation says "This setting can only be set in php.ini due to security reasons", so check it there.
See also fopen: "If PHP has decided that filename specifies a registered protocol, and that protocol is registered as a network URL, PHP will check to make sure that allow_url_fopen is enabled. If it is switched off, PHP will emit a warning and the fopen call will fail." As far as I can tell, that's exactly what is happening there.
If you can't or won't enable allow_url_fopen, you still have some options:
call sftp directly
mount a share with sshfs and then use it as a normal folder
Try as follows (Shell)
SFTP=<sftp path>
KEY_FILE=<your key>
USERNAME=<remote username>
SERVER =<remote server>
REMOTE_DIR=<remote location>
APP_HOME =<App location>
FILENAME=<file name>
${SFTP} -o IdentityFile=${KEY_FILE} ${USERNAME}#${SERVER} <<_COMMAND
lcd ${APP_HOME}
cd ${REMOTE_DIR}
put ${FILENAME}
bye
_COMMAND
I want to setup a CRON that runs a PHP script that in turn moves XML file (holding non-sensitive information) from one server to another.
I have been given the proper username/password, and want to use SFTP protocol. The jobs will run daily. There is the potential that one server is Linux and the other is Windows. Both are on different networks.
What is the best way to move that file?
If both servers would be on Linux you could use rsync for any kind of files (php, xml, html, binary, etc). Even if one of them will be Windows there are rsync ports to Windows.
Why not try using PHP's FTP functions?
Then you could do something like:
// open some file for reading
$file = 'somefile.txt';
$fp = fopen($file, 'r');
// set up basic connection
$conn_id = ftp_connect($ftp_server);
// login with username and password
$login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);
// try to upload $file
if (ftp_fput($conn_id, $file, $fp, FTP_ASCII)) {
echo "Successfully uploaded $file\n";
} else {
echo "There was a problem while uploading $file\n";
}
// close the connection and the file handler
ftp_close($conn_id);
fclose($fp);
Why not use shell_exec and scp?
<?php
$output = shell_exec('scp file1.txt dvader#deathstar.com:somedir');
echo "<pre>$output</pre>";
?>
I had some similar situation.
After some tries, I did some thing different
We have 2 servers,
a (that have the original files)
b (files should moved to it)
And for sure the data is NOT sensitive
Now in server a I made a file to do the following when called:
1. Choose the file to move
2. Zip the file
3. Print the .zip file location
4. Delete the .zip file (and the original file) if delete parameter passes
In the server b the file should do:
1. Call the file on the a server
2. Download the zip file
3. Unzip and copy it to the proper location
4. Call the delete function on server a
This way I have more control on my functions, tests and operations!