ftp_put() working sometimes but most of the time doesn't - php

I'm developing a web application with PHP, to upload files to a web server which merges, compresses those files (pdf) and then upload the new files to another FTP server. This second ftp server is used as a network driver in my offices so people can work with uploaded files.
The number of files goes from 2 to 7 for one upload.
I'm having the following problem: Sometimes files are uploaded, but most of the time they aren't. Sometimes one file on three is uploaded and the two others are not.
I almost always get this:
Warning: ftp_nb_put(): The connection timed out in ...
Warning: ftp_nb_put(): Opening data channel for uploading files to server from...
Warning: ftp_nb_put(): File not found in...
I tested with another (personal) FTP server from a webhost and it worked perfectly 10/10 times.
So I guess that the problem is coming from the ftp server.
I also tested with passive mode, but I get more error, even with :
ftp_pasv($ftp, true);
ftp_set_option($ftpconn, FTP_USEPASVADDRESS, false);
I don't have access to the FTP server settings, so I would like to know what could possibly be blocking downloads to give information to the service that manages this.
This is my code, for the FTP part :
$ftp = ftp_connect($ftp_hostname, 21);
ftp_login($ftp, $ftp_username, $ftp_password);
ftp_mkdir($ftp, $targetdir);
foreach($file as $filename => $path) {
if(file_exists($path)) {
$upload = ftp_nb_put($ftp, $targetdir.$filename, $path, FTP_BINARY, FTP_AUTORESUME);
while (FTP_MOREDATA == $upload){
$upload = ftp_nb_continue($ftp);
}
if ($upload != FTP_FINISHED) {
echo "Error with : ".$path;
}
}
else {
echo("File not found : ".$path);
}
}
ftp_close($ftp);
I hope I didn't miss any information,
Thanks !

Related

FTP corrupts files after certain delay and/or size

i am building a few Excel files with PHPExcel and store them on the server running the script.
This part works well.
i then try to send them to my client's ftp server using ftp_put, but the files arrive corrupted to the server.
When opening them after downloading them back from the ftp server with Filezilla, i get an error saying that there is a problem with the content of the file and that they can try to repair it as much as they can if i wish so. Doing so doesn't help.
if the excel file is under 100-120k its sent uncorrupted, bigger than that it gets corrupted.
script to send via ftp
$conn_id = ftp_connect($hostFTP);
if($login_result = ftp_login($conn_id, $userFTP, $passwordFTP))
{
ftp_pasv($conn_id, true);
if(ftp_put($conn_id, $remote_path.$output_filename, $localPath.$output_filename, FTP_BINARY))
{
$log = new Logs("listes.log", "Uploaded $output_filename");
}
else
{
$log = new Logs("listes.log", "FAIL Uploading $output_filename");
}
ftp_close($conn_id);
}
Am i doing something wrong? what can i do?
edit :
comparing the files in text show some differences starting at line 231, which i assume is around the 100-120k mark
tried with an image(jpg) and it also gets corrupted (top of image is fine, but at a point it just corrupts and the rest of the image is pretty much single colored), so problem isnt with excel;
Is it possible that the server (opensuse) limits ftp? and if so how?
Well i've found the problem, its not the code. its the internet service. We changed the internet provider yesterday and the ftp now works fine. I guess the ftp connection was being interrupted with the previous provider.

Uploading a local file to remote server with PHP and FTP

I've been trying to make a system that can upload large files, originally i used HTTP but this had a number of problems with settings that needed to be changed. So i thought i'd give it a go with FTP.
Now i have a ftp connection in PHP and it works find, i can view folders and files as well as make directories, what i can't seem to figure out though is how to get hold of a local file and upload it.
I have been reading lots of information and tutorials such as the PHP manual and a tutorial i found on nettuts But i'm struggling to do it. The tutorial says you can upload a local file but i must be missing something.
Here is the upload method i'm using:
public function uploadFile ($fileFrom, $fileTo)
{
// *** Set the transfer mode
$asciiArray = array('txt', 'csv');
$extension = end(explode('.', $fileFrom));
if (in_array($extension, $asciiArray))
$mode = FTP_ASCII;
else
$mode = FTP_BINARY;
// *** Upload the file
$upload = ftp_put($this->connectionId, $fileTo, $fileFrom, $mode);
// *** Check upload status
if (!$upload) {
$this->logMessage('FTP upload has failed!');
return false;
} else {
$this->logMessage('Uploaded "' . $fileFrom . '" as "' . $fileTo);
return true;
}
}
When trying to upload a file i use this:
$fileFrom = 'c:\test_pic.jpg';
$fileTo = $dir . '/test_pic.jpg';
$ftpObj -> uploadFile($fileFrom, $fileTo);
I thought this would get the file from my machine that is stored in the c: and upload it to the destination but it fails (Don't know why). So i changed it a little, changed the $fileFrom = test_pic.jpg and up the picture in the same folder on the remote server. When i ran this code the script copied the file from the one location to the other.
So how would i go about getting the file from my local machine to be sent up to the server?
Thanks in advance.
Using this you would upload a file from your PHP server to your FTP server, what actually not seems to be your target.
Create an upload form which submits to this PHP file. Store this file temporarily on your server and then upload it to your FTP server from there.
If your try would actually work, this would be a major security issue, because a PHP file would have access to any files on my local machine.

PHP / FTP - Simple ftp_get won't work on local server but does in production - Troubleshooting

I'm having a little problem with ftp_get. The script won't work when running on our local development server running on Centos 6.
I've done some research on Stack Overflow and tried most of the solutions without getting it to work.
I have tested the same script on a production server running Centos 5.x + cPanel and it's working.
I am wondering what could cause this on the local server?
Is there any specific settings for the file transfer to work?
Here's the list of things I've tried so far:
Errors, error_reporting(E_ALL) and see if there's any error. They were none.
Passive connection, ftp_pasv($connection, true) but it doesn't change anything.
Transfer modes, tried FTP_BINARY and FTP_ASCII. Nothing changes.
Tried using a file handler to save on the local server, didn't work either.
The original script runs in batch (download all files in a folder with specified filename). When running that script, I can say 3/4 of files get transfered succesfully to the 'dev' server.
Say about only 1/4 of them get the 'failed'. So the following script was used to troubleshoot / test-download some files that failed during the batch transfer.
They also failed transferring using this script.
I've checked all the permissions (chmod) on the distant server. Everything is set to 666.
Here's the code I'm using. It's very similar to the example on php.net
// Connection to ftp
$connection = ftp_connect($hostname);
// Login to the FTP
$login = ftp_login($connection, $username, $password);
// Passive might help?
ftp_pasv($connection, true);
// Test login
if ((!$connection) || (!$login)) {
echo "FTP Connection failed<br /><br />";
exit;
} else {
echo "Connection success<br /><br />";
}
$local_file = 'file.zip';
$server_file = 'file.zip';
// Download and save file
if (ftp_get($connection, $local_file, $server_file, FTP_ASCII)){
echo "Download win <br />";
} else {
echo "Download failed <br />";
}
Thanks for your time,
first try checking the same thing from command-line/shell of your local centos machine. Use "ftp " command and see if it works there. Mostly, it seems like a firewall problem to me.

php fwrite() doesn't finish writing string data to file, why?

I'm trying to write a sizable chunk of data to a file that is opened via fopen() in php. The protocol wrapper I'm using is ftp, so the file is remote to the server running the php code. The file I'm writing to is on a Windows server.
I verified that the file does, in fact, get created by my php code, but the problem is that the data within the file is either non-existant (0KB) or writing to the file stops prematurely. Not sure why this is the case.
Here is the code I am using for handling the operation:
$file_handle = fopen($node['ftp'].$path_to_lut, "wb", 0, $node['ftp_context']);
include_once($file);
if ($file_handle)
{
fwrite($file_handle, $string); //$string is inside included $file
fclose($file_handle);
} else {
die('There was a problem opening the file.');
}
This code works fine when I host it on my local machine, but when I upload it to my webhost (Rackspace Cloud), it fails. This leads me to believe it's an issue related to the configuration of the my server at Rackspace, but want to know if there is anything I can do to my php code to make it more robust.
Any ideas to ensure fwrite actually finishes writing the string to the remote machine?
Thanks!
Okay, I changed the code that writes to the file like so:
if ($file_handle)
{
if ($bytesWritten = fwrite($file_handle, $string) ) {
echo "There were " . $bytesWritten . " bytes written to the text file.";
}
if (!fflush($file_handle)) {
die("There was a problem outputting all the data to the text file.");
}
if (!fclose($file_handle)) {
die("There was a problem closing the text file.");
}
} else {
die("No file to write data to. Sorry.");
}
What is strange is that the echo statement shows the following:
There were 10330 bytes written to the text file.
And yet, when I verify the text file size via FTP it shows it to be 0K and the data inside the file is, in fact, truncated. I can't imagine it has to do with the FTP server itself because it works if the PHP is hosted on a machine other than the one on Rackspace Cloud.
** UPDATE **
I spoke to a Rackspace Cloud rep who mentioned that they require passive ftp if you're going to ftp from their servers. I setup the remote server to handle passive ftp connections, and have verified that passive ftp now works on the remote server via the OSX Transmit ftp client. I added:
ftp_pasv($file_handle, true);
Right after the fopen() statement, but I get an error from PHP saying the I didn't provide a valid resource to ftp_pasv(). How can I ensure that the connection to the ftp site that PHP makes is PASV and not ACTIVE and still use fwrite()? Incidentally, I've noticed that the Windows machine reports that the file being written by my PHP code is 4096 bytes on disk. It never gets beyond that amount. This led me to change the output_buffering php value to 65536 just to troubleshoot, but that didn't fix the issue either. . .
** UPDATE PART DUEX **
Troubleshooting the problem on the my virtual server on the Rackspace Cloud Sites product was proving too difficult because they don't offer enough admin rights. I created a very small cloud server on Rackspace's Cloud Server product and configured everything to the point where I'm still seeing the same error with fwrite(). To make sure that I could write a file from that server to a remote server, I used basic ftp commands within my bash shell on the cloud server. It worked fine. So, I assume that there is a bug within the php implementation of fwrite(), and that it is probably due to some type of data throttling issue. When I write to the remote server from my local environment which has a slow upspeed compared to what is offered on the Rackspace Cloud server, it works fine. Is there any way to effectively throttle down the speed of the write? Just askin' :)
** UPDATE PART III *
So, I took the suggestion from #a sad dude and implemented a function that might help somebody trying to write to a new file and send it off in its entirety via ftp:
function writeFileAndFTP($filename=null, $data=null, $node=null, $local_path=null, $remote_path=null)
{
// !Determin the path and the file to upload from the webserver
$file = $local_path.'/'.$filename;
// !Open a new file to write to on the local machine
if (!($file_handle = fopen($file, "wb", 0))) {
die("There was a problem opening ".$file." for writing!");
}
// !Write the file to local disk
if ($bytesWritten = fwrite($file_handle, $data) ) {
//echo "There were " . $bytesWritten . " bytes written to " . $file;
}
// !Close the file from writing
if (!fclose($file_handle)) {
die("There was a problem closing " . $file);
}
// !Create connection to remote FTP server
$ftp_cxn = ftp_connect($node['addr'], $node['ftp_port']) or die("Couldn't connect to the ftp server.");
// !Login to the remote server
ftp_login($ftp_cxn, $node['user'], getPwd($node['ID'])) or die("Couldn't login to the ftp server.");
// !Set PASV or ACTIVE FTP
ftp_pasv($ftp_cxn, true);
// !Upload the file
if (!ftp_put($ftp_cxn, $remote_path.'/'.$filename, $file, FTP_ASCII)) {
die("There was an issue ftp'ing the file to ".$node['addr'].$remote_path);
}
// !Close the ftp connection
ftp_close($ftp_cxn);
}
The length of the string fwrite can write in one go is limited on some platforms (which is why it returns the number of bytes written). You can try running it in a loop, but a better idea is to simply use file_put_contents, which guarantees that the whole string will be written.
http://www.php.net/manual/en/function.file-put-contents.php

What is the best way to move files from one server to another with PHP?

I want to setup a CRON that runs a PHP script that in turn moves XML file (holding non-sensitive information) from one server to another.
I have been given the proper username/password, and want to use SFTP protocol. The jobs will run daily. There is the potential that one server is Linux and the other is Windows. Both are on different networks.
What is the best way to move that file?
If both servers would be on Linux you could use rsync for any kind of files (php, xml, html, binary, etc). Even if one of them will be Windows there are rsync ports to Windows.
Why not try using PHP's FTP functions?
Then you could do something like:
// open some file for reading
$file = 'somefile.txt';
$fp = fopen($file, 'r');
// set up basic connection
$conn_id = ftp_connect($ftp_server);
// login with username and password
$login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);
// try to upload $file
if (ftp_fput($conn_id, $file, $fp, FTP_ASCII)) {
echo "Successfully uploaded $file\n";
} else {
echo "There was a problem while uploading $file\n";
}
// close the connection and the file handler
ftp_close($conn_id);
fclose($fp);
Why not use shell_exec and scp?
<?php
$output = shell_exec('scp file1.txt dvader#deathstar.com:somedir');
echo "<pre>$output</pre>";
?>
I had some similar situation.
After some tries, I did some thing different
We have 2 servers,
a (that have the original files)
b (files should moved to it)
And for sure the data is NOT sensitive
Now in server a I made a file to do the following when called:
1. Choose the file to move
2. Zip the file
3. Print the .zip file location
4. Delete the .zip file (and the original file) if delete parameter passes
In the server b the file should do:
1. Call the file on the a server
2. Download the zip file
3. Unzip and copy it to the proper location
4. Call the delete function on server a
This way I have more control on my functions, tests and operations!

Categories