I'm trying to write a sizable chunk of data to a file that is opened via fopen() in php. The protocol wrapper I'm using is ftp, so the file is remote to the server running the php code. The file I'm writing to is on a Windows server.
I verified that the file does, in fact, get created by my php code, but the problem is that the data within the file is either non-existant (0KB) or writing to the file stops prematurely. Not sure why this is the case.
Here is the code I am using for handling the operation:
$file_handle = fopen($node['ftp'].$path_to_lut, "wb", 0, $node['ftp_context']);
include_once($file);
if ($file_handle)
{
fwrite($file_handle, $string); //$string is inside included $file
fclose($file_handle);
} else {
die('There was a problem opening the file.');
}
This code works fine when I host it on my local machine, but when I upload it to my webhost (Rackspace Cloud), it fails. This leads me to believe it's an issue related to the configuration of the my server at Rackspace, but want to know if there is anything I can do to my php code to make it more robust.
Any ideas to ensure fwrite actually finishes writing the string to the remote machine?
Thanks!
Okay, I changed the code that writes to the file like so:
if ($file_handle)
{
if ($bytesWritten = fwrite($file_handle, $string) ) {
echo "There were " . $bytesWritten . " bytes written to the text file.";
}
if (!fflush($file_handle)) {
die("There was a problem outputting all the data to the text file.");
}
if (!fclose($file_handle)) {
die("There was a problem closing the text file.");
}
} else {
die("No file to write data to. Sorry.");
}
What is strange is that the echo statement shows the following:
There were 10330 bytes written to the text file.
And yet, when I verify the text file size via FTP it shows it to be 0K and the data inside the file is, in fact, truncated. I can't imagine it has to do with the FTP server itself because it works if the PHP is hosted on a machine other than the one on Rackspace Cloud.
** UPDATE **
I spoke to a Rackspace Cloud rep who mentioned that they require passive ftp if you're going to ftp from their servers. I setup the remote server to handle passive ftp connections, and have verified that passive ftp now works on the remote server via the OSX Transmit ftp client. I added:
ftp_pasv($file_handle, true);
Right after the fopen() statement, but I get an error from PHP saying the I didn't provide a valid resource to ftp_pasv(). How can I ensure that the connection to the ftp site that PHP makes is PASV and not ACTIVE and still use fwrite()? Incidentally, I've noticed that the Windows machine reports that the file being written by my PHP code is 4096 bytes on disk. It never gets beyond that amount. This led me to change the output_buffering php value to 65536 just to troubleshoot, but that didn't fix the issue either. . .
** UPDATE PART DUEX **
Troubleshooting the problem on the my virtual server on the Rackspace Cloud Sites product was proving too difficult because they don't offer enough admin rights. I created a very small cloud server on Rackspace's Cloud Server product and configured everything to the point where I'm still seeing the same error with fwrite(). To make sure that I could write a file from that server to a remote server, I used basic ftp commands within my bash shell on the cloud server. It worked fine. So, I assume that there is a bug within the php implementation of fwrite(), and that it is probably due to some type of data throttling issue. When I write to the remote server from my local environment which has a slow upspeed compared to what is offered on the Rackspace Cloud server, it works fine. Is there any way to effectively throttle down the speed of the write? Just askin' :)
** UPDATE PART III *
So, I took the suggestion from #a sad dude and implemented a function that might help somebody trying to write to a new file and send it off in its entirety via ftp:
function writeFileAndFTP($filename=null, $data=null, $node=null, $local_path=null, $remote_path=null)
{
// !Determin the path and the file to upload from the webserver
$file = $local_path.'/'.$filename;
// !Open a new file to write to on the local machine
if (!($file_handle = fopen($file, "wb", 0))) {
die("There was a problem opening ".$file." for writing!");
}
// !Write the file to local disk
if ($bytesWritten = fwrite($file_handle, $data) ) {
//echo "There were " . $bytesWritten . " bytes written to " . $file;
}
// !Close the file from writing
if (!fclose($file_handle)) {
die("There was a problem closing " . $file);
}
// !Create connection to remote FTP server
$ftp_cxn = ftp_connect($node['addr'], $node['ftp_port']) or die("Couldn't connect to the ftp server.");
// !Login to the remote server
ftp_login($ftp_cxn, $node['user'], getPwd($node['ID'])) or die("Couldn't login to the ftp server.");
// !Set PASV or ACTIVE FTP
ftp_pasv($ftp_cxn, true);
// !Upload the file
if (!ftp_put($ftp_cxn, $remote_path.'/'.$filename, $file, FTP_ASCII)) {
die("There was an issue ftp'ing the file to ".$node['addr'].$remote_path);
}
// !Close the ftp connection
ftp_close($ftp_cxn);
}
The length of the string fwrite can write in one go is limited on some platforms (which is why it returns the number of bytes written). You can try running it in a loop, but a better idea is to simply use file_put_contents, which guarantees that the whole string will be written.
http://www.php.net/manual/en/function.file-put-contents.php
Related
I'm developing a web application with PHP, to upload files to a web server which merges, compresses those files (pdf) and then upload the new files to another FTP server. This second ftp server is used as a network driver in my offices so people can work with uploaded files.
The number of files goes from 2 to 7 for one upload.
I'm having the following problem: Sometimes files are uploaded, but most of the time they aren't. Sometimes one file on three is uploaded and the two others are not.
I almost always get this:
Warning: ftp_nb_put(): The connection timed out in ...
Warning: ftp_nb_put(): Opening data channel for uploading files to server from...
Warning: ftp_nb_put(): File not found in...
I tested with another (personal) FTP server from a webhost and it worked perfectly 10/10 times.
So I guess that the problem is coming from the ftp server.
I also tested with passive mode, but I get more error, even with :
ftp_pasv($ftp, true);
ftp_set_option($ftpconn, FTP_USEPASVADDRESS, false);
I don't have access to the FTP server settings, so I would like to know what could possibly be blocking downloads to give information to the service that manages this.
This is my code, for the FTP part :
$ftp = ftp_connect($ftp_hostname, 21);
ftp_login($ftp, $ftp_username, $ftp_password);
ftp_mkdir($ftp, $targetdir);
foreach($file as $filename => $path) {
if(file_exists($path)) {
$upload = ftp_nb_put($ftp, $targetdir.$filename, $path, FTP_BINARY, FTP_AUTORESUME);
while (FTP_MOREDATA == $upload){
$upload = ftp_nb_continue($ftp);
}
if ($upload != FTP_FINISHED) {
echo "Error with : ".$path;
}
}
else {
echo("File not found : ".$path);
}
}
ftp_close($ftp);
I hope I didn't miss any information,
Thanks !
I feel like this should be a pretty straightforward process.
I have the following code:
<?php
$filename = "c:/TestFolder/config.txt";
echo "Attempting to read: ".$filename."<br/>";
$fh = fopen($filename, 'r') or die("file doesnt exist");
$readtext = fread($fh, filesize($filename));
fclose($fh);
echo "The Text is: ".$readtext;
?>
I have checked that I do indeed have "config.txt" in a folder called "TestFolder" on my C:/ drive... but I keep getting an error that the file doesn't exist.
I checked my PHPInfo to ensure that "allow_url_fopen" is turned on.
I have also tried different file path variations, such as:
C:\\TestFolder\\config.txt
C:\/TestFolder\/config.txt
This doesn't seem to make a difference.
Any idea what might be preventing me from opening this file?
Edits:
It should be noted that this file is uploaded to my web host, while I am attempting to access a file on my local machine. Not sure if this changes things at all.
This is not possible. "local files" are files on the server where PHP is running, not the client running the web browser. While they might be the same machine when you're testing locally, once you upload the script to a web host, PHP tries to read files on the web host, not your machine.
The only way for PHP to access a file on the client machine is for the application to upload it.
i am building a few Excel files with PHPExcel and store them on the server running the script.
This part works well.
i then try to send them to my client's ftp server using ftp_put, but the files arrive corrupted to the server.
When opening them after downloading them back from the ftp server with Filezilla, i get an error saying that there is a problem with the content of the file and that they can try to repair it as much as they can if i wish so. Doing so doesn't help.
if the excel file is under 100-120k its sent uncorrupted, bigger than that it gets corrupted.
script to send via ftp
$conn_id = ftp_connect($hostFTP);
if($login_result = ftp_login($conn_id, $userFTP, $passwordFTP))
{
ftp_pasv($conn_id, true);
if(ftp_put($conn_id, $remote_path.$output_filename, $localPath.$output_filename, FTP_BINARY))
{
$log = new Logs("listes.log", "Uploaded $output_filename");
}
else
{
$log = new Logs("listes.log", "FAIL Uploading $output_filename");
}
ftp_close($conn_id);
}
Am i doing something wrong? what can i do?
edit :
comparing the files in text show some differences starting at line 231, which i assume is around the 100-120k mark
tried with an image(jpg) and it also gets corrupted (top of image is fine, but at a point it just corrupts and the rest of the image is pretty much single colored), so problem isnt with excel;
Is it possible that the server (opensuse) limits ftp? and if so how?
Well i've found the problem, its not the code. its the internet service. We changed the internet provider yesterday and the ftp now works fine. I guess the ftp connection was being interrupted with the previous provider.
I'm having a little problem with ftp_get. The script won't work when running on our local development server running on Centos 6.
I've done some research on Stack Overflow and tried most of the solutions without getting it to work.
I have tested the same script on a production server running Centos 5.x + cPanel and it's working.
I am wondering what could cause this on the local server?
Is there any specific settings for the file transfer to work?
Here's the list of things I've tried so far:
Errors, error_reporting(E_ALL) and see if there's any error. They were none.
Passive connection, ftp_pasv($connection, true) but it doesn't change anything.
Transfer modes, tried FTP_BINARY and FTP_ASCII. Nothing changes.
Tried using a file handler to save on the local server, didn't work either.
The original script runs in batch (download all files in a folder with specified filename). When running that script, I can say 3/4 of files get transfered succesfully to the 'dev' server.
Say about only 1/4 of them get the 'failed'. So the following script was used to troubleshoot / test-download some files that failed during the batch transfer.
They also failed transferring using this script.
I've checked all the permissions (chmod) on the distant server. Everything is set to 666.
Here's the code I'm using. It's very similar to the example on php.net
// Connection to ftp
$connection = ftp_connect($hostname);
// Login to the FTP
$login = ftp_login($connection, $username, $password);
// Passive might help?
ftp_pasv($connection, true);
// Test login
if ((!$connection) || (!$login)) {
echo "FTP Connection failed<br /><br />";
exit;
} else {
echo "Connection success<br /><br />";
}
$local_file = 'file.zip';
$server_file = 'file.zip';
// Download and save file
if (ftp_get($connection, $local_file, $server_file, FTP_ASCII)){
echo "Download win <br />";
} else {
echo "Download failed <br />";
}
Thanks for your time,
first try checking the same thing from command-line/shell of your local centos machine. Use "ftp " command and see if it works there. Mostly, it seems like a firewall problem to me.
Our customers run our php program on their own server. Some are Linux, some are Windows. To update the program I use ftp. Recently I've changed the FTP server, i.e. the physical machine running the server.
Now some of our customers experience troubles if they want to update. If they execute a testskript they get following message:
Warning: ftp_get() [function.ftp-get]: Opening BINARY mode data connection for _testupdate.txt (68 bytes). in C:\Programme\Zend\Apache2\htdocs_testupdate.php on line 65
I've tried to download the file via passive and active mode and also with ASCII and BINARY mode... . But nothing changes. Here is the code:
echo "<br> Testfilegröße wurde richtig ermittelt.";
$bstat = ftp_get ( $conn_id, "_testupdate.txt", "_testupdate.txt", FTP_BINARY); #FTP_ASCII oder FTP_BINARY
$exists = file_exists("_testupdate.txt");
At first I thought that the firewall could cause the problem, but this seems unlikely, because the testscript can connect and login to the ftp server
Has somebody an idea what I can try to solve the problem?
Try this:
ftp_pasv($conn_id, TRUE);
Helps if a firewall is indeed the culprit.
Otherwise, make sure your file is indeed binary or ASCII. This error would be thrown in the wrong case.