How to download/copy file from server A to server B? - php

I am using this code to download a package from server A and put it in server B (copy)..but it does not work always, sometimes transfer does not get completed, file is not complete and some times it goes well. Can I improve this code in anyway or use cURL to do the same thing?
This is my code:
// from server a to server b
$filename = 'http://domain.com/file.zip';
$dest_folder = TEMPPATH.'/';
$out_file = #fopen(basename($filename), 'w');
$in_file = #fopen($filename, 'r');
if ($in_file && $out_file) {
while ($chunk = #fgets($in_file)) {
#fputs($out_file, $chunk);
}
#fclose($in_file);
#fclose($out_file);
$zip = new ZipArchive();
$result = $zip->open(basename($filename));
if ($result) {
$zip->extractTo($dest_folder);
$zip->close();
}
}
Problem is that it is not consistent. It does not get transfered all times, many times it comes missing and the script does not run well.

$filename = 'http://domain.com/file.zip';
echo `wget $filename`;
echo `unzip $filename`;
or
$ch = curl_init();
$timeout = 5;
curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch,CURLOPT_CONNECTTIMEOUT,$timeout);
$data = curl_exec($ch);
curl_close($ch);
fwrite(fopen($destfile,'w'),$data);
Really though, you need to figure out why it is failing. Is the zip operation killing it? Is the php script timing out because it took too long to execute? Is it running out of memory? Is the server on the other end timing out? Get some error reporting and debug data and try to figure out why it's not working. The code you have should be fine, and reliable.

Have you checked the timeout settings on your server. Maybe they are
causing your script to timeout before the code is executed
completely.
Make sure that you are allowed to open the external url by fopen in your servers settings. And you also have right access settings to get this file.
Make sure the firewall of the Server A is allowing Server B and not just blocking its ip.
Try to use curl or file_get_contents and file_put_contents. It is likely to work too and prevents the loops.
Check if the problem is with the ZipArchive class or getting the file itself.

The fact that you are having temperamental issues suggests you may be having the same issue I've encountered - which has nothing to do with the code.
I'm pulling my zip from a remote server using cURL and then extracting the locally saved zip. Sometimes it works, sometimes not... this caused some serious hair pulling to begin with.
I'm uploading my zip via filezilla and what I have found is it frequently crashes out, retries a few times and eventually works. The uploaded file has the correct file size and looks like it's successfully uploaded but if I download it again sometimes it's simply corrupted and can't be unzipped.
So long as I make sure my uploaded zip is fine my script works fine... so here it is:
$zip_url = "http://www.mydomain.com.au/";
$version = "1.0.1.zip"; // zip name
$ch = curl_init();
$tmp_zip = fopen($version, 'w'); // open local file for writing
curl_setopt($ch, CURLOPT_URL, "$zip_url$version"); // pull remote file
curl_setopt($ch, CURLOPT_FILE, $tmp_zip); // save to local file
$data = curl_exec($ch); // do execute
curl_close($ch);
fclose($tmp_zip); // close local file
// extract latest build
$zip = new ZipArchive;
$zip->open($version);
$result = $zip->extractTo("."); // extract to this directory
$zip->close();
if ($result) #unlink($version); // delete local zip if extracted
else echo "failed to unzip";
One big difference in my code from the previous answer is I'm using CURLOPT_FILE rather than CURLOPT_RETURNTRANSFER. You can read why CURLOPT_FILE is better for large transfers at:
www.phpriot.com/articles/download-with-curl-and-php

Related

WP ALL Export - Php snippet works on one server but not another

Im using the wordpress plugin wp all export to export a csv file and send it via FTP. They plugin provider actually had a code snippet on their site to do exactly what I need The only thing I changed in their code was adding in my own FTP details
See code example below
function wpae_after_export( $export_id ) {
// Retrieve export object.
$export = new PMXE_Export_Record();
$export->getById($export_id);
// Check if "Secure Mode" is enabled in All Export > Settings.
$is_secure_export = PMXE_Plugin::getInstance()->getOption('secure');
// Retrieve file path when not using secure mode.
if ( !$is_secure_export) {
$filepath = get_attached_file($export->attch_id);
// Retrieve file path when using secure mode.
} else {
$filepath = wp_all_export_get_absolute_path($export->options['filepath']);
}
// Path to the export file.
$localfile = $filepath;
// File name of remote file (destination file name).
$remotefile = basename($filepath);
// Remote FTP server details.
// The 'path' is relative to the FTP user's login directory.
$ftp = array(
'server' => 'ftp URL',
'user' => 'ftp Password',
'pass' => 'ftp Username',
'path' => '/'
);
// Ensure username is formatted properly
$ftp['user'] = str_replace('#', '%40', $ftp['user']);
// Ensure password is formatted properly
$ftp['pass'] = str_replace(array('#','?','/','\\'), array('%23','%3F','%2F','%5C'), $ftp['pass']);
// Remote FTP URL.
$remoteurl = "ftp://{$ftp['user']}:{$ftp['pass']}#{$ftp['server']}{$ftp['path']}/{$remotefile}";
// Retrieve cURL object.
$ch = curl_init();
// Open export file.
$fp = fopen($localfile, "rb");
// Proceed if the local file was opened.
if ($fp) {
// Provide cURL the FTP URL.
curl_setopt($ch, CURLOPT_URL, $remoteurl);
// Prepare cURL for uploading files.
curl_setopt($ch, CURLOPT_UPLOAD, 1);
// Provide the export file to cURL.
curl_setopt($ch, CURLOPT_INFILE, $fp);
// Provide the file size to cURL.
curl_setopt($ch, CURLOPT_INFILESIZE, filesize($localfile));
// Start the file upload.
curl_exec($ch);
// If there is an error, write error number & message to PHP's error log.
if($errno = curl_errno($ch)) {
if (version_compare(phpversion(), '5.5.0', '>=')) {
// If PHP 5.5.0 or greater is used, use newer function for cURL error message.
$error_message = curl_strerror($errno);
} else {
// Otherwise, use legacy cURL error message function.
$error_message = curl_error($ch);
}
// Write error to PHP log.
error_log("cURL error ({$errno}): {$error_message}");
}
// Close the connection to remote server.
curl_close($ch);
} else {
// If export file could not be found, write to error log.
error_log("Could not find export file");
}
}
add_action('pmxe_after_export', 'wpae_after_export', 10, 1);
I first set this up on a hostgator dedicated server and it worked perfectly but when I tried to copy the exact same process into another server hosting via AWS I don't receive the file into my FTP.
I would assume there is some sort of configration within the cpanel on the AWS server but I cannot figure out what. I dont get any error message when running the export either so the only thing I can see is that the export runs to 100% quickly but still runs for about 2 mins before its complete.
I also checked with the FTP owner and confirmed there is nothing on their site blocking the file from one server but not the other.
Someone also suggested I ensure both servers have cURL installed which I did confirm.

MAMP strange behaviour : php read external file from an http:// is very slow, but from https:// is quick

I have a simple PHP script to read a remote file line-by-line, and then JSON decode it. On the production server all works ok, but on my local machine (MAMP stack, OSX) the PHP hangs. It is very slow, and takes more than 2 minutes to produce the JSON file. I think it's the json_decode() that is freezing. Why only on MAMP?
I think it's stuck in while loop, because I can't show the final $str variable that is the result of all the lines.
In case you are wondering why I need to read the file line-by-line, it's because in the real scenario, the remote JSON file is a 40MB text file. My only good performance result is like this, but any good suggestion?
Is there a configuration in php.ini to help solve this?
// The path to the JSON File
$fileName = 'http://www.xxxx.xxx/response-single.json';
//Open the file in "reading only" mode.
$fileHandle = fopen($fileName, "r");
//If we failed to get a file handle, throw an Exception.
if($fileHandle === false){
error_log("erro handle");
throw new Exception('Could not get file handle for: ' . $fileName);
}
//While we haven't reach the end of the file.
$str = "";
while(!feof($fileHandle)) {
//Read the current line in.
$line = fgets($fileHandle);
$str .= $line;
}
//Finally, close the file handle.
fclose($fileHandle);
$json = json_decode($str, true); // decode the JSON into an associative array
Thanks for your time.
I found the cause. It is path protocol.
With
$filename = 'http://www.yyy/response.json';
It freezes the server for 1 to 2 minutes.
I changed the file to another server with https protocol, and used
$filename = 'https://www.yyy/response.json';
and it works.

Saving graphic to local folder from cron job

I have a cron job that runs once a week and checks a remote site for any updates, which may include new data, and may include new graphics.
I check my local folder to see if the graphic already exists, and if it doesnt then I want to save a local copy of that graphic.
My code seems to work when I run it from the browser, but when I check the folder every week, there are a lot of empty graphic files.
They have the correct filenames, but are all zero bytes.
My code inside the php file that the cron job runs is this:
if (!file_exists($graphic)) {
$imageString = file_get_contents($graphicurl);
file_put_contents($graphic, $imageString);
}
$graphic will be something like "filename.jpg"
$graphicurl will be something like "https://remotegraphicfile.jpg"
And I see many files such as "filename.jpg" that exist in my local folder, but with a zero byte filesize.
Is there any reason why this wouldnt work when called by a cron job?
I have now managed to get this working, by changing my original code from:
if (!file_exists($graphic)) {
$imageString = file_get_contents($graphicurl);
file_put_contents($graphic, $imageString);
}
To this:
if (!file_exists($graphic)) {
$file = fopen ($graphic, 'w');
$c = curl_init($graphicurl);
curl_setopt($c, CURLOPT_FILE, $file);
curl_exec($c);
curl_close($c);
fclose($file);
}
This now works, and graphics save to my folder when running the cron job.

PHP fwrite() not working

I'm writing a function in php, client side I have a canvas image which I use toDataUrl() along with a file name to save the image on the server. The here's the code:
<?php
$imageData=$GLOBALS['HTTP_RAW_POST_DATA'];
$data = json_decode($imageData, true);
$file = $data["file"];
$image = $data["data"];
$filteredData=substr($image, strpos($image, ",")+1);
$unencodedData=base64_decode($filteredData);
$fp = fopen( 'image/' . $file , 'wb' );
fwrite( $fp, $unencodedData);
fclose( $fp );
?>
The thing is that this code works. And for two out of three of the pages I used it on it works fine. The problem is when I copy and pasted it a third time to implement it again, for some reason the file is made on the server except that no data get's written into the file. I don't think it's a problem client side because I write in a debug alert message in the javascript and a debug echo into the PHP and both are able to print out the data fine. I made this short debug file:
<?php
$fp = fopen('data.txt', 'wb');
if(is_writable('data.txt')){
echo "file is writable<br>";
}
if(fwrite($fp, 'test') == FALSE){
echo "failed to write data<br>";
}
fclose($fp);
?>
And the output is
file is writable
failed to write data
I've tried using chmod and setting everything, the folder, the text file before I write to it to 0777 and I still get the same result; the file is made but no data is written into it. Is there anything I'm missing or any other approaches that might help. I haven't found anything on google and am still baffled as to why the same code worked exactly as expected twice before suddenly stopping for no apparent reason.
Thanks in advance.
I know this is an old post, but I had a very similar problem and found a solution (for me at least)! I ran out of disk space on my server, so it could create a 0 byte file, but wouldn't write to it. After I cleared out some space (deleted a 13gb error.log file) everything started working again as expected.
If fopen works but fwrite mysteriously doesn't, check your disk space. 'df -h' is the command to check disk space on a linux server.
instead of $fp = fopen('data.txt', 'wb'); give $fp = fopen('data.txt', 'w'); and try
Changed "wb" to "w"
When you write $fp = fopen('data.txt', 'w'); for your domain website.com having root at /var/www/website/ and if the php file is located at /var/www/website/php/server/file/admin.php or something similar, it will actually create a file at /var/www/website/data.txt
Try giving absolute path or path relative to your domain root to create files like,
$fp = fopen('php/server/file/data.txt', 'w');
Try the find command to see if the file is created anywhere else in the folder directory by using the following in Ubuntu,
find /var/www/website/ -name 'data.txt'
I had this issue, probably can help you solve if you have similar issue.

How can I resume an upload to ftp from php

I have a php-script that uploads files to an ftp server from a location with a very low bandwith, low reliability connection. I am currently using the ftp functions of php.
Sometimes the connection drops in the middle of a transfer. Is there a way to later resume the upload? How can this be done?
Edit:
People misunderstood that this is happening from a browser. However, this is not the case. It is a php cli script. So it is about a local file being uploaded to ftp, no user interaction.
Try getting the remote file's size and then issuing the APPE ftp command with the difference. This will append to the file. See http://webmasterworld.com/forum88/4703.htm for an example. If you want real control over this, I recommend using PHP's cURL functions.
I think all the previous answers are wrong. The PHP manage the resume, for that FTP_AUTODESK must be activated, and you have to activate the FTP_AUTORESUME during upload. The code should be something like this :
$ftp_connect = ftp_connect($host, $port);
ftp_set_option ($ftp_connect, FTP_AUTOSEEK, TRUE);
$stream = fopen($local_file, 'r');
$upload = ftp_fput($ftp_connect, $file_on_ftp, $stream, FTP_BINARY, FTP_AUTORESUME);
fclose($stream);
ftp_(f)put has a $startpos parameter. You should be able to do what you need using this parameter. (starts the transfer from the file size on your server).
However I never used it, so I don't know about the reliability. You should try.
EDIT:
Another Example: by cballou
Look here. A very simple and good example.
You could do something like this:
<?php
$host = 'your.host.name';
$user = 'user';
$passwd = 'yourpasswd';
$ftp_stream = ftp_connect($host);
//EDIT
ftp_set_option($ftp_stream, FTP_AUTOSEEK, 1);
if ( ftp_login($ftp_stream,$user,$passwd) ){
//activate passive mode
//ftp_pasv($ftp_stream, TRUE);
$ret = ftp_nb_put($ftp_stream, $remotefile, $localfile, FTP_BINARY,FTP_AUTORESUME);
while ( FTP_MOREDATA == $ret ) {
// continue transfer
$ret = ftp_nb_continue($ftp_stream);
}
if (FTP_FINISHED !== $ret){
echo 'Failure occured on transfer ...';
}
}
else {
print "FAILURE while login ...";
}
//free
ftp_close($ftp_stream);
?>
are you meaning you are going to allow user resume upload from you web interface ?
is this suit you ?
http://www.webmasterworld.com/php/3931706.htm
just use the filezilla can done everything for you ...i mean the uploading process...
it will keep the data and reopen after close you will be able to continue process q...
for automate and tracking , simply use our way...which is git and capistrano
there is an option for php
try google it...

Categories