PHP - Merging files with FTP - php

So, I'm trying to copy a file from my local server to a remote server with FTP. The problem is, I need to do this in chunks.
Thus far, based on my research, it looks like it will probably be easiest to do this by making two files on my local server: the file to be copied, and a small, temporary file that holds the current chunk. Then, I should simply merge that chunk with the remote file.
The problem is, I'm having trouble appending files. I can't figure out how the ftp:// protocol works, and I can't find a comprehensive explanation on how to do it with cURL. The closest I've found is this, but I couldn't get it working.
Below is what I've written up so far. I've commented it so you can just skim through it to get the idea, and see where I'm stuck--the code isn't complicated. What do you recommend I do? How do I append a file on my local server a file on a remote server with FTP?
<?php
// FTP credentials
$server = HIDDEN;
$username = HIDDEN;
$password = HIDDEN;
// Connect to FTP
$connection = ftp_connect($server) or die("Failed to connect to <b>$server</b>.");
// Login to FTP
if (!#ftp_login($connection, $username, $password))
{
echo "Failed to login to <b>$server</b> as <b>$username</b>.";
}
// Destination file (where the copied file should go)
$destination = 'final.txt';
// The file on my server that we're copying (in chunks) to $destination.
$read = 'readme.txt';
// Current chunk of $read.
$temp = 'temp.tmp';
// If the file we're trying to copy exists...
if (file_exists($read))
{
// Set a chunk size (this is tiny, but I'm testing
// with tiny files just to make sure it works)
$chunk_size = 4;
// For reading through the file we want to copy to the FTP server.
$read_handle = fopen($read, 'r');
// For writing the chunk to its own file.
$temp_handle = fopen($temp, 'w+');
// Loop through $read until we reach the end of the file.
while (!feof($read_handle))
{
// Read a chunk of the file we're copying.
$chunk = fread($read_handle, $chunk_size);
// Write that chunk to its own file.
fwrite($temp_handle, $chunk);
////////////////////////////////////////////
//// ////
//// NOW WHAT?? HOW DO I ////
//// WRITE / APPEND THAT ////
//// CHUNK TO $destination? ////
//// ////
////////////////////////////////////////////
}
}
fclose($read_handle);
fclose($temp_handle);
ftp_close($ftp_connect);
?>

First off i don't think you need to create the tmp file. You can append to the destination file either using "append mode" (a) as defined in the manual or use file_put_contents with the FILE_APPEND flag.
file_put_contents does takes a filename as parameter not a file handle so it basically does fopen for you which means that if you write frequently it will fopen frequently which is less optimal compared to using the file handle ad fwrite.
<?php
// The URI of the remote file to be written to
$write = 'ftp://username1:password1#domain1.com/path/to/writeme.txt';
// The URI of the remote file to be read
$read = 'ftp://username2:password2#domain2.com/path/to/readme.txt';
if (file_exists($read)) // this will work over ftp too
{
$chunk_size = 4;
$read_handle = fopen($read, 'r');
$write_handle = fopen($write, 'a');
while (!feof($read_handle))
{
$chunk = fread($read_handle, $chunk_size);
fwrite($write_handle, $chunk);
}
}
fclose($read_handle);
fclose($write_handle);
?>
As a side note: PHP has stream wrappers that make ftp access a breeze without using the ftp functions themselves.

Related

MAMP strange behaviour : php read external file from an http:// is very slow, but from https:// is quick

I have a simple PHP script to read a remote file line-by-line, and then JSON decode it. On the production server all works ok, but on my local machine (MAMP stack, OSX) the PHP hangs. It is very slow, and takes more than 2 minutes to produce the JSON file. I think it's the json_decode() that is freezing. Why only on MAMP?
I think it's stuck in while loop, because I can't show the final $str variable that is the result of all the lines.
In case you are wondering why I need to read the file line-by-line, it's because in the real scenario, the remote JSON file is a 40MB text file. My only good performance result is like this, but any good suggestion?
Is there a configuration in php.ini to help solve this?
// The path to the JSON File
$fileName = 'http://www.xxxx.xxx/response-single.json';
//Open the file in "reading only" mode.
$fileHandle = fopen($fileName, "r");
//If we failed to get a file handle, throw an Exception.
if($fileHandle === false){
error_log("erro handle");
throw new Exception('Could not get file handle for: ' . $fileName);
}
//While we haven't reach the end of the file.
$str = "";
while(!feof($fileHandle)) {
//Read the current line in.
$line = fgets($fileHandle);
$str .= $line;
}
//Finally, close the file handle.
fclose($fileHandle);
$json = json_decode($str, true); // decode the JSON into an associative array
Thanks for your time.
I found the cause. It is path protocol.
With
$filename = 'http://www.yyy/response.json';
It freezes the server for 1 to 2 minutes.
I changed the file to another server with https protocol, and used
$filename = 'https://www.yyy/response.json';
and it works.

PHP Persist variable across all requests

In some languages C# or .NET this would be a static variable, but in PHP the memory is cleared after each request. I want the value to persist across all requests. I don't wan't $_SESSION because that is different for each user.
To help explain here is an example:
I want to have a script like this that will count up. No matter which user/browser opens the url.
<?php
function getServerVar($name){
...
}
function setServerVar($name,$val){
...
}
$count = getServerVar("count");
$count++;
setServerVar("count", $count);
echo $count;
I want the value stored in memory. It will not be something that needs to persist when apache restarts and the data is not that important that it needs to be thread safe.
UPDATE: I'm fine if it holds different values per server in a loadbalanced environment. Static variables in C# or Java will not be in sync either.
You would typically use a database to store the count.
However as an alternative you could do so using a file:
<?php
$file = 'count.txt';
if (!file_exists($file)) {
touch($file);
}
//Open the File Stream
$handle = fopen($file, "r+");
//Lock File, error if unable to lock
if(flock($handle, LOCK_EX)) {
$size = filesize($file);
$count = $size === 0 ? 0 : fread($handle, $size); //Get Current Hit Count
$count = $count + 1; //Increment Hit Count by 1
echo $count;
ftruncate($handle, 0); //Truncate the file to 0
rewind($handle); //Set write pointer to beginning of file
fwrite($handle, $count); //Write the new Hit Count
flock($handle, LOCK_UN); //Unlock File
} else {
echo "Could not Lock File!";
}
//Close Stream
fclose($handle);
In php your going to have to use an external store that all servers share. The most commonly used tool is memcached, but sql and redis both work fine for this use case.
The only way to do taht is, like bspates said, a tool that does not depend on any resource on your server. If you have various servers, you cannot rely on memory-based mechanisms on each machine.
You have to store this number outside the servers, because each server will store the value for its own file or memory.
File writting, like $_SESSION, will work if you have only one server to receive your requests. For more than one server, you need any type of database where all your servers will communicate with.

String to Zipped Stream in php

I have a processing server with my database and a serving database to serve up files with a low bandwidth cost. On the processing server, php is not able to create files so everything must be done with streams and/or stay in memory before being sent over to another server for download. A few days ago I found out about the stream abstraction with 'php://memory' and that I can do something like
$fp=fopen('php://memory','w+');
fwrite($fp,"Hello world");
fseek($fp,0,SEEK_SET);
//make a ftp connection here with $conn_id
$upload = ftp_fput($conn_id,"targetpath/helloworld.txt",$fp,FTP_BINARY);
to make the file in memory and then allow me to ftp it over to my other server. This is exactly what I want, except I also want to zip the data before sending it -- preferably using only native parts of php like ziparchive and not additional custom classes for special stream manipulation. I know that I am very close with the following...
$zip = new ZipArchive();
if($zip->open('php://memory', ZIPARCHIVE::CREATE)) {
$zip->addFromString('testtext.txt','Hello World!');
$fp = $zip->getStream('test'); if(!$fp) print "no filepointer";
//make a ftp connection here with $conn_id
$upload = ftp_fput($conn_id,"targetpath/helloworld.zip",$fp,FTP_BINARY);
} else print "couldn't open a zip like that";
The point at which this fails is the call to getStream (which always returns false although I think I am using correctly). It appears that the zip is fine making the file in 'php://memory' but for some reason getStream still fails although perhaps I don't sufficiently understand how ZipArchive makes zips...
How can I go from the string to the zipped filepointer so that I can ftp the zip over to my other server? Remember I can't make any files or else I would just make the zip file then ftp it over.
EDIT: based on skinnynerd's suggestions below I tried the following
$zip = new ZipArchive();
if($zip->open('php://memory', ZIPARCHIVE::CREATE)) {
$zip->addFromString('testtext.txt','Hello World!');
$zip->close();
$fp = fopen('php://memory','r+');
fseek($fp,0,SEEK_SET);
//connect to ftp
$upload = ftp_fput($conn_id,"upload/transfer/helloworld.zip",$fp,FTP_BINARY);
}
This does make a zip and send it over but the zip is 0 bytes large so I don't think that 'php://memory' works the way I thought... it actually fails at the close step -- the $zip->close() returns false which makes me wonder if I can open zips into 'php://memory' at all. Does anyone know what I can try along these line to get the zip?
$zip->getStream('test') is getting a stream to extract the file 'test' from the archive. Since there's no file 'test' in the archive, this fails. This is not the function you want to use.
As you said, what you want to do is send the finished archive to the ftp server. In this case, you would want to close the zip archive, and then reopen php://memory as a normal file (using fopen) to send it.
I don't know, but you may also be able to use $zip as a resource directly, without having to close and reopen the file.
And I think you can try create a stream pipe directly from ftp server
<?php
$zip = new ZipArchive();
if($zip->open('ftp://user:password#ftp.host.com/upload/transfer/helloworld.zip', ZipArchive::CREATE))
{
$zip->addFromString('testtext.txt','Hello World!');
$zip->close();
}
else
print "couldn't open zip file on remote ftp host.";
Does it have to be a Zip archive? Since you're trying to save bandwith it could be a gzip too.
<?php
$ftp_credentials = "ftp://USER:PASSWORD#HOST/helloworld.gz";
$gz = gzencode("Hello World!", 9);
$options = array('ftp' => array('overwrite' => true));
$stream_context = stream_context_create($options);
file_put_contents($ftp_credentials, $gz, 0, $stream_context);
?>

Detect lost filehandler in PHP when listening to logfiles

I am trying to build a small demon in PHP that analyzes the logfiles on a linux system. (eg. follow the syslog).
I have managed to open the file via fopen and continuosly read it with stream_get_line. My problem starts when the monitored file is deleted and recreated (eg when rotating logs). The program then does not read anything anymore, even if the file grew larger than previously.
Is there an elegant solution for this? stream_get_meta_data does not help and using tail -f on the command line shows the same problem.
EDIT, added sample code
I tried to boil down the code to a minimum to illustrate what I am looking for
<?php
$break=FALSE;
$handle = fopen('./testlog.txt', 'r');
do {
$line = stream_get_line($handle, 100, "\n");
if(!empty($line)) {
// do something
echo $line;
}
while (feof($handle)) {
sleep (5);
$line = stream_get_line($handle, 100, "\n");
if(!empty($line)) {
// do something
echo $line;
}
// a commented on php.net indicated it is possible
// with tcp streams to distinguish empty and lost
// does NOT work here --> need somefunction($handle)
if($line !== FALSE && $line ='') $break=TRUE;
}
} while (!$break);
fclose($handle);
?>
When log files are rotated, the original file is copied, then deleted, and a new file with the same name is created. It may have the same name as the original file, but it has a different inode. Inodes (dumbed down description follows) are like hidden incremental index numbers for your files. You can change the name of a file, or move it, but it takes the inode with it. Once that original log file is deleted, you can't re-open a file with the same name using the same file handler, because the inode has changed. Your best bet is detect the failure, and attempt to open the new file.

using fgetcsv from a file on a FTP server

I need to read a list of CSV files from an FTP and delete them after I successfully read them.
Until now, i opened the csv file using fopen into a resource and then used fgetcsv to read the csv lines from it.
$res = fopen($url);
while ($csv_row = fgetcsv($res, null, self::DELIMITER)) {
.....
}
The problem is that I need to read a list of csv files and delete them too. the ftp_get function save the file into a local file. I rather avoid that. any way I can keep using the fgetcsv function with the ftp_nlist & ftp_connect functions? ?
You can save the csv file to a temporary file stream using ftp_fget(). This allows you to avoid the "create-read-delete" cycle. Once you close the file stream it's like it magically never existed :)
$ftp_handle = ftp_connect($ftp_server);
$remote_path = "/path/to/file.csv";
$tmp_handle = fopen('php://temp', 'r+');
if (ftp_fget($ftp_handle, $tmp_handle, $remote_path, FTP_ASCII)) {
rewind($tmp_handle);
while ($csv_row = fgetcsv($tmp_handle)) {
// do stuff
}
}
fclose($tmp_handle);
If you wanted to loop over a directory of files just get the list of files and then put the above code in a loop.

Categories