Stream FTP upload in chunks with PHP? - php

Is it possible to stream an FTP upload with PHP? I have files I need to upload to another server, and I can only access that server through FTP. Unfortunately, I can't up the timeout time on this server. Is it at all possible to do this?
Basically, if there is a way to write part of a file, and then append the next part (and repeat) instead of uploading the whole thing at once, that'd save me. However, my Googling hasn't provided me with an answer.
Is this achievable?

OK then... This might be what you're looking for. Are you familiar with curl?
CURL can support appending for FTP:
curl_setopt($ch, CURLOPT_FTPAPPEND, TRUE ); // APPEND FLAG
The other option is to use ftp:// / ftps:// streams, since PHP 5 they allow appending. See ftp://; ftps:// Docs. Might be easier to access.

The easiest way to append a chunk to the end of a remote file is to use file_put_contents with FILE_APPEND flag:
file_put_contents('ftp://username:pa‌​ssword#hostname/path/to/file', $chunk, FILE_APPEND);
If it does not work, it's probably because you do not have URL wrappers enabled in PHP.
If you need a greater control over the writing (transfer mode, passive mode, etc), or you cannot use the file_put_contents, use the ftp_fput with a handle to the php://temp (or the php://memory) stream:
$conn_id = ftp_connect('hostname');
ftp_login($conn_id, 'username', 'password');
ftp_pasv($conn_id, true);
$h = fopen('php://temp', 'r+');
fwrite($h, $chunk);
rewind($h);
// prevent ftp_fput from seeking local "file" ($h)
ftp_set_option($conn_id, FTP_AUTOSEEK, false);
$remote_path = '/path/to/file';
$size = ftp_size($conn_id, $remote_path);
$r = ftp_fput($conn_id, $remote_path, $h, FTP_BINARY, $size);
fclose($h);
ftp_close($conn_id);
(add error handling)

Related

Why does the file not upload? PHP and FTP [duplicate]

I am having some JSON data that I encoded it with PHP's json_encode(), it looks like this:
{
"site": "site1",
"nbrSicEnt": 85,
}
What I want to do is to write the data directly as a file onto an FTP server.
For security reasons, I don't want the file to be created locally first before sending it to the FTP server, I want it to be created on the fly. So without using tmpfile() for example.
When I read the php documentations for ftp_put:
bool ftp_put ( resource $ftp_stream , string $remote_file ,
string $local_file , int $mode [, int $startpos = 0 ] )
Ones needs to create a local file (string $local_file) before writing it to the remote file.
I am looking for a way to directly write into the remote_file. How can I do that using PHP?
The file_put_contents is the easiest solution:
file_put_contents('ftp://username:pa‌​ssword#hostname/path/to/file', $contents);
If it does not work, it's probably because you do not have URL wrappers enabled in PHP.
If you need greater control over the writing (transfer mode, passive mode, offset, reading limit, etc), use the ftp_fput with a handle to the php://temp (or the php://memory) stream:
$conn_id = ftp_connect('hostname');
ftp_login($conn_id, 'username', 'password');
ftp_pasv($conn_id, true);
$h = fopen('php://temp', 'r+');
fwrite($h, $contents);
rewind($h);
ftp_fput($conn_id, '/path/to/file', $h, FTP_BINARY, 0);
fclose($h);
ftp_close($conn_id);
(add error handling)
Or you can open/create the file directly on the FTP server. That's particularly useful, if the file is large, as you won't have keep whole contents in memory.
See Generate CSV file on an external FTP server in PHP.
According to Can you append lines to a remote file using ftp_put() or something similar? and Stream FTP Upload with PHP? you should be able to do something using either CURL or PHP's FTP wrappers using file_put_contents().
$data = json_encode($object);
file_put_contents("ftp://user:pass#host/dir/file.ext", $data, FILE_APPEND);

How to create a PDF in PHP and transfer it over FTP [duplicate]

I am having some JSON data that I encoded it with PHP's json_encode(), it looks like this:
{
"site": "site1",
"nbrSicEnt": 85,
}
What I want to do is to write the data directly as a file onto an FTP server.
For security reasons, I don't want the file to be created locally first before sending it to the FTP server, I want it to be created on the fly. So without using tmpfile() for example.
When I read the php documentations for ftp_put:
bool ftp_put ( resource $ftp_stream , string $remote_file ,
string $local_file , int $mode [, int $startpos = 0 ] )
Ones needs to create a local file (string $local_file) before writing it to the remote file.
I am looking for a way to directly write into the remote_file. How can I do that using PHP?
The file_put_contents is the easiest solution:
file_put_contents('ftp://username:pa‌​ssword#hostname/path/to/file', $contents);
If it does not work, it's probably because you do not have URL wrappers enabled in PHP.
If you need greater control over the writing (transfer mode, passive mode, offset, reading limit, etc), use the ftp_fput with a handle to the php://temp (or the php://memory) stream:
$conn_id = ftp_connect('hostname');
ftp_login($conn_id, 'username', 'password');
ftp_pasv($conn_id, true);
$h = fopen('php://temp', 'r+');
fwrite($h, $contents);
rewind($h);
ftp_fput($conn_id, '/path/to/file', $h, FTP_BINARY, 0);
fclose($h);
ftp_close($conn_id);
(add error handling)
Or you can open/create the file directly on the FTP server. That's particularly useful, if the file is large, as you won't have keep whole contents in memory.
See Generate CSV file on an external FTP server in PHP.
According to Can you append lines to a remote file using ftp_put() or something similar? and Stream FTP Upload with PHP? you should be able to do something using either CURL or PHP's FTP wrappers using file_put_contents().
$data = json_encode($object);
file_put_contents("ftp://user:pass#host/dir/file.ext", $data, FILE_APPEND);

Using file_get_contents vs curl for file size

I have a file uploading script running on my server which also features remote uploads.. Everything works fine but I am wondering what is the best way to upload via URL. Right now I am using fopen to get the file from the remote url pasted in the text box named "from". I have heard that fopen isn't the best way to do it. Why is that?
Also I am using file_get_contents to get the file size of the file from the URL. I have heard that curl is better on that part. Why is that and also how can I apply these changes to this script?
<?php
$from = htmlspecialchars(trim($_POST['from']));
if ($from != "") {
$file = file_get_contents($from);
$filesize = strlen($file);
while (!feof($file)) {
$move = "./uploads/" . $rand2;
move_upload($_FILES['from']['tmp_name'], $move);
$newfile = fopen("./uploads/" . $rand2, "wb");
file_put_contents($newfile, $file);
}
}
?>
You can use filesize to get the file size of a file on disk.
file_get_contents actually gets the file into memory so $filesize = strlen(file_get_contents($from)); already gets the file, you just don't do anything with it other than find it size. You can substitute for you fwrite call file_put_contents;
See: file_get_contents and file_put_contents .
curl is used when you need more access to the HTTP protocol. There are many questions and examples on StackOverflow using curl in PHP.
So we can first download the file, in this example I wll use file_get_contents, get its size, then put the file in the directory on your local disk.
$tmpFile = file_get_contents($from);
$fileSize = strlen($tmpFile);
// you could do a check for file size here
$newFileName = "./uploads/$rand2";
file_put_contents($newFileName, $tmpFile);
In your code you have move_upload($_FILES['from']['tmp_name'], $move); but $_FILES is only applicable when you have a <input type="file"> element, which it doesn't seem you have.
P.S. You should probably white-list characters that you allow in a filename for instance $goodFilename = preg_replace("/^[^a-zA-Z0-9]+$/", "-", $filename) This is often easier to read and safer.
Replace:
while (!feof($file)) {
$move = "./uploads/" . $rand2;
move_upload($_FILES['from']['tmp_name'], $move);
$newfile = fopen("./uploads/" . $rand2, "wb");
file_put_contents($newfile, $file);
}
With:
$newFile = "./uploads/" . $rand2;
file_put_contents($newfile, $file);
The whole file is read in by file_get_contents the whole file is written by file_put_contents
As far as I understand your question: You want to get the filesize of a remote fiel given by a URL, and you're not sure which solution ist best/fastest.
At first, the biggest difference between CURL, file_get_contents() and fread() in this context is that CURL and file_get_contents() put the whole thing into memory, while fopen() gives you more control over what parts of the file you want to read. I think fopen() and file_get_contents() are nearly equivalent in your case, because you're dealing with small files and you actually want to get the whole file. So it doesn't make any difference in terms of memory usage.
CURL is just the big brother of file_get_contents(). It is actually a complete HTTP-Client rather than some kind of a wrapper for simple functions.
And talking about HTTP: Don't forget there's more to HTTP than GET and POST. Why don't you just use the resource's meta-data to check it's size before you even get it? That's one thing the HTTP method HEAD is meant for. PHP even comes with a built in function for getting the headers: get_headers(). It has some flaws though: It still sends a GET request, which makes it probably a little slower, and it follows redirects, which may cause security issues. But you can fix this pretty easily by adjusting the default context:
$opts = array(
'http' =>
array(
'method' => 'HEAD',
'max_redirects'=> 1,
'ignore_errors'=> true
)
);
stream_context_set_default($opts);
Done. Now you can simply get the headers:
$headers = get_headers('http://example.com/pic.png', 1);
//set the keys to lowercase so we don't have to deal with lower- and upper case
$lowerCaseHeaders = array_change_key_case($headers);
// 'content-length' is the header we're interested in:
$filesize = $lowerCaseHeaders['content-length'];
NOTE: filesize() will not work on a http / https stream wrapper, because stat() is not supported (http://php.net/manual/en/wrappers.http.php).
And that's pretty much it. Of course you can achieve the same with CURL just as easy if you like it better. The approach would be same (reding the headers).
And here's how you get the file and it's size (after downloading) with CURL:
// Create a CURL handle
$ch = curl_init();
// Set all the options on this handle
// find a full list on
// http://au2.php.net/manual/en/curl.constants.php
// http://us2.php.net/manual/en/function.curl-setopt.php (for actual usage)
curl_setopt($ch, CURLOPT_URL, 'http://example.com/pic.png');
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, false);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
// Send the request and store what is returned to a variable
// This actually contains the raw image data now, you could
// pass it to e.g. file_put_contents();
$data = curl_exec($ch);
// get the required info about the request
// find a full list on
// http://us2.php.net/manual/en/function.curl-getinfo.php
$filesize = curl_getinfo($ch, CURLINFO_SIZE_DOWNLOAD);
// close the handle after you're done
curl_close($ch);
Pure PHP approach: http://codepad.viper-7.com/p8mlOt
Using CURL: http://codepad.viper-7.com/uWmsYB
For a nicely formatted and human readable output of the file size I've learned this amazing function from Laravel:
function get_file_size($size)
{
$units = array('Bytes', 'KiB', 'MiB', 'GiB', 'TiB', 'PiB', 'EiB');
return #round($size / pow(1024, ($i = floor(log($size, 1024)))), 2).' '.$units[$i];
}
If you don't want to deal with all this, you should check out Guzzle. It's a very powerful and extremely easy to use library for any kind HTTP stuff.

Save large files from php stdin

Advise me the most optimal way to save large files from php stdin, please.
iOS developer sends me large video content to server and i have to store it in to files.
I read the stdin thread with video data and write it to the file. For example, in this way:
$handle = fopen("php://input", "rb");
while (!feof($handle)) {
$http_raw_post_data .= fread($handle, 8192);
}
What function is better to use? file_get_contents or fread or something else?
I agree with #hek2mgl that treating this as a multipart form upload would make most sense, but if you can't alter your input interface then you can use file_put_contents() on a stream instead of looping through the file yourself.
$handle = fopen("php://input", "rb");
if (false === file_put_contents("outputfile.dat", $handle))
{
// handle error
}
fclose($handle);
It's cleaner than iterating through the file, and it might be faster (I haven't tested).
Don't use file_get_contents because it would attempt to load all the content of the file into a string
FROM PHP DOC
file_get_contents() is the preferred way to read the contents of a file into a string. It will use memory mapping techniques if supported by your OS to enhance performance.
Am sure you just want to create the movie file on your server .. this is a more efficient way
$in = fopen("php://input", "rb");
$out = fopen('largefile.dat', 'w');
while ( ! feof($in) ) {
fwrite($out, fread($in, 8192));
}
If you use nginx as web server i want to recommend nginx upload module with possibility to resume upload.

How can I unzip a .gz file with PHP?

I'm using CodeIgniter and I can't figure out how to unzip files!
PHP itself has a number of functions for dealing with gzip files.
If you want to create a new, uncompressed file, it would be something like this.
Note: This doesn't check if the target file exists first, doesn't delete the input file, or do any error checking. You really should fix those before using this in production code.
// This input should be from somewhere else, hard-coded in this example
$file_name = 'file.txt.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name);
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb');
// Keep repeating until the end of the input file
while(!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
fwrite($out_file, gzread($file, $buffer_size));
}
// Files are done, close files
fclose($out_file);
gzclose($file);
Note: This deals with gzip only. It doesn't deal with tar.
gzopen is way too much work. This is more intuitive:
$zipped = file_get_contents("foo.gz");
$unzipped = gzdecode($zipped);
works on http pages when the server is spitting out gzipped data also.
If you have access to system():
system("gunzip file.sql.gz");
Use the functions implemented by the Zlib Compression extension.
This snippet shows how to use some of the functions made available from the extension:
// open file for reading
$zp = gzopen($filename, "r");
// read 3 char
echo gzread($zp, 3);
// output until end of the file and close it.
gzpassthru($zp);
gzclose($zp);
Download the Unzip library
and include or autoload the unzip library
$this->load->library('unzip');

Categories