Copying file uploaded vai PHP's HTTP Get in PHP 4 - php

I have been working on adding functionality to a site originally written in PHP 4.4.9. It's not in their budget to port the site to PHP5, so don't even suggest it. (Although it needs it badly). The problem I am facing is how to copy binary data from a GET request to a file location on the server. The code that is currently written to support this method is as follows:
function save($path) {
$input = fopen("php://input", "r");
$temp = tmpfile();
$realSize = stream_copy_to_stream($input, $temp);
fclose($input);
if ($realSize != $this->getSize()){
return false;
}
$target = fopen($path, "w");
fseek($temp, 0, SEEK_SET);
stream_copy_to_stream($temp, $target);
fclose($target);
}
The problem that I am having with this is the funciton stream_copy_to_stream is only supported in PHP 5. Here is what I have so far, but it seems to only create files that are 8K in size and I'm not sure why. It should, in theory, allow for up to 20M.
function save($path) {
$input = fopen("php://input", "rb");
$temp = tmpfile();
fwrite($temp, fread($input, 20971520));
fclose($input);
$target = fopen($path, "w");
fseek($temp, 0, SEEK_SET);
#stream_copy_to_stream($temp, $target);
fwrite($target, fread($temp, 20971520));
fclose($target);
echo $path;
return true;
}
I removed the size check because I couldn't figure a way to get the filesize when reading. Any help is greatly appreciated on this. I have been racking my brain for literally hours and I know there is someone out there, most likely on stackoverflow, that can answer my question probably very easily.
Thanks for all the help in advance!
Also, I am submitting data via GET to to allow multiple file uploads with progress bars, etc.

I came across this thread looking for answer for exact same problem.
I know post is old but putting answer here for anyone else looking.
You were close.
fread only takes 8192 byte chunks out of stream at a time. So you will have to loop through until it sees end of file.
function save($path) {
$input = fopen("php://input", "rb");
$temp = tmpfile();
while(!feof($input))
fwrite($temp, fread($input, 8192));
//fwrite($temp, fread($input, 20971520));
fclose($input);
$target = fopen($path, "w");
fseek($temp, 0, SEEK_SET);
#stream_copy_to_stream($temp, $target);
while(!feof($temp))
fwrite($target, fread($temp, 8192));
fclose($target);
echo $path;
return true;
}

Related

Why cannot read php://input if a file is sent in Firefox?

I'm using the Valums File Uploader plugin. This plugin is working on Chrome, but I have a problem with Firefox. I've isolated the problem, but I can't guess what's happening...
The problem is here:
function save($path) {
$input = fopen("php://input", "r");
$temp = tmpfile();
$realSize = stream_copy_to_stream($input, $temp);
fclose($input);
if ($realSize != $this->getSize()){
return false;
}
$target = fopen($path, "w");
fseek($temp, 0, SEEK_SET);
stream_copy_to_stream($temp, $target);
fclose($target);
return true;
}
This piece of PHP is part of the server, the function that should take the uploaded file and move to a $path. The problem is that $input contains no data, it's empty. Even if I put this in the first line of my PHP handler file:
$postdata = file_get_contents("php://input");
the $postdata string is empty. Only in Firefox, not in Chrome. If I go to the client side, Firebug shows that the Firefox is sending the file:
But if you go to the PHP documentation, it states that "php://input is not available with enctype="multipart/form-data"", so I don't know if the problem is Firefox, php://input or sending a file with multipart/form-data (I don't how to send a file without this). What's the problem?

Alternative to Stream_Copy_To_Stream() php

I am working on a file sharing site right now and I've run into a small problem. I am using the upload scrip uploadify which works perfectly but if the user wants i want the uploaded file to be encrypted. Now i have working code that does this as shown below but my server only has 1GB or memory and using stream_copy_to_stream seems to take up the size of the actually file in memory and my max upload size is 256 so i know for a fact that something bad is going to happen when the site goes live and multiple people upload large files at once. Based on my code below is there any alternative that barely uses and memory or none at all, I wouldnt even care if it takes longer i just need this to work. I have the download version of this working because i have the file directly decrypted and immediately passed through to the browser so it decrypts as it downloads which i though was pretty efficient but this upload problem doesn't look to good. Any help is appreciated.
$temp_file = $_FILES['Filedata']['tmp_name'];
$ext = pathinfo($_FILES['Filedata']['name'], PATHINFO_EXTENSION);
$new_file_name = md5(uniqid(rand(), true));
$target_file = rtrim(enc_target_path, '/') . '/' . $new_file_name . '.enc.' . $ext;
$iv_size = mcrypt_get_iv_size(MCRYPT_RIJNDAEL_128, MCRYPT_MODE_CBC);
$iv = mcrypt_create_iv($iv_size, MCRYPT_RAND);
$key = substr(md5('some_salt' . $password, true) . md5($password . 'more_salt', true), 0, 24);
$opts = array('iv' => $iv, 'key' => $key);
$my_file = fopen($temp_file, 'rb');
$encrypted_file_name = $target_file;
$encrypted_file = fopen($encrypted_file_name, 'wb');
stream_filter_append($encrypted_file, 'mcrypt.rijndael_128', STREAM_FILTER_WRITE, $opts);
stream_copy_to_stream($my_file, $encrypted_file);
fclose($encrypted_file);
fclose($my_file);
unlink($temp_file);
temp_file is the first instance that i can see of the uploaded file
Do you have better results if you try reading the file in chunks like this?:
$my_file = fopen($temp_file, 'rb');
$encrypted_file_name = $target_file;
$encrypted_file = fopen($encrypted_file_name, 'wb');
stream_filter_append($encrypted_file, 'mcrypt.rijndael_128', STREAM_FILTER_WRITE, $opts);
//stream_copy_to_stream($my_file, $encrypted_file);
rewind($my_file);
while (!feof($my_file)) {
fwrite($encrypted_file, fread($my_file, 4096));
}
You might also try calling stream_set_chunk_size prior to calling stream_copy_to_stream to set the size of the buffer it uses to read from the source stream when copying to the destination.
Hope that helps.
EDIT: I tested with this code and when uploading a 700MB movie file, the peak memory usage of PHP is 524,288 bytes. It looks like stream_copy_to_stream will try to read the entire source file into memory unless you read it in chunks passing the length and offset arguments.
$encrypted_file_name = $target_file;
$encrypted_file = fopen($encrypted_file_name, 'wb');
stream_filter_append($encrypted_file, 'mcrypt.rijndael_128', STREAM_FILTER_WRITE, $opts);
$size = 16777216; // buffer size of copy
$pos = 0; // initial file position
fseek($my_file, 0, SEEK_END);
$length = ftell($my_file); // get file size
while ($pos < $length) {
$writ = stream_copy_to_stream($my_file, $encrypted_file, $size, $pos);
$pos += $writ;
}
fclose($encrypted_file);
fclose($my_file);
unlink($temp_file);

how to read contents from PUT request and grab the file contents in variable php

below is the code which i want to modify
$input = fopen("php://input", "r");
$temp = tmpfile();
$realSize = stream_copy_to_stream($input, $temp);
fclose($input);
if ($realSize != $this->getSize()){
return false;
}
$target = fopen($path, "w");
fseek($temp, 0, SEEK_SET);
stream_copy_to_stream($temp, $target);
fclose($target);
I want to save the contents into the memory and transfer it accross to other server without saving it on apache server.
when i try to output the contents i only see resource id# 5. Any suggestion, comments are highly apprecited . thanks
The code you have opens file handles, which in themselves are not the content. To get the content into a variable, just read it like any other file:
$put = file_get_contents('php://input');
To get the contents of the stream:
rewind($temp); // rewind the stream to the beginning
$contents = stream_get_contents($temp);
var_dump($contents);
Or, use file_get_contents as #deceze mentions.
UPDATE
I noticed you're also opening a temp file on disk. You might want to consider simplifying your code like so:
$put = stream_get_contents(STDIN); // STDIN is an open handle to php://input
if ($put) {
$target = fopen('/storage/put.txt', "w");
fwrite($target, $put);
fclose($target);
}

php input stream size limitations

I am trying to read a raw input stream from php using php://input. This works for most files, however, files over 4MB are being ignored in the upload. I have set post_max_size and upload_max_size to 20M each thinking it would solve my problem, but it didn't. Is there another php.ini setting that needs to be configured or do I need to do chunking of some sort? If so, how would I go about doing that? Here is the upload.php code:
$fileName = $_SERVER['HTTP_X_FILE_NAME'];
$contentLength = $_SERVER['CONTENT_LENGTH'];
file_put_contents('uploads/' . $fileName, file_get_contents("php://input"));
Try stream_copy_to_stream, which directly pumps the content of the input into the file without copying it all into memory first:
$input = fopen('php://input', 'rb');
$file = fopen($filename, 'wb');
stream_copy_to_stream($input, $file);
fclose($input);
fclose($file);
Alternative:
$input = fopen('php://input', 'rb');
$file = fopen($filename, 'wb');
while (!feof($input)) {
fwrite($file, fread($input, 102400));
}
fclose($input);
fclose($file);

Using php://input and file_put_contents

I'm receiving files (images) uploaded with Ajax into my PHP script and have got it to work using this:
$input = fopen("php://input", "r");
file_put_contents('image.jpg', $input);
Obviously I will sanitize input before this operation.
One thing I wanted to check was the file size prior to creating the new file, as follows:
$input = fopen("php://input", "r");
$temp = tmpfile();
$realsize = stream_copy_to_stream($input, $temp);
if ($realsize === $_SERVER["CONTENT_LENGTH"]) {
file_put_contents('image.jpg', $temp);
}
And that doesn't work. The file is created, but it has a size of 0 bytes, so the content isn't being put into the file. I'm not awfully familiar with using streams, but I don't see why that shouldn't work, so I'm turning to you for help. Thanks in advance!
The solution was deceptively simple:
$input = fopen("php://input", "r");
file_put_contents($path, $input);
You are using file resources as if they were strings. Instead you could again use stream_copy_to_stream:
stream_copy_to_stream($temp, fopen('image.jpg', 'w'));

Categories