Alternative to Stream_Copy_To_Stream() php - php

I am working on a file sharing site right now and I've run into a small problem. I am using the upload scrip uploadify which works perfectly but if the user wants i want the uploaded file to be encrypted. Now i have working code that does this as shown below but my server only has 1GB or memory and using stream_copy_to_stream seems to take up the size of the actually file in memory and my max upload size is 256 so i know for a fact that something bad is going to happen when the site goes live and multiple people upload large files at once. Based on my code below is there any alternative that barely uses and memory or none at all, I wouldnt even care if it takes longer i just need this to work. I have the download version of this working because i have the file directly decrypted and immediately passed through to the browser so it decrypts as it downloads which i though was pretty efficient but this upload problem doesn't look to good. Any help is appreciated.
$temp_file = $_FILES['Filedata']['tmp_name'];
$ext = pathinfo($_FILES['Filedata']['name'], PATHINFO_EXTENSION);
$new_file_name = md5(uniqid(rand(), true));
$target_file = rtrim(enc_target_path, '/') . '/' . $new_file_name . '.enc.' . $ext;
$iv_size = mcrypt_get_iv_size(MCRYPT_RIJNDAEL_128, MCRYPT_MODE_CBC);
$iv = mcrypt_create_iv($iv_size, MCRYPT_RAND);
$key = substr(md5('some_salt' . $password, true) . md5($password . 'more_salt', true), 0, 24);
$opts = array('iv' => $iv, 'key' => $key);
$my_file = fopen($temp_file, 'rb');
$encrypted_file_name = $target_file;
$encrypted_file = fopen($encrypted_file_name, 'wb');
stream_filter_append($encrypted_file, 'mcrypt.rijndael_128', STREAM_FILTER_WRITE, $opts);
stream_copy_to_stream($my_file, $encrypted_file);
fclose($encrypted_file);
fclose($my_file);
unlink($temp_file);
temp_file is the first instance that i can see of the uploaded file

Do you have better results if you try reading the file in chunks like this?:
$my_file = fopen($temp_file, 'rb');
$encrypted_file_name = $target_file;
$encrypted_file = fopen($encrypted_file_name, 'wb');
stream_filter_append($encrypted_file, 'mcrypt.rijndael_128', STREAM_FILTER_WRITE, $opts);
//stream_copy_to_stream($my_file, $encrypted_file);
rewind($my_file);
while (!feof($my_file)) {
fwrite($encrypted_file, fread($my_file, 4096));
}
You might also try calling stream_set_chunk_size prior to calling stream_copy_to_stream to set the size of the buffer it uses to read from the source stream when copying to the destination.
Hope that helps.
EDIT: I tested with this code and when uploading a 700MB movie file, the peak memory usage of PHP is 524,288 bytes. It looks like stream_copy_to_stream will try to read the entire source file into memory unless you read it in chunks passing the length and offset arguments.
$encrypted_file_name = $target_file;
$encrypted_file = fopen($encrypted_file_name, 'wb');
stream_filter_append($encrypted_file, 'mcrypt.rijndael_128', STREAM_FILTER_WRITE, $opts);
$size = 16777216; // buffer size of copy
$pos = 0; // initial file position
fseek($my_file, 0, SEEK_END);
$length = ftell($my_file); // get file size
while ($pos < $length) {
$writ = stream_copy_to_stream($my_file, $encrypted_file, $size, $pos);
$pos += $writ;
}
fclose($encrypted_file);
fclose($my_file);
unlink($temp_file);

Related

Reading large portions of files for hashing/checksums

If I have three get parameters:
$filename = $_GET['filename'];
$start = $_GET['start'];
$size = $_GET['size'];
And I am reading a chunk of the file like so:
$handle = fopen($basepath . $filename, "rb");
fseek($handle, $start);
$contents = fread($handle, $size);
echo md5($contents);
How may I read large portions of a file (anywhere from 1mb to 1gb) and create a hash or checksum of its contents without needing to allocate enough memory for the entire read?
At the moment if I try to hash a too large of part of the file I get a memory error since php can not allocated enough memory (roughly 400mb).
Is there a hashing function in which I can digest parts of the file at a time rather than the entire contents at once (for example starting at $start read 100kb blocks and feed it to the function until $size is met)? And how would I read the file in chunks so that I would start at $start and read $size bytes?
If there is not such a hashing or checksum function that supports feeding chunks of the data pieces at a time, would file_get_contents() fix the issue of allocating memory for large reads? I am not entirely sure how that function works.
Thanks.
http://php.net/manual/en/function.hash-update.php
<?php
define('CHUNK', 65536);
//$file = 'alargefile.img';
//$start = 256 * 1024 * 1024;
//$size = 512 * 1024 * 1024;
$fp = fopen($file, "r");
fseek($fp, $start);
$ctx = hash_init('md5');
while ($size > 0) {
$buffer = fread($fp, min($size, CHUNK));
hash_update($ctx, $buffer);
$size -= CHUNK;
}
$hash = hash_final($ctx);
fclose($fp);
print $hash;
?>

getimagesize() limiting file size for remote URL

I could use getimagesize() to validate an image, but the problem is what if the mischievous user puts a link to a 10GB random file then it would whack my production server's bandwidth. How do I limit the filesize getimagesize() is getting? (eg. 5MB max image size)
PS: I did research before asking.
You can download the file separately, imposing a maximum size you wish to download:
function mygetimagesize($url, $max_size = -1)
{
// create temporary file to store data from $url
if (false === ($tmpfname = tempnam(sys_get_temp_dir(), uniqid('mgis')))) {
return false;
}
// open input and output
if (false === ($in = fopen($url, 'rb')) || false === ($out = fopen($tmpfname, 'wb'))) {
unlink($tmpfname);
return false;
}
// copy at most $max_size bytes
stream_copy_to_stream($in, $out, $max_size);
// close input and output file
fclose($in); fclose($out);
// retrieve image information
$info = getimagesize($tmpfname);
// get rid of temporary file
unlink($tmpfname);
return $info;
}
You don't want to do something like getimagesize('http://example.com') to begin with, since this will download the image once, check the size, then discard the downloaded image data. That's a real waste of bandwidth.
So, separate the download process from the checking of the image size. For example, use fopen to open the image URL, read little by little and write it to a temporary file, keeping count of how much you have read. Once you cross 5MB and are still not finished reading, you stop and reject the image.
You could try to read the HTTP Content-Size header before starting the actual download to weed out obviously large files, but you cannot rely on it, since it can be spoofed or omitted.
Here is an example, you need to make some change to fit your requirement.
function getimagesize_limit($url, $limit)
{
global $phpbb_root_path;
$tmpfilename = tempnam($phpbb_root_path . 'store/', unique_id() . '-');
$fp = fopen($url, 'r');
if (!$fp) return false;
$tmpfile = fopen($tmpfilename, 'w');
$size = 0;
while (!feof($fp) && $size<$limit)
{
$content = fread($fp, 8192);
$size += 8192; fwrite($tmpfile, $content);
}
fclose($fp);
fclose($tmpfile);
$is = getimagesize($tmpfilename);
unlink($tmpfilename);
return $is;
}

getimagesize() on stream instead of string

I'm using Valum's file uploader to upload images with AJAX. This script submits the file to my server in a way that I don't fully understand, so it's probably best to explain by showing my server-side code:
$pathToFile = $path . $filename;
//Here I get a file not found error, because the file is not yet at this address
getimagesize($pathToFile);
$input = fopen('php://input', 'r');
$temp = tmpfile();
$realSize = stream_copy_to_stream($input, $temp);
//Here I get a string expected, resource given error
getimagesize($input);
fclose($input);
$target = fopen($pathToFile, 'w');
fseek($temp, 0, SEEK_SET);
//Here I get a file not found error, because the image is not at the $target yet
getimagesize($pathToFile);
stream_copy_to_stream($temp, $target);
fclose($target);
//Here it works, because the image is at the desired location so I'm able to access it with $pathToFile. However, the (potentially) malicious file is already in my server.
getimagesize($pathToFile);
The problem is that I want to perform some file validation here, using getimagesize(). getimagesize only supports a string, and I only have resources available, which result in the error: getimagesize expects a string, resource given.
It does work when I perform getimagesize($pathTofile) at the end of the script, but then the image is already uploaded and the damage could already have been done. Doing this and performing the check afterwards and then maybe deleting te file seems like bad practice to me.
The only thing thats in $_REQUEST is the filename, which i use for the var $pathToFile. $_FILES is empty.
How can I perform file validation on streams?
EDIT:
the solution is to first place the file in a temporary directory, and perform the validation on the temporary file before copying it to the destination directory.
// Store the file in tmp dir, to validate it before storing it in destination dir
$input = fopen('php://input', 'r');
$tmpPath = tempnam(sys_get_temp_dir(), 'upl'); // upl is 3-letter prefix for upload
$tmpStream = fopen($tmpPath, 'w'); // For writing it to tmp dir
stream_copy_to_stream($input, $tmpStream);
fclose($input);
fclose($tmpStream);
// Store the file in destination dir, after validation
$pathToFile = $path . $filename;
$destination = fopen($pathToFile, 'w');
$tmpStream = fopen($tmpPath, 'r'); // For reading it from tmp dir
stream_copy_to_stream($tmpStream, $destination);
fclose($destination);
fclose($tmpStream);
PHP 5.4 now supports getimagesizefromstring
See the docs:
http://php.net/manual/pt_BR/function.getimagesizefromstring.php
You could try:
$input = fopen('php://input', 'r');
$string = stream_get_contents($input);
fclose($input);
getimagesizefromstring($string);
Instead of using tmpfile() you could make use of tempnam() and sys_get_temp_dir() to create a temporary path.
Then use fopen() to get a handle to it, copy over the stream.
Then you've got a string and a handle for the operations you need to do.
//Copy PHP's input stream data into a temporary file
$inputStream = fopen('php://input', 'r');
$tempDir = sys_get_temp_dir();
$tempExtension = '.upload';
$tempFile = tempnam($tempDir, $tempExtension);
$tempStream = fopen($tempFile, "w");
$realSize = stream_copy_to_stream($inputStream, $tempStream);
fclose($tempStream);
getimagesize($tempFile);

php how to get web image size in kb?

php how to get web image size in kb?
getimagesize only get the width and height.
and filesize caused waring.
$imgsize=filesize("http://static.adzerk.net/Advertisers/2564.jpg");
echo $imgsize;
Warning: filesize() [function.filesize]: stat failed for http://static.adzerk.net/Advertisers/2564.jpg
Is there any other way to get a web image size in kb?
Short of doing a complete HTTP request, there is no easy way:
$img = get_headers("http://static.adzerk.net/Advertisers/2564.jpg", 1);
print $img["Content-Length"];
You can likely utilize cURL however to send a lighter HEAD request instead.
<?php
$file_size = filesize($_SERVER['DOCUMENT_ROOT']."/Advertisers/2564.jpg"); // Get file size in bytes
$file_size = $file_size / 1024; // Get file size in KB
echo $file_size; // Echo file size
?>
Not sure about using filesize() for remote files, but there are good snippets on php.net though about using cURL.
http://www.php.net/manual/en/function.filesize.php#92462
That sounds like a permissions issue because filesize() should work just fine.
Here is an example:
php > echo filesize("./9832712.jpg");
1433719
Make sure the permissions are set correctly on the image and that the path is also correct. You will need to apply some math to convert from bytes to KB but after doing that you should be in good shape!
Here is a good link regarding filesize()
You cannot use filesize() to retrieve remote file information. It must first be downloaded or determined by another method
Using Curl here is a good method:
Tutorial
You can use also this function
<?php
$filesize=file_get_size($dir.'/'.$ff);
$filesize=$filesize/1024;// to convert in KB
echo $filesize;
function file_get_size($file) {
//open file
$fh = fopen($file, "r");
//declare some variables
$size = "0";
$char = "";
//set file pointer to 0; I'm a little bit paranoid, you can remove this
fseek($fh, 0, SEEK_SET);
//set multiplicator to zero
$count = 0;
while (true) {
//jump 1 MB forward in file
fseek($fh, 1048576, SEEK_CUR);
//check if we actually left the file
if (($char = fgetc($fh)) !== false) {
//if not, go on
$count ++;
} else {
//else jump back where we were before leaving and exit loop
fseek($fh, -1048576, SEEK_CUR);
break;
}
}
//we could make $count jumps, so the file is at least $count * 1.000001 MB large
//1048577 because we jump 1 MB and fgetc goes 1 B forward too
$size = bcmul("1048577", $count);
//now count the last few bytes; they're always less than 1048576 so it's quite fast
$fine = 0;
while(false !== ($char = fgetc($fh))) {
$fine ++;
}
//and add them
$size = bcadd($size, $fine);
fclose($fh);
return $size;
}
?>
You can get the file size by using the get_headers() function. Use below code:
$image = get_headers($url, 1);
$bytes = $image["Content-Length"];
$mb = $bytes/(1024 * 1024);
echo number_format($mb,2) . " MB";

Copying file uploaded vai PHP's HTTP Get in PHP 4

I have been working on adding functionality to a site originally written in PHP 4.4.9. It's not in their budget to port the site to PHP5, so don't even suggest it. (Although it needs it badly). The problem I am facing is how to copy binary data from a GET request to a file location on the server. The code that is currently written to support this method is as follows:
function save($path) {
$input = fopen("php://input", "r");
$temp = tmpfile();
$realSize = stream_copy_to_stream($input, $temp);
fclose($input);
if ($realSize != $this->getSize()){
return false;
}
$target = fopen($path, "w");
fseek($temp, 0, SEEK_SET);
stream_copy_to_stream($temp, $target);
fclose($target);
}
The problem that I am having with this is the funciton stream_copy_to_stream is only supported in PHP 5. Here is what I have so far, but it seems to only create files that are 8K in size and I'm not sure why. It should, in theory, allow for up to 20M.
function save($path) {
$input = fopen("php://input", "rb");
$temp = tmpfile();
fwrite($temp, fread($input, 20971520));
fclose($input);
$target = fopen($path, "w");
fseek($temp, 0, SEEK_SET);
#stream_copy_to_stream($temp, $target);
fwrite($target, fread($temp, 20971520));
fclose($target);
echo $path;
return true;
}
I removed the size check because I couldn't figure a way to get the filesize when reading. Any help is greatly appreciated on this. I have been racking my brain for literally hours and I know there is someone out there, most likely on stackoverflow, that can answer my question probably very easily.
Thanks for all the help in advance!
Also, I am submitting data via GET to to allow multiple file uploads with progress bars, etc.
I came across this thread looking for answer for exact same problem.
I know post is old but putting answer here for anyone else looking.
You were close.
fread only takes 8192 byte chunks out of stream at a time. So you will have to loop through until it sees end of file.
function save($path) {
$input = fopen("php://input", "rb");
$temp = tmpfile();
while(!feof($input))
fwrite($temp, fread($input, 8192));
//fwrite($temp, fread($input, 20971520));
fclose($input);
$target = fopen($path, "w");
fseek($temp, 0, SEEK_SET);
#stream_copy_to_stream($temp, $target);
while(!feof($temp))
fwrite($target, fread($temp, 8192));
fclose($target);
echo $path;
return true;
}

Categories