If I have three get parameters:
$filename = $_GET['filename'];
$start = $_GET['start'];
$size = $_GET['size'];
And I am reading a chunk of the file like so:
$handle = fopen($basepath . $filename, "rb");
fseek($handle, $start);
$contents = fread($handle, $size);
echo md5($contents);
How may I read large portions of a file (anywhere from 1mb to 1gb) and create a hash or checksum of its contents without needing to allocate enough memory for the entire read?
At the moment if I try to hash a too large of part of the file I get a memory error since php can not allocated enough memory (roughly 400mb).
Is there a hashing function in which I can digest parts of the file at a time rather than the entire contents at once (for example starting at $start read 100kb blocks and feed it to the function until $size is met)? And how would I read the file in chunks so that I would start at $start and read $size bytes?
If there is not such a hashing or checksum function that supports feeding chunks of the data pieces at a time, would file_get_contents() fix the issue of allocating memory for large reads? I am not entirely sure how that function works.
Thanks.
http://php.net/manual/en/function.hash-update.php
<?php
define('CHUNK', 65536);
//$file = 'alargefile.img';
//$start = 256 * 1024 * 1024;
//$size = 512 * 1024 * 1024;
$fp = fopen($file, "r");
fseek($fp, $start);
$ctx = hash_init('md5');
while ($size > 0) {
$buffer = fread($fp, min($size, CHUNK));
hash_update($ctx, $buffer);
$size -= CHUNK;
}
$hash = hash_final($ctx);
fclose($fp);
print $hash;
?>
Related
Can we access some of the system's heap memory with PHP like in C/C++? Actually, I keep getting Fatal error: Allowed memory size of 134217728 bytes exhausted while trying to do some big file operation in PHP.
I know we can tweak this limit in Apache2 config. But, for processing large unknown sized files and data can we have some kind of access to the heap to process & save the file? Also, if yes, then is there a mechanism to clear the memory after usage?
Sample code
<?php
$filename = "a.csv";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
echo $contents;
fclose($handle);
?>
Here, a.csv is a 80mb file. Can there be a heap operation using some sort of pointer?
Have you tried reading the file in chunks, e.g.:
<?php
$chunk = 256 * 256; // set chunk size to your liking
$filename = "a.csv";
$handle = fopen($filename, 'rb');
while (!feof($handle))
{
$data = fread($handle, $chunk);
echo $data;
ob_flush();
flush();
}
fclose($handle);
I have got a large file in PHP of which I would like to replace the first 512 bytes with some other 512 bytes. Is there any PHP function that helps me with that?
If you want to optionally create a file and read and write to it (without truncating it), you need to open the file with the fopen() function in 'c+' mode:
$handle = fopen($filename, 'c+');
PHP then has the stream_get_contents() function which allows to read a chunk of bytes with a specific length (and from a specific offset in the file) into a string variable:
$buffer = stream_get_contents($handle, $length = 512, $offset = 0);
However, there is no stream_put_contents() function to write the string buffer back to the stream at a specific position/offset. A related function is file_put_contents() but it does not allow to write to a file-handle resource at a specific offset. But there is fseek() and fwrite() to do that:
$bytes_written = false;
if (0 === fseek($handle, $offset)) {
$bytes_written = fwrite($handle, $buffer, $length);
}
Here is the full picture:
$handle = fopen($filename, 'c+');
$buffer = stream_get_contents($handle, $length = 512, $offset = 0);
// ... change $buffer ...
$bytes_written = false;
if (0 === fseek($handle, $offset)) {
$bytes_written = fwrite($handle, $buffer, $length);
}
fclose($handle);
If the length of $buffer is not fixed this will not properly work. In that case it's better to work with two files and to use stream_copy_to_stream() as outlined in How to update csv column names with database table header or if the file is not large it is also possible to do that in memory:
$buffer = file_get_contents($filename);
// ... change $buffer ...
file_put_contents($filename, $buffer);
I am working on a file sharing site right now and I've run into a small problem. I am using the upload scrip uploadify which works perfectly but if the user wants i want the uploaded file to be encrypted. Now i have working code that does this as shown below but my server only has 1GB or memory and using stream_copy_to_stream seems to take up the size of the actually file in memory and my max upload size is 256 so i know for a fact that something bad is going to happen when the site goes live and multiple people upload large files at once. Based on my code below is there any alternative that barely uses and memory or none at all, I wouldnt even care if it takes longer i just need this to work. I have the download version of this working because i have the file directly decrypted and immediately passed through to the browser so it decrypts as it downloads which i though was pretty efficient but this upload problem doesn't look to good. Any help is appreciated.
$temp_file = $_FILES['Filedata']['tmp_name'];
$ext = pathinfo($_FILES['Filedata']['name'], PATHINFO_EXTENSION);
$new_file_name = md5(uniqid(rand(), true));
$target_file = rtrim(enc_target_path, '/') . '/' . $new_file_name . '.enc.' . $ext;
$iv_size = mcrypt_get_iv_size(MCRYPT_RIJNDAEL_128, MCRYPT_MODE_CBC);
$iv = mcrypt_create_iv($iv_size, MCRYPT_RAND);
$key = substr(md5('some_salt' . $password, true) . md5($password . 'more_salt', true), 0, 24);
$opts = array('iv' => $iv, 'key' => $key);
$my_file = fopen($temp_file, 'rb');
$encrypted_file_name = $target_file;
$encrypted_file = fopen($encrypted_file_name, 'wb');
stream_filter_append($encrypted_file, 'mcrypt.rijndael_128', STREAM_FILTER_WRITE, $opts);
stream_copy_to_stream($my_file, $encrypted_file);
fclose($encrypted_file);
fclose($my_file);
unlink($temp_file);
temp_file is the first instance that i can see of the uploaded file
Do you have better results if you try reading the file in chunks like this?:
$my_file = fopen($temp_file, 'rb');
$encrypted_file_name = $target_file;
$encrypted_file = fopen($encrypted_file_name, 'wb');
stream_filter_append($encrypted_file, 'mcrypt.rijndael_128', STREAM_FILTER_WRITE, $opts);
//stream_copy_to_stream($my_file, $encrypted_file);
rewind($my_file);
while (!feof($my_file)) {
fwrite($encrypted_file, fread($my_file, 4096));
}
You might also try calling stream_set_chunk_size prior to calling stream_copy_to_stream to set the size of the buffer it uses to read from the source stream when copying to the destination.
Hope that helps.
EDIT: I tested with this code and when uploading a 700MB movie file, the peak memory usage of PHP is 524,288 bytes. It looks like stream_copy_to_stream will try to read the entire source file into memory unless you read it in chunks passing the length and offset arguments.
$encrypted_file_name = $target_file;
$encrypted_file = fopen($encrypted_file_name, 'wb');
stream_filter_append($encrypted_file, 'mcrypt.rijndael_128', STREAM_FILTER_WRITE, $opts);
$size = 16777216; // buffer size of copy
$pos = 0; // initial file position
fseek($my_file, 0, SEEK_END);
$length = ftell($my_file); // get file size
while ($pos < $length) {
$writ = stream_copy_to_stream($my_file, $encrypted_file, $size, $pos);
$pos += $writ;
}
fclose($encrypted_file);
fclose($my_file);
unlink($temp_file);
php how to get web image size in kb?
getimagesize only get the width and height.
and filesize caused waring.
$imgsize=filesize("http://static.adzerk.net/Advertisers/2564.jpg");
echo $imgsize;
Warning: filesize() [function.filesize]: stat failed for http://static.adzerk.net/Advertisers/2564.jpg
Is there any other way to get a web image size in kb?
Short of doing a complete HTTP request, there is no easy way:
$img = get_headers("http://static.adzerk.net/Advertisers/2564.jpg", 1);
print $img["Content-Length"];
You can likely utilize cURL however to send a lighter HEAD request instead.
<?php
$file_size = filesize($_SERVER['DOCUMENT_ROOT']."/Advertisers/2564.jpg"); // Get file size in bytes
$file_size = $file_size / 1024; // Get file size in KB
echo $file_size; // Echo file size
?>
Not sure about using filesize() for remote files, but there are good snippets on php.net though about using cURL.
http://www.php.net/manual/en/function.filesize.php#92462
That sounds like a permissions issue because filesize() should work just fine.
Here is an example:
php > echo filesize("./9832712.jpg");
1433719
Make sure the permissions are set correctly on the image and that the path is also correct. You will need to apply some math to convert from bytes to KB but after doing that you should be in good shape!
Here is a good link regarding filesize()
You cannot use filesize() to retrieve remote file information. It must first be downloaded or determined by another method
Using Curl here is a good method:
Tutorial
You can use also this function
<?php
$filesize=file_get_size($dir.'/'.$ff);
$filesize=$filesize/1024;// to convert in KB
echo $filesize;
function file_get_size($file) {
//open file
$fh = fopen($file, "r");
//declare some variables
$size = "0";
$char = "";
//set file pointer to 0; I'm a little bit paranoid, you can remove this
fseek($fh, 0, SEEK_SET);
//set multiplicator to zero
$count = 0;
while (true) {
//jump 1 MB forward in file
fseek($fh, 1048576, SEEK_CUR);
//check if we actually left the file
if (($char = fgetc($fh)) !== false) {
//if not, go on
$count ++;
} else {
//else jump back where we were before leaving and exit loop
fseek($fh, -1048576, SEEK_CUR);
break;
}
}
//we could make $count jumps, so the file is at least $count * 1.000001 MB large
//1048577 because we jump 1 MB and fgetc goes 1 B forward too
$size = bcmul("1048577", $count);
//now count the last few bytes; they're always less than 1048576 so it's quite fast
$fine = 0;
while(false !== ($char = fgetc($fh))) {
$fine ++;
}
//and add them
$size = bcadd($size, $fine);
fclose($fh);
return $size;
}
?>
You can get the file size by using the get_headers() function. Use below code:
$image = get_headers($url, 1);
$bytes = $image["Content-Length"];
$mb = $bytes/(1024 * 1024);
echo number_format($mb,2) . " MB";
How can i create in PHP a file with a given size (no matter the content)?
I have to create a file bigger than 1GB. Arround 4-10GB maximum
You can make use of fopen and fseek
define('SIZE',100); // size of the file to be created.
$fp = fopen('somefile.txt', 'w'); // open in write mode.
fseek($fp, SIZE-1,SEEK_CUR); // seek to SIZE-1
fwrite($fp,'a'); // write a dummy char at SIZE position
fclose($fp); // close the file.
On execution:
$ php a.php
$ wc somefile.txt
0 1 100 somefile.txt
$
If the content of the file is irrelevant then just pad it - but do make sure you don't generate a variable too large to hold in memory:
<?php
$fh = fopen("somefile", 'w');
$size = 1024 * 1024 * 10; // 10mb
$chunk = 1024;
while ($size > 0) {
fputs($fh, str_pad('', min($chunk,$size)));
$size -= $chunk;
}
fclose($fh);
If the file has to be readable by something else - then how you do it depends on the other thing which needs to read it.
C.
Late, but it's really easier than the other answers.
$size = 100;
$fp = fopen('foo.dat',"w+");
fwrite($fp,str_repeat(' ',$size),$size);
The w+ will either create the file or overwrite it if it already exists.
For a really big file I usually cheat:
`truncate -s 10g foo.dat`;
Look my code at bottom
^^ The best feature is here with 0s to create a 4GB file ^^
FUNCTION CreatFileDummy($file_name,$size) {
// 32bits 4 294 967 296 bytes MAX Size
$f = fopen($file_name, 'wb');
if($size >= 1000000000) {
$z = ($size / 1000000000);
if (is_float($z)) {
$z = round($z,0);
fseek($f, ( $size - ($z * 1000000000) -1 ), SEEK_END);
fwrite($f, "\0");
}
while(--$z > -1) {
fseek($f, 999999999, SEEK_END);
fwrite($f, "\0");
}
}
else {
fseek($f, $size - 1, SEEK_END);
fwrite($f, "\0");
}
fclose($f);
Return true;
}
test it ^^ Max in Php 32bit 4 294 967 296 :
CreatFileDummy('mydummyfile.iso',4294967296);
You want Write , Read and Creat File Dummy my code is here ^^ :
https://github.com/Darksynx/php
Algorithm to Generate a LARGE Dummy File