Algorithm to Generate a LARGE Dummy File - php

Has anyone written a Fast Algorithm that generates a LARGE dummy file in PHP, say 500MB-2GB?

If you don't care about the file contents at all, you can just seek to any position and write something:
$f = fopen('largefile', 'wb');
fseek($f, 2 * 1000 * 1000 * 1000, SEEK_SET);
fwrite($f, 'after 2 GB');
fclose($f);
If the OS and filesystem support sparse files, the file will be really big, but not actually take more than a couple of bytes of disk space.

/* far too long to file creation , do not use especially not
$f = fopen('largefile', 'wb');
fseek($f, 2 * 1000 * 1000 * 1000, SEEK_SET);
fwrite($f, 'after 2 GB');
fclose($f);*/
^^ The best feature is here with 0s to create a 4GB file ^^
FUNCTION CreatFileDummy($file_name,$size) {
// 32bits 4 294 967 296 bytes MAX Size
$f = fopen($file_name, 'wb');
if($size >= 1000000000) {
$z = ($size / 1000000000);
if (is_float($z)) {
$z = round($z,0);
fseek($f, ( $size - ($z * 1000000000) -1 ), SEEK_END);
fwrite($f, "\0");
}
while(--$z > -1) {
fseek($f, 999999999, SEEK_END);
fwrite($f, "\0");
}
}
else {
fseek($f, $size - 1, SEEK_END);
fwrite($f, "\0");
}
fclose($f);
Return true;
}
test it ^^ Max in Php 32bit 4 294 967 296 :
CreatFileDummy('mydummyfile.iso',4294967296);
You want Write , Read and Creat File Dummy my code is here ^^ :
https://github.com/Darksynx/php

Related

Reading large portions of files for hashing/checksums

If I have three get parameters:
$filename = $_GET['filename'];
$start = $_GET['start'];
$size = $_GET['size'];
And I am reading a chunk of the file like so:
$handle = fopen($basepath . $filename, "rb");
fseek($handle, $start);
$contents = fread($handle, $size);
echo md5($contents);
How may I read large portions of a file (anywhere from 1mb to 1gb) and create a hash or checksum of its contents without needing to allocate enough memory for the entire read?
At the moment if I try to hash a too large of part of the file I get a memory error since php can not allocated enough memory (roughly 400mb).
Is there a hashing function in which I can digest parts of the file at a time rather than the entire contents at once (for example starting at $start read 100kb blocks and feed it to the function until $size is met)? And how would I read the file in chunks so that I would start at $start and read $size bytes?
If there is not such a hashing or checksum function that supports feeding chunks of the data pieces at a time, would file_get_contents() fix the issue of allocating memory for large reads? I am not entirely sure how that function works.
Thanks.
http://php.net/manual/en/function.hash-update.php
<?php
define('CHUNK', 65536);
//$file = 'alargefile.img';
//$start = 256 * 1024 * 1024;
//$size = 512 * 1024 * 1024;
$fp = fopen($file, "r");
fseek($fp, $start);
$ctx = hash_init('md5');
while ($size > 0) {
$buffer = fread($fp, min($size, CHUNK));
hash_update($ctx, $buffer);
$size -= CHUNK;
}
$hash = hash_final($ctx);
fclose($fp);
print $hash;
?>

Delete last line from text file PHP using ftruncate

I am trying to delete last line from my text file in PHP using ftruncate, but the problem is the length if line will keep varying most of the times. And I dont want to use file_put_contents, so that I write the file again because the file could be in Megabytes.
Please any suggestions ?
To get you an idea what I meant something like this:
$fp = fopen('file.txt', 'r+');
$pos = filesize('file.txt');
while ($pos > 0) {
$pos = max($pos - 1024, 0);
fseek($fp, $pos);
$tmp = fread($fp, 1024);
$tmppos = strrpos($tmp, "\n");
if ($tmppos !== false) {
ftruncate($fp, $pos + $tmppos);
break;
}
}
It will read the last 1024 bytes. Find the last newline in that buffer if it exists. If it exists truncate to that position. If is doesn't exists it reads the next 1024 bytes and check them. And so on.

Streaming PHP file always get cut

I tried to stream an audio file using this PHP code, but always get cut around 3/4 of the file size, especially for iPad and Android agents.
ob_clean();
flush();
set_time_limit(0);
$size = intval(sprintf("%u", filesize($filename)));
$chunksize = 0.5 * (1024 * 1024);
if ($size > $chunksize) {
$handle = fopen($filename, 'rb');
$buffer = '';
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
}
fclose($handle);
} else {
readfile($filename);
}
Is there a better way to do this? I have tried completely with readfile() only, the result is even worse. Thanks in advance.

How to change the first 512 bytes of a file?

I have got a large file in PHP of which I would like to replace the first 512 bytes with some other 512 bytes. Is there any PHP function that helps me with that?
If you want to optionally create a file and read and write to it (without truncating it), you need to open the file with the fopen() function in 'c+' mode:
$handle = fopen($filename, 'c+');
PHP then has the stream_get_contents() function which allows to read a chunk of bytes with a specific length (and from a specific offset in the file) into a string variable:
$buffer = stream_get_contents($handle, $length = 512, $offset = 0);
However, there is no stream_put_contents() function to write the string buffer back to the stream at a specific position/offset. A related function is file_put_contents() but it does not allow to write to a file-handle resource at a specific offset. But there is fseek() and fwrite() to do that:
$bytes_written = false;
if (0 === fseek($handle, $offset)) {
$bytes_written = fwrite($handle, $buffer, $length);
}
Here is the full picture:
$handle = fopen($filename, 'c+');
$buffer = stream_get_contents($handle, $length = 512, $offset = 0);
// ... change $buffer ...
$bytes_written = false;
if (0 === fseek($handle, $offset)) {
$bytes_written = fwrite($handle, $buffer, $length);
}
fclose($handle);
If the length of $buffer is not fixed this will not properly work. In that case it's better to work with two files and to use stream_copy_to_stream() as outlined in How to update csv column names with database table header or if the file is not large it is also possible to do that in memory:
$buffer = file_get_contents($filename);
// ... change $buffer ...
file_put_contents($filename, $buffer);

php create file with given size

How can i create in PHP a file with a given size (no matter the content)?
I have to create a file bigger than 1GB. Arround 4-10GB maximum
You can make use of fopen and fseek
define('SIZE',100); // size of the file to be created.
$fp = fopen('somefile.txt', 'w'); // open in write mode.
fseek($fp, SIZE-1,SEEK_CUR); // seek to SIZE-1
fwrite($fp,'a'); // write a dummy char at SIZE position
fclose($fp); // close the file.
On execution:
$ php a.php
$ wc somefile.txt
0 1 100 somefile.txt
$
If the content of the file is irrelevant then just pad it - but do make sure you don't generate a variable too large to hold in memory:
<?php
$fh = fopen("somefile", 'w');
$size = 1024 * 1024 * 10; // 10mb
$chunk = 1024;
while ($size > 0) {
fputs($fh, str_pad('', min($chunk,$size)));
$size -= $chunk;
}
fclose($fh);
If the file has to be readable by something else - then how you do it depends on the other thing which needs to read it.
C.
Late, but it's really easier than the other answers.
$size = 100;
$fp = fopen('foo.dat',"w+");
fwrite($fp,str_repeat(' ',$size),$size);
The w+ will either create the file or overwrite it if it already exists.
For a really big file I usually cheat:
`truncate -s 10g foo.dat`;
Look my code at bottom
^^ The best feature is here with 0s to create a 4GB file ^^
FUNCTION CreatFileDummy($file_name,$size) {
// 32bits 4 294 967 296 bytes MAX Size
$f = fopen($file_name, 'wb');
if($size >= 1000000000) {
$z = ($size / 1000000000);
if (is_float($z)) {
$z = round($z,0);
fseek($f, ( $size - ($z * 1000000000) -1 ), SEEK_END);
fwrite($f, "\0");
}
while(--$z > -1) {
fseek($f, 999999999, SEEK_END);
fwrite($f, "\0");
}
}
else {
fseek($f, $size - 1, SEEK_END);
fwrite($f, "\0");
}
fclose($f);
Return true;
}
test it ^^ Max in Php 32bit 4 294 967 296 :
CreatFileDummy('mydummyfile.iso',4294967296);
You want Write , Read and Creat File Dummy my code is here ^^ :
https://github.com/Darksynx/php
Algorithm to Generate a LARGE Dummy File

Categories