How can i create in PHP a file with a given size (no matter the content)?
I have to create a file bigger than 1GB. Arround 4-10GB maximum
You can make use of fopen and fseek
define('SIZE',100); // size of the file to be created.
$fp = fopen('somefile.txt', 'w'); // open in write mode.
fseek($fp, SIZE-1,SEEK_CUR); // seek to SIZE-1
fwrite($fp,'a'); // write a dummy char at SIZE position
fclose($fp); // close the file.
On execution:
$ php a.php
$ wc somefile.txt
0 1 100 somefile.txt
$
If the content of the file is irrelevant then just pad it - but do make sure you don't generate a variable too large to hold in memory:
<?php
$fh = fopen("somefile", 'w');
$size = 1024 * 1024 * 10; // 10mb
$chunk = 1024;
while ($size > 0) {
fputs($fh, str_pad('', min($chunk,$size)));
$size -= $chunk;
}
fclose($fh);
If the file has to be readable by something else - then how you do it depends on the other thing which needs to read it.
C.
Late, but it's really easier than the other answers.
$size = 100;
$fp = fopen('foo.dat',"w+");
fwrite($fp,str_repeat(' ',$size),$size);
The w+ will either create the file or overwrite it if it already exists.
For a really big file I usually cheat:
`truncate -s 10g foo.dat`;
Look my code at bottom
^^ The best feature is here with 0s to create a 4GB file ^^
FUNCTION CreatFileDummy($file_name,$size) {
// 32bits 4 294 967 296 bytes MAX Size
$f = fopen($file_name, 'wb');
if($size >= 1000000000) {
$z = ($size / 1000000000);
if (is_float($z)) {
$z = round($z,0);
fseek($f, ( $size - ($z * 1000000000) -1 ), SEEK_END);
fwrite($f, "\0");
}
while(--$z > -1) {
fseek($f, 999999999, SEEK_END);
fwrite($f, "\0");
}
}
else {
fseek($f, $size - 1, SEEK_END);
fwrite($f, "\0");
}
fclose($f);
Return true;
}
test it ^^ Max in Php 32bit 4 294 967 296 :
CreatFileDummy('mydummyfile.iso',4294967296);
You want Write , Read and Creat File Dummy my code is here ^^ :
https://github.com/Darksynx/php
Algorithm to Generate a LARGE Dummy File
Related
If I have three get parameters:
$filename = $_GET['filename'];
$start = $_GET['start'];
$size = $_GET['size'];
And I am reading a chunk of the file like so:
$handle = fopen($basepath . $filename, "rb");
fseek($handle, $start);
$contents = fread($handle, $size);
echo md5($contents);
How may I read large portions of a file (anywhere from 1mb to 1gb) and create a hash or checksum of its contents without needing to allocate enough memory for the entire read?
At the moment if I try to hash a too large of part of the file I get a memory error since php can not allocated enough memory (roughly 400mb).
Is there a hashing function in which I can digest parts of the file at a time rather than the entire contents at once (for example starting at $start read 100kb blocks and feed it to the function until $size is met)? And how would I read the file in chunks so that I would start at $start and read $size bytes?
If there is not such a hashing or checksum function that supports feeding chunks of the data pieces at a time, would file_get_contents() fix the issue of allocating memory for large reads? I am not entirely sure how that function works.
Thanks.
http://php.net/manual/en/function.hash-update.php
<?php
define('CHUNK', 65536);
//$file = 'alargefile.img';
//$start = 256 * 1024 * 1024;
//$size = 512 * 1024 * 1024;
$fp = fopen($file, "r");
fseek($fp, $start);
$ctx = hash_init('md5');
while ($size > 0) {
$buffer = fread($fp, min($size, CHUNK));
hash_update($ctx, $buffer);
$size -= CHUNK;
}
$hash = hash_final($ctx);
fclose($fp);
print $hash;
?>
I am trying to delete last line from my text file in PHP using ftruncate, but the problem is the length if line will keep varying most of the times. And I dont want to use file_put_contents, so that I write the file again because the file could be in Megabytes.
Please any suggestions ?
To get you an idea what I meant something like this:
$fp = fopen('file.txt', 'r+');
$pos = filesize('file.txt');
while ($pos > 0) {
$pos = max($pos - 1024, 0);
fseek($fp, $pos);
$tmp = fread($fp, 1024);
$tmppos = strrpos($tmp, "\n");
if ($tmppos !== false) {
ftruncate($fp, $pos + $tmppos);
break;
}
}
It will read the last 1024 bytes. Find the last newline in that buffer if it exists. If it exists truncate to that position. If is doesn't exists it reads the next 1024 bytes and check them. And so on.
I have got a large file in PHP of which I would like to replace the first 512 bytes with some other 512 bytes. Is there any PHP function that helps me with that?
If you want to optionally create a file and read and write to it (without truncating it), you need to open the file with the fopen() function in 'c+' mode:
$handle = fopen($filename, 'c+');
PHP then has the stream_get_contents() function which allows to read a chunk of bytes with a specific length (and from a specific offset in the file) into a string variable:
$buffer = stream_get_contents($handle, $length = 512, $offset = 0);
However, there is no stream_put_contents() function to write the string buffer back to the stream at a specific position/offset. A related function is file_put_contents() but it does not allow to write to a file-handle resource at a specific offset. But there is fseek() and fwrite() to do that:
$bytes_written = false;
if (0 === fseek($handle, $offset)) {
$bytes_written = fwrite($handle, $buffer, $length);
}
Here is the full picture:
$handle = fopen($filename, 'c+');
$buffer = stream_get_contents($handle, $length = 512, $offset = 0);
// ... change $buffer ...
$bytes_written = false;
if (0 === fseek($handle, $offset)) {
$bytes_written = fwrite($handle, $buffer, $length);
}
fclose($handle);
If the length of $buffer is not fixed this will not properly work. In that case it's better to work with two files and to use stream_copy_to_stream() as outlined in How to update csv column names with database table header or if the file is not large it is also possible to do that in memory:
$buffer = file_get_contents($filename);
// ... change $buffer ...
file_put_contents($filename, $buffer);
Has anyone written a Fast Algorithm that generates a LARGE dummy file in PHP, say 500MB-2GB?
If you don't care about the file contents at all, you can just seek to any position and write something:
$f = fopen('largefile', 'wb');
fseek($f, 2 * 1000 * 1000 * 1000, SEEK_SET);
fwrite($f, 'after 2 GB');
fclose($f);
If the OS and filesystem support sparse files, the file will be really big, but not actually take more than a couple of bytes of disk space.
/* far too long to file creation , do not use especially not
$f = fopen('largefile', 'wb');
fseek($f, 2 * 1000 * 1000 * 1000, SEEK_SET);
fwrite($f, 'after 2 GB');
fclose($f);*/
^^ The best feature is here with 0s to create a 4GB file ^^
FUNCTION CreatFileDummy($file_name,$size) {
// 32bits 4 294 967 296 bytes MAX Size
$f = fopen($file_name, 'wb');
if($size >= 1000000000) {
$z = ($size / 1000000000);
if (is_float($z)) {
$z = round($z,0);
fseek($f, ( $size - ($z * 1000000000) -1 ), SEEK_END);
fwrite($f, "\0");
}
while(--$z > -1) {
fseek($f, 999999999, SEEK_END);
fwrite($f, "\0");
}
}
else {
fseek($f, $size - 1, SEEK_END);
fwrite($f, "\0");
}
fclose($f);
Return true;
}
test it ^^ Max in Php 32bit 4 294 967 296 :
CreatFileDummy('mydummyfile.iso',4294967296);
You want Write , Read and Creat File Dummy my code is here ^^ :
https://github.com/Darksynx/php
php how to get web image size in kb?
getimagesize only get the width and height.
and filesize caused waring.
$imgsize=filesize("http://static.adzerk.net/Advertisers/2564.jpg");
echo $imgsize;
Warning: filesize() [function.filesize]: stat failed for http://static.adzerk.net/Advertisers/2564.jpg
Is there any other way to get a web image size in kb?
Short of doing a complete HTTP request, there is no easy way:
$img = get_headers("http://static.adzerk.net/Advertisers/2564.jpg", 1);
print $img["Content-Length"];
You can likely utilize cURL however to send a lighter HEAD request instead.
<?php
$file_size = filesize($_SERVER['DOCUMENT_ROOT']."/Advertisers/2564.jpg"); // Get file size in bytes
$file_size = $file_size / 1024; // Get file size in KB
echo $file_size; // Echo file size
?>
Not sure about using filesize() for remote files, but there are good snippets on php.net though about using cURL.
http://www.php.net/manual/en/function.filesize.php#92462
That sounds like a permissions issue because filesize() should work just fine.
Here is an example:
php > echo filesize("./9832712.jpg");
1433719
Make sure the permissions are set correctly on the image and that the path is also correct. You will need to apply some math to convert from bytes to KB but after doing that you should be in good shape!
Here is a good link regarding filesize()
You cannot use filesize() to retrieve remote file information. It must first be downloaded or determined by another method
Using Curl here is a good method:
Tutorial
You can use also this function
<?php
$filesize=file_get_size($dir.'/'.$ff);
$filesize=$filesize/1024;// to convert in KB
echo $filesize;
function file_get_size($file) {
//open file
$fh = fopen($file, "r");
//declare some variables
$size = "0";
$char = "";
//set file pointer to 0; I'm a little bit paranoid, you can remove this
fseek($fh, 0, SEEK_SET);
//set multiplicator to zero
$count = 0;
while (true) {
//jump 1 MB forward in file
fseek($fh, 1048576, SEEK_CUR);
//check if we actually left the file
if (($char = fgetc($fh)) !== false) {
//if not, go on
$count ++;
} else {
//else jump back where we were before leaving and exit loop
fseek($fh, -1048576, SEEK_CUR);
break;
}
}
//we could make $count jumps, so the file is at least $count * 1.000001 MB large
//1048577 because we jump 1 MB and fgetc goes 1 B forward too
$size = bcmul("1048577", $count);
//now count the last few bytes; they're always less than 1048576 so it's quite fast
$fine = 0;
while(false !== ($char = fgetc($fh))) {
$fine ++;
}
//and add them
$size = bcadd($size, $fine);
fclose($fh);
return $size;
}
?>
You can get the file size by using the get_headers() function. Use below code:
$image = get_headers($url, 1);
$bytes = $image["Content-Length"];
$mb = $bytes/(1024 * 1024);
echo number_format($mb,2) . " MB";