My PHP web application has an API that can recieve reasonably large files (up to 32 MB) which are base64 encoded. The goal is to write these files somewhere on my filesystem. Decoded of course. What would be the least resource intensive way of doing this?
Edit: Recieving the files through an API means that I have a 32MB string in my PHP app, not a 32 MB source file somewhere on disk. I need to get that string decoded an onto the filesystem.
Using PHP's own base64_decode() isn't cutting it because it uses a lot of memory so I keep running into PHP's memory limit (I know, I could raise that limit but I don't feel good about allowing PHP to use 256MB or so per process).
Any other options? Could I do it manually? Or write the file to disk encoded and call some external command? Any thought?
Even though this has an accepted answer, I have a different suggestion.
If you are pulling the data from an API, you should not store the entire payload in a variable. Using curl or other HTTP fetchers you can automatically store your data in a file.
Assuming you are fetching the data through a simple GET url:
$url = 'http://www.example.com/myfile.base64';
$target = 'localfile.data';
$rhandle = fopen($url,'r');
stream_filter_append($rhandle, 'convert.base64-decode');
$whandle = fopen($target,'w');
stream_copy_to_stream($rhandle,$whandle);
fclose($rhandle);
fclose($whandle);
Benefits:
Should be faster (less copying of huge variables)
Very little memory overhead
If you must grab the data from a temporary variable, I can suggest this approach:
$data = 'your base64 data';
$target = 'localfile.data';
$whandle = fopen($target,'w');
stream_filter_append($whandle, 'convert.base64-decode',STREAM_FILTER_WRITE);
fwrite($whandle,$data);
fclose($whandle);
Decode the data in smaller chunks. Four characters of Base64 data equal three bytes of “Base256” data.
So you could group each 1024 characters and decode them to 768 octets of binary data:
$chunkSize = 1024;
$src = fopen('base64.data', 'rb');
$dst = fopen('binary.data', 'wb');
while (!feof($src)) {
fwrite($dst, base64_decode(fread($src, $chunkSize)));
}
fclose($dst);
fclose($src);
It's not a good idea transfer 32Mb string. But I have a solution for my task, that can accept any size files from browser to app server. Algorithm:
Client
Javascript: Read file from INPUT with FileReader and readAsDataURL() to FILE var.
Cut all of data in FILE from start to first "," position Split it with array_chunks by max_upload_size/max_post_size php var.
Send chunk with UID, chunk number and chunks quantity and wait for response, then send another chunk one by one.
Server Side
Write each chunk until the last one. Then do base64 with streams:
$src = fopen($source, 'r');
$trg = fopen($target, 'w');
stream_filter_append($src, 'convert.base64-decode');
stream_copy_to_stream($src, $trg);
fclose($src);
fclose($trg);
... now you have your base64 decoded file in $target local path. Note! You can not read and write the same file, so $source and $target must be different.
Related
I have a base64 encoded string which my frontend team has provided me with.The string is a video which was encoded using base64. I want to convert that back into a video file using Php.
I am currently just using the following to decode the string but I don't know how to proceed further.
$decoded = base64_decode ($encoded_string);
There seems to be a way to convert images from string using imagecreatefromstring() function, but I could not find a way to convert it into a video.
Thank you
you should know the video file type. you can decode to original format
$fp=file_put_contents('sample.mp4',base64_decode($encoded_string,true));
Video streams tend to be very large so is isn't a good idea to convert them to plain text in the first place. We'd also need to know the exact mechanism (protocol, format...) used to deliver the base64 string. In any case, once there you can do something like this (error checking omitted for brevity):
$chunk_size = 8192; // Bytes (must be multiple of 4)
$input = fopen('php://input', 'rb');
$output = fopen('/tmp/foo.avi', 'wb');
while ($chunk = fread($input, $chunk_size)) {
fwrite($output, base64_decode($chunk));
}
fclose($output);
fclose($input);
Smaller chunks reduce RAM usage and larger chunks improve I/O performance. You'll need to find a balance that works best for you.
My PHP script is receiving large data (100 - 500 MB) from a client. I want my PHP script run fast, without using too much memory.
To save traffic, I don't use Base64 or form data. I send binary data directly in a POST request.
The data consists of two parts: 2000 Bytes header, and the rest, that has to be stored as a file on the server.
$fle = file_get_contents("php://input",FALSE,NULL,2000);
file_put_contents("file.bin", $fle);
The problem is, that file_get_contents ignores the offset parameter, and reads the data from byte 0. Is there any better way to do it?
** I don't want to read the whole data and slice off the last N-2000 bytes, as I am afraid it would use too much memory.
Use the lower-level file IO functions and read/write a little bit at a time.
$bufsz = 4096;
$fi = fopen("php://input", "rb");
$fo = fopen("file.bin", "wb");
fseek($fi, 2000);
while( $buf = fread($fi, $bufsz) ) {
fwrite($fo, $buf);
}
fclose($fi);
fclose($fo);
This will read/write in 4kB chunks.
When I take content of a picture I try to dump it like that:
$filename = '(900).jpg';
$im = file_get_contents($filename);
var_dump(serialize($im));
When the picture is under 1mb everything works, but if it is more than 1mb browser crash can you tell me why is that a browser issue or some limitation of file_get_contents() function?
The only limitation of file_get_contents might be the memory which is allowed for PHP to use. And the default is about 128 MB.
It is a browser "issue" if you want to call it that. Outputting so much debug information to the browser is not a good idea as you can see. Additionally there is no benefit in viewing a binary file as text.
If you want to find out if the variable is set, you can use functions to check the size of the (binary) string e.g. mb_strlen().
A better way would be this
$filename = '(900).jpg';
$im = file_get_contents($filename);
// check if the file could be loaded
if ($im !== false) {
// start your processing
}
But this does not check what kind of file you have loaded into the string. If you must store the file into the database - which is considered very evil - you can either store the binary string into a BLOB type row or encode the binary string with base64_encode() and store it into a text type. Both of this solutions are also not recommended!
If you need to store image information into the database, you should think about using references to the files - e.g. the file path. Your primary objective is to secure that the database information and the filesystem information is always synchronized.
i created a web service for sending files and that saved in server. The file format is zip. The file size may vary in the user need.
This is my code for before sending data in to server.
$filename = $_SERVER['DOCUMENT_ROOT'].'/data.zip';
$data = file_get_contents($filename);
$new_data = base64_encode($data);
This new_data variable send to the server.
This worked in all small files. But base64_encode return null when i use 5 mb file or larger file.
my problem is that base64_encode is not working in large string i genrated.
if any one know about this please help me.
The answer is simple: file_get_contents() is returning false which, when passed to base64_encode(), results in an empty string.
Why is file_get_contents() failing? It could be any number of reasons:
You don't have read permissions on the file. chmod() may help, but it's likely something caused elsewhere.
Memory limits prevent file_get_contents() from loading 5 MB of text into memory.
Filesystem or disk errors. (Unlikely but strictly possible.)
I'm looking for the most efficient way to write the contents of the PHP input stream to disk, without using much of the memory that is granted to the PHP script. For example, if the max file size that can be uploaded is 1 GB but PHP only has 32 MB of memory.
define('MAX_FILE_LEN', 1073741824); // 1 GB in bytes
$hSource = fopen('php://input', 'r');
$hDest = fopen(UPLOADS_DIR.'/'.$MyTempName.'.tmp', 'w');
fwrite($hDest, fread($hSource, MAX_FILE_LEN));
fclose($hDest);
fclose($hSource);
Does fread inside an fwrite like the above code shows mean that the entire file will be loaded into memory?
For doing the opposite (writing a file to the output stream), PHP offers a function called fpassthru which I believe does not hold the contents of the file in the PHP script's memory.
I'm looking for something similar but in reverse (writing from input stream to file). Thank you for any assistance you can give.
Yep - fread used in that way would read up to 1 GB into a string first, and then write that back out via fwrite. PHP just isn't smart enough to create a memory-efficient pipe for you.
I would try something akin to the following:
$hSource = fopen('php://input', 'r');
$hDest = fopen(UPLOADS_DIR . '/' . $MyTempName . '.tmp', 'w');
while (!feof($hSource)) {
/*
* I'm going to read in 1K chunks. You could make this
* larger, but as a rule of thumb I'd keep it to 1/4 of
* your php memory_limit.
*/
$chunk = fread($hSource, 1024);
fwrite($hDest, $chunk);
}
fclose($hSource);
fclose($hDest);
If you wanted to be really picky, you could also unset($chunk); within the loop after fwrite to absolutely ensure that PHP frees up the memory - but that shouldn't be necessary, as the next loop will overwrite whatever memory is being used by $chunk at that time.