Send Video to Server [duplicate] - php

I am using file_put_contents to create a video file. the problem is the speed and performance. It takes about an average of 30 to 60 minutes for an average file size of 50 mb to be created and that is just for one file alone. I am decoding a byte array to create the file. How can I improve the speed and performance?
$json_str = file_get_contents('php://input');
$json_obj = json_decode($json_str);
$Video = $json_obj->Video;
$CAF = $json_obj->CAF;
$Date = $json_obj->Date;
$CafDate = date("Y-m-d", strtotime($Date));
$video_decode = base64_decode($Video);
$video_filename = __DIR__ . '/uploads/'. $CAF . '_'.$CafDate.'_VID.mp4';
$video_dbfilename = './uploads/'. $CAF . '_'.$CafDate.'_VID.mp4';
$save_video = file_put_contents($video_filename, $video_decode);

You should not load entire files to the memory when you can't foresee the size or when it's going to handle huge files. It's better to read the file in chunks and process it chunk by chunk.
Here's a quick and dirty example of how to achieve it:
<?php
// open the handle in binary-read mode
$handle = fopen("php://input", "r");
// open handle for saving the file
$local_file = fopen("path/to/file", "w");
// create a variable to store the chunks
$chunk = '';
// loop until the end of the file
while (!feof($handle)) {
// get a chunk
$chunk = fread($handle, 8192);
// here you do whatever you want with $chunk
// (i.e. save it appending to some file)
fwrite($local_file, $chunk);
}
// close handles
fclose($handle);
fclose($local_file);

There are many problems here, with a compounding effect.
$json_str = file_get_contents('php://input');
This reads the entire input into memory, and blocks until it's done. Don't do this.
$json_obj = json_decode($json_str);
If you have 50MB of JSON, you're probably doing something wrong.
$Video = $json_obj->Video;
This is what you're doing wrong... storing a large binary blob in a small structured data serialization format. You realize that the JSON parser has to read and handle the entire thing? Don't send JSON for this!
$video_decode = base64_decode($Video);
Don't use base-64 encoding unless absolutely necessary. It adds 33% overhead to the storage size, as well as CPU for encoding/decoding. This is a complete waste, and is totally unnecessary.
$video_filename = __DIR__ . '/uploads/'. $CAF . '_'.$CafDate.'_VID.mp4';
$video_dbfilename = './uploads/'. $CAF . '_'.$CafDate.'_VID.mp4';
You may have a serious security issue with those two lines. What if someone posts a path of ../../../etc/init.d/something-evil? Don't let the user dictate the filename on disk in any way.
$save_video = file_put_contents($video_filename, $video_decode);
The comment from file_get_contents() applies here in that you should be using this method. Stream the contents to disk instead.

Related

file_put_contents too slow when handling large file or multiple files

I am using file_put_contents to create a video file. the problem is the speed and performance. It takes about an average of 30 to 60 minutes for an average file size of 50 mb to be created and that is just for one file alone. I am decoding a byte array to create the file. How can I improve the speed and performance?
$json_str = file_get_contents('php://input');
$json_obj = json_decode($json_str);
$Video = $json_obj->Video;
$CAF = $json_obj->CAF;
$Date = $json_obj->Date;
$CafDate = date("Y-m-d", strtotime($Date));
$video_decode = base64_decode($Video);
$video_filename = __DIR__ . '/uploads/'. $CAF . '_'.$CafDate.'_VID.mp4';
$video_dbfilename = './uploads/'. $CAF . '_'.$CafDate.'_VID.mp4';
$save_video = file_put_contents($video_filename, $video_decode);
You should not load entire files to the memory when you can't foresee the size or when it's going to handle huge files. It's better to read the file in chunks and process it chunk by chunk.
Here's a quick and dirty example of how to achieve it:
<?php
// open the handle in binary-read mode
$handle = fopen("php://input", "r");
// open handle for saving the file
$local_file = fopen("path/to/file", "w");
// create a variable to store the chunks
$chunk = '';
// loop until the end of the file
while (!feof($handle)) {
// get a chunk
$chunk = fread($handle, 8192);
// here you do whatever you want with $chunk
// (i.e. save it appending to some file)
fwrite($local_file, $chunk);
}
// close handles
fclose($handle);
fclose($local_file);
There are many problems here, with a compounding effect.
$json_str = file_get_contents('php://input');
This reads the entire input into memory, and blocks until it's done. Don't do this.
$json_obj = json_decode($json_str);
If you have 50MB of JSON, you're probably doing something wrong.
$Video = $json_obj->Video;
This is what you're doing wrong... storing a large binary blob in a small structured data serialization format. You realize that the JSON parser has to read and handle the entire thing? Don't send JSON for this!
$video_decode = base64_decode($Video);
Don't use base-64 encoding unless absolutely necessary. It adds 33% overhead to the storage size, as well as CPU for encoding/decoding. This is a complete waste, and is totally unnecessary.
$video_filename = __DIR__ . '/uploads/'. $CAF . '_'.$CafDate.'_VID.mp4';
$video_dbfilename = './uploads/'. $CAF . '_'.$CafDate.'_VID.mp4';
You may have a serious security issue with those two lines. What if someone posts a path of ../../../etc/init.d/something-evil? Don't let the user dictate the filename on disk in any way.
$save_video = file_put_contents($video_filename, $video_decode);
The comment from file_get_contents() applies here in that you should be using this method. Stream the contents to disk instead.

weird PHP filesize behavior

In the code below I...
open a textfile, write four characters to it, and close it again,
re-open it, read the contents, and use filesize to report the size of the file. (It's 4, as it should be),
manipulate those contents and tack on four more characters as well. Then I write that new string to the textfile and close it again,
use filesize again to report how big the file is.
To my surprise, the answer it gives is still 4 even though the actual size of the file is 8! Examining the contents of the file proves that the write works and the length of the contents is 8.
What is going on??
By the way I have to use fread and fwrite instead of file_get_contents and file_put_contents. At least I think I do. This little program is a stepping stone to using "flock" so I can read the contents of a file and rewrite to it while making sure no other processes use the file in between. And AFAIK flock doesn't work with file_get_contents and file_put_contents.
Please help!
<?php
$filename = "blahdeeblah.txt";
// Write 4 characters
$fp = fopen($filename, "w");
fwrite($fp, "1234");
fclose($fp);
// read those characters, manipulate them, and write them back (also increasing filesize).
$fp = fopen($filename, "r+");
$size = filesize($filename);
echo "size before is: " . $size . "<br>";
$t = fread($fp, $size);
$t = $t[3] . $t[2] . $t[1] . $t[0] . "5678";
rewind($fp);
fwrite($fp, $t);
fclose($fp);
// "filesize" returns the same number as before even though the file is larger now.
$size = filesize($filename);
echo "size after is: " . $size . " ";
?>
From http://php.net/manual/en/function.filesize.php
Note: The results of this function are cached. See clearstatcache() for more details.
When you open a file by fopen() function you can obtain proper size at any time using fstat() function:
$fstat=fstat($fp);
echo 'Size: '.$fstat['size'];
Example:
$filename='blahdeeblah.txt';
$fp=fopen($filename, 'a');
$size=#filesize($filename);
echo 'Proper size (obtained by filesize): '.$size.'<br>';
$fstat=fstat($fp);
echo 'Proper size (obtained by fstat): '.$fstat['size'].'<br><br>';
fwrite($fp, '1234');
echo 'Writing 4 bytes...<br><br>';
$fstat=fstat($fp);
echo 'Proper size (obtained by fstat): '.$fstat['size'].'<br>';
fclose($fp);
$size=#filesize($filename);
echo 'Wrong size (obtained by filesize): '.$size;
Note that cached value is used only in current script. When you run the script again filesize() reads new (proper) size of file.
Example:
$filename='blahdeeblah.txt';
$fp=fopen($filename, 'a');
$size=#filesize($filename);
echo 'Proper size: '.$size.'<br>';
fwrite($fp, '1234');
fclose($fp);
$size=#filesize($filename);
echo 'Wrong size: '.$size;

Out of memory decrypting large files in PHP with phpseclib

I'm trying to write a program that decrypts AES files on the fly with phpseclib.
Files are large, so I get an out of memory error if I use file_get_contents($f) or fread(filesize($f)) to read the input file.
For some reason, a loop like this is creating corrupted output files. WHY!? =(
For example, an input file of size 296,155,408 bytes comes out with 18,805,826 bytes. NOTE: It works if the entire file can be read in one iteration of the loop.
define('MY_BUFFER_SIZE', 128 * 1024);
$sessionKey = '....';
$filenameIn = $argv[1];
$fileIn = fopen($filenameIn, 'rb');
$filenameOut = dirname($argv[1]) . DIRECTORY_SEPARATOR . basename($argv[1], '.tar.gz') . '-decrypted.tar.gz';
$fileOut = fopen($filenameOut, 'wb');
// Setup cipher for continuous buffering and copy between files.
$aesCipher = new Crypt_AES(CRYPT_AES_MODE_CBC);
$aesCipher->setKey($sessionKey);
$aesCipher->setIV("\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00");
$aesCipher->enableContinuousBuffer();
while (!feof($fileIn)) {
$packet = fread($fileIn, MY_BUFFER_SIZE); // #TODO: Streaming not working.
fwrite($fileOut, $aesCipher->decrypt($packet));
}
fclose($fileIn);
fclose($fileOut);
Thanks to #neubert!
What was required was adding:
$aesCipher->disablePadding()
This works:
// Setup cipher for continuous buffering and copy between files.
$aesCipher = new Crypt_AES(CRYPT_AES_MODE_CBC);
$aesCipher->setKey($sessionKey);
$aesCipher->setIV("\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00");
$aesCipher->enableContinuousBuffer();
$aesCipher->disablePadding();
while (!feof($fileIn)) {
fwrite($fileOut, $aesCipher->decrypt(fread($fileIn, MY_BUFFER_SIZE)));
}

Change pointer for file_put_contents()

$iplog = "$time EST - $userip - $location - $currentpage\n";
file_put_contents("iplog.txt", $iplog, FILE_APPEND);
I am trying to write this to the text file, but it puts it at the bottom and I would prefer if the new entries were at the top. How would I change the pointer for where it puts the text?
To prepend at the beginning of a file is very uncommon, as it requires all data of the file copied. If the file is large, this might get unacceptable for performance (especially when it is a log file, which is frequently written to). I would re-think If you really want that.
The simplest way to do this with PHP is something like this:
$iplog = "$time EST - $userip - $location - $currentpage\n";
file_put_contents("iplog.txt", $iplog . file_get_contents('iplog.txt'));
The file_get_contents solution doesn't have a flag for prepending content to a file and is not very efficient for big files, which log files usually are. The solution is to use fopen and fclose with a temporary buffer. Then you can have issues if different visitors are updating your log file at the same time, but's that another topic (you then need locking mechanisms or else).
<?php
function prepend($file, $data, $buffsize = 4096)
{
$handle = fopen($file, 'r+');
$total_len = filesize($file) + strlen($data);
$buffsize = max($buffsize, strlen($data));
// start by adding the new data to the file
$save= fread($handle, $buffsize);
rewind($handle);
fwrite($handle, $data, $buffsize);
// now add the rest of the file after the new data
$data = $save;
while (ftell($handle) < $total_len)
{
$chunk = fread($handle, $buffsize);
fwrite($handle, $data);
$data = $chunk;
}
}
prepend("iplog.txt", "$time EST - $userip - $location - $currentpage\n")
?>
That should do it (the code was tested). It requires an initial iplog.txt file though (or filesize throws an error.

Most memory-efficient way to split a variable sized chunks?

Is there a way to do something like an fread, but on a variable?
That is, I want to "read" another in-memory variable 1MB at a time.
That way I could have something like this:
$data = ... ; // 10MB of data
$handle = fopen($data, "rb"); // Need something instead of fopen here
while (!feof($handle))
{
$chunk = fread($handle, 1048576); // Want to read 1MB at a time
doSomethingWithChunk($chunk);
}
fclose($handle);
I have a large binary file loaded into memory, about 10MB. I'd like to split it into an array of 1MB chunks. I don't need all 1MB chunks in memory at once, so I think I could do something like the above more efficiently than just using PHP's built-in str_split function.
There's no way to sequentially 'read' a string that's already loaded into memory; it's not really more efficient to split it up. The overhead of multiple variables will use more memory than a single one as well. Ideally you would load the string into a stream, but PHP doesn't really have a string stream.
If you just want to deal with the string in chunks, you can just loop over substrings of it:
$data;
$pointer = 0, $size = strlen($data);
$chunkSize = 1048576;
while ($pointer < $size)
{
$chunk = substr($data, $pointer, $chunkSize);
doSomethingWithChunk($chunk);
$pointer += $chunkSize;
}
I'm not sure how PHP handles large strings internally, but according to the string documentation, a string can only be "as large as up to 2GB (2147483647 bytes maximum)". If your file is about 10MB, it shouldn't be a problem for PHP.
Another option (probably the better option) is to load $data into a memory or temporary stream. If you want to spare the environment from excessive memory, you can use the php://temp stream wrapper, where some of the data is stored in a temporary file if it exceeds 2MB. Just load the string into the stream as soon as possible to conserve memory, and then you can use the file stream functions on it.
$dataStream = fopen("php://temp", "w+b");
fwrite($dataStream, funcThatGetsData()); // try not to put data into a variable to save memory
while (!feof($dataStream))
{
$chunk = fread($dataStream, 1048576); // want to read 1MB at a time
doSomethingWithChunk($chunk);
}
fclose($dataStream);
If you get $data from another function you could pass around $dataStream instead. If you must have $data in a string beforehand, be sure to call unset() on it to free the memory:
$data = getData(); // string from some other function
$dataStream = fopen("php://temp", "w+b");
fwrite($dataStream, $data);
unset($data); // free 10MB of memory!
...
If you want to keep it all in memory you can use php://memory, but you might as well just use a string in that case.
You can use like;
$handle = #fopen("path_to_your_file", "r");
if ($handle) {
while (($buffer = fgets($handle, 1024)) !== false) {
doSomethingWithChunk($buffer );
}
fclose($handle);
}

Categories