Streaming PHP file always get cut - php

I tried to stream an audio file using this PHP code, but always get cut around 3/4 of the file size, especially for iPad and Android agents.
ob_clean();
flush();
set_time_limit(0);
$size = intval(sprintf("%u", filesize($filename)));
$chunksize = 0.5 * (1024 * 1024);
if ($size > $chunksize) {
$handle = fopen($filename, 'rb');
$buffer = '';
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
}
fclose($handle);
} else {
readfile($filename);
}
Is there a better way to do this? I have tried completely with readfile() only, the result is even worse. Thanks in advance.

Related

WordPress: Securing/Escaping Output of file content - reading via #fread() in PHP

I am working on securing the content read from a file via the #fread() function.
private function readfile_chunked($file) {
$chunksize = 1024 * 1024;
// Open Resume
$handle = #fopen($file, 'r');
if (false === $handle) {
return FALSE;
}
while (!#feof($handle)) {
$content = #fread($handle, $chunksize);
echo wp_kses_post( $content);
if (ob_get_length()) {
ob_flush();
flush();
}
}
return #fclose($handle);
}
The aforementioned wp_kses_post($content) is suggested by the WP plugin review team to secure the file content but this solution is not working for me. It is downloading the file in a loop. Any help will be appreciated on "How we can escape output of #fread() in WordPress?". Or any alternative function so we can skip the "echo" function.
Thanks
private function readfile_chunked($file) {
$chunksize = 1024 * 1024;
// Open Resume
$handle = #fopen($file, 'r');
if (false === $handle) {
return FALSE;
}
$output_resource = fopen( 'php://output', 'w' );
while (!#feof($handle)) {
$content = #fread($handle, $chunksize);
fwrite( $output_resource, $content );
if (ob_get_length()) {
ob_flush();
flush();
}
}
return #fclose($handle);
}
You can write to memory as an output stream instead of echo.

Serve huge file via php, not located in public_html

I want to serve huge files from a folder above the public_html.
Currently I do:
<?php
// Authenticate
if ($_GET['key'] !== "MY-API-KEY") {
header('HTTP/1.0 403 Forbidden');
echo "You are not authorized.";
return;
}
define('CHUNK_SIZE', 1024*1024);
$PATH_ROOT_AUTOPILOT_ACTIVITY_STREAMS = "../../../data/csv/";
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
// Get the file parameter
$file = basename(urldecode($_GET['file']));
$fileDir = $PATH_ROOT_AUTOPILOT_ACTIVITY_STREAMS;
$filePath = $fileDir . $file;
if (file_exists($filePath))
{
// Get the file's mime type to send the correct content type header
$finfo = finfo_open(FILEINFO_MIME_TYPE);
$mime_type = finfo_file($finfo, $filePath);
// Send the headers
header("Content-Disposition: attachment; filename=$file.csv;");
header("Content-Type: $mime_type");
header('Content-Length: ' . filesize($filePath));
// Stream the file
readfile_chunked($filePath);
exit;
}
?>
This currently fails for some reason I don't understand. curl outputs:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 457M 0 457M 0 0 1228k 0 --:--:-- 0:06:21 --:--:-- 1762k
curl: (92) HTTP/2 stream 1 was not closed cleanly: INTERNAL_ERROR (err 2)
Is there a better way to serve big files programatically, via PHP?
Currently about 2/3 of the file is served. The response is not complete because it crashes. There are no logs.

Encode a big file using base64

I'm trying to encode/decode a file that is several MBs or sometimes GBs in base64 encoding however some pieces of data gets encoded/decoded in a wrong way which results in strange characters like: � �̴.
I'm reading the file chunk by chunk encoding and saving each individually (Probably that's the problem however i cannot figure it out).
Here is what i have tried so far:
<?php
function encode_file($Ifilename, $Efilename){
$handle = fopen($Ifilename, 'rb');
$outHandle = fopen($Efilename, 'wb');
$bufferSize = 8151;
while(!feof($handle)){
$buffer = fread($handle, $bufferSize);
$ebuffer = base64_encode($buffer);
fwrite($outHandle, $ebuffer);
}
fclose($handle);
fclose($outHandle);
}
function decode_file($Ifilename, $Efilename){
$handle = fopen($Ifilename, 'rb');
$outHandle = fopen($Efilename, 'wb');
$bufferSize = 8151;
while(!feof($handle)){
$buffer = fread($handle, $bufferSize);
$dbuffer = base64_decode($buffer);
fwrite($outHandle, $dbuffer);
}
fclose($handle);
fclose($outHandle);
}
encode_file('input.txt', 'out.bin');//Big text file ~4MBs
decode_file('out.bin', 'out.txt');
After reading the whole Wikipedia article on base64, I found that every 3 characters encodes to 4 base64 characters, this is what was causing the file corruption.
The fix is to simply set the buffer to n when encoding, where n is a multiple of 3.
When decoding set the buffer to N, where N is a multiple of 4.
The working code:
<?php
function encode_file($Ifilename, $Efilename){
$handle = fopen($Ifilename, 'rb');
$outHandle = fopen($Efilename, 'wb');
$bufferSize = 3 * 256;// 3 bytes of ASCII encodes to 4 bytes of base64
while(!feof($handle)){
$buffer = fread($handle, $bufferSize);
$ebuffer = base64_encode($buffer);
fwrite($outHandle, $ebuffer);
}
fclose($handle);
fclose($outHandle);
}
function decode_file($Ifilename, $Efilename){
$handle = fopen($Ifilename, 'rb');
$outHandle = fopen($Efilename, 'wb');
$bufferSize = 4 * 256; // 4 bytes of base64 decodes to 3 bytes of ASCII
while(!feof($handle)){
$buffer = fread($handle, $bufferSize);
$dbuffer = base64_decode($buffer);
fwrite($outHandle, $dbuffer);
}
fclose($handle);
fclose($outHandle);
}
encode_file('input.txt', 'out.bin');
decode_file('out.bin', 'output.txt');

Php send 0 bytes excel file through echo

I am creating a very large table in a single string $output with the data from a MySQL database. In end of the php script, I send two headers:
header("Content-type: application/msexcel");
header("Content-Disposition: attachment; filename=action.xls");
The problem is when the large table has about more than 55000 rows. When this happens, the file sent is 0 bytes and the user opens a empty file. I am sending the file through this after the headers:
echo $output;
When the table has not too many rows, the file sent work. How to send a file in this way that the size of the string $output don't matter?
header("Content-type: application/msexcel");
header("Content-Disposition: attachment; filename=action.xls");
Move these lines to the top the page or Script before html table it will starts working
The solution found in another question worked for this. The file is too large to send, so the solution is send part by part to the user. Instead of use readfile() function, use the readfile_chuncked() customized function:
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}

php: readfile_chunked not flushing last 2830 bytes of 29353KB file

I am using a variation of the familiar readfile_chunked in attempt of download for larger files:
function readfile_chunked($filename)
{
$chunk_size = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
while (!feof($handle))
{
$buffer = fread($handle, $chunk_size);
print $buffer;
ob_flush();
flush();
sleep(1);
}
$status = fclose($handle);
return $status;
}
smaller files work fine, but this larger file is missing the last 2830 bytes.
i found out the issue to this. under the php.ini file, make sure you set implicit_flushing to On. i still have the explicit flush code after each line outputted however.

Categories