How to handle simultaneous reading and writing - php

I have a server that grabs mp3 audio buffers from a source and writes it into a file via php. It truncates the beginning of the file so the file size never exceeds 2 MB. At the same time a client is streaming the mp3 by seeking to the end and reading if there is any new data. The problem is when the file get's truncated the position the client was reading at changes.
This is the client side that streams the audio:
$handle = fopen('cool.mp3', "r");
$err = fseek($handle, 0, SEEK_END);
while(file_exists($file_lock)){ // cool.mp3.lock means stream is still going
$data = fread($handle, 1024);
echo $data;
ob_flush();
flush();
}
I use this on the server to write data as I get it:
$data = "audio frames....";
clearstatcache();
$file = 'cool.mp3';
if(filesize($file) > 1024*200){ //2 MB
ftruncatestart($file, 1024*25); //Trim Down By Deleting Front
}
file_put_contents($file, $data, FILE_APPEND);

Related

PHP Memory access

Can we access some of the system's heap memory with PHP like in C/C++? Actually, I keep getting Fatal error: Allowed memory size of 134217728 bytes exhausted while trying to do some big file operation in PHP.
I know we can tweak this limit in Apache2 config. But, for processing large unknown sized files and data can we have some kind of access to the heap to process & save the file? Also, if yes, then is there a mechanism to clear the memory after usage?
Sample code
<?php
$filename = "a.csv";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
echo $contents;
fclose($handle);
?>
Here, a.csv is a 80mb file. Can there be a heap operation using some sort of pointer?
Have you tried reading the file in chunks, e.g.:
<?php
$chunk = 256 * 256; // set chunk size to your liking
$filename = "a.csv";
$handle = fopen($filename, 'rb');
while (!feof($handle))
{
$data = fread($handle, $chunk);
echo $data;
ob_flush();
flush();
}
fclose($handle);

how to stream video from a thread in PHP?

hey i am trying to write web page that stream live video, from my locally server. i want to use threads. so i tried to use file_get_content and i got gibberish presenting in my web page . this is the thread, does anybody know different method for presenting video within the thread?
class stream extends Thread {
public function run() {
$file = "/var/www/html/movie.mp4"; // The media file's location
$f = fopen($file, 'rb'); // Open the file in binary mode
$chunkSize = 8192; // The size of each chunk to output
// Start outputting the data
while(true){
fpassthru($f);
echo file_get_contents('/var/www/html/movie.mp4');
//echo fread($f, $chunkSize);
//$data = fread($f, $chunkSize);
// echo $data;
flush();
}
}
}

Write and read from the same file- PHP

I am trying to write to a file and then read the data from the same file. But sometimes I am facing this issue that the file reading process is getting started even before the file writing gets finished. How can I solve this issue ? How can i make file writing process finish before moving ahead?
// writing to file
$string= <12 kb of specific data which i need>;
$filename.="/ttc/";
$filename.="datasave.html";
if($fp = fopen($filename, 'w'))
{
fwrite($fp, $string);
fclose($fp);
}
// writing to the file
$handle = fopen($filename, "r") ;
$datatnc = fread($handle, filesize($filename));
$datatnc = addslashes($datatnc);
fclose($handle);
The reason it does not work is because when you are done writing a string to the file the file pointer points to the end of the file so later when you try to read the same file with the same file pointer there is nothing more to read. All you have to do is rewind the pointer to the beginning of the file. Here is an example:
<?php
$fileName = 'test_file';
$savePath = "tmp/tests/" . $fileName;
//create file pointer handle
$fp = fopen($savePath, 'r+');
fwrite($fp, "Writing and Reading with same fopen handle!");
//Now rewind file pointer to start reading
rewind($fp);
//this will output "Writing and Reading with same fopen handle!"
echo fread($fp, filesize($savePath));
fclose($fp);
?>
Here is more info on the rewind() method http://php.net/manual/en/function.rewind.php
I have mentioned the URL through which i got the solution. I implemented the same. If you want me to copy the text from that link then here it is :
$file = fopen("test.txt","w+");
// exclusive lock
if (flock($file,LOCK_EX))
{
fwrite($file,"Write something");
// release lock
flock($file,LOCK_UN);
}
else
{
echo "Error locking file!";
}
fclose($file);
Use fclose after writing to close the file pointer and then fopen again to open it.

getimagesize() limiting file size for remote URL

I could use getimagesize() to validate an image, but the problem is what if the mischievous user puts a link to a 10GB random file then it would whack my production server's bandwidth. How do I limit the filesize getimagesize() is getting? (eg. 5MB max image size)
PS: I did research before asking.
You can download the file separately, imposing a maximum size you wish to download:
function mygetimagesize($url, $max_size = -1)
{
// create temporary file to store data from $url
if (false === ($tmpfname = tempnam(sys_get_temp_dir(), uniqid('mgis')))) {
return false;
}
// open input and output
if (false === ($in = fopen($url, 'rb')) || false === ($out = fopen($tmpfname, 'wb'))) {
unlink($tmpfname);
return false;
}
// copy at most $max_size bytes
stream_copy_to_stream($in, $out, $max_size);
// close input and output file
fclose($in); fclose($out);
// retrieve image information
$info = getimagesize($tmpfname);
// get rid of temporary file
unlink($tmpfname);
return $info;
}
You don't want to do something like getimagesize('http://example.com') to begin with, since this will download the image once, check the size, then discard the downloaded image data. That's a real waste of bandwidth.
So, separate the download process from the checking of the image size. For example, use fopen to open the image URL, read little by little and write it to a temporary file, keeping count of how much you have read. Once you cross 5MB and are still not finished reading, you stop and reject the image.
You could try to read the HTTP Content-Size header before starting the actual download to weed out obviously large files, but you cannot rely on it, since it can be spoofed or omitted.
Here is an example, you need to make some change to fit your requirement.
function getimagesize_limit($url, $limit)
{
global $phpbb_root_path;
$tmpfilename = tempnam($phpbb_root_path . 'store/', unique_id() . '-');
$fp = fopen($url, 'r');
if (!$fp) return false;
$tmpfile = fopen($tmpfilename, 'w');
$size = 0;
while (!feof($fp) && $size<$limit)
{
$content = fread($fp, 8192);
$size += 8192; fwrite($tmpfile, $content);
}
fclose($fp);
fclose($tmpfile);
$is = getimagesize($tmpfilename);
unlink($tmpfilename);
return $is;
}

PHP: Uncompressing gzcompressed file with gzuncompress and fopen

I am trying to uncompress a binary file and then uncompress it to test whether the compresssion is working.
However the 'uncompressed' file has the same data as the 'compressed' file. As though the uncompression never happened. I have listed code below.
thanks in advance:
//compressing
//read file
$filename = 'tocompress/tocompress'.$number_input.'.bin';
$contents=fread($fp,filesize($filename));
fclose($fp);
//compress file
$compressing = gzcompress($contents , '0');
//write to file
$fp = fopen('compressed/compressed'.$number_input.'.bin', 'wb');
fwrite($fp, $compressing);
fclose($fp);
//uncompressing
//read file
$uncompfilename='compressed/compressed'.$number_input.'.bin';
$fp=fopen($uncompfilename,'rb');
$uncompresscontents=fread($fp,filesize($uncompfilename));
fclose($fp);
//uncompress file
$uncompressing = gzuncompress($uncompresscontents);
//write to file
$fp = fopen('uncompressed/uncompressed'.$number_input.'.bin', 'wb');
fwrite($fp, $uncompresscontents);
fclose($fp);
gzcompress takes an optional second argument which sets compression level, from 0-9. You're setting it to 0 (no compression) and you should be using an int, not a string:
You have gzcompress($contents, '0');
You want gzcompress($contents, 9);
From php.net/gzcompress:
The level of compression. Can be given as 0 for no compression up to 9
for maximum compression.

Categories