Broadcast stream with PHP within localhost - php

Maybe I'm asking the impossible but I wanted to clone a stream multiple times. A sort of multicast emulation. The idea is to write every 0.002 seconds a 1300 bytes big buffer into a .sock file (instead of IP:port to avoid overheading) and then to read from other scripts the same .sock file multiple times.
Doing it through a regular file is not doable. It works only within the same script that generates the buffer file and then echos it. The other scripts will misread it badly.
This works perfectly with the script that generates the chunks:
$handle = #fopen($url, 'rb');
$buffer = 1300;
while (1) {
$chunck = fread($handle, $buffer);
$handle2 = fopen('/var/tmp/stream_chunck.tmp', 'w');
fwrite($handle2, $chunck);
fclose($handle2);
readfile('/var/tmp/stream_chunck.tmp');
}
BUT the output of another script that reads the chunks:
while (1) {
readfile('/var/tmp/stream_chunck.tmp');
}
is messy. I don't know how to synchronize the reading process of chunks and I thought that sockets could make a miracle.

It works only within the same script that generates the buffer file and then echos it. The other scripts will misread it badly
Using a single file without any sort of flow control shouldn't be a problem - tail -F does just that. The disadvantage is that the data will just accululate indefinitely on the filesystem as long as a single client has an open file handle (even if you truncate the file).
But if you're writing chunks, then write each chunk to a different file (using an atomic write mechanism) then everyone can read it by polling for available files....
do {
while (!file_exists("$dir/$prefix.$current_chunk")) {
clearstatcache();
usleep(1000);
}
process(file_get_contents("$dir/$prefix.$current_chunk"));
$current_chunk++;
} while (!$finished);
Equally, you could this with a database - which should have slightly lower overhead for the polling, and simplifies the garbage collection of old chunks.
But this is all about how to make your solution workable - it doesn't really address the problem you are trying to solve. If we knew what you were trying to achieve then we might be able to advise on a more appropriate solution - e.g. if it's a chat application, video broadcast, something else....
I suspect a more appropriate solution would be to use mutli-processing, single memory model server - and when we're talking about PHP (which doesn't really do threading very well) that means an event based/asynchronous server. There's a bit more involved than simply calling socket_select() but there are some good scripts available which do most of the complicated stuff for you.

Related

difference between readfile() and fopen()

These two codes both do the same thing in reading files , so what's the main difference ?
1-First code :
$handle = fopen($file, 'r');
$data = fread($handle, filesize($file));
2-Second code :
readfile($file);
There's a significant difference between fread() and readfile().
First, readfile() does a number of things that fread() does not. readfile() opens the file for reading, reads it, and then prints it to the output buffer all in one go. fread() only does one of those things: it reads bytes from a given file handle.
Additionally, readfile() has some benefits that fread() does not. For example, it can take advantage of memory-mapped I/O where available rather than slower disk reads. This significantly increases the performance of reading the file since it delegates the process away from PHP itself and more towards operating system calls.
Errata
I previously noted that readfile() could run without PHP (this is corrected below).
For truly large files (think several gigs like media files or large archive backups), you may want to consider delegating the reading of the file away from PHP entirely with X-Sendfile headers to your webserver instead (so that you don't keep your PHP worker tied up for the length of an upload that could potentially take hours).
So you could do something like this instead of readfile():
<?php
/* process some things in php here */
header("X-Sendfile: /path/to/file");
exit; // don't need to keep PHP busy for this
Reading the docs, readfile reads the whole content and writes it into STDOUT.
$data = fread($handle, filesize($file));
While fread puts the content into the variable $data.

PHP fread by detail

Please look at the PHP code below. It is from a download script:
while(ob_get_level() > 0){
ob_end_clean();
}
set_time_limit(0);
ignore_user_abort(true);
$file = fopen(MATIN_FILE_PATH,"rb"); // the main file
$chunksize = 2*1024*1024;
while(!feof($file)){
echo #fread($file, $chunksize);
flush();
if (connection_status() == 1){ // if client aborted
#fclose($file);
exit;
}
}
#fclose($file);
exit;
In this code, you see that I send 2 MB per chunk.
Imagine a client with speed of 100kb/s
After many times of debugging, I found out that when client downloads each 2MB, fwrite happens and while goes to next loop. so, what Is PHP doing at this time? is it waiting for the user to download 2MB completely and then send another 2MB? so, isn't it better that I send 10MB or 50MB per chunk?
Thanks for any detailed guide.
Imagine you have 10 simultaneous client requests for downloading some file with this script and you have set 50MBs per chunk size. For each of the requests a new php process will be invoked, each of them demanding 50MBs of your server's memory to process fread($file, 50*1024*1024). So, you will have 500MBs memory consumed.
If, as you suggested, a client speed is 100kb/s, then the probability that you have 100 simultaneous connections is not so low and you could get 100 concurrent requests, which is 5GBs of RAM already. Do you have that much or need that all?
You cannot make the user download the file faster than his actual download speed, so the chunk size does not significantly matter for this. Neither reducing the number of loops iterations will help to speed up. I have not tested this, but I think, the loop is executed faster than I/O operations with remote client. So, the only thing you should really be concerned of, is to make your server work reliably.

PHP readfile on a file which is increasing in size

Is it possible to use PHP readfile function on a remote file whose size is unknown and is increasing in size? Here is the scenario:
I'm developing a script which downloads a video from a third party website and simultaneously trans-codes the video into MP3 format. This MP3 is then transferred to the user via readfile.
The query used for the above process is like this:
wget -q -O- "VideoURLHere" | ffmpeg -i - "Output.mp3" > /dev/null 2>&1 &
So the file is fetched and encoded at the same time.
Now when the above process is in progress I begin sending the output mp3 to the user via readfile. The problem is that the encoding process takes some time and therefore depending on the users download speed readfile reaches an assumed EoF before the whole file is encoded, resulting in the user receiving partial content/incomplete files.
My first attempt to fix this was to apply a speed limit on the users download, but this is not foolproof as the encoding time and speed vary with load and this still led to partial downloads.
So is there a way to implement this system in such a way that I can serve the downloads simultaneously along with the encoding and also guarantee sending the complete file to the end user?
Any help is appreciated.
EDIT:
In response to Peter, I'm actually using fread(read readfile_chunked):
<?php
function readfile_chunked($filename,$retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$totChunk = 0;
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
//usleep(120000); //Used to impose an artificial speed limit
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
readfile_chunked($linkToMp3);
?>
This still does not guarantee complete downloads as depending on the users download speed and the encoding speed, the EOF() may be reached prematurely.
Also in response to theJeztah's comment, I'm trying to achieve this without having to make the user wait..so that's not an option.
Since you are dealing with streams, you probably should use stream handling functions :). passthru comes to mind, although this will only work if the download | transcode command is started in your script.
If it is started externally, take a look at stream_get_contents.
Libevent as mentioned by Evert seems like the general solution where you have to use a file as a buffer. However in your case, you could do it all inline in your script without using a file as a buffer:
<?php
header("Content-Type: audio/mpeg");
passthru("wget -q -O- http://localhost/test.avi | ffmpeg -i - -f mp3 -");
?>
I don't think there's any of being notified about there being new data, short of something like inotify.
I suggest that if you hit EOF, you start polling the modification time of the file (using clearstatcache() between calls) every 200 ms or so. When you find the file size has increased, you can reopen the file, seek to the last position and continue.
I can highly recommend using libevent for applications like this.
It works perfect for cases like this.
The PHP documentation is a bit sparse for this, but you should be able to find more solid examples around the web.

Is ini_set("memory_limit", "512M"); too much?

I am using 2 php libraries to serve files- unzip, and dUnzip2
zip zip.lib
http://www.zend.com/codex.php?id=470&single=1
They work fine on files under 10MB , but with files over 10MB, I have to set the mem limit to 256. With files over 25MB, I set it to 512. It seems kind of high... Is it?
I'm on a dedicated server- 4 CPU's and 16GB RAM - but we also have a lot of traffic and downloading, so I'm kind of wondering here.
Perhaps you're using php to load the whole files into memory before serving them to the user? I've used a function found at http://www.php.net/manual/en/function.readfile.php (comments section) that serves the file in parts, keeping memory low. Copying from that post (because my version is changed):
<?php
function readfile_chunked ($filename,$type='array') {
$chunk_array=array();
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
switch($type)
{
case'array':
// Returns Lines Array like file()
$lines[] = fgets($handle, $chunksize);
break;
case'string':
// Returns Lines String like file_get_contents()
$lines = fread($handle, $chunksize);
break;
}
}
fclose($handle);
return $lines;
}
?>
what my script does is licensing files before user downloads
Generally you should avoid any web script that consumes that much memory or CPU time. Ideally you'd convert the task to a run via a detached process. Setting a high limit isn't bad per se, but it makes it harder to detect poorly written scripts and easier for high demand of the complex pages to hurt performance across even the simple pages.
For example, with gearman you can easily set up some license worker scripts that run on the CLI and communicate with them via the gearman API.
This way your web server can remain free to perform low CPU and memory tasks, and you can easily guarantee that you'll never run more than X licensing tasks at once (where X is based on how many worker scripts you let run).
Your front end script could just be an AJAX widget that polls the server, checking to see if the task has completed.

is php file_get_contents() enough for downloading movies?

is file_get_contents() enough for downloading remote movie files located on a server ?
i just think that perhaps storing large movie files to string is harmful ? according to the php docs.
OR do i need to use cURL ? I dont know cURL.
UPDATE: these are big movie files. around 200MB each.
file_get_contents() is a problem because it's going to load the entire file into memory in one go. If you have enough memory to support the operation (taking into account that if this is a web server, you may have multiple hits that generate this behavior simultaneously, and therefore each need that much memory), then file_get_contents() should be fine. However, it's not the right way to do it - you should use a library specifically intended for these sort of operations. As mentioned by others, cURL will do the trick, or wget. You might also have good luck using fopen('http://someurl', 'r') and reading blocks from the file and then dumping them straight to a local file that's been opened for write privileges.
As #mopoke suggested it could depend on the size of the file. For a small movie it may suffice. In general I think cURL would be a better fit though. You have much more flexibility with it than with file_get_contents().
For the best performance you may find it makes sense to just use a standard unix util like WGET. You should be able to call it with system("wget ...") or exec()
http://www.php.net/manual/en/function.system.php
you can read a few bytes at a time using fread().
$src="http://somewhere/test.avi";
$dst="test.avi";
$f = fopen($src, 'rb');
$o = fopen($dst, 'wb');
while (!feof($f)) {
if (fwrite($o, fread($f, 2048)) === FALSE) {
return 1;
}
}
fclose($f);
fclose($o);

Categories