Where is the bottleneck in mjpeg broadcasting? - php

I'm streaming mjpeg with PHP just like this
<?php
//example /cli/watch.php?i=0&j=200
function get_one_jpeg($i) {
$path = "img";
//$f = fopen("$path/$i.jpg", "rb");
return file_get_contents("$path/$i.jpg");
}
ini_set('display_errors', 1);
# Used to separate multipart
$boundary = "my_mjpeg";
# We start with the standard headers. PHP allows us this much
//header("Connection: close");
header("Cache-Control: no-store, no-cache, must-revalidate, pre-check=0, post-check=0, max-age=0");
header("Cache-Control: private");
header("Pragma: no-cache");
header("Expires: -1");
header("Content-type: multipart/x-mixed-replace; boundary=$boundary");
# From here out, we no longer expect to be able to use the header() function
print "--$boundary\n";
# Set this so PHP doesn't timeout during a long stream
set_time_limit(0);
# Disable Apache and PHP's compression of output to the client
#apache_setenv('no-gzip', 1);
#ini_set('zlib.output_compression', 0);
# Set implicit flush, and flush all current buffers
#ini_set('implicit_flush', 1);
for ($i = 0; $i < ob_get_level(); $i++)
ob_end_flush();
ob_implicit_flush(1);
# The loop, producing one jpeg frame per iteration
$i = $_GET['i'];
$j = $_GET['j'];
while ($i <= $j) {
# Per-image header, note the two new-lines
print "Content-type: image/jpeg\n\n";
# Your function to get one jpeg image
print get_one_jpeg($i);
# The separator
print "--$boundary\n";
# Sleeping for 0.1 seconds for 10 frames in second
usleep(100000);
$i++;
}
?>
But if I set a big range of images, for example, from 0 to 300, in indefinite time browser just stop showing.
It's not a specific frame or moment of time, and shows in different browsers, so I think that the causer of it is Apache.
I tried it under Apache 2.2.9 and 2.2.21 and get the same result. Under IIS Express it works even worse.
What problem it can be?

Based only on the info given:
10 frames per second can be a little aggresive for mjpeg if the frame size/resolution is larger. Remember, this is not mpeg where parts of the frame that are static don't get sent. Here the entire frame/image is sent every time. I would first try lowering the frame rate to around 5. If the problem improves then you know the issue is data rate, somewhere/somehow. You might be able to improve your problem at 10 fps if your code buffered some frames first then read from the buffer. That way if a frame is slow to show up your code or browser don't choke. I think you also need to limit the time your code will wait for an image to show up before giving up and continuing with the next image. Hope this helps.

I'm not sure if this question is still valid, but even if it's not, there is no direct answer.
I assume you are getting "image corrupt or truncated" error. My code is almost identical and I faced the same issue when using usleep(..).
The root cause is the usleep(..) placement - it should be called before print($boundary), not after. When you place it after print, browser will thing that something is wrong since it expect image directly after boundary section. IN this code immediately after boundary is usleep(..) which hold the stream for 100ms and because of that browser think that something is wrong.
Change this code:
print "--$boundary\n";
usleep(100000);
To this one:
usleep(100000);
print "--$boundary\n";
And everything will be working fine.

Related

Don't buffer response for event stream

I am using the CodeIgniter 3 PHP framework on an nginx/1.14.1 server with PHP/7.4.19.
I am trying to set up event streams. The controller method looks as so:
public function foo()
{
// Try everything I can to disable output buffering
ini_set('output_buffering', 'Off');
ini_set('implicit_flush', 'On');
ini_set('zlib.output_compression', 'Off');
header('X-Accel-Buffering: no');
header("X-Robots-Tag: noindex");
header('Cache-Control: no-store, no-cache');
header('Content-Type: text/event-stream');
header('Content-Encoding: none');
$i = 0;
// Loop 10 times
while ($i < 10)
{
// Codeigniter built-in function to generate a random alphanumeric string of n characters
$random_string = random_string('alnum', 10);
$array = array(
'foo' => $random_string
);
// Put together an event
$output = "event: ping\n";
$output .= "id: $random_string\n";
$output .= "data: ";
$output .= json_encode($array);
$output .= "\n\n";
echo $output;
$i++;
ob_end_flush();
flush();
// Break the loop if the client closed the page
if (connection_aborted())
{
break;
}
sleep(1);
}
The expected behavior is for each event to come in one at a time, one per second.
However, it seems I am unable to get output buffering to be disabled. The full 10s passes before all 10 events load in full.
If I generate a large random string such as random_string('alnum', 500); then the page loads in ~4K character chunks. In other words, the page will flush the buffer approximately every ~4K characters, and it will take about 3 of these chunks (with a 2-3s wait between each one, to accomodate 2-3 iterations of the while loop) before the page finishes loading.
I am using Cloudflare but I have a page rule in place to bypass caching on the page in question. It is the first page rule.
Response buffering is disabled in Cloudflare. By default, Cloudflare sends packets to the client as they receive them. Enterprise accounts (which we are not) can optionally enable response buffering.
Cloudflare Brotli compression is enabled. Temporarily disabling it did not fix the issue.
Obviously, it works as expected when the php script is run from the command line.
I am able to achieve the desired results in Chrome by adding the line $output .= str_pad('', 4096) . "\n\n"; just before echo'ing the output.
However, although this solution/hack "works", it does not seem to disable buffering, which is what my question was.
Update: I discovered this hack does not work in Safari.
I'm wondering if this is an Nginx issue? I had a similar problem when running a Server-Sent Event behind Nginx. The fix was to add a 'X-Accel-Buffering' header with value 'no' in the response.

Apache does not work with HTTP_RANGE?

I wanted to create a .php file, that streams a video!
Now, the problem is, that it works, only if i use a normal readfile(), but then, you can not go back and forward in the video, so i searched on google, to find this code:
(basically, the HTTP_RANGE does not work, NEVER, i do not know why, when testing it, it always fires my die("lol?");, so it clearly does not support it for some reason)
(the die() function is left there on purpose, it will be taken out if it would work..)
(note that i changed "$size = filesize($file);" to "$size = filesize(".".$file);", because someone mentioned that this is required, and "filesize($file);" does not work for me anyways, it always fires an error)!
(and, the $file, shows the actual path for my file, nothing replaced, its how it looks in the original php of me!)
<?php
// Clears the cache and prevent unwanted output
ob_clean();
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
apache_setenv('no-gzip', 1);
ini_set('zlib.output_compression', 'Off');
$file = "/cdn4-e663/zw4su8jiy8skgvizihsjehj/2038tkusi9u848sui7zh/2q3z6hjk97ujduz/a1-cdn/9zw35jbmhkk47wi63uu7.mp4"; // The media file's location
$mime = "application/octet-stream"; // The MIME type of the file, this should be replaced with your own.
$size = filesize(".".$file); // The size of the file
// Send the content type header
header('Content-type: ' . $mime);
// Check if it's a HTTP range request
if(isset($_SERVER['HTTP_RANGE'])){
// Parse the range header to get the byte offset
$ranges = array_map(
'intval', // Parse the parts into integer
explode(
'-', // The range separator
substr($_SERVER['HTTP_RANGE'], 6) // Skip the `bytes=` part of the header
)
);
// If the last range param is empty, it means the EOF (End of File)
if(!$ranges[1]){
$ranges[1] = $size - 1;
}
// Send the appropriate headers
header('HTTP/1.1 206 Partial Content');
header('Accept-Ranges: bytes');
header('Content-Length: ' . ($ranges[1] - $ranges[0])); // The size of the range
// Send the ranges we offered
header(
sprintf(
'Content-Range: bytes %d-%d/%d', // The header format
$ranges[0], // The start range
$ranges[1], // The end range
$size // Total size of the file
)
);
// It's time to output the file
$f = fopen($file, 'rb'); // Open the file in binary mode
$chunkSize = 8192; // The size of each chunk to output
// Seek to the requested start range
fseek($f, $ranges[0]);
die("working?");
// Start outputting the data
while(true){
// Check if we have outputted all the data requested
if(ftell($f) >= $ranges[1]){
break;
}
// Output the data
echo fread($f, $chunkSize);
// Flush the buffer immediately
ob_flush();
flush();
}
}
else {
die("lol?");
header('Content-Length: ' . $size);
// Read the file
readfile($file);
// and flush the buffer
ob_flush();
flush();
}
?>
so, the die("lol?"); was added by me to see if the
if(isset($_SERVER['HTTP_RANGE'])){
/*function fires or not, and no, as it seems it returns FALSE every time..8/
}
so i wanted to ask you all, how can i fix this? i really want to use php to stream my video, because of security reasons, and because i like it, i already use this methode with images but its a different code(and working)!
I am using Apache 2.4 (Windows 10 - 64bit PC) with the latest version of PHP7, but it seems that apache does not support HTTP_RANGE? am i missing something, is there something i need to enable inside either the php.ini or the httpd.conf??
Thank you in advance, i hope someone can tells me what to do, because i really am stuck here, and i tried ALL examples of mp4 video streaming i could find on google, and none worked for me :/
There are 2 parts to this:
The request made by the browser/client. This must send appropriate request headers.
The response given by your server. This is done by your PHP script and must also send the appropriate response headers
When you try and stream your video (or whatever the content is) open the Network tab in your browser.
Look at the Request Headers (in Chrome this is under the Network tab). I've posted a screenshot below. Note that in the request there is a Range: parameter. If this is not present in the request, you'll have problems. This is what tells the PHP script on the server that you are doing a range request in the first place. If the server does not see this header in the request then it will just bypass the if statement and go into the die.
Note that the Range: request header is not normally included in requests by default, so unless you are specifying this, it will never work. If you don't see it in the Request Headers on your Network tab, it is not present, and you need to fix that.
You may also want to examine the response headers - which are totally different from the request headers. Again, these can be seen in the Network tab in your browser. See below for the appropriate headers that must be set:
Going back to the original question, none of it has anything to do with the response (which is what you were describing). The initial problem you are having is all to do with how you're making the request and the fact it does not contain a Range: header, when it must do so.

PHP file_get_contents and exit before page loads [duplicate]

I'm attempting to do an AJAX call (via JQuery) that will initiate a fairly long process. I'd like the script to simply send a response indicating that the process has started, but JQuery won't return the response until the PHP script is done running.
I've tried this with a "close" header (below), and also with output buffering; neither seems to work. Any guesses? or is this something I need to do in JQuery?
<?php
echo( "We'll email you as soon as this is done." );
header( "Connection: Close" );
// do some stuff that will take a while
mail( 'dude#thatplace.com', "okay I'm done", 'Yup, all done.' );
?>
The following PHP manual page (incl. user-notes) suggests multiple instructions on how to close the TCP connection to the browser without ending the PHP script:
Connection handling Docs
Supposedly it requires a bit more than sending a close header.
OP then confirms: yup, this did the trick: pointing to user-note #71172 (Nov 2006) copied here:
Closing the users browser connection whilst keeping your php script running has been an issue since [PHP] 4.1, when the behaviour of register_shutdown_function() was modified so that it would not automatically close the users connection.
sts at mail dot xubion dot hu Posted the original solution:
<?php
header("Connection: close");
ob_start();
phpinfo();
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
sleep(13);
error_log("do something in the background");
?>
Which works fine until you substitute phpinfo() for echo('text I want user to see'); in which case the headers are never sent!
The solution is to explicitly turn off output buffering and clear the buffer prior to sending your header information. Example:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // just to be safe
ob_start();
echo('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
// Do processing here
sleep(30);
echo('Text user will never see');
?>
Just spent 3 hours trying to figure this one out, hope it helps someone :)
Tested in:
IE 7.5730.11
Mozilla Firefox 1.81
Later on in July 2010 in a related answer Arctic Fire then linked two further user-notes that were-follow-ups to the one above:
Connection Handling user-note #89177 (Feb 2009)
Connection Handling user-note #93441 (Sep 2009)
It's necessary to send these 2 headers:
Connection: close
Content-Length: n (n = size of output in bytes )
Since you need know the size of your output, you'll need to buffer your output, then flush it to the browser:
// buffer all upcoming output
ob_start();
echo 'We\'ll email you as soon as this is done.';
// get the size of the output
$size = ob_get_length();
// send headers to tell the browser to close the connection
header('Content-Length: '.$size);
header('Connection: close');
// flush all output
ob_end_flush();
ob_flush();
flush();
// if you're using sessions, this prevents subsequent requests
// from hanging while the background process executes
if (session_id()) {session_write_close();}
/******** background process starts here ********/
Also, if your web server is using automatic gzip compression on the output (ie. Apache with mod_deflate), this won't work because actual size of the output is changed, and the Content-Length is no longer accurate. Disable gzip compression the particular script.
For more details, visit http://www.zulius.com/how-to/close-browser-connection-continue-execution
You can use Fast-CGI with PHP-FPM to use the fastcgi_end_request() function. In this way, you can continue to do some processing while the response has already been sent to the client.
example of how to use fastcgi_finish_request() (Nov 2010)
You find this in the PHP manual here: FastCGI Process Manager (FPM); But that function specifically is not further documented in the manual. Here the excerpt from the PHP-FPM: PHP FastCGI Process Manager Wiki:
fastcgi_finish_request()
Scope: php function
Category: Optimization
This feature allows you to speed up implementation of some php queries. Acceleration is possible when there are actions in the process of script execution that do not affect server response. For example, saving the session in memcached can occur after the page has been formed and passed to a web server. fastcgi_finish_request() is a php feature, that stops the response output. Web server immediately starts to transfer response "slowly and sadly" to the client, and php at the same time can do a lot of useful things in the context of a query, such as saving the session, converting the downloaded video, handling all kinds of statistics, etc.
fastcgi_finish_request() can invoke executing shutdown function.
Note: fastcgi_finish_request() has a quirk where calls to flush, print, or echo will terminate the script early.
To avoid that issue, you can call ignore_user_abort(true) right before or after the fastcgi_finish_request call:
ignore_user_abort(true);
fastcgi_finish_request();
Complete version:
ignore_user_abort(true);//avoid apache to kill the php running
ob_start();//start buffer output
echo "show something to user";
session_write_close();//close session file on server side to avoid blocking other requests
header("Content-Encoding: none");//send header to avoid the browser side to take content as gzip format
header("Content-Length: ".ob_get_length());//send length header
header("Connection: close");//or redirect to some url: header('Location: http://www.google.com');
ob_end_flush();flush();//really send content, can't change the order:1.ob buffer to normal buffer, 2.normal buffer to output
//continue do something on server side
ob_start();
sleep(5);//the user won't wait for the 5 seconds
echo 'for diyism';//user can't see this
file_put_contents('/tmp/process.log', ob_get_contents());
ob_end_clean();
A better solution is to fork a background process. It is fairly straight forward on unix/linux:
<?php
echo "We'll email you as soon as this is done.";
system("php somestuff.php dude#thatplace.com >/dev/null &");
?>
You should look at this question for better examples:
PHP execute a background process
Assuming you have a Linux server and root access, try this. It is the simplest solution I have found.
Create a new directory for the following files and give it full permissions. (We can make it more secure later.)
mkdir test
chmod -R 777 test
cd test
Put this in a file called bgping.
echo starting bgping
ping -c 15 www.google.com > dump.txt &
echo ending bgping
Note the &. The ping command will run in the background while the current process moves on to the echo command.
It will ping www.google.com 15 times, which will take about 15 seconds.
Make it executable.
chmod 777 bgping
Put this in a file called bgtest.php.
<?php
echo "start bgtest.php\n";
exec('./bgping', $output, $result)."\n";
echo "output:".print_r($output,true)."\n";
echo "result:".print_r($result,true)."\n";
echo "end bgtest.php\n";
?>
When you request bgtest.php in your browser, you should get the following response quickly, without waiting about
15 seconds for the ping command to complete.
start bgtest.php
output:Array
(
[0] => starting bgping
[1] => ending bgping
)
result:0
end bgtest.php
The ping command should now be running on the server. Instead of the ping command, you could run a PHP script:
php -n -f largejob.php > dump.txt &
Hope this helps!
Here's a modification to Timbo's code that works with gzip compression.
// buffer all upcoming output
if(!ob_start("ob_gzhandler")){
define('NO_GZ_BUFFER', true);
ob_start();
}
echo "We'll email you as soon as this is done.";
//Flush here before getting content length if ob_gzhandler was used.
if(!defined('NO_GZ_BUFFER')){
ob_end_flush();
}
// get the size of the output
$size = ob_get_length();
// send headers to tell the browser to close the connection
header("Content-Length: $size");
header('Connection: close');
// flush all output
ob_end_flush();
ob_flush();
flush();
// if you're using sessions, this prevents subsequent requests
// from hanging while the background process executes
if (session_id()) session_write_close();
/******** background process starts here ********/
I'm on a shared host and fastcgi_finish_request is setup to exit scripts completely. I don't like the connection: close solution either. Using it forces a separate connection for subsequent requests, costing additional server resources. I read the Transfer-Encoding: cunked Wikipedia Article and learned that 0\r\n\r\n terminates a response. I haven't thoroughly tested this across browsers versions and devices, but it works on all 4 of my current browsers.
// Disable automatic compression
// #ini_set('zlib.output_compression', 'Off');
// #ini_set('output_buffering', 'Off');
// #ini_set('output_handler', '');
// #apache_setenv('no-gzip', 1);
// Chunked Transfer-Encoding & Gzip Content-Encoding
function ob_chunked_gzhandler($buffer, $phase) {
if (!headers_sent()) header('Transfer-Encoding: chunked');
$buffer = ob_gzhandler($buffer, $phase);
return dechex(strlen($buffer))."\r\n$buffer\r\n";
}
ob_start('ob_chunked_gzhandler');
// First Chunk
echo "Hello World";
ob_flush();
// Second Chunk
echo ", Grand World";
ob_flush();
ob_end_clean();
// Terminating Chunk
echo "\x30\r\n\r\n";
ob_flush();
flush();
// Post Processing should not be displayed
for($i=0; $i<10; $i++) {
print("Post-Processing");
sleep(1);
}
TL;DR Answer:
ignore_user_abort(true); //Safety measure so that the user doesn't stop the script too early.
$content = 'Hello World!'; //The content that will be sent to the browser.
header('Content-Length: ' . strlen($content)); //The browser will close the connection when the size of the content reaches "Content-Length", in this case, immediately.
ob_start(); //Content past this point...
echo $content;
//...will be sent to the browser (the output buffer gets flushed) when this code executes.
ob_end_flush();
ob_flush();
flush();
if(session_id())
{
session_write_close(); //Closes writing to the output buffer.
}
//Anything past this point will be ran without involving the browser.
Function Answer:
ignore_user_abort(true);
function sendAndAbort($content)
{
header('Content-Length: ' . strlen($content));
ob_start();
echo $content;
ob_end_flush();
ob_flush();
flush();
}
sendAndAbort('Hello World!');
//Anything past this point will be ran without involving the browser.
You could try to do multithreading.
you could whip up a script that makes a system call ( using shell_exec ) that calls the php binary with the script to do your work as the parameter. But I don't think that is the most secure way. Maybe you can thighten stuff up by chrooting the php process and other stuff
Alternatively, there's a class at phpclasses that do that http://www.phpclasses.org/browse/package/3953.html. But I don't know the specifics of the implementation
Joeri Sebrechts' answer is close, but it destroys any existing content that may be buffered before you wish to disconnect. It doesn't call ignore_user_abort properly, allowing the script to terminate prematurely. diyism's answer is good but is not generically applicable. E.g. a person may have greater or fewer output buffers that that answer does not handle, so it may simply not work in your situation and you won't know why.
This function allows you to disconnect any time (as long as headers have not been sent yet) and retains the content you've generated so far. The extra processing time is unlimited by default.
function disconnect_continue_processing($time_limit = null) {
ignore_user_abort(true);
session_write_close();
set_time_limit((int) $time_limit);//defaults to no limit
while (ob_get_level() > 1) {//only keep the last buffer if nested
ob_end_flush();
}
$last_buffer = ob_get_level();
$length = $last_buffer ? ob_get_length() : 0;
header("Content-Length: $length");
header('Connection: close');
if ($last_buffer) {
ob_end_flush();
}
flush();
}
If you need extra memory, too, allocate it before calling this function.
Note for mod_fcgid users (please, use at your own risk).
Quick Solution
The accepted answer of Joeri Sebrechts is indeed functional. However, if you use mod_fcgid you may find that this solution does not work on its own. In other words, when the flush function is called the connection to the client does not get closed.
The FcgidOutputBufferSize configuration parameter of mod_fcgid may be to blame. I have found this tip in:
this reply of Travers Carter and
this blog post of Seumas Mackinnon.
After reading the above, you may come to the conclusion that a quick solution would be to add the line (see "Example Virtual Host" at the end):
FcgidOutputBufferSize 0
in either your Apache configuration file (e.g, httpd.conf), your FCGI configuration file (e.g, fcgid.conf) or in your virtual hosts file (e.g., httpd-vhosts.conf).
In (1) above, a variable named "OutputBufferSize" is mentioned. This is the old name of the FcgidOutputBufferSize mentioned in (2) (see the upgrade notes in the Apache web page for mod_fcgid).
Details & A Second Solution
The above solution disables the buffering performed by mod_fcgid either for the whole server or for a specific virtual host. This might lead to a performance penalty for your web site. On the other hand, this may well not be the case since PHP performs buffering on its own.
In case you do not wish to disable mod_fcgid's buffering there is another solution... you can force this buffer to flush.
The code below does just that by building on the solution proposed by Joeri Sebrechts:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // just to be safe
ob_start();
echo('Text the user will see');
echo(str_repeat(' ', 65537)); // [+] Line added: Fill up mod_fcgi's buffer.
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
// Do processing here
sleep(30);
echo('Text user will never see');
?>
What the added line of code essentially does is fill up mod_fcgi's buffer, thus forcing it to flush. The number "65537" was chosen because the default value of the FcgidOutputBufferSize variable is "65536", as mentioned in the Apache web page for the corresponding directive. Hence, you may need to adjust this value accordingly if another value is set in your environment.
My Environment
WampServer 2.5
Apache 2.4.9
PHP 5.5.19 VC11, x86, Non Thread Safe
mod_fcgid/2.3.9
Windows 7 Professional x64
Example Virtual Host
<VirtualHost *:80>
DocumentRoot "d:/wamp/www/example"
ServerName example.local
FcgidOutputBufferSize 0
<Directory "d:/wamp/www/example">
Require all granted
</Directory>
</VirtualHost>
this worked for me
//avoid apache to kill the php running
ignore_user_abort(true);
//start buffer output
ob_start();
echo "show something to user1";
//close session file on server side to avoid blocking other requests
session_write_close();
//send length header
header("Content-Length: ".ob_get_length());
header("Connection: close");
//really send content, can't change the order:
//1.ob buffer to normal buffer,
//2.normal buffer to output
ob_end_flush();
flush();
//continue do something on server side
ob_start();
//replace it with the background task
sleep(20);
Ok, so basically the way jQuery does the XHR request, even the ob_flush method will not work because you are unable to run a function on each onreadystatechange. jQuery checks the state, then chooses the proper actions to take (complete,error,success,timeout). And although I was unable to find a reference, I recall hearing that this does not work with all XHR implementations.
A method that I believe should work for you is a cross between the ob_flush and forever-frame polling.
<?php
function wrap($str)
{
return "<script>{$str}</script>";
};
ob_start(); // begin buffering output
echo wrap("console.log('test1');");
ob_flush(); // push current buffer
flush(); // this flush actually pushed to the browser
$t = time();
while($t > (time() - 3)) {} // wait 3 seconds
echo wrap("console.log('test2');");
?>
<html>
<body>
<iframe src="ob.php"></iframe>
</body>
</html>
And because the scripts are executed inline, as the buffers are flushed, you get execution. To make this useful, change the console.log to a callback method defined in you main script setup to receive data and act on it. Hope this helps. Cheers, Morgan.
An alternative solution is to add the job to a queue and make a cron script which checks for new jobs and runs them.
I had to do it that way recently to circumvent limits imposed by a shared host - exec() et al was disabled for PHP run by the webserver but could run in a shell script.
Your problem can be solved by doing some parallel programming in php. I asked a question about it a few weeks ago here: How can one use multi threading in PHP applications
And got great answers. I liked one in particular very much. The writer made a reference to the Easy Parallel Processing in PHP (Sep 2008; by johnlim) tutorial which can actually solve your problem very well as I have used it already to deal with a similar problem that came up a couple of days ago.
If flush() function does not work. You must set next options in php.ini like:
output_buffering = Off
zlib.output_compression = Off
Latest Working Solution
// client can see outputs if any
ignore_user_abort(true);
ob_start();
echo "success";
$buffer_size = ob_get_length();
session_write_close();
header("Content-Encoding: none");
header("Content-Length: $buffer_size");
header("Connection: close");
ob_end_flush();
ob_flush();
flush();
sleep(2);
ob_start();
// client cannot see the result of code below
After trying many different solutions from this thread (after none of them worked for me), I've found solution on official PHP.net page:
function sendResponse($response) {
ob_end_clean();
header("Connection: close\r\n");
header("Content-Encoding: none\r\n");
ignore_user_abort(true);
ob_start();
echo $response; // Actual response that will be sent to the user
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
if (ob_get_contents()) {
ob_end_clean();
}
}
Couldn't get any of the above to work with IIS but:
With all its limitations, the built in PHP -S webserver comes to the rescue.
Caller script (IIS)
//limit of length required!
<?php
$s = file_get_contents('http://127.0.0.1:8080/test.php',false,null,0,10);
echo $s;
Worker script (built in webserber # 8080 - beware single thread):
ob_end_clean();
header("Connection: close");
ignore_user_abort(true);
ob_start();
echo 'Text the user will see';
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // All output buffers must be flushed here
flush(); // Force output to client
// Do processing here
sleep(5);
file_put_contents('ts.txt',date('H:i:s',time()));
//echo('Text user will never see');
Ugly enough? :)

Output line by line with PHP

In a small application, I'm writing, during installation phase, the user must enter detail about the MySQL connection in a form (server name, database name, username ...)
I would like to display line by line the result of tests and actions done in a window, with eventually a small pause between each test, like :
"Connection to the server ..."(2s) "ok" (2s)
"Connection to the database ..."(2s) "ok" (2s)
"Creation of tables ..."(2s) "ok"
....etc
I have tried with ob_start(), flush(), ob_flush() but it doesn't work as my webserver seems to buffer itself the php output, so the script takes a long time to run but everything is printed at the same time.
I have searched here and with Google, without any result (may be not with the correct keywords)
Can you point me to solution ?
with Ajax may be ?
ericc
There are multiple solutions to this, but disabling caching altogether isn't easily accomplished.
Because on the server side there is some caching, which can be disabled, but there is also a very large chance that the browser is caching as well.
To fix this you have a few options:
fake it, display a static javascript based "animation" and redirect when done.
Use ajax, let the installation script echo the results to a text file / mysql database and use a different php script to load the the text file / database at set intervals and display the new results.
Use the newest HTML5/javascript streaming API which is made exactly for stuff like this, it is, however, quite hard to find any good documentation on this.
To disable server side caching, I use this code:
#ini_set('zlib.output_compression', 0);
#ini_set('implicit_flush', 1);
for ($i = 0; $i < ob_get_level(); $i++) { ob_end_flush(); }
ob_implicit_flush(1);
header('Expires: Fri, 01 Jan 1990 00:00:00 GMT');
header('Cache-Control: no-cache, no-store, max-age=0, must-revalidate');
header('Pragma: no-cache');
header('Connection: close');
Have you tried usleep() before every time printing out?
Edit
You can use usleep() to make your php script delay the execution so that texts can be seen one by one, it could substitute the use of buffering IN YOUR CASE.
mysql_connect() successfully -> print something -> usleep(300)
mysql_select_db() successfully -> print something -> usleep(300)

Is there a good implementation of partial file downloading in PHP?

I'm writing a PHP script that allows the user to download a file. Basically the idea is to prevent the file being downloaded more than X times, since it is paid content, and the link should not be spread around.
Since the files will be pretty large, it should be good to implement resuming. I've read the standard, but it's pretty long and allows for some flexibility. Since I need to get it done quickly, I'd prefer a stable, tested implementation of this feature.
Can anyone point me to such a a script?
Seems that I found what I needed myself. So that other may benefit from this, here is the link: http://www.coneural.org/florian/papers/04_byteserving.php
And just in case the original page stops to work (the script is pretty old already), here is a copy of it:
<?php
/*
The following byte serving code is (C) 2004 Razvan Florian. You may find the latest version at
http://www.coneural.org/florian/papers/04_byteserving.php
*/
function set_range($range, $filesize, &$first, &$last){
/*
Sets the first and last bytes of a range, given a range expressed as a string
and the size of the file.
If the end of the range is not specified, or the end of the range is greater
than the length of the file, $last is set as the end of the file.
If the begining of the range is not specified, the meaning of the value after
the dash is "get the last n bytes of the file".
If $first is greater than $last, the range is not satisfiable, and we should
return a response with a status of 416 (Requested range not satisfiable).
Examples:
$range='0-499', $filesize=1000 => $first=0, $last=499 .
$range='500-', $filesize=1000 => $first=500, $last=999 .
$range='500-1200', $filesize=1000 => $first=500, $last=999 .
$range='-200', $filesize=1000 => $first=800, $last=999 .
*/
$dash=strpos($range,'-');
$first=trim(substr($range,0,$dash));
$last=trim(substr($range,$dash+1));
if ($first=='') {
//suffix byte range: gets last n bytes
$suffix=$last;
$last=$filesize-1;
$first=$filesize-$suffix;
if($first<0) $first=0;
} else {
if ($last=='' || $last>$filesize-1) $last=$filesize-1;
}
if($first>$last){
//unsatisfiable range
header("Status: 416 Requested range not satisfiable");
header("Content-Range: */$filesize");
exit;
}
}
function buffered_read($file, $bytes, $buffer_size=1024){
/*
Outputs up to $bytes from the file $file to standard output, $buffer_size bytes at a time.
*/
$bytes_left=$bytes;
while($bytes_left>0 && !feof($file)){
if($bytes_left>$buffer_size)
$bytes_to_read=$buffer_size;
else
$bytes_to_read=$bytes_left;
$bytes_left-=$bytes_to_read;
$contents=fread($file, $bytes_to_read);
echo $contents;
flush();
}
}
function byteserve($filename){
/*
Byteserves the file $filename.
When there is a request for a single range, the content is transmitted
with a Content-Range header, and a Content-Length header showing the number
of bytes actually transferred.
When there is a request for multiple ranges, these are transmitted as a
multipart message. The multipart media type used for this purpose is
"multipart/byteranges".
*/
$filesize=filesize($filename);
$file=fopen($filename,"rb");
$ranges=NULL;
if ($_SERVER['REQUEST_METHOD']=='GET' && isset($_SERVER['HTTP_RANGE']) && $range=stristr(trim($_SERVER['HTTP_RANGE']),'bytes=')){
$range=substr($range,6);
$boundary='g45d64df96bmdf4sdgh45hf5';//set a random boundary
$ranges=explode(',',$range);
}
if($ranges && count($ranges)){
header("HTTP/1.1 206 Partial content");
header("Accept-Ranges: bytes");
if(count($ranges)>1){
/*
More than one range is requested.
*/
//compute content length
$content_length=0;
foreach ($ranges as $range){
set_range($range, $filesize, $first, $last);
$content_length+=strlen("\r\n--$boundary\r\n");
$content_length+=strlen("Content-type: application/pdf\r\n");
$content_length+=strlen("Content-range: bytes $first-$last/$filesize\r\n\r\n");
$content_length+=$last-$first+1;
}
$content_length+=strlen("\r\n--$boundary--\r\n");
//output headers
header("Content-Length: $content_length");
//see http://httpd.apache.org/docs/misc/known_client_problems.html for an discussion of x-byteranges vs. byteranges
header("Content-Type: multipart/x-byteranges; boundary=$boundary");
//output the content
foreach ($ranges as $range){
set_range($range, $filesize, $first, $last);
echo "\r\n--$boundary\r\n";
echo "Content-type: application/pdf\r\n";
echo "Content-range: bytes $first-$last/$filesize\r\n\r\n";
fseek($file,$first);
buffered_read ($file, $last-$first+1);
}
echo "\r\n--$boundary--\r\n";
} else {
/*
A single range is requested.
*/
$range=$ranges[0];
set_range($range, $filesize, $first, $last);
header("Content-Length: ".($last-$first+1) );
header("Content-Range: bytes $first-$last/$filesize");
header("Content-Type: application/pdf");
fseek($file,$first);
buffered_read($file, $last-$first+1);
}
} else{
//no byteserving
header("Accept-Ranges: bytes");
header("Content-Length: $filesize");
header("Content-Type: application/pdf");
readfile($filename);
}
fclose($file);
}
function serve($filename, $download=0){
//Just serves the file without byteserving
//if $download=true, then the save file dialog appears
$filesize=filesize($filename);
header("Content-Length: $filesize");
header("Content-Type: application/pdf");
$filename_parts=pathinfo($filename);
if($download) header('Content-disposition: attachment; filename='.$filename_parts['basename']);
readfile($filename);
}
//unset magic quotes; otherwise, file contents will be modified
set_magic_quotes_runtime(0);
//do not send cache limiter header
ini_set('session.cache_limiter','none');
$filename='myfile.pdf'; //this is the PDF file that will be byteserved
byteserve($filename); //byteserve it!
?>
You should be using PEAR HTTP_Download. It is pretty easy to use and it allows download resuming just file:
http://pear.php.net/manual/en/package.http.http-download.intro.php
Based on this:
http://w-shadow.com/blog/2007/08/12/how-to-force-file-download-with-php/
(which you also could use)
I've made a small lib that does what PECL http_send_file extension does:
http://php.net/manual/en/function.http-send-file.php
(which you also could use)
The lib resembles the http_send_file, but if you don't have the option of installing the PECL lib, you could use the http-send-file lib:
https://github.com/diversen/http-send-file
See http://us3.php.net/manual/en/function.fread.php
An alternative is to let the web server can handle http by redirecting to the file in question.
A PHP script can do any checks needed (security, authentication, validate the file, incrementing the download count) and any other tasks before calling header("Location $urltofile");
I tested this with apache. Interrupt/resume download works. The server's mime type configuration will determine client behavior. For apache, if defaults in mime.types are not suitable, configuration directives for mod_mime could go in a .htaccess file in the directory of the file to download. If really necessary, these could even by written by the PHP script before it redirects.
Perhaps instead of implementing web server in a web server (yo dawg!) you could use mod trigger before download in lighttpd or mod X-Sendfile available for both lighttpd and Apache2?

Categories