I have a PHP script which takes a while to process and I wanted to get the site to display it's progress while it is running.
I have been having problems displaying the contents of the output buffer while a PHP code is running. I have been trying the use of ob_implicit_flush(true), ob_end_flush() and ob_flush() with no success.
After looking for a fair bit at the PHP manual on php.net I have come across a comment by Lee saying
As of August 2012, all browsers seem to show an all-or-nothing
approach to buffering. In other words, while php is operating, no
content can be shown.
In particular this means that the following workarounds listed further
down here are ineffective:
1) ob_flush (), flush () in any combination with other output
buffering functions;
2) changes to php.ini involving setting output_buffer and/or
zlib.output_compression to 0 or Off;
3) setting Apache variables such as "no-gzip" either through
apache_setenv () or through entries in .htaccess.
So, until browsers begin to show buffered content again, the tips
listed here are moot.
see here
If he is correct, is there an alternative I can use to display the progress of the script running?
[edit]
Looks like Lee could be right. I have been doing some testing and here is an example of the problem
<?php
ob_implicit_flush(true);
ob_end_flush();
for ($i=0; $i<5; $i++) {
echo $i.'<br>';
sleep(1);
}
?>
Should display 0 to 4 with a second in between. What actually happens is that the site waits 5 seconds then displays 0 to 4 immediately. It is waiting for the script to complete then displays everything in one go.
I have had output_buffering = Off in my php.ini file and tried setting it to 1 and 4096 (4096 being 4KB) and it still does the same.
Check the buffer size and make an "empty" dummy of that size, for example...
$flushdummy="";
for ($i=0; $i<700; $i++) { $flushdummy=$flushdummy." "; }
$flushdummy=$flushdummy."\n";
then, when output is desired, add the dummy to the output, plus the flush:
echo "here I am<br />", $flushdummy;
flush();
#ob_flush();
I'm attempting to do an AJAX call (via JQuery) that will initiate a fairly long process. I'd like the script to simply send a response indicating that the process has started, but JQuery won't return the response until the PHP script is done running.
I've tried this with a "close" header (below), and also with output buffering; neither seems to work. Any guesses? or is this something I need to do in JQuery?
<?php
echo( "We'll email you as soon as this is done." );
header( "Connection: Close" );
// do some stuff that will take a while
mail( 'dude#thatplace.com', "okay I'm done", 'Yup, all done.' );
?>
The following PHP manual page (incl. user-notes) suggests multiple instructions on how to close the TCP connection to the browser without ending the PHP script:
Connection handling Docs
Supposedly it requires a bit more than sending a close header.
OP then confirms: yup, this did the trick: pointing to user-note #71172 (Nov 2006) copied here:
Closing the users browser connection whilst keeping your php script running has been an issue since [PHP] 4.1, when the behaviour of register_shutdown_function() was modified so that it would not automatically close the users connection.
sts at mail dot xubion dot hu Posted the original solution:
<?php
header("Connection: close");
ob_start();
phpinfo();
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
sleep(13);
error_log("do something in the background");
?>
Which works fine until you substitute phpinfo() for echo('text I want user to see'); in which case the headers are never sent!
The solution is to explicitly turn off output buffering and clear the buffer prior to sending your header information. Example:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // just to be safe
ob_start();
echo('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
// Do processing here
sleep(30);
echo('Text user will never see');
?>
Just spent 3 hours trying to figure this one out, hope it helps someone :)
Tested in:
IE 7.5730.11
Mozilla Firefox 1.81
Later on in July 2010 in a related answer Arctic Fire then linked two further user-notes that were-follow-ups to the one above:
Connection Handling user-note #89177 (Feb 2009)
Connection Handling user-note #93441 (Sep 2009)
It's necessary to send these 2 headers:
Connection: close
Content-Length: n (n = size of output in bytes )
Since you need know the size of your output, you'll need to buffer your output, then flush it to the browser:
// buffer all upcoming output
ob_start();
echo 'We\'ll email you as soon as this is done.';
// get the size of the output
$size = ob_get_length();
// send headers to tell the browser to close the connection
header('Content-Length: '.$size);
header('Connection: close');
// flush all output
ob_end_flush();
ob_flush();
flush();
// if you're using sessions, this prevents subsequent requests
// from hanging while the background process executes
if (session_id()) {session_write_close();}
/******** background process starts here ********/
Also, if your web server is using automatic gzip compression on the output (ie. Apache with mod_deflate), this won't work because actual size of the output is changed, and the Content-Length is no longer accurate. Disable gzip compression the particular script.
For more details, visit http://www.zulius.com/how-to/close-browser-connection-continue-execution
You can use Fast-CGI with PHP-FPM to use the fastcgi_end_request() function. In this way, you can continue to do some processing while the response has already been sent to the client.
example of how to use fastcgi_finish_request() (Nov 2010)
You find this in the PHP manual here: FastCGI Process Manager (FPM); But that function specifically is not further documented in the manual. Here the excerpt from the PHP-FPM: PHP FastCGI Process Manager Wiki:
fastcgi_finish_request()
Scope: php function
Category: Optimization
This feature allows you to speed up implementation of some php queries. Acceleration is possible when there are actions in the process of script execution that do not affect server response. For example, saving the session in memcached can occur after the page has been formed and passed to a web server. fastcgi_finish_request() is a php feature, that stops the response output. Web server immediately starts to transfer response "slowly and sadly" to the client, and php at the same time can do a lot of useful things in the context of a query, such as saving the session, converting the downloaded video, handling all kinds of statistics, etc.
fastcgi_finish_request() can invoke executing shutdown function.
Note: fastcgi_finish_request() has a quirk where calls to flush, print, or echo will terminate the script early.
To avoid that issue, you can call ignore_user_abort(true) right before or after the fastcgi_finish_request call:
ignore_user_abort(true);
fastcgi_finish_request();
Complete version:
ignore_user_abort(true);//avoid apache to kill the php running
ob_start();//start buffer output
echo "show something to user";
session_write_close();//close session file on server side to avoid blocking other requests
header("Content-Encoding: none");//send header to avoid the browser side to take content as gzip format
header("Content-Length: ".ob_get_length());//send length header
header("Connection: close");//or redirect to some url: header('Location: http://www.google.com');
ob_end_flush();flush();//really send content, can't change the order:1.ob buffer to normal buffer, 2.normal buffer to output
//continue do something on server side
ob_start();
sleep(5);//the user won't wait for the 5 seconds
echo 'for diyism';//user can't see this
file_put_contents('/tmp/process.log', ob_get_contents());
ob_end_clean();
A better solution is to fork a background process. It is fairly straight forward on unix/linux:
<?php
echo "We'll email you as soon as this is done.";
system("php somestuff.php dude#thatplace.com >/dev/null &");
?>
You should look at this question for better examples:
PHP execute a background process
Assuming you have a Linux server and root access, try this. It is the simplest solution I have found.
Create a new directory for the following files and give it full permissions. (We can make it more secure later.)
mkdir test
chmod -R 777 test
cd test
Put this in a file called bgping.
echo starting bgping
ping -c 15 www.google.com > dump.txt &
echo ending bgping
Note the &. The ping command will run in the background while the current process moves on to the echo command.
It will ping www.google.com 15 times, which will take about 15 seconds.
Make it executable.
chmod 777 bgping
Put this in a file called bgtest.php.
<?php
echo "start bgtest.php\n";
exec('./bgping', $output, $result)."\n";
echo "output:".print_r($output,true)."\n";
echo "result:".print_r($result,true)."\n";
echo "end bgtest.php\n";
?>
When you request bgtest.php in your browser, you should get the following response quickly, without waiting about
15 seconds for the ping command to complete.
start bgtest.php
output:Array
(
[0] => starting bgping
[1] => ending bgping
)
result:0
end bgtest.php
The ping command should now be running on the server. Instead of the ping command, you could run a PHP script:
php -n -f largejob.php > dump.txt &
Hope this helps!
Here's a modification to Timbo's code that works with gzip compression.
// buffer all upcoming output
if(!ob_start("ob_gzhandler")){
define('NO_GZ_BUFFER', true);
ob_start();
}
echo "We'll email you as soon as this is done.";
//Flush here before getting content length if ob_gzhandler was used.
if(!defined('NO_GZ_BUFFER')){
ob_end_flush();
}
// get the size of the output
$size = ob_get_length();
// send headers to tell the browser to close the connection
header("Content-Length: $size");
header('Connection: close');
// flush all output
ob_end_flush();
ob_flush();
flush();
// if you're using sessions, this prevents subsequent requests
// from hanging while the background process executes
if (session_id()) session_write_close();
/******** background process starts here ********/
I'm on a shared host and fastcgi_finish_request is setup to exit scripts completely. I don't like the connection: close solution either. Using it forces a separate connection for subsequent requests, costing additional server resources. I read the Transfer-Encoding: cunked Wikipedia Article and learned that 0\r\n\r\n terminates a response. I haven't thoroughly tested this across browsers versions and devices, but it works on all 4 of my current browsers.
// Disable automatic compression
// #ini_set('zlib.output_compression', 'Off');
// #ini_set('output_buffering', 'Off');
// #ini_set('output_handler', '');
// #apache_setenv('no-gzip', 1);
// Chunked Transfer-Encoding & Gzip Content-Encoding
function ob_chunked_gzhandler($buffer, $phase) {
if (!headers_sent()) header('Transfer-Encoding: chunked');
$buffer = ob_gzhandler($buffer, $phase);
return dechex(strlen($buffer))."\r\n$buffer\r\n";
}
ob_start('ob_chunked_gzhandler');
// First Chunk
echo "Hello World";
ob_flush();
// Second Chunk
echo ", Grand World";
ob_flush();
ob_end_clean();
// Terminating Chunk
echo "\x30\r\n\r\n";
ob_flush();
flush();
// Post Processing should not be displayed
for($i=0; $i<10; $i++) {
print("Post-Processing");
sleep(1);
}
TL;DR Answer:
ignore_user_abort(true); //Safety measure so that the user doesn't stop the script too early.
$content = 'Hello World!'; //The content that will be sent to the browser.
header('Content-Length: ' . strlen($content)); //The browser will close the connection when the size of the content reaches "Content-Length", in this case, immediately.
ob_start(); //Content past this point...
echo $content;
//...will be sent to the browser (the output buffer gets flushed) when this code executes.
ob_end_flush();
ob_flush();
flush();
if(session_id())
{
session_write_close(); //Closes writing to the output buffer.
}
//Anything past this point will be ran without involving the browser.
Function Answer:
ignore_user_abort(true);
function sendAndAbort($content)
{
header('Content-Length: ' . strlen($content));
ob_start();
echo $content;
ob_end_flush();
ob_flush();
flush();
}
sendAndAbort('Hello World!');
//Anything past this point will be ran without involving the browser.
You could try to do multithreading.
you could whip up a script that makes a system call ( using shell_exec ) that calls the php binary with the script to do your work as the parameter. But I don't think that is the most secure way. Maybe you can thighten stuff up by chrooting the php process and other stuff
Alternatively, there's a class at phpclasses that do that http://www.phpclasses.org/browse/package/3953.html. But I don't know the specifics of the implementation
Joeri Sebrechts' answer is close, but it destroys any existing content that may be buffered before you wish to disconnect. It doesn't call ignore_user_abort properly, allowing the script to terminate prematurely. diyism's answer is good but is not generically applicable. E.g. a person may have greater or fewer output buffers that that answer does not handle, so it may simply not work in your situation and you won't know why.
This function allows you to disconnect any time (as long as headers have not been sent yet) and retains the content you've generated so far. The extra processing time is unlimited by default.
function disconnect_continue_processing($time_limit = null) {
ignore_user_abort(true);
session_write_close();
set_time_limit((int) $time_limit);//defaults to no limit
while (ob_get_level() > 1) {//only keep the last buffer if nested
ob_end_flush();
}
$last_buffer = ob_get_level();
$length = $last_buffer ? ob_get_length() : 0;
header("Content-Length: $length");
header('Connection: close');
if ($last_buffer) {
ob_end_flush();
}
flush();
}
If you need extra memory, too, allocate it before calling this function.
Note for mod_fcgid users (please, use at your own risk).
Quick Solution
The accepted answer of Joeri Sebrechts is indeed functional. However, if you use mod_fcgid you may find that this solution does not work on its own. In other words, when the flush function is called the connection to the client does not get closed.
The FcgidOutputBufferSize configuration parameter of mod_fcgid may be to blame. I have found this tip in:
this reply of Travers Carter and
this blog post of Seumas Mackinnon.
After reading the above, you may come to the conclusion that a quick solution would be to add the line (see "Example Virtual Host" at the end):
FcgidOutputBufferSize 0
in either your Apache configuration file (e.g, httpd.conf), your FCGI configuration file (e.g, fcgid.conf) or in your virtual hosts file (e.g., httpd-vhosts.conf).
In (1) above, a variable named "OutputBufferSize" is mentioned. This is the old name of the FcgidOutputBufferSize mentioned in (2) (see the upgrade notes in the Apache web page for mod_fcgid).
Details & A Second Solution
The above solution disables the buffering performed by mod_fcgid either for the whole server or for a specific virtual host. This might lead to a performance penalty for your web site. On the other hand, this may well not be the case since PHP performs buffering on its own.
In case you do not wish to disable mod_fcgid's buffering there is another solution... you can force this buffer to flush.
The code below does just that by building on the solution proposed by Joeri Sebrechts:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // just to be safe
ob_start();
echo('Text the user will see');
echo(str_repeat(' ', 65537)); // [+] Line added: Fill up mod_fcgi's buffer.
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
// Do processing here
sleep(30);
echo('Text user will never see');
?>
What the added line of code essentially does is fill up mod_fcgi's buffer, thus forcing it to flush. The number "65537" was chosen because the default value of the FcgidOutputBufferSize variable is "65536", as mentioned in the Apache web page for the corresponding directive. Hence, you may need to adjust this value accordingly if another value is set in your environment.
My Environment
WampServer 2.5
Apache 2.4.9
PHP 5.5.19 VC11, x86, Non Thread Safe
mod_fcgid/2.3.9
Windows 7 Professional x64
Example Virtual Host
<VirtualHost *:80>
DocumentRoot "d:/wamp/www/example"
ServerName example.local
FcgidOutputBufferSize 0
<Directory "d:/wamp/www/example">
Require all granted
</Directory>
</VirtualHost>
this worked for me
//avoid apache to kill the php running
ignore_user_abort(true);
//start buffer output
ob_start();
echo "show something to user1";
//close session file on server side to avoid blocking other requests
session_write_close();
//send length header
header("Content-Length: ".ob_get_length());
header("Connection: close");
//really send content, can't change the order:
//1.ob buffer to normal buffer,
//2.normal buffer to output
ob_end_flush();
flush();
//continue do something on server side
ob_start();
//replace it with the background task
sleep(20);
Ok, so basically the way jQuery does the XHR request, even the ob_flush method will not work because you are unable to run a function on each onreadystatechange. jQuery checks the state, then chooses the proper actions to take (complete,error,success,timeout). And although I was unable to find a reference, I recall hearing that this does not work with all XHR implementations.
A method that I believe should work for you is a cross between the ob_flush and forever-frame polling.
<?php
function wrap($str)
{
return "<script>{$str}</script>";
};
ob_start(); // begin buffering output
echo wrap("console.log('test1');");
ob_flush(); // push current buffer
flush(); // this flush actually pushed to the browser
$t = time();
while($t > (time() - 3)) {} // wait 3 seconds
echo wrap("console.log('test2');");
?>
<html>
<body>
<iframe src="ob.php"></iframe>
</body>
</html>
And because the scripts are executed inline, as the buffers are flushed, you get execution. To make this useful, change the console.log to a callback method defined in you main script setup to receive data and act on it. Hope this helps. Cheers, Morgan.
An alternative solution is to add the job to a queue and make a cron script which checks for new jobs and runs them.
I had to do it that way recently to circumvent limits imposed by a shared host - exec() et al was disabled for PHP run by the webserver but could run in a shell script.
Your problem can be solved by doing some parallel programming in php. I asked a question about it a few weeks ago here: How can one use multi threading in PHP applications
And got great answers. I liked one in particular very much. The writer made a reference to the Easy Parallel Processing in PHP (Sep 2008; by johnlim) tutorial which can actually solve your problem very well as I have used it already to deal with a similar problem that came up a couple of days ago.
If flush() function does not work. You must set next options in php.ini like:
output_buffering = Off
zlib.output_compression = Off
Latest Working Solution
// client can see outputs if any
ignore_user_abort(true);
ob_start();
echo "success";
$buffer_size = ob_get_length();
session_write_close();
header("Content-Encoding: none");
header("Content-Length: $buffer_size");
header("Connection: close");
ob_end_flush();
ob_flush();
flush();
sleep(2);
ob_start();
// client cannot see the result of code below
After trying many different solutions from this thread (after none of them worked for me), I've found solution on official PHP.net page:
function sendResponse($response) {
ob_end_clean();
header("Connection: close\r\n");
header("Content-Encoding: none\r\n");
ignore_user_abort(true);
ob_start();
echo $response; // Actual response that will be sent to the user
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
if (ob_get_contents()) {
ob_end_clean();
}
}
Couldn't get any of the above to work with IIS but:
With all its limitations, the built in PHP -S webserver comes to the rescue.
Caller script (IIS)
//limit of length required!
<?php
$s = file_get_contents('http://127.0.0.1:8080/test.php',false,null,0,10);
echo $s;
Worker script (built in webserber # 8080 - beware single thread):
ob_end_clean();
header("Connection: close");
ignore_user_abort(true);
ob_start();
echo 'Text the user will see';
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // All output buffers must be flushed here
flush(); // Force output to client
// Do processing here
sleep(5);
file_put_contents('ts.txt',date('H:i:s',time()));
//echo('Text user will never see');
Ugly enough? :)
I've been trying to complete a script that sends the proper notification to a users browser to close the connection, but allows the server to keep processing a request. My code is based on what I've seen on:
http://www.php.net/manual/en/features.connection-handling.php#71172
and
close a connection early
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
ignore_user_abort(); // optional
header('Content-Encoding: none');
header("Content-Length: $size");
header("Connection: close");
// flush all output
ob_end_flush();
ob_flush();
flush();
sleep(5);
//just a test to see if the script continues to run
file_put_contents("trash/".date('dmY-H_i_s_1').".txt", "Some text.");
file_put_contents("trash/".date('dmY-H_i_s_2').".txt", "Some text.");
file_put_contents("trash/".date('dmY-H_i_s_3').".txt", "Some text.");
When I go to run the script, sometimes it will create the first file but not write the text to it. Sometimes it doesn't create any files. If I run the script with the return early code commented out, all three files are created just fine. Zlib compression is turned off. Any ideas?
The right way to do this is to use php in fastcgi mode and then you can use the function
fastcgi_finish_request();
it will do exactly what you need - it will close the connection to the browser but the rest of the script will continue to run to the end.
http://php-fpm.org/wiki/Features
It's much better than relying on output buffering.
You can set your extra processing to take place with register_shutdown_function.
Basically, you register something to take place after you script finishes and sends all output to the user. That way the script can keep running for a while after that doing whatever cleanup or what not is needed.
Update: My bad, that no long applies after PHP 4.1
The links you gave and the other stuff I found seems to state you are doing it right. Are you sure there isn't an error or script timeout going on? What does the PHP error log state?
ignore_user_abort() must have a non-null parameter for it to actually ignore user abort. Otherwise it just returns the current ignore setting without changing anything. so you'd need
ignore_user_abort(true);
I'm trying find a way to have PHP to indicate to the browser that all page output is complete. After the page is done we're running some statistics code that usually doesn't take to long but in case it does I don't want to have the users browser waiting for more data. This can't be done via JavaScript because it needs to work with mobile phones.
I'm already starting output buffering using
mb_http_output("UTF8");
ob_start("mb_output_handler");
to insure I don't have issues with my sites MB text (Japanese). I was hoping that ob_end_flush() would do the trick but if I place sleep(10); after the ob_end_flush() the browser waits an additional 10 seconds. Does anyone have any ideas about this?
UPDATE:
Using jitters approach below I "ob_gzhandler" to get it working with gzip any one see any possible issues here?
//may be also add headers for cache-control and expires (IE)
header("Connection: close"); //tells browser that connection will be closed
ob_start();
ob_start("ob_gzhandler");
//page content
ob_end_flush();
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
UPDATE AGAIN:
Please take another look at the code above. You need to do an ob_start(); before the ob_start("ob_gzhandler"); and then call ob_end_flush(); prior to calling ob_get_length() so that you get the correct gzip compressed size.
Use something along these lines
//may be also add headers for cache-control and expires (IE)
header("Connection: close"); //tells browser that connection will be closed
ob_start();
//generate your output
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
//continue statistic processing
I don't think there is a way to notify the browser that the output is complete, at least from the script that sends the output. If you use some other script that will monitor the output of your first script and use an iframe maybe then you might be able to do it.
The browser knows when the output is complete when the page is considered loaded. That is what the browser knows.
You could fork a new php process in the background and let that take care of the stats. Something like:
shell_exec('php stats.php &');
The & at the end makes sure that it's run in the background, so even if the stats.php takes 20 seconds, the visitor won't notice it.
You would probably need to pass data to the stats script, which you can do by passing in parameters, like this:
shell_exec('php stats.php -b '. escapeshellarg($_SERVER['HTTP_USER_AGENT']) .' &');
In stats.php, you'd use the $argv variable to get that data.
But I wouldn't do this if the statistics code doesn't take that long to run, since forking a new process for every page load like this has some overhead. I don't know what it is that makes the stats code take a long time to process, but another solution might be to insert the raw data into a database, and let a background job work on that data to create usable statistics. That could be done either by a cron job, or having a screen run in an infinte loop that processes the queue.
Try to move your statistics code to a seperate function and call this function with an ajax call in the dom.ready or onload event in javascript code on your rendered page like in this meta code:
<html>
<script type="text/javascript">
dom.onready = Ajax.call(location.href + '?do_stats');
</script>
<body>...
</html>
The dom.ready event can be provided by jQuery or Prototype libraries. Downside is it will only work with js enabled.
Alternatively you could just record all needed information for the stats to a database and dispatch a script collecting the queued data from there and working on it in the background - eg by using cron.
I am responsible for the backend portion of an API, written in PHP, which is primarily used by a Flash client. What happens right now is: the Flash client makes a call, the backend loads the necessary data, does any necessary processing and post processing, logging and caching and then returns the result to the client.
What I would like to have happen is return the data to the client as soon as possible, close the connection, and then do all the stuff that the client doesn't have to care about. This could make the API seem much more responsive. Following the suggestions here:
http://php.net/manual/en/features.connection-handling.php
actually works, except that I have to turn off gzip encoding in order to make it work, which isn't very practical. We use mod_deflate in apache, so a solution that works with that would be ideal, but I would also consider a different method to gzip our content if that is necessary.
It seems like there should be a way to let Apache know "I've sent you all the data I'm going to send," but I can't seem to find anything like that.
For those wondering, yes I can flush the results early, but the Flash client will not process them until the connection is closed.
You might try breaking it into two pages.
In the first page, do the necessary processing, then load the second page via curl, and die().
That would cause the first page to complete and close, independent of the second page processing.
ie:
Page 1:
<?php
// Do stuff
// Post or get second page...
// Send Data to client
die();
?>
Page 2:
<?php
// Do other stuff....
?>
See http://www.php.net/curl
There's a kind of hack to do this by placing the code you want to execute after the connection closes within a callback method registered via to register_shutdown_function();
#Theo.T since the comment system mangled the crap out of my code, I'm posting it here:
No luck. The following prints out the extra crap and takes the full execution time to close the connection when using mod_deflate:
function sleepLongTime() {
print "you can't see this";
sleep(30);
}
ob_end_clean();
register_shutdown_function('sleepLongTime');
header("Connection: close\r\n");
ignore_user_abort(true);
ob_start();
echo ('Text user will see');
ob_end_flush();
flush();
ob_end_clean();
die();
set_time_limit(0);
header("Connection: close");
header("Content-Length: " .(strlen($stream)+256));
ignore_user_abort(true);
echo $stream;
echo(str_repeat(' ',256));
#ob_flush();
#flush();
#ob_end_flush();
your_long_long_long_long_function_here();
this will tell the user to close the connection once all of $stream is received . but be careful not to echo anything before the header part u know :p
if you are sending binary data (swf) you might need to remove the '+256' and echo(str_repeat(' ',256)); but in this case the code 'might' fail if the data sent is les than 256 bytes .
Today I also met this case, after some tests around, I found this way works:
Two steps:
Make sure the php script output is not with gzip encoding, the solution can refer to this link:
<IfModule mod_env.c>
SetEnvIfNoCase Request_URI "\.php$" no-gzip dont-vary
</IfModule>
Add the above to .htaccess file under the prj web site, then avoid apache gzip it automatically.
As some people said at features.connection-handling.php,
set_time_limit(0);
ignore_user_abort(true);
// buffer all upcoming output - make sure we care about compression:
if(!ob_start("ob_gzhandler"))
ob_start();
echo $stringToOutput;
// get the size of the output
$size = ob_get_length();
// send headers to tell the browser to close the connection
header("Content-Length: $size");
header('Connection: close');
// flush all output
ob_end_flush();
ob_flush();
flush();
// close current session
if (session_id()) session_write_close(); //close connection
// here, do what you want.