I'm attempting to do an AJAX call (via JQuery) that will initiate a fairly long process. I'd like the script to simply send a response indicating that the process has started, but JQuery won't return the response until the PHP script is done running.
I've tried this with a "close" header (below), and also with output buffering; neither seems to work. Any guesses? or is this something I need to do in JQuery?
<?php
echo( "We'll email you as soon as this is done." );
header( "Connection: Close" );
// do some stuff that will take a while
mail( 'dude#thatplace.com', "okay I'm done", 'Yup, all done.' );
?>
The following PHP manual page (incl. user-notes) suggests multiple instructions on how to close the TCP connection to the browser without ending the PHP script:
Connection handling Docs
Supposedly it requires a bit more than sending a close header.
OP then confirms: yup, this did the trick: pointing to user-note #71172 (Nov 2006) copied here:
Closing the users browser connection whilst keeping your php script running has been an issue since [PHP] 4.1, when the behaviour of register_shutdown_function() was modified so that it would not automatically close the users connection.
sts at mail dot xubion dot hu Posted the original solution:
<?php
header("Connection: close");
ob_start();
phpinfo();
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
sleep(13);
error_log("do something in the background");
?>
Which works fine until you substitute phpinfo() for echo('text I want user to see'); in which case the headers are never sent!
The solution is to explicitly turn off output buffering and clear the buffer prior to sending your header information. Example:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // just to be safe
ob_start();
echo('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
// Do processing here
sleep(30);
echo('Text user will never see');
?>
Just spent 3 hours trying to figure this one out, hope it helps someone :)
Tested in:
IE 7.5730.11
Mozilla Firefox 1.81
Later on in July 2010 in a related answer Arctic Fire then linked two further user-notes that were-follow-ups to the one above:
Connection Handling user-note #89177 (Feb 2009)
Connection Handling user-note #93441 (Sep 2009)
It's necessary to send these 2 headers:
Connection: close
Content-Length: n (n = size of output in bytes )
Since you need know the size of your output, you'll need to buffer your output, then flush it to the browser:
// buffer all upcoming output
ob_start();
echo 'We\'ll email you as soon as this is done.';
// get the size of the output
$size = ob_get_length();
// send headers to tell the browser to close the connection
header('Content-Length: '.$size);
header('Connection: close');
// flush all output
ob_end_flush();
ob_flush();
flush();
// if you're using sessions, this prevents subsequent requests
// from hanging while the background process executes
if (session_id()) {session_write_close();}
/******** background process starts here ********/
Also, if your web server is using automatic gzip compression on the output (ie. Apache with mod_deflate), this won't work because actual size of the output is changed, and the Content-Length is no longer accurate. Disable gzip compression the particular script.
For more details, visit http://www.zulius.com/how-to/close-browser-connection-continue-execution
You can use Fast-CGI with PHP-FPM to use the fastcgi_end_request() function. In this way, you can continue to do some processing while the response has already been sent to the client.
example of how to use fastcgi_finish_request() (Nov 2010)
You find this in the PHP manual here: FastCGI Process Manager (FPM); But that function specifically is not further documented in the manual. Here the excerpt from the PHP-FPM: PHP FastCGI Process Manager Wiki:
fastcgi_finish_request()
Scope: php function
Category: Optimization
This feature allows you to speed up implementation of some php queries. Acceleration is possible when there are actions in the process of script execution that do not affect server response. For example, saving the session in memcached can occur after the page has been formed and passed to a web server. fastcgi_finish_request() is a php feature, that stops the response output. Web server immediately starts to transfer response "slowly and sadly" to the client, and php at the same time can do a lot of useful things in the context of a query, such as saving the session, converting the downloaded video, handling all kinds of statistics, etc.
fastcgi_finish_request() can invoke executing shutdown function.
Note: fastcgi_finish_request() has a quirk where calls to flush, print, or echo will terminate the script early.
To avoid that issue, you can call ignore_user_abort(true) right before or after the fastcgi_finish_request call:
ignore_user_abort(true);
fastcgi_finish_request();
Complete version:
ignore_user_abort(true);//avoid apache to kill the php running
ob_start();//start buffer output
echo "show something to user";
session_write_close();//close session file on server side to avoid blocking other requests
header("Content-Encoding: none");//send header to avoid the browser side to take content as gzip format
header("Content-Length: ".ob_get_length());//send length header
header("Connection: close");//or redirect to some url: header('Location: http://www.google.com');
ob_end_flush();flush();//really send content, can't change the order:1.ob buffer to normal buffer, 2.normal buffer to output
//continue do something on server side
ob_start();
sleep(5);//the user won't wait for the 5 seconds
echo 'for diyism';//user can't see this
file_put_contents('/tmp/process.log', ob_get_contents());
ob_end_clean();
A better solution is to fork a background process. It is fairly straight forward on unix/linux:
<?php
echo "We'll email you as soon as this is done.";
system("php somestuff.php dude#thatplace.com >/dev/null &");
?>
You should look at this question for better examples:
PHP execute a background process
Assuming you have a Linux server and root access, try this. It is the simplest solution I have found.
Create a new directory for the following files and give it full permissions. (We can make it more secure later.)
mkdir test
chmod -R 777 test
cd test
Put this in a file called bgping.
echo starting bgping
ping -c 15 www.google.com > dump.txt &
echo ending bgping
Note the &. The ping command will run in the background while the current process moves on to the echo command.
It will ping www.google.com 15 times, which will take about 15 seconds.
Make it executable.
chmod 777 bgping
Put this in a file called bgtest.php.
<?php
echo "start bgtest.php\n";
exec('./bgping', $output, $result)."\n";
echo "output:".print_r($output,true)."\n";
echo "result:".print_r($result,true)."\n";
echo "end bgtest.php\n";
?>
When you request bgtest.php in your browser, you should get the following response quickly, without waiting about
15 seconds for the ping command to complete.
start bgtest.php
output:Array
(
[0] => starting bgping
[1] => ending bgping
)
result:0
end bgtest.php
The ping command should now be running on the server. Instead of the ping command, you could run a PHP script:
php -n -f largejob.php > dump.txt &
Hope this helps!
Here's a modification to Timbo's code that works with gzip compression.
// buffer all upcoming output
if(!ob_start("ob_gzhandler")){
define('NO_GZ_BUFFER', true);
ob_start();
}
echo "We'll email you as soon as this is done.";
//Flush here before getting content length if ob_gzhandler was used.
if(!defined('NO_GZ_BUFFER')){
ob_end_flush();
}
// get the size of the output
$size = ob_get_length();
// send headers to tell the browser to close the connection
header("Content-Length: $size");
header('Connection: close');
// flush all output
ob_end_flush();
ob_flush();
flush();
// if you're using sessions, this prevents subsequent requests
// from hanging while the background process executes
if (session_id()) session_write_close();
/******** background process starts here ********/
I'm on a shared host and fastcgi_finish_request is setup to exit scripts completely. I don't like the connection: close solution either. Using it forces a separate connection for subsequent requests, costing additional server resources. I read the Transfer-Encoding: cunked Wikipedia Article and learned that 0\r\n\r\n terminates a response. I haven't thoroughly tested this across browsers versions and devices, but it works on all 4 of my current browsers.
// Disable automatic compression
// #ini_set('zlib.output_compression', 'Off');
// #ini_set('output_buffering', 'Off');
// #ini_set('output_handler', '');
// #apache_setenv('no-gzip', 1);
// Chunked Transfer-Encoding & Gzip Content-Encoding
function ob_chunked_gzhandler($buffer, $phase) {
if (!headers_sent()) header('Transfer-Encoding: chunked');
$buffer = ob_gzhandler($buffer, $phase);
return dechex(strlen($buffer))."\r\n$buffer\r\n";
}
ob_start('ob_chunked_gzhandler');
// First Chunk
echo "Hello World";
ob_flush();
// Second Chunk
echo ", Grand World";
ob_flush();
ob_end_clean();
// Terminating Chunk
echo "\x30\r\n\r\n";
ob_flush();
flush();
// Post Processing should not be displayed
for($i=0; $i<10; $i++) {
print("Post-Processing");
sleep(1);
}
TL;DR Answer:
ignore_user_abort(true); //Safety measure so that the user doesn't stop the script too early.
$content = 'Hello World!'; //The content that will be sent to the browser.
header('Content-Length: ' . strlen($content)); //The browser will close the connection when the size of the content reaches "Content-Length", in this case, immediately.
ob_start(); //Content past this point...
echo $content;
//...will be sent to the browser (the output buffer gets flushed) when this code executes.
ob_end_flush();
ob_flush();
flush();
if(session_id())
{
session_write_close(); //Closes writing to the output buffer.
}
//Anything past this point will be ran without involving the browser.
Function Answer:
ignore_user_abort(true);
function sendAndAbort($content)
{
header('Content-Length: ' . strlen($content));
ob_start();
echo $content;
ob_end_flush();
ob_flush();
flush();
}
sendAndAbort('Hello World!');
//Anything past this point will be ran without involving the browser.
You could try to do multithreading.
you could whip up a script that makes a system call ( using shell_exec ) that calls the php binary with the script to do your work as the parameter. But I don't think that is the most secure way. Maybe you can thighten stuff up by chrooting the php process and other stuff
Alternatively, there's a class at phpclasses that do that http://www.phpclasses.org/browse/package/3953.html. But I don't know the specifics of the implementation
Joeri Sebrechts' answer is close, but it destroys any existing content that may be buffered before you wish to disconnect. It doesn't call ignore_user_abort properly, allowing the script to terminate prematurely. diyism's answer is good but is not generically applicable. E.g. a person may have greater or fewer output buffers that that answer does not handle, so it may simply not work in your situation and you won't know why.
This function allows you to disconnect any time (as long as headers have not been sent yet) and retains the content you've generated so far. The extra processing time is unlimited by default.
function disconnect_continue_processing($time_limit = null) {
ignore_user_abort(true);
session_write_close();
set_time_limit((int) $time_limit);//defaults to no limit
while (ob_get_level() > 1) {//only keep the last buffer if nested
ob_end_flush();
}
$last_buffer = ob_get_level();
$length = $last_buffer ? ob_get_length() : 0;
header("Content-Length: $length");
header('Connection: close');
if ($last_buffer) {
ob_end_flush();
}
flush();
}
If you need extra memory, too, allocate it before calling this function.
Note for mod_fcgid users (please, use at your own risk).
Quick Solution
The accepted answer of Joeri Sebrechts is indeed functional. However, if you use mod_fcgid you may find that this solution does not work on its own. In other words, when the flush function is called the connection to the client does not get closed.
The FcgidOutputBufferSize configuration parameter of mod_fcgid may be to blame. I have found this tip in:
this reply of Travers Carter and
this blog post of Seumas Mackinnon.
After reading the above, you may come to the conclusion that a quick solution would be to add the line (see "Example Virtual Host" at the end):
FcgidOutputBufferSize 0
in either your Apache configuration file (e.g, httpd.conf), your FCGI configuration file (e.g, fcgid.conf) or in your virtual hosts file (e.g., httpd-vhosts.conf).
In (1) above, a variable named "OutputBufferSize" is mentioned. This is the old name of the FcgidOutputBufferSize mentioned in (2) (see the upgrade notes in the Apache web page for mod_fcgid).
Details & A Second Solution
The above solution disables the buffering performed by mod_fcgid either for the whole server or for a specific virtual host. This might lead to a performance penalty for your web site. On the other hand, this may well not be the case since PHP performs buffering on its own.
In case you do not wish to disable mod_fcgid's buffering there is another solution... you can force this buffer to flush.
The code below does just that by building on the solution proposed by Joeri Sebrechts:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // just to be safe
ob_start();
echo('Text the user will see');
echo(str_repeat(' ', 65537)); // [+] Line added: Fill up mod_fcgi's buffer.
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
// Do processing here
sleep(30);
echo('Text user will never see');
?>
What the added line of code essentially does is fill up mod_fcgi's buffer, thus forcing it to flush. The number "65537" was chosen because the default value of the FcgidOutputBufferSize variable is "65536", as mentioned in the Apache web page for the corresponding directive. Hence, you may need to adjust this value accordingly if another value is set in your environment.
My Environment
WampServer 2.5
Apache 2.4.9
PHP 5.5.19 VC11, x86, Non Thread Safe
mod_fcgid/2.3.9
Windows 7 Professional x64
Example Virtual Host
<VirtualHost *:80>
DocumentRoot "d:/wamp/www/example"
ServerName example.local
FcgidOutputBufferSize 0
<Directory "d:/wamp/www/example">
Require all granted
</Directory>
</VirtualHost>
this worked for me
//avoid apache to kill the php running
ignore_user_abort(true);
//start buffer output
ob_start();
echo "show something to user1";
//close session file on server side to avoid blocking other requests
session_write_close();
//send length header
header("Content-Length: ".ob_get_length());
header("Connection: close");
//really send content, can't change the order:
//1.ob buffer to normal buffer,
//2.normal buffer to output
ob_end_flush();
flush();
//continue do something on server side
ob_start();
//replace it with the background task
sleep(20);
Ok, so basically the way jQuery does the XHR request, even the ob_flush method will not work because you are unable to run a function on each onreadystatechange. jQuery checks the state, then chooses the proper actions to take (complete,error,success,timeout). And although I was unable to find a reference, I recall hearing that this does not work with all XHR implementations.
A method that I believe should work for you is a cross between the ob_flush and forever-frame polling.
<?php
function wrap($str)
{
return "<script>{$str}</script>";
};
ob_start(); // begin buffering output
echo wrap("console.log('test1');");
ob_flush(); // push current buffer
flush(); // this flush actually pushed to the browser
$t = time();
while($t > (time() - 3)) {} // wait 3 seconds
echo wrap("console.log('test2');");
?>
<html>
<body>
<iframe src="ob.php"></iframe>
</body>
</html>
And because the scripts are executed inline, as the buffers are flushed, you get execution. To make this useful, change the console.log to a callback method defined in you main script setup to receive data and act on it. Hope this helps. Cheers, Morgan.
An alternative solution is to add the job to a queue and make a cron script which checks for new jobs and runs them.
I had to do it that way recently to circumvent limits imposed by a shared host - exec() et al was disabled for PHP run by the webserver but could run in a shell script.
Your problem can be solved by doing some parallel programming in php. I asked a question about it a few weeks ago here: How can one use multi threading in PHP applications
And got great answers. I liked one in particular very much. The writer made a reference to the Easy Parallel Processing in PHP (Sep 2008; by johnlim) tutorial which can actually solve your problem very well as I have used it already to deal with a similar problem that came up a couple of days ago.
If flush() function does not work. You must set next options in php.ini like:
output_buffering = Off
zlib.output_compression = Off
Latest Working Solution
// client can see outputs if any
ignore_user_abort(true);
ob_start();
echo "success";
$buffer_size = ob_get_length();
session_write_close();
header("Content-Encoding: none");
header("Content-Length: $buffer_size");
header("Connection: close");
ob_end_flush();
ob_flush();
flush();
sleep(2);
ob_start();
// client cannot see the result of code below
After trying many different solutions from this thread (after none of them worked for me), I've found solution on official PHP.net page:
function sendResponse($response) {
ob_end_clean();
header("Connection: close\r\n");
header("Content-Encoding: none\r\n");
ignore_user_abort(true);
ob_start();
echo $response; // Actual response that will be sent to the user
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
if (ob_get_contents()) {
ob_end_clean();
}
}
Couldn't get any of the above to work with IIS but:
With all its limitations, the built in PHP -S webserver comes to the rescue.
Caller script (IIS)
//limit of length required!
<?php
$s = file_get_contents('http://127.0.0.1:8080/test.php',false,null,0,10);
echo $s;
Worker script (built in webserber # 8080 - beware single thread):
ob_end_clean();
header("Connection: close");
ignore_user_abort(true);
ob_start();
echo 'Text the user will see';
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // All output buffers must be flushed here
flush(); // Force output to client
// Do processing here
sleep(5);
file_put_contents('ts.txt',date('H:i:s',time()));
//echo('Text user will never see');
Ugly enough? :)
Related
I’m setting up a new application in php that use output buffering, the code works fine on local server (wamp) but when I upload my code to the production server the code doesn't work,
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // just to be safe
ob_start();
echo('Text the user will see ');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
// Do processing here
sleep(6);
echo('Text user will never see');
?>
I expect the output of 'Text the user will see', but the server doesn't close the connection, but the actual output is 'Text the user will see Text user will never see'
Some servers use the FastCGI Process Manager. You may want to consider adding the fastcgi finish request to your code. From the documentation:
This function flushes all response data to the client and finishes the request. This allows for time consuming tasks to be performed without leaving the connection to the client open.
It's however not available when the FastCGI Process Manager is not used, resulting in errors, so you may want to add it conditionally:
if ( function_exists("fastcgi_finish_request") ) fastcgi_finish_request();
Ok for the life of me I can't figure out why flush() isn't working. I've googled this issue EVERYWHERE with no success. At the moment i don't have nginx or fastCGI (which have been mentioned as having special needs). I have a php.ini override file in place to change these values:
output_buffering = Off
implicit_flush = 1
zlib.output_compression = 0
I've tried everything under the sun in the actual php but this is what i currently have:
ignore_user_abort(true);
set_time_limit(0);
header('Content-Encoding: none;');
header('X-Accel-Buffering: no');
header("Connection: close");
echo 'success?';
ob_flush();
flush();
/******** background process starts here ********/
sleep(2);
echo 'dammit';
This is just test info really. The end goal is to submit data via an ajax POST using jquery and send back a success and continue processing the information. The user is NOT supposed to wait while the rest of the script runs. I can't even get this to work if i call the file directly! Both of the echos spit out at the same time no matter how i work it!
Heavens forgive me for this code:
protected function detachBrowser()
{
ob_start();
// tell PHP to ignore if the browsers closes connection
#ignore_user_abort(true);
// check it worked
$defer = #ignore_user_abort();
// according to the docs, in some cases on IIS+CGI
// ignore_user_abort does not work
// If so, just abort.
if (!$defer)
{
throw new RuntimeException("Webserver does not support ignore_user_abort()");
}
// remove the buffer, even nested ones
while (ob_get_level()) ob_end_clean();
/* close the frigging connection with the browser, and help IE understand the message */
ob_start();
header('Content-Type: text/plain');
header('Content-Length: 0');
header("Content-Encoding: none\r\n");
header('Connection: close');
// we need all three, in this precise order
#flush();
#ob_end_flush();
#ob_flush();
}
Edit: I derived this code from DokuWiki, one of the first PHP projects to implement a web bug for an actually useful purpose (search indexing).
So this issue is because ""OutputBufferSize 0" needs to be set in the FastCGI configuration, otherwise mod_fcgid will do it's own output buffering in addition to PHPs." This is something I didnt have control over on my hosting server. Hope this helps someone!!
How do you close an ajax request from php before the script ends? Example: user requests php.php, which has the line: echo "phpphp", and after this line, the ajax request finishes and has the data "phpphp", but the PHP script keeps on running, without using processes or forking?
How do i implement this scenario using PHP? has the answer. Set Connection close and Content length headers, with the flushing.
ob_start();
echo "111";
header("Content-Length: ".ob_get_length());
header("Connection: close");
flush();
somescript();
Try streaming the PHP file, and once it hits a certain point, send a return to JS and that will close the JS connection and (perhaps?) allow this php file to continue?
Haven't tried it but it's worth a shot.
// set as plain text
header('content-type:text/plain');
// let it stream
ob_implicit_flush(true);
ob_end_flush();
exit or any alias of exit will stop script execution.
Take a look at output buffering - using ob_start() with ob_flush() or simply flush() might do the trick.
I've been trying to complete a script that sends the proper notification to a users browser to close the connection, but allows the server to keep processing a request. My code is based on what I've seen on:
http://www.php.net/manual/en/features.connection-handling.php#71172
and
close a connection early
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
ignore_user_abort(); // optional
header('Content-Encoding: none');
header("Content-Length: $size");
header("Connection: close");
// flush all output
ob_end_flush();
ob_flush();
flush();
sleep(5);
//just a test to see if the script continues to run
file_put_contents("trash/".date('dmY-H_i_s_1').".txt", "Some text.");
file_put_contents("trash/".date('dmY-H_i_s_2').".txt", "Some text.");
file_put_contents("trash/".date('dmY-H_i_s_3').".txt", "Some text.");
When I go to run the script, sometimes it will create the first file but not write the text to it. Sometimes it doesn't create any files. If I run the script with the return early code commented out, all three files are created just fine. Zlib compression is turned off. Any ideas?
The right way to do this is to use php in fastcgi mode and then you can use the function
fastcgi_finish_request();
it will do exactly what you need - it will close the connection to the browser but the rest of the script will continue to run to the end.
http://php-fpm.org/wiki/Features
It's much better than relying on output buffering.
You can set your extra processing to take place with register_shutdown_function.
Basically, you register something to take place after you script finishes and sends all output to the user. That way the script can keep running for a while after that doing whatever cleanup or what not is needed.
Update: My bad, that no long applies after PHP 4.1
The links you gave and the other stuff I found seems to state you are doing it right. Are you sure there isn't an error or script timeout going on? What does the PHP error log state?
ignore_user_abort() must have a non-null parameter for it to actually ignore user abort. Otherwise it just returns the current ignore setting without changing anything. so you'd need
ignore_user_abort(true);
I am responsible for the backend portion of an API, written in PHP, which is primarily used by a Flash client. What happens right now is: the Flash client makes a call, the backend loads the necessary data, does any necessary processing and post processing, logging and caching and then returns the result to the client.
What I would like to have happen is return the data to the client as soon as possible, close the connection, and then do all the stuff that the client doesn't have to care about. This could make the API seem much more responsive. Following the suggestions here:
http://php.net/manual/en/features.connection-handling.php
actually works, except that I have to turn off gzip encoding in order to make it work, which isn't very practical. We use mod_deflate in apache, so a solution that works with that would be ideal, but I would also consider a different method to gzip our content if that is necessary.
It seems like there should be a way to let Apache know "I've sent you all the data I'm going to send," but I can't seem to find anything like that.
For those wondering, yes I can flush the results early, but the Flash client will not process them until the connection is closed.
You might try breaking it into two pages.
In the first page, do the necessary processing, then load the second page via curl, and die().
That would cause the first page to complete and close, independent of the second page processing.
ie:
Page 1:
<?php
// Do stuff
// Post or get second page...
// Send Data to client
die();
?>
Page 2:
<?php
// Do other stuff....
?>
See http://www.php.net/curl
There's a kind of hack to do this by placing the code you want to execute after the connection closes within a callback method registered via to register_shutdown_function();
#Theo.T since the comment system mangled the crap out of my code, I'm posting it here:
No luck. The following prints out the extra crap and takes the full execution time to close the connection when using mod_deflate:
function sleepLongTime() {
print "you can't see this";
sleep(30);
}
ob_end_clean();
register_shutdown_function('sleepLongTime');
header("Connection: close\r\n");
ignore_user_abort(true);
ob_start();
echo ('Text user will see');
ob_end_flush();
flush();
ob_end_clean();
die();
set_time_limit(0);
header("Connection: close");
header("Content-Length: " .(strlen($stream)+256));
ignore_user_abort(true);
echo $stream;
echo(str_repeat(' ',256));
#ob_flush();
#flush();
#ob_end_flush();
your_long_long_long_long_function_here();
this will tell the user to close the connection once all of $stream is received . but be careful not to echo anything before the header part u know :p
if you are sending binary data (swf) you might need to remove the '+256' and echo(str_repeat(' ',256)); but in this case the code 'might' fail if the data sent is les than 256 bytes .
Today I also met this case, after some tests around, I found this way works:
Two steps:
Make sure the php script output is not with gzip encoding, the solution can refer to this link:
<IfModule mod_env.c>
SetEnvIfNoCase Request_URI "\.php$" no-gzip dont-vary
</IfModule>
Add the above to .htaccess file under the prj web site, then avoid apache gzip it automatically.
As some people said at features.connection-handling.php,
set_time_limit(0);
ignore_user_abort(true);
// buffer all upcoming output - make sure we care about compression:
if(!ob_start("ob_gzhandler"))
ob_start();
echo $stringToOutput;
// get the size of the output
$size = ob_get_length();
// send headers to tell the browser to close the connection
header("Content-Length: $size");
header('Connection: close');
// flush all output
ob_end_flush();
ob_flush();
flush();
// close current session
if (session_id()) session_write_close(); //close connection
// here, do what you want.