Can't get flush() working in Windows - php

We have a ubuntu test server that this simple script works fine on where it outputs a number each second to the browser
<?
for ($i=0; $i<3; $i++)
{
echo $i.'<br>';
flush();
sleep(1);
}
However on my local development box (windows environment) I cannot get it to work, it outputs it all after 3 seconds.
I am using the same php version (5.3.13) as well as apache (2.2.22) version on my local box as the test server.
I have mirrored most of the settings in both php.ini as well as the httpd.conf
Here are settings I have set (which are identical to the test server)
output_buffering 0
zlib.output_compression = Off
zlib.output_compression_level = -1
I don't have mod_gzip installed.
I have even tried doing a wamp install on my box, and it didn't work with that either.
I can't imagine it being a client (browser -- firefox 13) issue as I use the exact same browser to view the script on the test server and it works fine
I have tried enabling implicit flush, doing ob_flush, etc. No luck.
Anyone have any ideas on how to get this working?

Just keep in mind what #Wrikken said, this is not recommended.
So I was thinking, everything you're doing seems to be right. I copied your setup and tried. I faced same problem. I executed the script from command line, it worked.
Then I had to do the last test, Wireshark. Following the first couple of packets it was clear that the server is sending everything correctly, so it had to be the browser’s buffer.
So I tried send some data before the loop, well guess what? It worked!
Here you go:
<?php
ini_set('output_buffering','off');
ini_set('zlib.output_compression', 0);
echo str_repeat(" ", 1024), "\n";
for($i=0;$i<6;$i++) {
echo $i."<br />\n";
ob_flush();
flush();
sleep(1);
}
?>
I'm not sure about the application you have in mind, but this is clearly not a good idea, I highly recommend that you look into other options.
Update:
After a lengthy Google search I found this and this
Because a browser cannot correctly render a page without knowing how
to construct the page's characters, most browsers buffer a certain
number of bytes before executing any JavaScript or drawing the page
And
To avoid these delays, you should always specify the character
encoding as early as possible, for any HTML document larger than 1 KB
(1024 bytes, to be precise, which is the maximum buffer limit used by
any of the browsers we have tested).
So I tried to send the charset first instead of filling the buffer: (and it worked)
<?php
ini_set('output_buffering','off');
ini_set('zlib.output_compression', 0);
//echo str_repeat(" ", 1024), "\n";
header('Content-Type: text/html; charset=iso-8859-1');
//Note that it shoudn't matter which charset you send
for($i=0;$i<6;$i++) {
echo $i."<br />\n";
ob_flush();
flush();
sleep(1);
}
?>
So why was it working with the first server but not the second?
Most likely it's because your first server is sending the charset with the header while the second one isn't.
In your case, however, I'd make the following changes
<?php
ini_set('output_buffering','off');
ini_set('zlib.output_compression', 0);
//Plain text MIME type since you'll use for logging purposes
//and if you run it from CLI, you can ignore the whole header line
header('Content-Type: text/plain; charset=iso-8859-1');
for($i=0;$i<6;$i++) {
//No need to echo <br /> once you'll run it from CLI
echo $i."\n";
ob_flush();
flush();
sleep(1);
}
?>

I have found a solution for my problem.
FOr some reason adding the line
default_charset = "utf-8"
To my php.ini makes it work.
I confirmed my removing the line.. flush didn't work.
Added it, and it worked. Maybe because it adds extra headers giving enough for the browser buffer.

Related

Problems with PHP Output Buffer Flush

I have a PHP script which takes a while to process and I wanted to get the site to display it's progress while it is running.
I have been having problems displaying the contents of the output buffer while a PHP code is running. I have been trying the use of ob_implicit_flush(true), ob_end_flush() and ob_flush() with no success.
After looking for a fair bit at the PHP manual on php.net I have come across a comment by Lee saying
As of August 2012, all browsers seem to show an all-or-nothing
approach to buffering. In other words, while php is operating, no
content can be shown.
In particular this means that the following workarounds listed further
down here are ineffective:
1) ob_flush (), flush () in any combination with other output
buffering functions;
2) changes to php.ini involving setting output_buffer and/or
zlib.output_compression to 0 or Off;
3) setting Apache variables such as "no-gzip" either through
apache_setenv () or through entries in .htaccess.
So, until browsers begin to show buffered content again, the tips
listed here are moot.
see here
If he is correct, is there an alternative I can use to display the progress of the script running?
[edit]
Looks like Lee could be right. I have been doing some testing and here is an example of the problem
<?php
ob_implicit_flush(true);
ob_end_flush();
for ($i=0; $i<5; $i++) {
echo $i.'<br>';
sleep(1);
}
?>
Should display 0 to 4 with a second in between. What actually happens is that the site waits 5 seconds then displays 0 to 4 immediately. It is waiting for the script to complete then displays everything in one go.
I have had output_buffering = Off in my php.ini file and tried setting it to 1 and 4096 (4096 being 4KB) and it still does the same.
Check the buffer size and make an "empty" dummy of that size, for example...
$flushdummy="";
for ($i=0; $i<700; $i++) { $flushdummy=$flushdummy." "; }
$flushdummy=$flushdummy."\n";
then, when output is desired, add the dummy to the output, plus the flush:
echo "here I am<br />", $flushdummy;
flush();
#ob_flush();

PHP file_get_contents and exit before page loads [duplicate]

I'm attempting to do an AJAX call (via JQuery) that will initiate a fairly long process. I'd like the script to simply send a response indicating that the process has started, but JQuery won't return the response until the PHP script is done running.
I've tried this with a "close" header (below), and also with output buffering; neither seems to work. Any guesses? or is this something I need to do in JQuery?
<?php
echo( "We'll email you as soon as this is done." );
header( "Connection: Close" );
// do some stuff that will take a while
mail( 'dude#thatplace.com', "okay I'm done", 'Yup, all done.' );
?>
The following PHP manual page (incl. user-notes) suggests multiple instructions on how to close the TCP connection to the browser without ending the PHP script:
Connection handling Docs
Supposedly it requires a bit more than sending a close header.
OP then confirms: yup, this did the trick: pointing to user-note #71172 (Nov 2006) copied here:
Closing the users browser connection whilst keeping your php script running has been an issue since [PHP] 4.1, when the behaviour of register_shutdown_function() was modified so that it would not automatically close the users connection.
sts at mail dot xubion dot hu Posted the original solution:
<?php
header("Connection: close");
ob_start();
phpinfo();
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
sleep(13);
error_log("do something in the background");
?>
Which works fine until you substitute phpinfo() for echo('text I want user to see'); in which case the headers are never sent!
The solution is to explicitly turn off output buffering and clear the buffer prior to sending your header information. Example:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // just to be safe
ob_start();
echo('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
// Do processing here
sleep(30);
echo('Text user will never see');
?>
Just spent 3 hours trying to figure this one out, hope it helps someone :)
Tested in:
IE 7.5730.11
Mozilla Firefox 1.81
Later on in July 2010 in a related answer Arctic Fire then linked two further user-notes that were-follow-ups to the one above:
Connection Handling user-note #89177 (Feb 2009)
Connection Handling user-note #93441 (Sep 2009)
It's necessary to send these 2 headers:
Connection: close
Content-Length: n (n = size of output in bytes )
Since you need know the size of your output, you'll need to buffer your output, then flush it to the browser:
// buffer all upcoming output
ob_start();
echo 'We\'ll email you as soon as this is done.';
// get the size of the output
$size = ob_get_length();
// send headers to tell the browser to close the connection
header('Content-Length: '.$size);
header('Connection: close');
// flush all output
ob_end_flush();
ob_flush();
flush();
// if you're using sessions, this prevents subsequent requests
// from hanging while the background process executes
if (session_id()) {session_write_close();}
/******** background process starts here ********/
Also, if your web server is using automatic gzip compression on the output (ie. Apache with mod_deflate), this won't work because actual size of the output is changed, and the Content-Length is no longer accurate. Disable gzip compression the particular script.
For more details, visit http://www.zulius.com/how-to/close-browser-connection-continue-execution
You can use Fast-CGI with PHP-FPM to use the fastcgi_end_request() function. In this way, you can continue to do some processing while the response has already been sent to the client.
example of how to use fastcgi_finish_request() (Nov 2010)
You find this in the PHP manual here: FastCGI Process Manager (FPM); But that function specifically is not further documented in the manual. Here the excerpt from the PHP-FPM: PHP FastCGI Process Manager Wiki:
fastcgi_finish_request()
Scope: php function
Category: Optimization
This feature allows you to speed up implementation of some php queries. Acceleration is possible when there are actions in the process of script execution that do not affect server response. For example, saving the session in memcached can occur after the page has been formed and passed to a web server. fastcgi_finish_request() is a php feature, that stops the response output. Web server immediately starts to transfer response "slowly and sadly" to the client, and php at the same time can do a lot of useful things in the context of a query, such as saving the session, converting the downloaded video, handling all kinds of statistics, etc.
fastcgi_finish_request() can invoke executing shutdown function.
Note: fastcgi_finish_request() has a quirk where calls to flush, print, or echo will terminate the script early.
To avoid that issue, you can call ignore_user_abort(true) right before or after the fastcgi_finish_request call:
ignore_user_abort(true);
fastcgi_finish_request();
Complete version:
ignore_user_abort(true);//avoid apache to kill the php running
ob_start();//start buffer output
echo "show something to user";
session_write_close();//close session file on server side to avoid blocking other requests
header("Content-Encoding: none");//send header to avoid the browser side to take content as gzip format
header("Content-Length: ".ob_get_length());//send length header
header("Connection: close");//or redirect to some url: header('Location: http://www.google.com');
ob_end_flush();flush();//really send content, can't change the order:1.ob buffer to normal buffer, 2.normal buffer to output
//continue do something on server side
ob_start();
sleep(5);//the user won't wait for the 5 seconds
echo 'for diyism';//user can't see this
file_put_contents('/tmp/process.log', ob_get_contents());
ob_end_clean();
A better solution is to fork a background process. It is fairly straight forward on unix/linux:
<?php
echo "We'll email you as soon as this is done.";
system("php somestuff.php dude#thatplace.com >/dev/null &");
?>
You should look at this question for better examples:
PHP execute a background process
Assuming you have a Linux server and root access, try this. It is the simplest solution I have found.
Create a new directory for the following files and give it full permissions. (We can make it more secure later.)
mkdir test
chmod -R 777 test
cd test
Put this in a file called bgping.
echo starting bgping
ping -c 15 www.google.com > dump.txt &
echo ending bgping
Note the &. The ping command will run in the background while the current process moves on to the echo command.
It will ping www.google.com 15 times, which will take about 15 seconds.
Make it executable.
chmod 777 bgping
Put this in a file called bgtest.php.
<?php
echo "start bgtest.php\n";
exec('./bgping', $output, $result)."\n";
echo "output:".print_r($output,true)."\n";
echo "result:".print_r($result,true)."\n";
echo "end bgtest.php\n";
?>
When you request bgtest.php in your browser, you should get the following response quickly, without waiting about
15 seconds for the ping command to complete.
start bgtest.php
output:Array
(
[0] => starting bgping
[1] => ending bgping
)
result:0
end bgtest.php
The ping command should now be running on the server. Instead of the ping command, you could run a PHP script:
php -n -f largejob.php > dump.txt &
Hope this helps!
Here's a modification to Timbo's code that works with gzip compression.
// buffer all upcoming output
if(!ob_start("ob_gzhandler")){
define('NO_GZ_BUFFER', true);
ob_start();
}
echo "We'll email you as soon as this is done.";
//Flush here before getting content length if ob_gzhandler was used.
if(!defined('NO_GZ_BUFFER')){
ob_end_flush();
}
// get the size of the output
$size = ob_get_length();
// send headers to tell the browser to close the connection
header("Content-Length: $size");
header('Connection: close');
// flush all output
ob_end_flush();
ob_flush();
flush();
// if you're using sessions, this prevents subsequent requests
// from hanging while the background process executes
if (session_id()) session_write_close();
/******** background process starts here ********/
I'm on a shared host and fastcgi_finish_request is setup to exit scripts completely. I don't like the connection: close solution either. Using it forces a separate connection for subsequent requests, costing additional server resources. I read the Transfer-Encoding: cunked Wikipedia Article and learned that 0\r\n\r\n terminates a response. I haven't thoroughly tested this across browsers versions and devices, but it works on all 4 of my current browsers.
// Disable automatic compression
// #ini_set('zlib.output_compression', 'Off');
// #ini_set('output_buffering', 'Off');
// #ini_set('output_handler', '');
// #apache_setenv('no-gzip', 1);
// Chunked Transfer-Encoding & Gzip Content-Encoding
function ob_chunked_gzhandler($buffer, $phase) {
if (!headers_sent()) header('Transfer-Encoding: chunked');
$buffer = ob_gzhandler($buffer, $phase);
return dechex(strlen($buffer))."\r\n$buffer\r\n";
}
ob_start('ob_chunked_gzhandler');
// First Chunk
echo "Hello World";
ob_flush();
// Second Chunk
echo ", Grand World";
ob_flush();
ob_end_clean();
// Terminating Chunk
echo "\x30\r\n\r\n";
ob_flush();
flush();
// Post Processing should not be displayed
for($i=0; $i<10; $i++) {
print("Post-Processing");
sleep(1);
}
TL;DR Answer:
ignore_user_abort(true); //Safety measure so that the user doesn't stop the script too early.
$content = 'Hello World!'; //The content that will be sent to the browser.
header('Content-Length: ' . strlen($content)); //The browser will close the connection when the size of the content reaches "Content-Length", in this case, immediately.
ob_start(); //Content past this point...
echo $content;
//...will be sent to the browser (the output buffer gets flushed) when this code executes.
ob_end_flush();
ob_flush();
flush();
if(session_id())
{
session_write_close(); //Closes writing to the output buffer.
}
//Anything past this point will be ran without involving the browser.
Function Answer:
ignore_user_abort(true);
function sendAndAbort($content)
{
header('Content-Length: ' . strlen($content));
ob_start();
echo $content;
ob_end_flush();
ob_flush();
flush();
}
sendAndAbort('Hello World!');
//Anything past this point will be ran without involving the browser.
You could try to do multithreading.
you could whip up a script that makes a system call ( using shell_exec ) that calls the php binary with the script to do your work as the parameter. But I don't think that is the most secure way. Maybe you can thighten stuff up by chrooting the php process and other stuff
Alternatively, there's a class at phpclasses that do that http://www.phpclasses.org/browse/package/3953.html. But I don't know the specifics of the implementation
Joeri Sebrechts' answer is close, but it destroys any existing content that may be buffered before you wish to disconnect. It doesn't call ignore_user_abort properly, allowing the script to terminate prematurely. diyism's answer is good but is not generically applicable. E.g. a person may have greater or fewer output buffers that that answer does not handle, so it may simply not work in your situation and you won't know why.
This function allows you to disconnect any time (as long as headers have not been sent yet) and retains the content you've generated so far. The extra processing time is unlimited by default.
function disconnect_continue_processing($time_limit = null) {
ignore_user_abort(true);
session_write_close();
set_time_limit((int) $time_limit);//defaults to no limit
while (ob_get_level() > 1) {//only keep the last buffer if nested
ob_end_flush();
}
$last_buffer = ob_get_level();
$length = $last_buffer ? ob_get_length() : 0;
header("Content-Length: $length");
header('Connection: close');
if ($last_buffer) {
ob_end_flush();
}
flush();
}
If you need extra memory, too, allocate it before calling this function.
Note for mod_fcgid users (please, use at your own risk).
Quick Solution
The accepted answer of Joeri Sebrechts is indeed functional. However, if you use mod_fcgid you may find that this solution does not work on its own. In other words, when the flush function is called the connection to the client does not get closed.
The FcgidOutputBufferSize configuration parameter of mod_fcgid may be to blame. I have found this tip in:
this reply of Travers Carter and
this blog post of Seumas Mackinnon.
After reading the above, you may come to the conclusion that a quick solution would be to add the line (see "Example Virtual Host" at the end):
FcgidOutputBufferSize 0
in either your Apache configuration file (e.g, httpd.conf), your FCGI configuration file (e.g, fcgid.conf) or in your virtual hosts file (e.g., httpd-vhosts.conf).
In (1) above, a variable named "OutputBufferSize" is mentioned. This is the old name of the FcgidOutputBufferSize mentioned in (2) (see the upgrade notes in the Apache web page for mod_fcgid).
Details & A Second Solution
The above solution disables the buffering performed by mod_fcgid either for the whole server or for a specific virtual host. This might lead to a performance penalty for your web site. On the other hand, this may well not be the case since PHP performs buffering on its own.
In case you do not wish to disable mod_fcgid's buffering there is another solution... you can force this buffer to flush.
The code below does just that by building on the solution proposed by Joeri Sebrechts:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // just to be safe
ob_start();
echo('Text the user will see');
echo(str_repeat(' ', 65537)); // [+] Line added: Fill up mod_fcgi's buffer.
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
// Do processing here
sleep(30);
echo('Text user will never see');
?>
What the added line of code essentially does is fill up mod_fcgi's buffer, thus forcing it to flush. The number "65537" was chosen because the default value of the FcgidOutputBufferSize variable is "65536", as mentioned in the Apache web page for the corresponding directive. Hence, you may need to adjust this value accordingly if another value is set in your environment.
My Environment
WampServer 2.5
Apache 2.4.9
PHP 5.5.19 VC11, x86, Non Thread Safe
mod_fcgid/2.3.9
Windows 7 Professional x64
Example Virtual Host
<VirtualHost *:80>
DocumentRoot "d:/wamp/www/example"
ServerName example.local
FcgidOutputBufferSize 0
<Directory "d:/wamp/www/example">
Require all granted
</Directory>
</VirtualHost>
this worked for me
//avoid apache to kill the php running
ignore_user_abort(true);
//start buffer output
ob_start();
echo "show something to user1";
//close session file on server side to avoid blocking other requests
session_write_close();
//send length header
header("Content-Length: ".ob_get_length());
header("Connection: close");
//really send content, can't change the order:
//1.ob buffer to normal buffer,
//2.normal buffer to output
ob_end_flush();
flush();
//continue do something on server side
ob_start();
//replace it with the background task
sleep(20);
Ok, so basically the way jQuery does the XHR request, even the ob_flush method will not work because you are unable to run a function on each onreadystatechange. jQuery checks the state, then chooses the proper actions to take (complete,error,success,timeout). And although I was unable to find a reference, I recall hearing that this does not work with all XHR implementations.
A method that I believe should work for you is a cross between the ob_flush and forever-frame polling.
<?php
function wrap($str)
{
return "<script>{$str}</script>";
};
ob_start(); // begin buffering output
echo wrap("console.log('test1');");
ob_flush(); // push current buffer
flush(); // this flush actually pushed to the browser
$t = time();
while($t > (time() - 3)) {} // wait 3 seconds
echo wrap("console.log('test2');");
?>
<html>
<body>
<iframe src="ob.php"></iframe>
</body>
</html>
And because the scripts are executed inline, as the buffers are flushed, you get execution. To make this useful, change the console.log to a callback method defined in you main script setup to receive data and act on it. Hope this helps. Cheers, Morgan.
An alternative solution is to add the job to a queue and make a cron script which checks for new jobs and runs them.
I had to do it that way recently to circumvent limits imposed by a shared host - exec() et al was disabled for PHP run by the webserver but could run in a shell script.
Your problem can be solved by doing some parallel programming in php. I asked a question about it a few weeks ago here: How can one use multi threading in PHP applications
And got great answers. I liked one in particular very much. The writer made a reference to the Easy Parallel Processing in PHP (Sep 2008; by johnlim) tutorial which can actually solve your problem very well as I have used it already to deal with a similar problem that came up a couple of days ago.
If flush() function does not work. You must set next options in php.ini like:
output_buffering = Off
zlib.output_compression = Off
Latest Working Solution
// client can see outputs if any
ignore_user_abort(true);
ob_start();
echo "success";
$buffer_size = ob_get_length();
session_write_close();
header("Content-Encoding: none");
header("Content-Length: $buffer_size");
header("Connection: close");
ob_end_flush();
ob_flush();
flush();
sleep(2);
ob_start();
// client cannot see the result of code below
After trying many different solutions from this thread (after none of them worked for me), I've found solution on official PHP.net page:
function sendResponse($response) {
ob_end_clean();
header("Connection: close\r\n");
header("Content-Encoding: none\r\n");
ignore_user_abort(true);
ob_start();
echo $response; // Actual response that will be sent to the user
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
if (ob_get_contents()) {
ob_end_clean();
}
}
Couldn't get any of the above to work with IIS but:
With all its limitations, the built in PHP -S webserver comes to the rescue.
Caller script (IIS)
//limit of length required!
<?php
$s = file_get_contents('http://127.0.0.1:8080/test.php',false,null,0,10);
echo $s;
Worker script (built in webserber # 8080 - beware single thread):
ob_end_clean();
header("Connection: close");
ignore_user_abort(true);
ob_start();
echo 'Text the user will see';
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // All output buffers must be flushed here
flush(); // Force output to client
// Do processing here
sleep(5);
file_put_contents('ts.txt',date('H:i:s',time()));
//echo('Text user will never see');
Ugly enough? :)

PHP flush and WAMP server

I can't for the life of me get the PHP flush function to work properly, using WAMP. Here is some sample code, commented out are all of the different things I've tried:
//apache_setenv('no-gzip', 1); // returns error that apache_setenv does not exist
//ini_set('zlib.output_compression',0);
//ini_set('implicit_flush',1);
//ob_end_clean();
//for ($i = 0; $i < ob_get_level(); $i++) { ob_end_flush(); }
//ob_implicit_flush(1);
set_time_limit(0);
echo "<pre>";
for ($i = 0; $i < 100; ++$i) {
echo $i.' '.time().str_repeat(' ',256)."\n";
//ob_flush(); // returns error without output buffering enabled
flush();
usleep(100000);
}
It seems no matter what I do, I always get the results all together in one giant chunk.
Edit:
I've uploaded the same exact code to a server running on cPanel/linux, and it works perfect in all browsers. Why can't I get it to work properly on a localhost WAMP server??
flush() may not be able to override the buffering scheme of your web server and it has no effect on any client-side buffering in the browser. It also doesn't affect PHP's userspace output buffering mechanism. This means you will have to call both ob_flush() and flush() to flush the ob output buffers if you are using those.
Several servers, especially on Win32, will still buffer the output from your script until it terminates before transmitting the results to the browser.
Server modules for Apache like mod_gzip may do buffering of their own that will cause flush() to not result in data being sent immediately to the client.
Even the browser may buffer its input before displaying it. Netscape, for example, buffers text until it receives an end-of-line or the beginning of a tag, and it won't render tables until the tag of the outermost table is seen.
Some versions of Microsoft Internet Explorer will only start to display the page after they have received 256 bytes of output, so you may need to send extra whitespace before flushing to get those browsers to display the page.
php.net
Try using ob_flush() before flush();

flush() doesn't work in Firefox 4

I noticed that the php flush(); doesn't work in Firefox 4 beta 7, as it works in 3.6.12.
I recently installed firefox 4 beta 7, and the contents are not being flush immediately when flush() is called. It used to work fine in 3.6.12. Is there any thing else that could provide me with the flushing functionality.
I've tried
flush();
#ob_flush();
I also tried the following code at the top of the page.
#apache_setenv('no-gzip', 1);
#ini_set('zlib.output_compression', 0);
#ini_set('implicit_flush', 1);
for ($i = 0; $i < ob_get_level(); $i++) { ob_end_flush(); }
ob_implicit_flush(1);
By the way, I use php on XAMPP/Apache. Thanks.
I found that setting content type to text/plain works, but it just outputs plain text and not html content.
You're not seeing ghosts - I've experienced the same difference between FF3.6 and FF4.
Here's a work around: add an
echo str_repeat(" ", 1024);
before the output that needs to be flushed. You can put it for example in the <head>.
My theory is that FF4, like apparently IE and Safari, have a small buffer that needs to be filled before incremental rendering kicks in.
flush will function identically server-side regardless of the browser. If the client is displaying things differently, there's not a lot you can do server-side to fix it.

PHP immediate echo

I have quite a long data mining script, and in parts of it I echo some information to the page (during a foreach loop, actually.)
However I am noticing that the information is being sent to the browse not immediately as I had hoped, but in 'segments'.
Is there some function I can use after my echo to send all the data to the browser immediately?
Thanks.
You probably want flush(). However, PHP may be using output buffering. There are a few ways that this can change things, but in a nutshell, you can flush(), then ob_flush().
You can try using flush() after each echo, but even that won't guarantee a write to the client depending on the web server you're running.
Yes, padding your output to 1024 bytes will cause most browsers to start displaying the content.
But we also learn from #nobody's answer to question "How to flush output after each `echo` call?" that the 1024 bytes browser buffering effect only happens when the browser has to guess the character encoding of the page, which can be prevented by sending the proper Content-Type header (eg. "Content-Type: text/html; charset=utf-8"), or by specifying the content charset through appropriate html meta tags. And it worked as well for me in all browsers.
So basically, all one need to do is:
header('Content-Type: text/html; charset=utf-8');
ob_implicit_flush(true);
With no requirement for extra padding or flushing, which is of great cosmetic benefit for the code! Of course, headers have to be sent before any content, and one also has to make sure no output buffering is going on.
Problem definitely solved for me! Please (+1) #nobody's answer on the other question as well if it works for you. If, although, one still encounters problems, I suggest checking out the answers to that other question for other specific situations that might presumely prevent implicit flushing from working correctly.
Note also that some browsers won't start displaying anything until the body of the response contains a certain amount of data - like 256 or 1024 bytes. I have seen applications before that pad data with a 1024 character long comment near the top of the page, before they do a flush. It's a bit of a hack, but necessary.
This applies to Internet Explorer and Safari IIRC.
So,
If it is the first flush, make sure you have output at least 1024 bytes sofar (not including HTTP headers).
Call flush()
If you can determine that there is output buffering in place, issue ob_flush()
I like to just use
while (ob_get_level()) ob_end_flush();
near the start of my script somewhere, and then just
flush();
whenever I want to flush. This assumes that you don't want any output buffering at all, even if it was set up before your script (such as in a PHP.ini or htaccess configuration).
You should be able to use something like this to force output to be sent immeadiately. Put it at the part of the code you want the output to be sent.
flush();
ob_flush();
Phew! I finally found the answer to Google Chrome's buffer issue! Thanks to boysmakesh for the push in the right direction. Here's the function I use:
function buffer_flush(){
echo str_pad('', 512);
echo '<!-- -->';
if(ob_get_length()){
#ob_flush();
#flush();
#ob_end_flush();
}
#ob_start();
}
And this is how I call it:
show_view('global', 'header'); // Echos the <html><head>... tags and
// includes JS and CSS.
show_view('global', 'splash_screen'); // Shows a loading image telling
// the user that everything's okay.
buffer_flush(); // Pretty obvious. At this point the loading view shows
// up on every browser i've tested (chrome, firefox,
// IE 7 & 8)
show_view('global', 'main'); // Has a loop that echos "Test $i<br>" 5
// times and calls buffer_flush() each time.
show_view('global', 'footer'); // End the html page and use JQuery to
// fade out the loading view.
To perfectly work this out in Google chrome,
try this:
$i = 0;
$padstr = str_pad("",512," ");
echo $padstr;
while ($i <= 4){
$padstr = str_pad("",512," ");
echo $padstr;
echo "boysmakesh <BR> ";
flush();
sleep(2);
$i = $i + 1;
}
Ee are sending 512 bytes before sending EACH echo. Don't forget to put <BR> at the end of content before you flush. Else it won't work in Chrome but works in IE.
The data we padding is browser dependent. For some browsers it's enough to have 256 bytes but some need 1024 bytes. For chrome it is 512.
ignore_user_abort(TRUE); // run script in background
set_time_limit(0); // run script forever
$interval=150000;
$i = 0;
if(
strpos($_SERVER["HTTP_USER_AGENT"], "Gecko") or
strpos($_SERVER["HTTP_USER_AGENT"], "WebKit")
){
# important to change browser into quirks mode
echo '<?xml version="1.0" encoding="iso-8859-1"?><!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">';
}
function buffer_flush(){
echo "\n\n<!-- Deal with browser-related buffering by sending some incompressible strings -->\n\n";
for ( $i = 0; $i < 5; $i++ )
echo "<!-- abcdefghijklmnopqrstuvwxyz1234567890aabbccddeeffgghhiijjkkllmmnnooppqqrrssttuuvvwwxxyyzz11223344556677889900abacbcbdcdcededfefegfgfhghgihihjijikjkjlklkmlmlnmnmononpopoqpqprqrqsrsrtstsubcbcdcdedefefgfabcadefbghicjkldmnoepqrfstugvwxhyz1i234j567k890laabmbccnddeoeffpgghqhiirjjksklltmmnunoovppqwqrrxsstytuuzvvw0wxx1yyz2z113223434455666777889890091abc2def3ghi4jkl5mno6pqr7stu8vwx9yz11aab2bcc3dd4ee5ff6gg7hh8ii9j0jk1kl2lmm3nnoo4p5pq6qrr7ss8tt9uuvv0wwx1x2yyzz13aba4cbcb5dcdc6dedfef8egf9gfh0ghg1ihi2hji3jik4jkj5lkl6kml7mln8mnm9ono -->\n\n";
while ( ob_get_level() )
ob_end_flush();
if(ob_get_length()){
#ob_flush();
#flush();
#ob_end_flush();
}
#ob_start();
}
ob_start();
do{
if($i<10){
buffer_flush();
echo ". ";
buffer_flush();
usleep($interval);
} else {
echo sprintf("<pre>%s</pre>", print_r($_SERVER,true));
break;
}
$i++;
}while(true);
Running php 5.5 on IIS 7, IE 11 (win server) I found this worked as the opening lines of the file. Note putting the while statement before the header caused a header already written error.
header('Content-Type: text/html; charset=utf-8');
while (ob_get_level()) ob_end_flush();
ob_implicit_flush(true);
Further references to ob_flush() in the script caused a buffer does not exist error.
This worked fine when I was processing a file and sending sql statements to the browser, however when I hooked up the db (ms server 2008) I had no input returned till the script had completed.
this combination finally worked for me, based on thomasrutter's answer
while (ob_get_level()) ob_end_flush();
ob_implicit_flush(true);

Categories