I have an infinite loop that I want to sleep for a second but the browser just hangs and nothing echos. Can someone tell me why?
while(1)
{
sleep(1);
echo 'test'.'<br>';
}
It won't work until the server stops processing.
You could try flushing the output buffer with flush(), but it's not guaranteed to work.
Try this function from PHP.net:
function flush_buffers() {
ob_end_flush();
ob_flush();
flush();
ob_start();
}
That's because PHP sends data to browser in big chunks. It is all about optimization, sending small data is bad for performance, you want it to be sent in huge enough blocks so overhead from transferring it (both in speed and actual traffic) would be relatively low. Couple of small strings is just not big enough. Try add a flush() right after the echo, it will force PHP send this string to the browser.
Couse it's a infinite loop it never will send response or be break
Related
In my website(running with drupal) the ob_flush function takes a long time(between 10 - 100 secs) to be executed. How do I find out why? What can cause this so long time?
Try this:
ob_start();
//Your code to generate the output
$result = ob_get_contents(); //save the contents of output buffer to a string
ob_end_clean();
echo $result;
It is run quick for me.
[You may want to tag your question with Drupal, since this feels like it might be a Drupal issue. Specifically, I suspect that when you flush the buffer, you're writing to an outer buffer, which triggers a ton of hooks to be called to filter the data you've just written.]
I suspect that your problem is nested buffers. Drupal really likes buffers and buffers everything all over the place. Check the result of:
echo "<pre>\nBuffering level: ";
. ob_get_level() .
. "\nBuffer status:\n"
. var_dump(ob_get_status(TRUE))
. "\n</pre>";
If you've got nested buffers, then I suspect ob_flush() will do nothing for you: it just appends the contents of your inner buffer into the next outermost layer of buffering.
Nested buffers can come from Drupal itself (which the above will show), or from the settings for zlib-output-compression and output_buffering (try twiddling those, see if it changes anything).
If your buffers are not nested, and the above settings do not help, then you may also want to split the operation into pieces, and run the profiler there, to see which part is taking the time:
$data = ob_get_contents(); // Return the contents of the output buffer.
ob_clean(); // Clean (erase) the output buffer.
ob_end(); // Close the buffer.
echo($data); // Output our data - only works if there's no outer buffer!
ob_start(); // Start our buffer again.
The question then becomes, though, "what are you trying to accomplish?" What do you think ob_flush() is doing here? Because if the answer is "I want to push everything I've done so far to the browser"... then I'm afraid that ob_flush() just isn't the right way.
SET
output_buffering = Off
in php.ini
use
<?ob_start();?>
at the beginning of the page and
<?ob_flush();?>
at the end of the page, to solve this problem.
I just want to print a counting from 1 to 10 at an interval of 10 sec between each integer.
eg.
$i=10; //Time delay
for($j=1;$j<11;$j++)
{
echo $j;
//do something to delay the execution by $i seconds
}
I have tried everything including flush(), ob_flush(), ob_implicit_flush() but all i get is a frozen screen untill the whole time is executed.
http://php.net/manual/en/function.sleep.php
The sleep function will interrupt execution of your script.
But have you considered using Javascript for something like this? Your script may reach maximum execution time, and will be hogging resources on the server. Use the client's resources instead!
What you want is much more javascript-related than PHP. Because PHP is serverside it is not designed to do these kind of operations. You COULD get it to work, but it would not be very pretty.
In my logic; counting from 1 to 10 should not involve the server at all. You can do this directly in the browser, hence use javascript.
you want to print the countdown while your php script is running?
if yes, then try that non-recommended fragment:
ob_start();
for($i=0;$i<10;$i++) {
echo str_repeat(" ",10000);
echo 'printing...<br />';
ob_flush();
flush();
sleep(1);
}
you see, the strange line:
echo str_repeat(" ",10000);
it seems that browsers needs some "data" before deciding to really flush your data.
Use javascript for real time counters.
Use jQuery. On $(document).ready add a delay of 10 seconds to show a specific div which would contain the info to appear after 10 seconds.
For ready - http://api.jquery.com/ready/
For delay - http://api.jquery.com/delay/
Yes, use Javascript as it's not possible to accomplish this task with PHP using HTTP because of output buffering.
I want to read everything from a textfile and echo it. But there might be more lines written to the text-file while I'm reading so I don't want the script to exit when it has reached the end of the file, instead I wan't it to wait forever for more lines. Is this possible in php?
this is just a guess, but try to pass through (passthru) a "tail -f" output.
but you will need to find a way to flush() your buffer.
IMHO a much nicer solution would be to build a ajax site.
read the contents of the file in to an array. store the number of lines in the session. print the content of the file.
start an ajax request every x seconds to a script which checks the file, if the line count is greater then the session count append the result to the page.
you could use popen() inststed:
$f = popen("tail -f /where/ever/your/file/is 2>&1", 'r');
while(!feof($f)) {
$buffer = fgets($f);
echo "$buffer\n";
flush();
sleep(1);
}
pclose($f)
the sleep is important, without it you will have 100% CPU time.
In fact, when you "echo" it, it goes to the buffer. So what you want is "appending" the new content if it's added while the browser is still receiving output. And this is not possible (but there are some approaches to this).
I solved it.
The trick was to use fopen and when eof is reached move the cursor to the previous position and continue reading from there.
<?php
$handle = fopen('text.txt', 'r');
$lastpos = 0;
while(true){
if (!feof($handle)){
echo fread($handle,8192);
flush();
$lastpos = ftell($handle);
}else{
fseek($handle,$lastpos);
}
}
?>
Still consumes pretty much cpu though, don't know how to solve that.
You may also use filemtime: you get latest modification timestamp, send the output and at the end compare again the stored filemtime with the current one.
Anyway, if you want the script go at the same time that the browser (or client), you should send the output using chunks (fread, flush), then check any changes at the end. If there are any changes, re-open the file and read from the latest position (you can get the position outside of the loop of while(!feof())).
I'm always using an output variable in PHP where I gather all the content before I echo it. Then I read somewhere (I don't remember where, though) that you get best performance if you split the output variable into packets and then echo each packet instead of the whole output variable.
How is it really?
If you are outputting really big strings with echo, it is better to use multiple echo statements.
This is because of the way Nagle's algorithm causes data to be buffered over TCP/IP.
Found an note on Php-bugs about it:
http://bugs.php.net/bug.php?id=18029
This will automatically break up big strings into smaller chunks and echo them out:
function echobig($string, $bufferSize = 8192) {
$splitString = str_split($string, $bufferSize);
foreach($splitString as $chunk) {
echo $chunk;
}
}
Source: http://wonko.com/post/seeing_poor_performance_using_phps_echo_statement_heres_why
I think a better solution is presented here ....
http://wonko.com/post/seeing_poor_performance_using_phps_echo_statement_heres_why#comment-5606
........
Guys, I think I narrowed it down even further!
As previously said, PHP buffering will let PHP race to the end of your script, but after than it will still “hang” while trying to pass all that data to Apache.
Now I was able, not only to measure this (see previous comment) but to actually eliminate the waiting period inside of PHP. I did that by increasing Apache’s SendBuffer with the SendBufferSize directive.
This pushes the data out of PHP faster. I guess the next step would be to get it out of Apache faster but I’m not sure if there is actually another configurable layer between Apache and the raw network bandwidth.
This is my version of the solution, it echos only if the connection is not aborted. if the user disconnects then the function exits.
<?php
function _echo(){
$args = func_get_args();
foreach($args as $arg){
$strs = str_split($arg, 8192);
foreach($strs as $str){
if(connection_aborted()){
break 2;
}
echo $str;
}
}
}
_echo('this','and that','and more','and even more');
_echo('or just a big long text here, make it as long as you want');
how do i go about reading data being sent almost constantly to my server.
the protocol is udp. if i try to read in a while(1) loop, i dont get anything. it seems like the read will only echo once all the reading is done. so it waits till the loop is done reading which it will never be. i want the socket_read to echo immediately when it gets the data. here is the code that doesnt work. thanks in advance.
<?php
$sock = socket_create(AF_INET, SOCK_DGRAM, SOL_UDP);
socket_bind($sock, $local, $port) or die('Could not bind to address');
//this is where the reading loop should go.
while(1)
{
echo socket_read($sock,1024);
}
socket_close($sock);
?>
Try calling flush() immediately after that echo statement.
Something like this might help:
do {
echo socket_read($handle,1024);
$status = socket_get_status($handle);
} while($status['unread_bytes']);
OR
while ( $buffer = #socket_read($sock,512,PHP_NORMAL_READ) )
echo $buffer;
The PHP manual entry on socket_read() is a little vague when it comes to how much (if any) internal buffering it's doing. Given that you are passing 1024 in for the length, that specifies that it should return after receiving no more than 1024 bytes of data.
Disclaimer: the following is just speculation, as I have no knowledge of the internal implementation of socket_read().
If the socket_read() function is using its length parameter as a hint for an internal buffer size, you might see bad performance with small UDP packets. For example, if socket_read() waits for 1024 bytes of data regardless of the size of the packets, if you are constantly receiving 60 byte UDP packets it'll take a while for the buffer to fill and the function to return.
(Note: after looking up the "unread_bytes" field mentioned by Tim, it looks like PHP does keep internal buffers, but it makes no mention of how large or small those might be.)
In this case, socket_read() will return larger chunks of data once its buffers fill to reduce processing resource consumption, but at the expense of higher latency. If you need the packets as past as possible, perhaps setting a lower length field would work. That would force socket_read() to return sooner, albeit at the expense of executing your loop more often. Also if you set the length too low, your socket_read()'s might start returning incomplete packets, so you'll have to account for that in your code. (If that matters for your application, of course.)
I needed to call ob_flush();. Never even heard of it before. turns out my problem wasn't the loop, but the fact that php naturally waits till script is done before actually sending the internal buffer to the web browser. calling flush(); followed by ob_flush(); will force php to send whatever buffer it has stored to the browser immediately. This is needed for scripts that will not stop (infinite loops) and want to echo data to the browser. Sometimes flush() doesn't work as it didn't in this case. Hope that helps.