Run process with realtime output in PHP - php

I am trying to run a process on a web page that will return its output in realtime. For example if I run 'ping' process it should update my page every time it returns a new line (right now, when I use exec(command, output) I am forced to use -c option and wait until process finishes to see the output on my web page). Is it possible to do this in php?
I am also wondering what is a correct way to kill this kind of process when someone is leaving the page. In case of 'ping' process I am still able to see the process running in the system monitor (what makes sense).

This worked for me:
$cmd = "ping 127.0.0.1";
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("pipe", "w") // stderr is a pipe that the child will write to
);
flush();
$process = proc_open($cmd, $descriptorspec, $pipes, realpath('./'), array());
echo "<pre>";
if (is_resource($process)) {
while ($s = fgets($pipes[1])) {
print $s;
flush();
}
}
echo "</pre>";

This is a nice way to show real time output of your shell commands:
<?php
header("Content-type: text/plain");
// tell php to automatically flush after every output
// including lines of output produced by shell commands
disable_ob();
$command = 'rsync -avz /your/directory1 /your/directory2';
system($command);
You will need this function to prevent output buffering:
function disable_ob() {
// Turn off output buffering
ini_set('output_buffering', 'off');
// Turn off PHP output compression
ini_set('zlib.output_compression', false);
// Implicitly flush the buffer(s)
ini_set('implicit_flush', true);
ob_implicit_flush(true);
// Clear, and turn off output buffering
while (ob_get_level() > 0) {
// Get the curent level
$level = ob_get_level();
// End the buffering
ob_end_clean();
// If the current level has not changed, abort
if (ob_get_level() == $level) break;
}
// Disable apache output buffering/compression
if (function_exists('apache_setenv')) {
apache_setenv('no-gzip', '1');
apache_setenv('dont-vary', '1');
}
}
It doesn't work on every server I have tried it on though, I wish I could offer advice on what to look for in your php configuration to determine whether or not you should pull your hair out trying to get this type of behavior to work on your server! Anyone else know?
Here's a dummy example in plain PHP:
<?php
header("Content-type: text/plain");
disable_ob();
for($i=0;$i<10;$i++)
{
echo $i . "\n";
usleep(300000);
}
I hope this helps others who have googled their way here.

Checked all answers, nothing works...
Found solution Here
It works on windows (i think this answer is helpful for users searching over there)
<?php
$a = popen('ping www.google.com', 'r');
while($b = fgets($a, 2048)) {
echo $b."<br>\n";
ob_flush();flush();
}
pclose($a);
?>

A better solution to this old problem using modern HTML5 Server Side Events is described here:
http://www.w3schools.com/html/html5_serversentevents.asp
Example:
http://sink.agiletoolkit.org/realtime/console
Code: https://github.com/atk4/sink/blob/master/admin/page/realtime/console.php#L40
(Implemented as a module in Agile Toolkit framework)

For command-line usage:
function execute($cmd) {
$proc = proc_open($cmd, [['pipe','r'],['pipe','w'],['pipe','w']], $pipes);
while(($line = fgets($pipes[1])) !== false) {
fwrite(STDOUT,$line);
}
while(($line = fgets($pipes[2])) !== false) {
fwrite(STDERR,$line);
}
fclose($pipes[0]);
fclose($pipes[1]);
fclose($pipes[2]);
return proc_close($proc);
}
If you're trying to run a file, you may need to give it execute permissions first:
chmod('/path/to/script',0755);

try this (tested on Windows machine + wamp server)
header('Content-Encoding: none;');
set_time_limit(0);
$handle = popen("<<< Your Shell Command >>>", "r");
if (ob_get_level() == 0)
ob_start();
while(!feof($handle)) {
$buffer = fgets($handle);
$buffer = trim(htmlspecialchars($buffer));
echo $buffer . "<br />";
echo str_pad('', 4096);
ob_flush();
flush();
sleep(1);
}
pclose($handle);
ob_end_flush();

I've tried various PHP execution commands on Windows and found that they differ quite a lot.
Don't work for streaming: shell_exec, exec, passthru
Kind of works: proc_open, popen -- "kind of" because you cannot pass arguments to your command (i.e. wont' work with my.exe --something, will work with _my_something.bat).
The best (easiest) approach is:
You must make sure your exe is flushing commands (see printf flushing problem). Without this you will most likely receive batches of about 4096 bytes of text whatever you do.
If you can, use header('Content-Type: text/event-stream'); (instead of header('Content-Type: text/plain; charset=...');). This will not work in all browsers/clients though! Streaming will work without this, but at least first lines will be buffered by the browser.
You also might want to disable cache header('Cache-Control: no-cache');.
Turn off output buffering (either in php.ini or with ini_set('output_buffering', 'off');). This might also have to be done in Apache/Nginx/whatever server you use in front.
Turn of compression (either in php.ini or with ini_set('zlib.output_compression', false);). This might also have to be done in Apache/Nginx/whatever server you use in front.
So in your C++ program you do something like (again, for other solutions see printf flushing problem):
Logger::log(...) {
printf (text);
fflush(stdout);
}
In PHP you do something like:
function setupStreaming() {
// Turn off output buffering
ini_set('output_buffering', 'off');
// Turn off PHP output compression
ini_set('zlib.output_compression', false);
// Disable Apache output buffering/compression
if (function_exists('apache_setenv')) {
apache_setenv('no-gzip', '1');
apache_setenv('dont-vary', '1');
}
}
function runStreamingCommand($cmd){
echo "\nrunning $cmd\n";
system($cmd);
}
...
setupStreaming();
runStreamingCommand($cmd);

First check whether flush() works for you. If it does, good, if it doesn't it probably means the web server is buffering for some reason, for example mod_gzip is enabled.
For something like ping, the easiest technique is to loop within PHP, running "ping -c 1" multiple times, and calling flush() after each output. Assuming PHP is configured to abort when the HTTP connection is closed by the user (which is usually the default, or you can call ignore_user_abort(false) to make sure), then you don't need to worry about run-away ping processes either.
If it's really necessary that you only run the child process once and display its output continuously, that may be more difficult -- you'd probably have to run it in the background, redirect output to a stream, and then have PHP echo that stream back to the user, interspersed with regular flush() calls.

If you're looking to run system commands via PHP look into, the exec documentation.
I wouldn't recommend doing this on a high traffic site though, forking a process for each request is quite a hefty process. Some programs provide the option of writing their process id to a file such that you could check for, and terminate the process at will, but for commands like ping, I'm not sure that's possible, check the man pages.
You may be better served by simply opening a socket on the port you expect to be listening (IE: port 80 for HTTP) on the remote host, that way you know everything is going well in userland, as well as on the network.
If you're attempting to output binary data look into php's header function, and ensure you set the proper content-type, and content-disposition. Review the documentation, for more information on using/disabling the output buffer.

Try changing the php.ini file set "output_buffering = Off". You should be able to get the real time output on the page
Use system command instead of exec.. system command will flush the output

why not just pipe the output into a log file and then use that file to return content to the client. not quite real time but perhaps good enough?

I had the same problem only could do it using Symfony Process Components ( https://symfony.com/doc/current/components/process.html )
Quick example:
<?php
use Symfony\Component\Process\Process;
$process = new Process(['ls', '-lsa']);
$process->run(function ($type, $buffer) {
if (Process::ERR === $type) {
echo 'ERR > '.$buffer;
} else {
echo 'OUT > '.$buffer;
}
});
?>

Related

how to transfer control from one non web based script to another in php and not return to first program? [duplicate]

How can a PHP script start another PHP script, and then exit, leaving the other script running?
Also, is there any way for the 2nd script to inform the PHP script when it reaches a particular line?
Here's how to do it. You tell the browser to read in the first N characters of output and then close the connection, while your script keeps running until it's done.
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(); // optional
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Will not work
flush(); // Unless both are called !
// At this point, the browser has closed connection to the web server
// Do processing here
include('other_script.php');
echo('Text user will never see');
?>
You can effectively achieve this by forking and then calling include or require.
parent.php:
<?php
$pid = pcntl_fork();
if ($pid == -1) {
die("couldn't fork");
} else if ($pid) { // parent script
echo "Parent waiting at " . date("H:i:s") . "\n";
pcntl_wait($status);
echo "Parent done at " . date("H:i:s") . "\n";
} else {
// child script
echo "Sleeper started at " . date("H:i:s") . "\n";
include('sleeper.php');
echo "Sleeper done at " . date("H:i:s") . "\n";
}
?>
sleeper.php:
<?php
sleep(3);
?>
Output:
$ php parent.php
Sleeper started at 01:22:02
Parent waiting at 01:22:02
Sleeper done at 01:22:05
Parent done at 01:22:05
However, forking does not inherently allow any inter-process communication, so you'd have to find some other way to inform the parent that the child has reached the specific line, like you asked in the question.
Here's a shot in the dark: you could try using php's OS execution functions with &.
exec("./somescript.php &");
Additionally, if that doesn't work, you can try
exec("nohup ./somescript.php &");
Edit: nohup is a POSIX command to ignore the HUP (hangup) signal, enabling the command to keep running after the user who issues the command has logged out. The HUP (hangup) signal is by convention the way a terminal warns depending processes of logout.
Would pcntl_fork() do something similar to what you're ultimately trying to accomplish? http://www.php.net/manual/en/function.pcntl-fork.php
If you don't want to build the pcntl extension, then a good alternative is to use proc_open().
http://www.php.net/manual/en/function.proc-open.php
Use that together with stream_select() so your PHP process can sleep until something happens with the child process you created.
That will effectively create a process in the background, without blocking the parent PHP process. You PHP can read and write to STDIN, STDOUT, STDERR.
To make the browser complete loading (stop the load progress indicator) then you can use what Milan Babuškov mentioned.
The key to making the browser think the HTTP request is complete, is to send it the content length. To do this you can start buffering the request, then flush it after you send the Content-Length header.
eg:
<?php
ob_start();
// render the HTML page and/or process stuff
header('Content-Length: '.ob_get_length());
ob_flush();
flush();
// can do more processing
?>
You can create a request and close the connection right after it is done being written to.
Checkout the code in http://drupal.org/project/httprl as can do this (non-blocking request). I plan on pushing this lib to github once I get it more polished; something that can be ran outside of drupal. This should do what your looking for.

Equivalent of /dev/null for writing garbage test data?

I need to perform a series of test for picking the fastest branch of code for a set of functions I designed. As this functions output some text/HTML content, I would like to measure the speed without filling the browser with garbage data.
Is there an equivalent to /dev/null in PHP? The closest equivalent to write temporary data I've found are php://temp and php://memory but those two I/O streams store the garbage data and I want for every piece of data to be written in a 'fake' fashion.
I could always write all garbage data in a variable ala $tmp .= <function return value goes here> but I'm sure there must be a more elegant or a better way to accomplish this WITHOUT resorting to functions like shell_exec(), exec(), proc_open() and similar approaches (the production server I'm going to test the final code won't have any of those commands).
Is there an equivalent?
// For what its worth, this works on CentOS 6.5 php 5.3.3.
$fname = "/dev/null";
if(file_exists($fname)) print "*** /dev/null exists ***\n";
if (is_readable($fname)) print "*** /dev/null readable ***\n";
if (is_writable($fname)) print "*** /dev/null writable ***\n";
if (($fileDesc = fopen($fname, "r"))==TRUE){
print "*** I opened /dev/null for reading ***\n";
$x = fgetc($fileDesc);
fclose($fileDesc);
}
if (($fileDesc = fopen($fname, "w"))==TRUE)
{
print "*** I opened /dev/null for writing ***\n";
$x = fwrite($fileDesc,'X');
fclose($fileDesc);
}
if (($fileDesc = fopen($fname, "w+"))==TRUE) {
print "*** I opened /dev/null for append ***\n";
$x = fwrite($fileDesc,'X');
fclose($fileDesc);
}
I think your best bet would be a streamWrapper that profiles your output on write with microtime, that you can then stream_wrapper_register . The example in the manual is pretty good.
If your code is not that complicated or you fell this would be overkill, you can just use the ob_start callback handler
Hope this helps.

shell_exec() in PHP

<?php
// Execute a shell script
$dump = shell_exec('bigfile.sh'); // This script takes some 10s to complete execution
print_r($dump); // Dump log to screen
?>
When the script above is executed from the browser, it loads for 10s and the dumps the output of the script to the screen. This is, of course, normal. But if I want the data written to STDOUT by the shell script to be displayed on the screen in real-time, is there some way I could do it?
I would add proc_open() which gives you much more control over command execution if you need it, if not try passthru() or popen() as it was mentioned before.
Try this:
$handle = proc_open('bigfile.sh', array(0 => STDIN, 1 => STDOUT, 2 => STDERR), $pipes);
$status = proc_close($handle);
It works great for me.
Try passthru() or popen()
The code will look something like this:
<?php
$fp=popen("bigfile.sh","r");
while (!feof($fp)) {
$results = fgets($fp, 256);
echo $result;
flush();
}
?>
As #wik suggest below you can also try proc_open instead of popen it should work in a similar fashion.

How to flush output after each `echo` call?

I have a php script that only produces logs to the client.
When I echo something, I want it to be transferred to client on-the-fly.
(Because while the script is processing, the page is blank)
I had already played around with ob_start() and ob_flush(), but they didn't work.
What's the best solution?
PS: it is a little dirty to put a flush at the end of the echo call...
EDIT: Neither the Answers worked, PHP or Apache Fault?
I've gotten the same issue and one of the posted example in the manual worked. A character set must be specified as one of the posters here already mentioned. http://www.php.net/manual/en/function.ob-flush.php#109314
header( 'Content-type: text/html; charset=utf-8' );
echo 'Begin ...<br />';
for( $i = 0 ; $i < 10 ; $i++ )
{
echo $i . '<br />';
ob_flush();
flush();
sleep(1);
}
echo 'End ...<br />';
Edit:
I was reading the comments on the manual page and came across a bug that states that ob_implicit_flush does not work and the following is a workaround for it:
ob_end_flush();
# CODE THAT NEEDS IMMEDIATE FLUSHING
ob_start();
If this does not work then what may even be happening is that the client does not receive the packet from the server until the server has built up enough characters to send what it considers a packet worth sending.
Old Answer:
You could use ob_implicit_flush which will tell output buffering to turn off buffering for a while:
ob_implicit_flush(true);
# CODE THAT NEEDS IMMEDIATE FLUSHING
ob_implicit_flush(false);
So here's what I found out.
Flush would not work under Apache's mod_gzip or Nginx's gzip because, logically, it is gzipping the content, and to do that it must buffer content to gzip it. Any sort of web server gzipping would affect this. In short, at the server side we need to disable gzip and decrease the fastcgi buffer size. So:
In php.ini:
output_buffering = Off
zlib.output_compression = Off
In nginx.conf:
gzip off;
proxy_buffering off;
Also have these lines at hand, especially if you don't have access to php.ini:
#ini_set('zlib.output_compression',0);
#ini_set('implicit_flush',1);
#ob_end_clean();
set_time_limit(0);
Last, if you have it, comment the code bellow:
ob_start('ob_gzhandler');
ob_flush();
PHP test code:
ob_implicit_flush(1);
for ($i=0; $i<10; $i++) {
echo $i;
// this is to make the buffer achieve the minimum size in order to flush data
echo str_repeat(' ',1024*64);
sleep(1);
}
For those coming in 2018:
The ONLY Solution worked for me:
<?php
if (ob_get_level() == 0) ob_start();
for ($i = 0; $i<10; $i++){
echo "<br> Line to show.";
echo str_pad('',4096)."\n";
ob_flush();
flush();
sleep(2);
}
echo "Done.";
ob_end_flush();
?>
and its very important to keep de "4096" part because it seems that "fills" the buffer...
Flushing seemingly failing to work is a side effect of automatic character set detection.
The browser will not display anything until it knows the character set to display it in, and if you don't specify the character set, it need tries to guess it. The problem being that it can't make a good guess without enough data, which is why browsers seem to have this 1024 byte (or similar) buffer they need filled before displaying anything.
The solution is therefore to make sure the browser doesn't have to guess the character set.
If you're sending text, add a '; charset=utf-8' to its content type, and if it's HTML, add the character set to the appropriate meta tag.
what you want is the flush method.
example:
echo "log to client";
flush();
Why not make a function to echo, like this:
function fecho($string) {
echo $string;
ob_flush();
}
One thing that is not often mentionned is gzip compression that keeps turned ON because of details in various hosting environments.
Here is a modern approach, working with PHP-FPM as Fast CGI, which does not need .htaccess rewrite rule or environment variable :
In php.ini or .user.ini :
output_buffering = 0
zlib.output_compression = 0
implicit_flush = true
output_handler =
In PHP script :
header('Content-Encoding: none'); // Disable gzip compression
ob_end_flush(); // Stop buffer
ob_implicit_flush(1); // Implicit flush at each output command
See this comment on official PHP doc for ob_end_flush() need.
I had a similar thing to do. Using
// ini_set("output_buffering", 0); // off
ini_set("zlib.output_compression", 0); // off
ini_set("implicit_flush", 1); // on
did make the output flushing frequent in my case.
But I had to flush the output right at a particular point(in a loop that I run), so using both
ob_flush();
flush();
together worked for me.
I wasn't able to
turn off "output_buffering" with
ini_set(...), had to turn it directly
in php.ini, phpinfo() shows its setting
as "no value" when turned off, is that
normal? .
header( 'X-Accel-Buffering: no' );
header( 'Content-Type: text/html; charset=utf-8' );
echo 'text to display';
echo '<span style="display: none;">' . str_repeat ( ' ', 4096 ) . '</span>';
flush();
usleep( 10 );
The correct function to use is flush().
<html>
<body>
<p>
Hello! I am waiting for the next message...<br />
<?php flush(); sleep(5); ?>
I am the next message!<br />
<?php flush(); sleep(5); ?>
And I am the last message. Good bye.
</p>
</body>
</html>
Please note that there is a "problem" with IE, which only outputs the flushed content when it is at least 256 byte, so your first part of the page needs to be at least 256 byte.
This works fine for me (Apache 2.4/PHP 7.0):
#ob_end_clean();
echo "lorem ipsum...";
flush();
sleep(5);
echo "<br>dolor...";
flush();
sleep(5);
echo "<br>sit amet";
Anti-virus software may also be interfering with output flushing. In my case, Kaspersky Anti-Virus 2013 was holding data chunks before sending it to the browser, even though I was using an accepted solution.
Sometimes, the problem come from Apache settings. Apache can be set to gzip the output.
In the file .htaccess you can add for instance :
SetEnv no-gzip 1
Try this:
while (#ob_end_flush());
ob_implicit_flush(true);
echo "first line visible to the browser";
echo "<br />";
sleep(5);
echo "second line visible to the browser after 5 secs";
Just notice that this way you're actually disabling the output buffer for your current script. I guess you can reenable it with ob_start() (i'm not sure).
Important thing is that by disabling your output buffer like above, you will not be able to redirect your php script anymore using the header() function, because php can sent only once per script execution http headers.
You can however redirect using javascript. Just let your php script echo following lines when it comes to that:
echo '<script type="text/javascript">';
echo 'window.location.href="'.$url.'";';
echo '</script>';
echo '<noscript>';
echo '<meta http-equiv="refresh" content="0;url='.$url.'" />';
echo '</noscript>';
exit;
Note if you are on certain shared hosting sites like Dreamhost you can't disable PHP output buffering at all without going through different routes:
Changing the output buffer cache If you are using PHP FastCGI, the PHP
functions flush(), ob_flush(), and ob_implicit_flush() will not
function as expected. By default, output is buffered at a higher level
than PHP (specifically, by the Apache module mod_deflate which is
similar in form/function to mod_gzip).
If you need unbuffered output, you must either use CGI (instead of
FastCGI) or contact support to request that mod_deflate is disabled
for your site.
https://help.dreamhost.com/hc/en-us/articles/214202188-PHP-overview
I'm late to the discussion but I read that many people are saying appending flush(); at the end of each code looks dirty, and they are right.
Best solution is to disable deflate, gzip and all buffering from Apache, intermediate handlers and PHP. Then in your php.ini you should have:
output_buffering = Off
zlib.output_compression = Off
implicit_flush = Off
Temporary solution is to have this in your php.ini IF you can solve your problem with flush(); but you think it is dirty and ugly to put it everywhere.
implicit_flush = On
If you only put it above in your php.ini, you don't need to put flush(); in your code anymore.
This is my code: (work for PHP7)
private function closeConnection()
{
#apache_setenv('no-gzip', 1);
#ini_set('zlib.output_compression', 0);
#ini_set('implicit_flush', 1);
ignore_user_abort(true);
set_time_limit(0);
ob_start();
// do initial processing here
echo json_encode(['ans' => true]);
header('Connection: close');
header('Content-Length: ' . ob_get_length());
ob_end_flush();
ob_flush();
flush();
}

how to redirect STDOUT to a file in PHP?

The code below almost works, but it's not what I really meant:
ob_start();
echo 'xxx';
$contents = ob_get_contents();
ob_end_clean();
file_put_contents($file,$contents);
Is there a more natural way?
It is possible to write STDOUT directly to a file in PHP, which is much easier and more straightforward than using output bufferering.
Do this in the very beginning of your script:
fclose(STDIN);
fclose(STDOUT);
fclose(STDERR);
$STDIN = fopen('/dev/null', 'r');
$STDOUT = fopen('application.log', 'wb');
$STDERR = fopen('error.log', 'wb');
Why at the very beginning you may ask? No file descriptors should be opened yet, because when you close the standard input, output and error file descriptors, the first three new descriptors will become the NEW standard input, output and error file descriptors.
In my example here I redirected standard input to /dev/null and the output and error file descriptors to log files. This is common practice when making a daemon script in PHP.
To write to the application.log file, this would suffice:
echo "Hello world\n";
To write to the error.log, one would have to do:
fwrite($STDERR, "Something went wrong\n");
Please note that when you change the input, output and error descriptors, the build-in PHP constants STDIN, STDOUT and STDERR will be rendered unusable. PHP will not update these constants to the new descriptors and it is not allowed to redefine these constants (they are called constants for a reason after all).
here's a way to divert OUTPUT which appears to be the original problem
$ob_file = fopen('test.txt','w');
function ob_file_callback($buffer)
{
global $ob_file;
fwrite($ob_file,$buffer);
}
ob_start('ob_file_callback');
more info here:
http://my.opera.com/zomg/blog/2007/10/03/how-to-easily-redirect-php-output-to-a-file
None of the answers worked for my particular case where I needed a cross platform way of redirecting the output as soon as it was echo'd out so that I could follow the logs with tail -f log.txt or another log viewing app.
I came up with the following solution:
$logFp = fopen('log.txt', 'w');
ob_start(function($buffer) use($logFp){
fwrite($logFp, $buffer);
}, 1); //notice the use of chunk_size == 1
echo "first output\n";
sleep(10)
echo "second output\n";
ob_end_clean();
I haven't noticed any performance issues but if you do, you can change chunk_size to greater values.
Now just tail -f the log file:
tail -f log.txt
No, output buffering is as good as it gets. Though it's slightly nicer to just do
ob_start();
echo 'xxx';
$contents = ob_get_flush();
file_put_contents($file,$contents);
Using eio pecl module eio is very easy, also you can capture PHP internal errors, var_dump, echo, etc. In this code, you can found some examples of different situations.
$fdout = fopen('/tmp/stdout.log', 'wb');
$fderr = fopen('/tmp/stderr.log', 'wb');
eio_dup2($fdout, STDOUT);
eio_dup2($fderr, STDERR);
eio_event_loop();
fclose($fdout);
fclose($fderr);
// output examples
echo "message to stdout\n";
$v2dump = array(10, "graphinux");
var_dump($v2dump);
// php internal error/warning
$div0 = 10/0;
// user errors messages
fwrite(STDERR, "user controlled error\n");
Call to eio_event_loop is used to be sure that previous eio requests have been processed. If you need append on log, on fopen call, use mode 'ab' instead of 'wb'.
Install eio module is very easy (http://php.net/manual/es/eio.installation.php). I tested this example with version 1.2.6 of eio module.
You can install Eio extension
pecl install eio
and duplicate a file descriptor
$temp=fopen('/tmp/my_stdout','a');
$my_data='my something';
$foo=eio_dup2($temp,STDOUT,EIO_PRI_MAX,function($data,$esult,$request){
var_dump($data,$esult,$request);
var_dump(eio_get_last_error($request));
},$my_data);
eio_event_loop();
echo "something to stdout\n";
fclose($temp);
this creates new file descriptor and rewrites target stream of STDOUT
this can be done with STDERR as well
and constants STD[OUT|ERR] are still usable
I understand that this question is ancient, but people trying to do what this question asks will likely end up here... Both of you.
If you are running under a particular environment...
Running under Linux (probably most other Unix like operating systems, untested)
Running via CLI (Untested on web servers)
You can actually close all of your file descriptors (yes all, which means it's probably best to do this at the very beginning of execution... for example just after a pcntl_fork() call to background the process in a daemon (which seems like the most common need for something like this)
fclose( STDIN ); // fd 3
fclose( STDERR); // fd 2
fclose( STDOUT ); // fd 1
And then re-open the file descriptors, assigning them to a variable that will not fall out of scope and thus be garbage collected. Because Linux will predictably open them in the proper order.
$kept_in_scope_variable_fd1 = fopen(...); // fd 1
$kept_in_scope_variable_fd2 = fopen(...); // fd 2
$kept_in_scope_variable_fd3 = fopen( '/dev/null', ... ); // fd 3
You can use whatever files or devices you want for this. I gave /dev/null as the example for STDIN (fd3) because that's probably the most common case for this kind of code.
Once this is done you should be able to do normal things like echo, print_r, var_dump, etc without specifically needing to write to a file with a function. Which is useful when you're trying to background code that you do not want to, or aren't able to, rewrite to be file-pointer-output-friendly.
YMMV for other environments and things like having other FD's open, etc. My advice is to start with a small test script to prove that it works, or doesn't, in your environment and then move on to integration from there.
Good luck.
Here is an ugly solution that was useful for a problem I had (need to debug).
if(file_get_contents("out.txt") != "in progress")
{
file_put_contents("out.txt","in progress");
$content = file_get_contents('http://'.$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']);
file_put_contents("out.txt",$content);
}
The main drawback of that is that you'd better not to use the $_POST variables.
But you dont have to put it in the very beggining.

Categories