PHP SSE (EventSource) timeout every 2 minutes - php

I'm using this Mozilla SSE example
I added inside the loop a sample PHP proc_open example.
Run from browser, everything works fine.
The only problem is proc_open() execute a command that can take more than 2 minute to finish, which make the browser timeout after 2 minutes only. And our server use non-thread PHP.
Question:
How I can make the PHP script send something to the browser while waiting for proc_open() to finish in a non-thread PHP script ?.
Code:
date_default_timezone_set("America/New_York");
header("Cache-Control: no-store");
header("Content-Type: text/event-stream");
$counter = rand(1, 10);
while (true) {
// Run a local command
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("file", "/tmp/error-output.txt", "a") // stderr is a file to write to
);
$cwd = '/tmp';
$env = array('some_option' => 'aeiou');
$process = proc_open('HelloWorldProgram', $descriptorspec, $pipes, $cwd, $env);
if (is_resource($process)) {
// $pipes now looks like this:
// 0 => writeable handle connected to child stdin
// 1 => readable handle connected to child stdout
// Any error output will be appended to /tmp/error-output.txt
fwrite($pipes[0], '<?php print_r($_ENV); ?>');
fclose($pipes[0]);
echo stream_get_contents($pipes[1]);
fclose($pipes[1]);
// It is important that you close any pipes before calling
// proc_close in order to avoid a deadlock
$return_value = proc_close($process);
echo "command returned $return_value\n";
}
// Every second, send a "ping" event.
echo "event: ping\n";
$curDate = date(DATE_ISO8601);
echo 'data: {"time": "' . $curDate . '"}';
echo "\n\n";
// Send a simple message at random intervals.
$counter--;
if (!$counter) {
echo 'data: This is a message at time ' . $curDate . "\n\n";
$counter = rand(1, 10);
}
ob_end_flush();
flush();
// Break the loop if the client aborted the connection (closed the page)
if ( connection_aborted() ) break;
sleep(1);
}

I had a problem like this in one of my project so use this line set_time_limit(0); in the start of your script like this
date_default_timezone_set("America/New_York");
header("Cache-Control: no-store");
header("Content-Type: text/event-stream");
set_time_limit(0);//this will prevent the script to stop

Workaround
If someone had the same issue, I did this workaround which send a "Ping" to the browser while your real command is running and waiting to finish.
MyScript.sh:
#!/bin/bash
for task in "$#"; do {
$task &
} done
while true; do {
echo "Ping";
sleep 30;
} done
Use:
$ sh MyScript.sh "Your Command Here With All Arguments"
Ping
Ping
Ping
If you setup correctly the SSE, you browser will receive "Ping" every 30 seconds in the same time your command is running so your browser will never timeout.

Related

Refresh output of long process in PHP page

I have a "long" script I want to execute in a PHP page, and I want its output to be 'refreshed' as soon as the script outputs something.
I've read plenty of solutions like questions 4706525, 9182094, 8882383, PHP Flush Manual but it's not working as expected in my case!
My test script:
#!/bin/bash
echo "This is a test script"
echo "Sleeping"
sleep 30
echo "Done"
Executable permission is set for www-data.
My PHP page:
<?php
#apache_setenv('no-gzip', 1);
#ini_set('zlib.output_compression', 0);
#ini_set('implicit_flush', 1);
#ini_set('output_buffering', 0);
#apache_setenv('output_buffering', 0);
echo "Here<br>";
flush();
$cmd = "../test.sh";
$pipes = array();
$descriptors = array(
0 => array("pipe", "r"),
1 => array("pipe", "w"),
2 => array("pipe", "w"),
);
echo "Starting process<br>";
flush();
$process = proc_open($cmd, $descriptors, $pipes, realpath('./'), array());
echo "<pre>";
if (is_resource($process)) {
while ($s = fgets($pipes[1])) {
print $s;
flush();
}
} else {
print "Cannot create process\n";
}
echo "</pre>";
fclose($pipes[0]);
fclose($pipes[1]);
fclose($pipes[2]);
proc_close($process);
?>
NB. My test script, test.sh, is in a directory above the PHP page, thus ../test.sh. Not that that changes anything. But it's not a typo.
My php.ini has those (although, I wasn't too keen at changing that server wide, but I wanted to test if that was the issue):
zlib.output_compression = Off
output_buffering = Off
I use LAMPP.
If I run the PHP page in a terminal,
$ php test.php
It works fine: I immediately get "This is a test script" and "Sleeping", and after a while, "Done".
If I load the page in my browser, it does not work: it waits until test.sh has completed before outputting anything.
Edited: If I add echo str_pad('',4096)."\n" in the loop, then it works. However, this fix suggests that for a reason I do not understand, output buffering is still set to its default value (4096) and not off as I tried to configure.
while ($s = fgets($pipes[1])) {
print $s;
echo str_pad('',4096)."\n";
flush();
}
Furthermore, this solution is not perfect because , in reality, it adds spaces to the output.
I am looking for a solution that
refreshes output of the PHP page
does not modify php.ini
does not modify the output
Thanks!

PHP script stuck on exec even after running command in backgroud on Windows

I want a php script from which I can execute a program, and terminate it if it doesn't complete execution in 2 seconds. I am using Windows. I have tried the following code:
exec("start /B program.exe");
sleep(2);
exec('taskkill /F /IM "program.exe"');
This doesn't seem to work as script is stuck on the first exec statement as long as program.exe is not finished execution. I can't figure out how to do fix this issue.
Are you doing this with php cli (command line)? Open a command prompt as administrator.
To not being blocked by waiting for the program close the process of opening the program.
php myscript.php
pclose(popen("start /B program.exe", "r"));
sleep(2);
exec('taskkill /F /IM program.exe');
exit(0);
Would also be fine to put the exec start into a separate script and fire this script using exec
Right, exec() will block until execution completes. This question has great answers for how to do an exec() with a timeout. I think this will probably work best for you. I'll post the code here for completeness (but I can't take any credit!):
/**
* Execute a command and return it's output. Either wait until the command exits or the timeout has expired.
*
* #param string $cmd Command to execute.
* #param number $timeout Timeout in seconds.
* #return string Output of the command.
* #throws \Exception
*/
function exec_timeout($cmd, $timeout) {
// File descriptors passed to the process.
$descriptors = array(
0 => array('pipe', 'r'), // stdin
1 => array('pipe', 'w'), // stdout
2 => array('pipe', 'w') // stderr
);
// Start the process.
$process = proc_open('exec ' . $cmd, $descriptors, $pipes);
if (!is_resource($process)) {
throw new \Exception('Could not execute process');
}
// Set the stdout stream to none-blocking.
stream_set_blocking($pipes[1], 0);
// Turn the timeout into microseconds.
$timeout = $timeout * 1000000;
// Output buffer.
$buffer = '';
// While we have time to wait.
while ($timeout > 0) {
$start = microtime(true);
// Wait until we have output or the timer expired.
$read = array($pipes[1]);
$other = array();
stream_select($read, $other, $other, 0, $timeout);
// Get the status of the process.
// Do this before we read from the stream,
// this way we can't lose the last bit of output if the process dies between these functions.
$status = proc_get_status($process);
// Read the contents from the buffer.
// This function will always return immediately as the stream is none-blocking.
$buffer .= stream_get_contents($pipes[1]);
if (!$status['running']) {
// Break from this loop if the process exited before the timeout.
break;
}
// Subtract the number of microseconds that we waited.
$timeout -= (microtime(true) - $start) * 1000000;
}
// Check if there were any errors.
$errors = stream_get_contents($pipes[2]);
if (!empty($errors)) {
throw new \Exception($errors);
}
// Kill the process in case the timeout expired and it's still running.
// If the process already exited this won't do anything.
proc_terminate($process, 9);
// Close all streams.
fclose($pipes[0]);
fclose($pipes[1]);
fclose($pipes[2]);
proc_close($process);
return $buffer;
}
Edit
The 'exec' part of the proc_open() probably won't work on Windows, but it's probably unnecessary.
The first comment of the exec() manual page shows a very simple example.

PHP: Need to close STDIN in order to read STDOUT?

I recently tried to communicate with a binary on my Ubuntu webserver [1] using the PHP function proc_open. I can establish a connection and define the pipes STDIN, STDOUT, and STDERR. Nice.
Now the bimary I am talking to is an interactive computer algebra software - therefore I would like to keep both STDOUT and STDIN alive after the first command such that I can still use the application a few lines later in an interactive manner (direct user inputs from a web-frontend).
However, as it turns out, the PHP functions to read the STDOUT of the binary (either stream_get_contents or fgets) need a closed STDIN before they can work. Otherwise the program deadlocks.
This is a severe drawback since I can not just reopen the closed STDIN after closing it. So my question is: why does my script deadlock if I want to read the STDOUT when my STDIN is still alive?
Thanks
Jens
[1] proc_open returns false but does not write in error file - permissions issue?
my source:
$descriptorspec = array(
0 => array("pipe","r"),
1 => array("pipe","w"),
2 => array("file","./error.log","a")
) ;
// define current working directory where files would be stored
$cwd = './' ;
// open reduce
$process = proc_open('./reduce/reduce', $descriptorspec, $pipes, $cwd) ;
if (is_resource($process)) {
// some valid Reduce commands
fwrite($pipes[0], 'load excalc; operator x; x(0) := t; x(1) := r;');
// if the following line is removed, the script deadlocks
fclose($pipes[0]);
echo "output: " . stream_get_contents($pipes[1]);
// close pipes & close process
fclose($pipes[0]);
fclose($pipes[1]);
fclose($pipes[2]);
proc_close($process);
}
EDIT:
This code kind of works. Kind of because it uses usleeps to wait for the non-blocked STDOUT to be filled with data. How do I do that more elegantly?
# Elias: By polling the $status['running'] entry you can only determine if the overall process is still running, but not if the process is busy or idling... That is why I have to include these usleeps.
define('TIMEOUT_IN_MS', '100');
define('TIMEOUT_STEPS', '100');
function getOutput ($pipes) {
$result = "";
$stage = 0;
$buffer = 0;
do {
$char = fgets($pipes[1], 4096);
if ($char != null) {
$buffer = 0;
$stage = 1;
$result .= $char;
} else if ($stage == "1") {
usleep(TIMEOUT_IN_MS/TIMEOUT_STEPS);
$buffer++;
if ($buffer > TIMEOUT_STEPS) {
$stage++;
}
}
} while ($stage < 2);
return $result;
}
$descriptorspec = array( 0 => array("pipe", "r"), 1 => array("pipe", "w") ) ;
// define current working directory where files would be stored
$cwd = './' ;
// open reduce
$process = proc_open('./reduce/reduce', $descriptorspec, $pipes, $cwd);
if (is_resource($process)) {
stream_set_blocking($pipes[1], 0);
echo "startup output:<br><pre>" . getOutput($pipes) . "</pre>";
fwrite($pipes[0], 'on output; load excalc; operator x; x(0) := t; x(1) := r;' . PHP_EOL);
echo "output 1:<br><pre>" . getOutput($pipes) . "</pre>";
fwrite($pipes[0], 'coframe o(t) = sqrt(1-2m/r) * d t, o(r) = 1/sqrt(1-2m/r) * d r with metric g = -o(t)*o(t) + o(r)*o(r); displayframe;' . PHP_EOL);
echo "output 2:<br><pre>" . getOutput($pipes) . "</pre>";
// close pipes & close process
fclose($pipes[0]);
fclose($pipes[1]);
fclose($pipes[2]);
proc_close($process);
}
This reminds me of a script I wrote a while back. While it might serve as inspiration to you (or others), it doesn't do what you need. What it does contain is an example of how you can read the output of a stream, without having to close any of the streams.
Perhaps you can apply the same logic to your situation:
$allInput = array(
'load excalc; operator x; x(0) := t; x(1) := r;'
);//array with strings to pass to proc
if (is_resource($process))
{
$output = '';
$input = array_shift($allInput);
do
{
usleep(200);//make sure the running process is ready
fwrite(
$pipes,
$input.PHP_EOL,//add EOL
strlen($input)+1
);
fflush($pipes[0]);//flush buffered data, write to stream
usleep(200);
$status = proc_get_status($process);
while($out = fread($pipes[1], 1024) && !feof($pipes[1]))
$output .= $out;
} while($status['running'] && $input = array_shift($allInput));
//proc_close & fclose calls here
}
Now, seeing as I don't know what it is exactly you are trying to do, this code will need to be tweaked quite a bit. You may, for example, find yourself having to set the STDIN and STDOUT pipes as non-blocking.
It's a simple matter of adding this, right after calling proc_open, though:
stream_set_blocking($pipes[0], 0);
stream_set_blocking($pipes[1], 0);
Play around, have fun, and perhaps let me know if this answer was helpful in any way...
My guess would be that you're doing everything correctly, except that the binary is never notified that it has received all the input and can start to work. By closing STDIN, you're kicking off the work process, because it's clear that there will be no more input. If you're not closing STDIN, the binary is waiting for more input, while your side is waiting for its output.
You probably need to end your input with a newline or whatever other protocol action is expected of you. Or perhaps closing STDIN is the action that's expected of you. Unless the process is specifically created to stay open and continue to stream input, you can't make it do it. If the process reads all input, processes it, returns output and then quits, there's no way you can make it stay alive to process more input later. If the process explicitly supports that behaviour, there should be a definition on how you need to delimit your input.

proc_open leaves zombie process

The following scripts monitors /dev/shm/test for new files and outputs info about it in real time.
The problem is that when user closes the browser, a inotifywait process remains open, and so on.
Is there any way to avoid this?
<?php
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("pipe", "a") // stderr is a file to write to
);
$process = proc_open('inotifywait -mc -e create /dev/shm/test/', $descriptorspec, $pipes);
if (is_resource($process)) {
header("Content-type: text/html;charset=utf-8;");
ob_end_flush(); //ends the automatic ob started by PHP
while ($s = fgets($pipes[1])) {
print $s;
flush();
}
fclose($pipes[1]);
fclose($pipes[0]);
fclose($pipes[2]);
// It is important that you close any pipes before calling
// proc_close in order to avoid a deadlock
$return_value = proc_close($process);
echo "command returned $return_value\n";
}
?>
That's because inotifywait will wait until changes happen to the file /dev/shm/test/, then will output diagnostic information on standard error and event information on standard output, and fgets() will wait until it can read a line: Reading ends when $length - 1 bytes (2nd parameter) have been read, or a newline (which is included in the return value), or an EOF (whichever comes first). If no length is specified, it will keep reading from the stream until it reaches the end of the line.
So basically, you should read data from the child process' stdout pipe non-blocking mode with stream_set_blocking($pipes[1], 0), or check manually if there is data on that pipe with stream_select().
Also, you need to ignore user abort with ignore_user_abort(true).
As inotifywait runs as own process that basically never ends you need to send it a KILL signal. If you run the script on cli the Ctrl+C signal is sent to the inotifywait process too - but you don't have that when running in the webserver.
You send the signal in a function that gets called by register_shutdown_function or by __destruct in a class.
This simple wrapper around proc_open could help:
class Proc
{
private $_process;
private $_pipes;
public function __construct($cmd, $descriptorspec, $cwd = null, $env = null)
{
$this->_process = proc_open($cmd, $descriptorspec, $this->_pipes, $cwd, $env);
if (!is_resource($this->_process)) {
throw new Exception("Command failed: $cmd");
}
}
public function __destruct()
{
if ($this->isRunning()) {
$this->terminate();
}
}
public function pipe($nr)
{
return $this->_pipes[$nr];
}
public function terminate($signal = 15)
{
$ret = proc_terminate($this->_process, $signal);
if (!$ret) {
throw new Exception("terminate failed");
}
}
public function close()
{
return proc_close($this->_process);
}
public function getStatus()
{
return proc_get_status($this->_process);
}
public function isRunning()
{
$st = $this->getStatus();
return $st['running'];
}
}
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("pipe", "a") // stderr is a file to write to
);
$proc = new Proc('inotifywait -mc -e create /dev/shm/test/', $descriptorspec);
header("Content-type: text/html;charset=utf-8;");
ob_end_flush(); //ends the automatic ob started by PHP
$pipe = $proc->pipe(1);
while ($s = fgets($pipe)) {
print $s;
flush();
}
fclose($pipe);
$return_value = proc->close($process);
echo "command returned $return_value\n";
Or you could use the Symfony Process Component which does exactly the same (plus other useful things)
You can use ignore_user_abort to specify that the script should not stop executing when the user closes the browser window. That will solve half of the problem, so you also need to check if the window was closed inside your loop with connection_aborted to determine when you need to shut down everything in an orderly manner:
header("Content-type: text/html;charset=utf-8;");
ignore_user_abort(true);
ob_end_flush(); //ends the automatic ob started by PHP
while ($s = fgets($pipes[1])) {
print $s;
flush();
if (connection_aborted()) {
proc_terminate($process);
break;
}
}
Does this help?
$proc_info = proc_get_status($process);
pcntl_waitpid($proc_info['pid']);

open a terminal session from php

i'm trying to write a php page that
call for a server program like
gdb
the problem is if i did
<?php
exec(" gdb code", $out);
?>
the PHP call for the command and exist
BUT what i want to do is like open a "terminal" session
where the user enter commands in that program like
gdb code
..
break main
..
run
and after each command i give him the output and he give me the next command
and it won't work if i did it like this
<?php
exec(" gdb code", $out);
exec(" break", $out);
exec(" run", $out);
?>
and the PHP can be run from a browser
and i tried it with pro_open
<?php
$descriptorspec = array(
0 => array("pipe", "r"),
1 => array("pipe", "w"),
2 => array("file", "/var/www/err.log", "a")
);
$cwd = '/var/www';
$env = array('some_option' => 'aeiou');
$StdErr='';
$process = proc_open('/bin/bash', $descriptorspec, $pipes, $cwd, $env);
if (is_resource($process)) {
fwrite($pipes[0], "gcc code ");
fwrite($pipes[0], " break main");
fflush($pipes[0]);
fclose($pipes[0]);
while(!feof($pipes[1])) {
echo fgets($pipes[1], 1024);
}
echo $StdErr;
fclose($pipes[1]);
$return_value = proc_close($process);
echo "command returned : $return_value\n";
}
and thank you .
Edit just saw you do try it from a browser. There is absolutely no simple way to do this. If you want an interactive session from the browser, you must run a separate daemon process and forward commands to it from PHP (and return output).
This is not simple at all; so if you still feel like doing this.. I would recommend starting with how to create a deamon; and then write a tcp socket server (or other IPC).
Excuse the crappy grammar

Categories