How to pipe to PHP process using the proc_open() function? - php

In this case there are two PHP files, stdin.php (child process component) and proc_open.php (parent process component), both are stored in the same folder of the public_html of a domain. There is also Program X, that pipes data into stdin.php.
stdin.php (this component works)
It is a process that should not be executed throught a browser, because it is destinated to receive input from Program X, and back it all up in a file (named stdin_backup). This process is working, because every time Program X pipes input, the process backs it up entirely. If this process is executed with input not passed (such is the case of executing it from a browser), the process creates a file (named stdin_error) with the text "ERROR".
Below, part of the code is ommited because the process works (as mentioned above). The code shown is just to illustrate:
#!/usr/bin/php -q
<?php
// Get the input
$fh_stdin = fopen('php://stdin', 'rb');
if (is_resource($fh_stdin)) {
// ... Code that backs up STDIN in a file ...
} else {
// ... Code that logs "ERROR" in a file ...
}
?>
proc_open.php (this component isn't working)
It is a process that must be executed throught a browser, and is destinated to pass input to stdin.php, as Program X does. This process is failing, because every time it is executed, there is no signal of stdin.php being executed: no stdin_error file, no stdin_backup file, not even PHP error_log file.
Code:
// Execute a PHP process and pipe input to it
// - Specify process command
$process_path = __DIR__ . '/stdin.php';
$process_execution_command = 'php ' . $process_path;
// - Specify process descriptor
$process_descriptor = array(
0 => array('pipe', 'r') // To child's STDIN
);
// - Specify pipes container
$pipes = [];
// - Open process
$process = proc_open($process_execution_command, $process_descriptor, $pipes);
if (is_resource($process)) {
// - Send data to the process STDIN
$process_STDIN = 'Data to be received by the process STDIN';
fwrite($pipes[0], $process_STDIN);
fclose($pipes[0]);
// - Close process
$process_termination_status = proc_close($process);
}
I am not sure if the command passed to proc_open() is correct, because I have not found examples for this case, and as mentioned above, this script is failing. I dont know what else can be incorrect with proc_open.php.
Also when I execute the proc_open.php process, an infinite loop produces, printing the next string over and over:
X-Powered-By: PHP/5.5.20 Content-type: text/html; charset=UTF-8
Tryed popen('php ' . __DIR__ . '/stdin.php', 'w') instead, and had exaclty the same result: the infinite loop error printing the same string of above, no errors, no logs, and no signals of stdin.php execution.

If I understand your question correctly, you want to open a process and write data into that process' STDIN stream. You can use the proc_open function for that:
$descriptors = array(
0 => array("pipe", "r"), // STDIN
1 => array("pipe", "w"), // STDOUT
2 => array("pipe", "w") // STDERR
);
$proc = proc_open("php script2.php", $descriptors, $pipes);
fwrite($pipes[0], "Your data here...");
fclose($pipes[0]);
$stdout = stream_get_contents($pipes[1]);
$stderr = stream_get_contents($pipes[2]);
fclose($pipes[1]);
fclose($pipes[2]);
$exitCode = proc_close($proc);
If you simply want to test your 2nd script, It would probably be easier to simply use a shell command:
$ echo "your data" | php script2.php
Or alternatively,
$ php script2.php < datafile.txt
Update, accounting for your edit to the question
When using the popen function, you can either open the process for reading or writing. That allows you to read the process' STDOUT stream or write into the STDIN stream (but not both; if that's a requirement, you'll need to use proc_open). If you want to write into the STDIN stream, specify "w" as 2nd parameter to popen to open the process for writing:
$fh_pipe = popen(
'php script1.php',
'w' // <- "w", not "r"!
);
fwrite($fh_pipe, 'EMAIL TEXT') ;
pclose($fh_pipe);

Related

PHP SSH exec() failure: evidently dependent on exported variable size

I'm sending a rather large string (stored in variable "$data") to a remote server with an SSH command in the format as follows:
$command= "ssh -l user subdomain.domain.com bin/prog <<<'$data'";
exec($command, $response, $status);
If the data is less than 130KB, the execution behaves normally, and I get the expected response from my server. However, if $data is greater than about 130KB, the execution of the command immediately returns a NULL array as a response.
I'm not too sure what's happening here. At first I suspected it was a time-out issue, but the command fails immediately if $data is past a certain size. Evidently the only determining factor here appears to be the size of $data. It takes about 30-35 seconds to execute the command when the data is less than 130KB.
Any input is appreciated. If more information is required, please let me know.
The core issue here had to do with the exec() function buffer.
The command we were executing to send data via SSH via exec() was essentially an incredibly long string. PHP has a string size limit of around 2GB. While SSH itself appears to be able to transmit an arbitrary amount of data, our exec()'s buffer appears to fail around 130KB. Under these assumptions, we used a PHP function called proc_open():
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("file", "/tmp/error-output.txt", "a") // stderr is a file to write to
);
$cwd = '/tmp';
// $env = array('some_option' => 'aeiou'); // Our environment variables are loaded in the shell on login.
$env = NULL;
$process = proc_open('ssh -l user subdomain.domain.com bin/prog', $descriptorspec, $pipes, $cwd, $env);
if (is_resource($process)) {
// $pipes now looks like this:
// 0 => writeable handle connected to child stdin
// 1 => readable handle connected to child stdout
// Any error output will be appended to /tmp/error-output.txt
foreach($dataset as $dataline){
fwrite($pipes[0], $dataline . "\n"); // Feed data down the pipe
}
fclose($pipes[0]);
echo stream_get_contents($pipes[1]); // echo out the reply from the bin/prog
fclose($pipes[1]);
// It is important that you close any pipes before calling
// proc_close in order to avoid a deadlock
$return_value = proc_close($process);
echo "command returned $return_value\n";
}
This procedure is straight out of the manual.

proc_open hangs when trying to read from a stream

I've encountered the issue with proc_open on Windows, when trying to convert a wmv file (to flv), using ffmpeg, however I suspect I'll encounter the same scenario whenever certain conditions occur.
Basically my code is as follows:
$descriptorspec = array
(
array("pipe", "r"),
array("pipe", "w"),
array("pipe", "w")
);
$pipes = array();
$procedure = proc_open('cd "C:/Program Files/ffmpeg/bin" && "ffmpeg.exe" -i "C:/wamp/www/project/Wildlife.wmv" -deinterlace -qdiff 2 -ar 22050 "C:/wamp/www/project/Wildlife.flv"', $descriptorspec, $pipes);
var_dump(stream_get_contents($pipes[1]));
Now, this code will cause PHP to hang indefinitely (it doesn't matter if instead of stream_get_contents I'll use fgets or stream_select, the behavior is consistent).
The reason for it (I suspect) is that, while STDOUT stream is open succesfully, the process doesn't write anything to it (even though running the same command in cmd displays output) and as such, trying to read from such stream, would cause the same issue as described here, so - PHP waits for the stream to have anything in it, process doesn't write anything to it.
However (additional fun), setting stream_set_timeout or stream_set_blocking doesn't have any effect.
As such - can somebody confirm/deny on what is going on, and, if possible, show how can I cater for such situation? I've looked at PHP bugs, and all proc_open hangs ones seem to be fixed.
For time being I've implemented such solution:
$timeout = 60;
while (true) {
sleep(1);
$status = proc_get_status($procedure);
if (!$status['running'] || $timeout == 0) break;
$timeout--;
}
However, I'd really not like to rely on something like this as:
I will have processes that run for longer than a minute - such processes will be falsely reported to be of the above mentioned type
I want to know when the ffmpeg has finished converting the video - currently I'll only know that process is still running after a minute, and I can't really do anything to check if there's any output (as it will hang PHP).
Also, I don't really want to wait a full minute for the process to be checked (for example - converting the given video from command line takes <10s), and I'll have videos that take more time to be converted.
Per comment from #Sjon, here's stream_select I was using, which blocks due to same issue - STDOUT not being written to:
$descriptorspec = array
(
array("pipe", "r"),
array("pipe", "w"),
array("pipe", "w")
);
$pipes = array();
$procedure = proc_open('cd "C:/Program Files/ffmpeg/bin" && "ffmpeg.exe" -i "C:/wamp/www/sandbox/Wildlife.wmv" -deinterlace -qdiff 2 -ar 22050 "C:/wamp/www/sandbox/Wildlife.flv"', $descriptorspec, $pipes);
$read = array($pipes[0]);
$write = array($pipes[1], $pipes[2]);
$except = array();
while(true)
if(($num_changed_streams = stream_select($read, $write, $except, 10)) !== false)
{
foreach($write as $stream)
var_dump(stream_get_contents($stream));
exit;
}
else
break;
Per conversation with #Sjon - reading from buffered streams on Windows is broken. The solution in the end is to use stream redirection via shell, and then read the created files - as such
$descriptorspec = array
(
array("pipe", "r"),
array("pipe", "w"),
array("pipe", "w")
);
$pipes = array();
$procedure = proc_open('cd "C:/Program Files/ffmpeg/bin" && "ffmpeg.exe" -i "C:/wamp/www/sandbox/Wildlife.mp4" -deinterlace -qdiff 2 -ar 22050 "C:/wamp/www/sandbox/Wildlife.flv" > C:/stdout.log 2> C:/stderr.log', $descriptorspec, $pipes);
proc_close($procedure);
$output = file_get_contents("C:/stdout.log");
$error = file_get_contents("C:/stderr.log");
unlink("C:/stdout.log");
unlink("C:/stderr.log");
As the stream is buffered, in the file we will get unbuffered output (something I was after as well). And we don't need to check if the file changes, because the result from shell is unbuffered and synchronous.
This took some time to reproduce, but I found your problem. The command you run, outputs some diagnostics when you run it; but it doesn't output to stdout, but to stderr. The reason for this is explained in man stderr:
Under normal circumstances every UNIX program has three streams opened for it when it starts up, one for input, one for output, and one for printing diagnostic or error messages
If you would properly use streams; this wouldn't be an issue; but you call stream_get_contents($pipes[1]) instead. This results in PHP waiting for output from stdout, which never arrives. This fix is simple; read from stderr stream_get_contents($pipes[2]) instead and the script will quit immediately after the process ends
To expand on your addition of stream_select to the question; stream_select is not implemented on windows in php, it says so in the manual:
Use of stream_select() on file descriptors returned by proc_open() will fail and return FALSE under Windows.
So if the code posted above doesn't work; I'm not sure what will. Have you considered abandoning your streams solution, reverting to a simple exec()-call instead? If you append >%TEMP%/out.log 2>%TEMP%/err.log to your command you can still read output from the process and it might finish quicker (without waiting for the unmodifiable timeout)

Get PHP proc_open() to read a PhantomJS stream for as long as png is created

I had a PHP script that relied on shell_exec() and (as a result) worked 99% of the time.
The script executed a PhantomJS script that produced an image file.
Then using more PHP that image file was processed in a certain way.
Problem was that on occasion shell_exec() would hang and cause usability issues.
Reading this
https://github.com/ariya/phantomjs/issues/11463
I learnt that shell_exec() is the problem and switching to proc_open would solve the hanging.
The problem is that while shell_exec() waits for the executed command to finish proc_open doesn't, and so the PHP commands that follow it and work on the generated image fail as the image is still being produced.
I'm working on Windows so pcntl_waitpid is not an option.
What I'm trying to do is continuously have PhantomJS output something (anything) so that proc_open would read via it's stdin pipe and that way I can time the image processing PHP functions to start working as soon as the target image file is ready.
here is my phantomJS script:
interval = setInterval(function() {
console.log("x");
}, 250);
var page = require('webpage').create();
var args = require('system').args;
page.open('http://www.cnn.com', function () {
page.render('test.png');
phantom.exit();
});
And my PHP code:
ob_implicit_flush(true);
$descriptorspec = array(
0 => array("pipe", "r"), // stdin
1 => array("pipe", "w"), // stdout
2 => array("pipe", "w") // stderr
);
$process = proc_open ("c:\phantomjs\phantomjs.exe /test.js", $descriptorspec, $pipes);
if (is_resource($process))
{
while( ! feof($pipes[1]))
{
$return_message = fgets($pipes[1], 1024);
if (strlen($return_message) == 0) break;
echo $return_message.'<br />';
ob_flush();
flush();
}
}
The test.png is generated, but I am not getting a single $return_message. What am I doing wrong?
As Bill Shander suggested right in your linked github issue, you can use:
Proc_Close(Proc_Open("phantomjs test.js &", Array (), $foo));
to run your phantomjs script (which is based on this answer). It seems that you only need the image, so the pipes are not necessary in this case.
Complete script for reference is here and works on windows as is.

call python script from PHP script

I need to call python script from PHP script and return result back to PHP.
I playing with proc_open function. But it does not work. Do you know why?
This is PHP script:
$msg = "this is a new message \n bla ble !##$%^&*%(*))(_+=-";
$descriptorspec = array(
0 => array("pipe","r"),
1 => array("pipe","w"),
2 => array("file","./error.log","a")
) ;
$cwd = './' ;
$command = 'python ./upper.py ';
$proc = proc_open($command, $descriptorspec, $pipes, $cwd) ;
if ( is_resource( $proc ) ) {
fwrite( $pipes[0], $msg );
fclose( $pipes[0] );
fclose( $pipes[1] );
proc_close($proc);
echo "proc is closed\n";
}
else {
echo 'proc is not a resource';
}
python `upper.py' script
import sys
print 'in python script'
data = sys.stdin.readlines()
print data
Output is :
in php script
proc is closed
I have error in error.log:
close failed in file object destructor:
sys.excepthook is missing
lost sys.stderr
Just in case the answer still matters to anyone:
The error occurs because you're closing $pipes[1] immediately, before the python script has a chance to write it, and quite likely even before it has even started running. The error on my system (a Mac) is:
close failed: [Errno 32] Broken pipe
(Just out of curiosity, what type of system are you running on that's giving you that strange message?)
Anyway, you can avoid the error by reading the pipe before closing it, e.g.,
stream_get_contents($pipes[1]);
which guarantees that the python script will get a chance to write the pipe (at least once) while the pipe is still open.
If you are the author of the python script, there's not much point in having it write output unless your PHP script is going to read what it writes.
On the other hand, if the python script isn't of your making, and it writes output that you don't care about, you might be able to avoid reading the output by playing some games with pcntl_wait() or pcntl_waitpid() -- though I wouldn't bet on it. A safer option is probably to just read the output streams and throw them on the floor.
(Kind of raises an interesting question, whether or not you care about the output: if you don't know a priori what the end of the output will look like, and the subprocess won't exit until you close the pipes and call proc_close(), how will you know when the subprocess has actually finished doing what you called it to do?)
Another option (untested by me so far) might be to connect any output streams you don't care about to a descriptor like:
array("file", "/dev/null", "w")
or whatever the equivalent of /dev/null is on your system.

PHP - Blocking File Read

I have a file that is getting added to remotely (file.txt). From SSH, I can call tail -f file.txt which will display the updated contents of the file. I'd like to be able to do a blocking call to this file that will return the last appended line. A pooling loop simply isn't an option. Here's what I'd like:
$cmd = "tail -f file.txt";
$str = exec($cmd);
The problem with this code is that tail will never return. Is there any kind of wrapper function for tail that will kill it when once it has returned content? Is there a better way to do this in a low overhead way?
The only solution I've found is somewhat dirty:
<?php
$descriptorspec = array(
0 => array("pipe", "r"), // stdin
1 => array("pipe", "w"), // stdout
2 => array("pipe", "w") // stderr
);
$process = proc_open('tail -f -n 0 /tmp/file.txt',$descriptorspec,$pipes);
fclose($pipes[0]);
stream_set_blocking($pipes[1],1);
$read = fgets($pipes[1]);
fclose($pipes[1]);
fclose($pipes[2]);
//if I try to call proc_close($process); here, it fails / hangs untill a second line is
//passed to the file. Hence an inelegant kill in the next 2 line:
$status = proc_get_status($process);
exec('kill '.$status['pid']);
proc_close($process);
echo $read;
tail -n 1 file.txt will always return you the last line in the file, but I'm almost sure what you want instead is for PHP to know when file.txt has a new line, and display it, all without polling in a loop.
You will need a long running process anyway if it will check for new content, be it with a polling loop that checks for file modification time and compares to the last modification time saved somewhere else, or any other way.
You can even have php be run via cron to do the check if you don't want it running in a php loop (probably best), or via a shell script that does the loop and calls the php file if you need sub 1-minute runs that are cron's limit.
Another idea, though I haven't tried it, would be to open the file in a non-blocking stream and then use the quite efficient stream_select on it to have the system poll for changes.

Categories