So I open a process with $process = proc_open("my_process", $descriptors, $pipes);
Then I write to the stdin of the process using fwrite($pipes[0], "some_command");
Then I have to close the pipe using fclose($pipes[0]); before i can read from the pipes stdout using $output = stream_get_contents($pipes[1]);. If I don't close the pipe my php script hangs on this call.
But once I have received the output from stdout what if I want to send another command to the process...the stdin pipe is closed so I have no way to send it. So is it possible to somehow send another command to the process?
It sounds like the other process is blocking waiting for EOL or EOF on STDIN. What are you trying to execute?
Regardless, there's a pretty good chance this will sort it out: Just append \n to the command you are sending to the other process.
E.g.
$process = proc_open("my_process", $descriptors, $pipes);
$command = "some_command";
fwrite($pipes[0], $command."\n");
// Fetch the contents of STDOUT
Now, one issue that you may also be running into is to do with the fact that you are using stream_get_get_contents() - which will wait for EOF before it returns. You may have to be a bit more intelligent about how your retrieve the data from $pipes[1], using fgets() and looking for a specific number of lines or a string to indicate the end of the output.
If you tell us what you are executing, I may be able to give you a more specific answer.
Related
I am a PHP beginner. I want to invoke an external Unix command, pipe some stuff into it (e.g., string and files), and have the result appear in my output buffer (the browser).
Consider the following:
echo '<h1>stuff for my browser output window</h1>';
$fnmphp= '/tmp/someinputfile';
$sendtoprogram = "myheader: $fnmphp\n\n".get_file_contents($fnmphp);
popen2outputbuf("unixprogram < $sendtoprogram");
echo '<p>done</p>';
An even better solution would let PHP write myheader (into Unix program), then pipe the file $fnmphp (into Unix program); and the output of unixprogram would immediately go to my browser output buffer.
I don't think PHP uses stdout, so that my Unix program STDOUT output would make it into the browser. Otherwise, this would happen to default if I used system(). I can only think of solutions that require writing tempfiles.
I think I am standing on the line here (German idiom; wires crossed)--- this probably has an obvious solution.
update:
here is the entirely inelegant but pretty precise solution that I want to replace:
function pipe2eqb( $text ) {
$whatever= '/tmp/whatever-'.time().'-'.$_SESSION['uid'];
$inf = "$whatever.in";
$outf= "$whatever.out";
assert(!file_exists($inf));
assert(!file_exists($outf));
file_put_contents($inf, $text);
assert(file_exists($inf));
system("unixprog < $inf > $outf");
$fo= file_get_contents($outf);
unlink($infilename);
unlink($outfilename);
return $fo;
}
It is easy to replace either the input or the output, but I want to replace both. I will post a solution when I figure it out.
the best to do this is the proc_open family of functions
<?php
$descriptorspec = array(
0 => array("pipe", "r"), // stdin
1 => array("pipe", "w"), // stdout
2 => array("pipe", "w") // stderr
);
$cwd = NULL;//'/tmp';
$env = NULL;//array();
$cmd='unixprog ';
$process = proc_open($cmd, $descriptorspec, $pipes, $cwd, $env);
assert(false!==$process);
now, to give arguments to unixprog, do like
$cmd='unixprog --arg1='.escapeshellarg($arg1).' --arg2='.escapeshellarg($arg2);
to talk to the program's stdin, do like
assert(strlen($stdinmessage)===fwrite($pipes[0],$stdinmessage));
to read from the process's stdout, do like
$stdout=file_get_contents($pipes[$1])
to read from the process's stderr, do like
$stderr=file_get_contents($pipes[$2])
to check if the program has finished, do like
$status=proc_get_status($process);
if($status['running']){echo 'child process is still running.';}
to check the return code of the process when it has finished,
echo 'return code from child process: '.$status['exitcode'];
to wait for the child process to finish, you CAN do
while(proc_get_status($process)['running']){sleep(1);}
this is a quick and easy way to do it, but it is not optimal. tl;dr: it may be slow or waste cpu. long version:
there is some nigh-optimal event-driven way to do this, but im not sure how to do it. imagine having to run a program 10 times, but the program execute in 100 milliseconds. this code would use 10 seconds! while optimal code would use only 1 second. you can use usleep() for microseconds, but its still not optimal, imagine if you're checking every 100 microseconds, but the program use 10 seconds to execute: you would waste cpu, checking the status 100,000 times, while optimal code would only check it once.. im sure there is a fancy way to let php sleep until the process finishes with some callback/signal, perhaps with stream_select , but i've yet to solve it. (if anybody have the solution, please let me know!)
-- read more at http://php.net/manual/en/book.exec.php
I'm thinking about making a php script which opens stockfish chess engine CLI, send fews commands and get back the output.
I think I can achieve this by using proc_open with pipes array but I can't figure out how to wait the whole output... If there's another better solution it's appreciated!
Here's my code:
// ok, define the pipes
$descr = array(
0 => array("pipe", "r"),
1 => array("pipe", "w"),
2 => array("pipe", "w")
);
$pipes = array();
// open the process with those pipes
$process = proc_open("./stockfish", $descr, $pipes);
// check if it's running
if (is_resource($process)) {
// send first universal chess interface command
fwrite($pipes[0], "uci");
// send analysis (5 seconds) command
fwrite($pipes[0], "go movetime 5000");
// close read pipe or STDOUTPUT can't be read
fclose($pipes[0]);
// read and print all output comes from the pipe
while (!feof($pipes[1])) {
echo fgets($pipes[1]);
}
// close the last opened pipe
fclose($pipes[1]);
// at the end, close the process
proc_close($process);
}
The process seems to start, but the second STDINPUT I send to process isn't able to wait until it finishes because the second command produces analysis lines for about 5 seconds and the result it prints it's immediate.
How can I get on it?
CLI link, CLI documentation link
Please, ask me for more information about this engine if you need.
Thank you!
fwrite($pipes[0], "uci/n");
fwrite($pipes[0], "go movetime 5000/n");
Without /n Stockfish see this as one command (ucigo movetime 5000) and don't recognise it.
Actually, your code works. $pipes[1] contained all the output from stockfish...
You might need to change the line
position startpos moves 5000
to a different number, as the 5000 means 5000 ms = 5 seconds, ie. the time when the engine stops. Try 10000 and the engine stops after 10 seconds etc.
You need to remove: fclose($pipes[0]); and make check for bestmove in while cycle, if bestmove is found - break the cycle, and after that only put fclose($pipes[0]);. That worked for me. And add \n separator at the end of commands.
Thanks for the code!
I'm talking to a process requiring user interaction using the following (PHP 5.3/ Ubuntu 12.04),
$pdes = array(
0 => array('pipe', 'r'), //child's stdin
1 => array('pipe', 'w'), //child's stdout
);
$process = proc_open($cmd, $pdes, $pipes);
sleep(1);
if(is_resource($process)){
while($iter-->0){
$r=array($pipes[1]);
$w=array($pipes[0]);
$e=array();
if(0<($streams=stream_select($r,$w,$e,2))){
if($streams){
if($r){
echo "reading\n";
$rbuf.=fread($pipes[1],$rlen); //reading rlen bytes from pipe
}else{
echo "writing\n";
fwrite($pipes[0],$wbuf."\n"); //writing to pipe
fflush($pipes[0]);
}}}}
fclose($pipes[0]);
fclose($pipes[1]);
echo "exitcode: ".proc_close($process)."\n";
}
And this is my test program in C,
#include <stdio.h>
int main(){
char buf[512];
printf("before input\n");
scanf("%s",buf);
printf("after input\n");
return 0;
}
Now, the problem is $r is always empty after stream_select even if $pipes[1] is set to non-blocking where as write to $pipes[0] never blocks. However, things work fine without stream_select i.e. if I match reads and writes to the test program,
echo fread($pipes[1],$rlen); //matching printf before input
fwrite($pipes[0],$wbuf."\n"); //matching scanf
fflush($pipes[0]);
echo fread($pipes[1],$rlen); //matching printf after input
I couldn't figure out what's happening here. I'm trying to achieve something sort of web based terminal emulator here. Any suggestions on how to do this are welcome :)
Sorry guys for wasting your time. I figured out the problem a while later (sorry for the late update). Read was blocking due to a race condition.
I write to the process and immediately check the streams for availability. Write never blocks and data is not ready for reading yet (somehow even for 1 byte of data to be available it took 600ms). So, I was able to fix the problem by adding sleep(1) at the end of write block.
I have a variable that contains a long string. (specifically it contains a few kilobytes of javascript-code)
I want to pass this string trough an external command, in this case a javascript-compressor, and capture the output of the external command (the compressed javascript) in php, assigning it to a variable.
I'm aware that there's classes for compressing javascript in php, but this is merely one example of a general problem.
originally we used:
$newvar = passthru("echo $oldvar | compressor");
This works for small strings, but is insecure. (if oldvar contains characters with special meaning to the shell, then anything could happen)
Escaping with escapeshellarg fixes that, but the solution breaks for longer strings, because of OS-limitations on maximum allowable argument-length.
I tried using popen("command" "w") and writing to the command - this works, but the output from the command silently disappears into the void.
Conceptually, I just want to do the equivalent of:
$newvar = external_command($oldvar);
Using the proc_open-function you can get handles to both stdout and stdin of the process and thus write your data to it and read the result.
Using rumpels suggestion, I was able to device the following solution which seems to work well. Posting it here for the benefit of anyone else interested in the question.
public static function extFilter($command, $content){
$fds = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("pipe", "w") // stderr is a pipe that the child will write to
);
$process = proc_open($command, $fds, $pipes, NULL, NULL);
if (is_resource($process)) {
fwrite($pipes[0], $content);
fclose($pipes[0]);
$stdout = stream_get_contents($pipes[1]);
fclose($pipes[1]);
$stderr = stream_get_contents($pipes[2]);
fclose($pipes[2]);
$return_value = proc_close($process);
// Do whatever you want to do with $stderr and the commands exit-code.
} else {
// Do whatever you want to do if the command fails to start
}
return $stdout;
}
There may be deadlock-issues: if the data you send is larger than the combined sizes of the pipes, then the external command will block, waiting for someone to read from it's stdout, while php is blocked, waiting for stdin to be read from to make room for more input.
Possibly PHP takes care of this issue somehow, but it's worth testing out if you plan to send (or receive) more data than fits in the pipes.
I need to turn HTML into equivalent Markdown-structured text.
OBS.: Quick and clear way of doing this with PHP & Python.
As I am programming in PHP, some people indicates Markdownify to do the job, but unfortunately, the code is not being updated and in fact it is not working. At sourceforge.net/projects/markdownify there is a "NOTE: unsupported - do you want to maintain this project? contact me! Markdownify is a HTML to Markdown converter written in PHP. See it as the successor to html2text.php since it has better design, better performance and less corner cases."
From what I could discover, I have only two good choices:
Python: Aaron Swartz's html2text.py
Ruby: Singpolyma's html2markdown.rb, based on Nokogiri
So, from PHP, I need to pass the HTML code, call the Ruby/Python Script and receive the output back.
(By the way, a folk made a similar question here ("how to call ruby script from php?") but with no practical information to my case).
Following the Tin Man`s tip (bellow), I got to this:
PHP code:
$t='<p><b>Hello</b><i>world!</i></p>';
$scaped=preg_quote($t,"/");
$program='python html2md.py';
//exec($program.' '.$scaped,$n); print_r($n); exit; //Works!!!
$input=$t;
$descriptorspec=array(
array('pipe','r'),//stdin is a pipe that the child will read from
array('pipe','w'),//stdout is a pipe that the child will write to
array('file','./error-output.txt','a')//stderr is a file to write to
);
$process=proc_open($program,$descriptorspec,$pipes);
if(is_resource($process)){
fwrite($pipes[0],$input);
fclose($pipes[0]);
$r=stream_get_contents($pipes[1]);
fclose($pipes[1]);
$return_value=proc_close($process);
echo "command returned $return_value\n";
print_r($pipes);
print_r($r);
}
Python code:
#! /usr/bin/env python
import html2text
import sys
print html2text.html2text(sys.argv[1])
#print "Hi!" #works!!!
With the above I am geting this:
command returned 1
Array
(
[0] => Resource id #17
1 => Resource id #18
)
And the "error-output.txt" file says:
Traceback (most recent call last):
File "html2md.py", line 5, in
print html2text.html2text(sys.argv1)
IndexError: list index out of range
Any ideas???
Ruby code (still beeing analysed)
#!/usr/bin/env ruby
require_relative 'html2markdown'
puts HTML2Markdown.new("<h1>#{ ARGF.read }</h1>").to_s
Just for the records, I tryed before to use PHP's most simple "exec()" but I got some problemas with some special characters very common to HTML language.
PHP code:
echo exec('./hi.rb');
echo exec('./hi.py');
Ruby code:
#!/usr/bin/ruby
puts "Hello World!"
Python code:
#!usr/bin/python
import sys
print sys.argv[1]
Both working fine. But when the string is a bit more complicated:
$h='<p><b>Hello</b><i>world!</i></p>';
echo exec("python hi.py $h");
It did not work at all.
That's because the html string needed to have its special characters scaped. I got it using this:
$t='<p><b>Hello</b><i>world!</i></p>';
$scaped=preg_quote($t,"/");
Now it works like I said here.
I am runnig:
Fedora 14
ruby 1.8.7
Python 2.7
perl 5.12.2
PHP 5.3.4
nginx 0.8.53
Have PHP open the Ruby or Python script via proc_open, piping the HTML into STDIN in the script. The Ruby/Python script reads and processes the data and returns it via STDOUT back to the PHP script, then exits. This is a common way of doing things via popen-like functionality in Perl, Ruby or Python and is nice because it gives you access to STDERR in case something blows chunks and doesn't require temp files, but it's a bit more complex.
Alternate ways of doing it could be writing the data from PHP to a temporary file, then using system, exec, or something similar to call the Ruby/Python script to open and process it, and print the output using their STDOUT.
EDIT:
See #Jonke's answer for "Best practices with STDIN in Ruby?" for examples of how simple it is to read STDIN and write to STDOUT with Ruby. "How do you read from stdin in python" has some good samples for that language.
This is a simple example showing how to call a Ruby script, passing a string to it via PHP's STDIN pipe, and reading the Ruby script's STDOUT:
Save this as "test.php":
<?php
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("file", "./error-output.txt", "a") // stderr is a file to write to
);
$process = proc_open('ruby ./test.rb', $descriptorspec, $pipes);
if (is_resource($process)) {
// $pipes now looks like this:
// 0 => writeable handle connected to child stdin
// 1 => readable handle connected to child stdout
// Any error output will be appended to /tmp/error-output.txt
fwrite($pipes[0], 'hello world');
fclose($pipes[0]);
echo stream_get_contents($pipes[1]);
fclose($pipes[1]);
// It is important that you close any pipes before calling
// proc_close in order to avoid a deadlock
$return_value = proc_close($process);
echo "command returned $return_value\n";
}
?>
Save this as "test.rb":
#!/usr/bin/env ruby
puts "<b>#{ ARGF.read }</b>"
Running the PHP script gives:
Greg:Desktop greg$ php test.php
<b>hello world</b>
command returned 0
The PHP script is opening the Ruby interpreter which opens the Ruby script. PHP then sends "hello world" to it. Ruby wraps the received text in bold tags, and outputs it, which is captured by PHP, and then output. There are no temp files, nothing passed on the command-line, you could pass a LOT of data if need-be, and it would be pretty fast. Python or Perl could easily be used instead of Ruby.
EDIT:
If you have:
HTML2Markdown.new('<h1>HTMLcode</h1>').to_s
as sample code, then you could begin developing a Ruby solution with:
#!/usr/bin/env ruby
require_relative 'html2markdown'
puts HTML2Markdown.new("<h1>#{ ARGF.read }</h1>").to_s
assuming you've already downloaded the HTML2Markdown code and have it in the current directory and are running Ruby 1.9.2.
In Python, have PHP pass the var as a command line argument, get it from sys.argv (the list of command line arguments passed to Python), and then have Python print the output, which PHP then echoes. Example:
#!usr/bin/python
import sys
print "Hello ", sys.argv[1] # 2nd element, since the first is the script name
PHP:
<?php
echo exec('python script.py Rafe');
?>
The procedure should be basically the same in Ruby.
Use a variable in the Ruby code, and pass it in as an argument to the Ruby script from the PHP code. Then, have the Ruby script return the processed code into stdout which PHP can read.
I think your question is wrong. Your problem is how to convert from HTML to Markdown. Am I right?
Try this http://milianw.de/projects/markdownify/ I think it could help you =)
Another very weird approach will be like the one i used.
Php file -> output.txt
ruby file -> read from output.txt
Ruby file-> result.txt
Php file -> read from result.txt
simple add exec(rubyfile.rb);
Not recommended but this will work for sure.