Runnig process with proc_open - php

I have one php script and I want to call a process writting in C from the PHP scrit. There are many ways to do it(system,exec...), but Ï chosse the function proc_open. With this I can open a pipe in stdin and stdout with the C process, but I don´t know how to get the data from stdin in the C process. Can anyone help me with a example?.Thank you

In C, stdin, stdout and stderr are constant FILE pointers defined in <stdio.h>. For example, to read from stdin:
#include <stdio.h>
int main() {
int ch = fgetc(stdin); //read 1 character from stdin
fputc(ch, stdout); //dump to stdout
//...
return 0;
}

Related

Control a c program using php because my intention is to run c program from a web app

void main(void){
int numb1;
int numb2;
printf("Pick a number for numb1:");
scanf("%d", &numb1);
printf("Pick a number for numb2:");
scanf("%d", &numb2);
int result = numb1 + numb2;
printf("Result of numb1 + numb2 is: %d\n", result);}
I have the following c program. I also have setup a local server to run php file that will run the c program. I have a problem to write into STDIN using php which will then be accepted by the c program to store in the variable numb1. Do you know how can i write into STDIN from php which will then be read by the c program to be processed?
You should compile your C program and then execute it with PHP.
That is a way that you can do it:
<?php
exec("testone.exe 125 70", $out);
print_r($out);
?>
And your c code
#include <stdio.h>
int main(int argc, char **argv)
{
if(argv[1])
printf("First arg %d\n", argv[1]);
if(argv[2])
printf("Second arg %d", argv[2]);
return 0;
}
You make a .exe file from your c code and then run it and give the data to it like this example

Communication between PHP and C++

I want to communicate PHP and C++ code.
I need to pass a big JSON between them.
The problem is that I am currently using "passthru", but for some reason I do not know, the C ++ code does not receive the entire parameter, but is cut in 528 characters when the JSON is 3156.
By performing tests, I have been able to verify that the "passthru" command supports as many characters as 3156. But I do not know if there is a maximum input parameter size in C ++.
The PHP application is as follows:
passthru('programc++.exe '.$bigJSON, $returnVal);
The C++ application:
int main(int argc, char* argv[]){
char *json = argv[1];
}
Is there any way to fix the problem? I have read about PHP extensions and IPC protocols, but the problem is that I have to do a multiplatform program (I must have a version for Windows, another for Linux and for Mac). And I think that using PHP extensions and IPC protocols (as far as I could read) complicates things quite a bit.
Solution:
The solution is use "proc_open" and use the pipe stdin and stdout. In my case, I use the library rapidjson. I add double quotes in PHP in order to rapidJSON works and process the JSON.
PHP:
$exe_command = 'program.exe';
$descriptorspec = array(
0 => array("pipe", "r"), // stdin
1 => array("pipe", "w"), // stdout -> we use this
2 => array("pipe", "w") // stderr
);
$process = proc_open($exe_command, $descriptorspec, $pipes);
$returnValue = null;
if (is_resource($process)){
fwrite($pipes[0], $bigJSON);
fclose($pipes[0]);
$returnValue = stream_get_contents($pipes[1]);
fclose($pipes[1]);
}
C++:
int main(int argc, char* argv[]){
std::string json;
std::getline (std::cin, json);
cout << json << endl; // The JSON
}

Passing output from C++ to PHP

I am creating a PHP file to pass values to a c++ .exe which will then calculate an output and return that output. However, I cannot seem to get the output from the .exe back into the PHP file.
PHP Code:
$path = 'C:enter code here\Users\sumit.exe';
$handle = popen($path,'w');
$write = fwrite($handle,"37");
pclose($handle);
C++ Code:
#include "stdafx.h"
#include <iostream>
using namespace std;
// Declaation of Input Variables:
int main()
{
int num;
cin>> num;
std::cout<<num+5;
return 0;
}
I'd advise neither system nor popen but proc_open command: php.net
Call it like
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("pipe", "w") // stderr, also a pipe the child will write to
);
proc_open('C:enter code here\Users\sumit.exe', $descriptorspec, $pipes);
After that you'll have $pipes filled with handles to send data to program ([0]) and recieve data from program ([1]). Also you will have [2] which you can use to get stderr from the program (or just close if you don't use stderr).
Don't forget to close process handle with proc_close() and pipe handles with fclose(). Note that your program will not know the output is complete before you close $pipes[0] handle or write some whitespace character. I advise closing the pipe.
Using command line arguments in system() or popen() is valid, though if you intend to send large amounts of data and/or raw data, you will have trouble with command line length limits and with escaping special chars.
In your C++ code I am not seeing anything for passing variables in you need
int main(int argc, char* argv[])
instead of
int main()
Remember argc is the count of variables and it includes the path to the file, so your arguments begin at 1 with each argv being a c-string of that argument. If you need a decimal atof is your friend or atoi for an integer.
Then you are using popen. The PHP documentation says that it can be only used for reading or writting. It is not bi-directional. You want to use proc_open to have bi-directional support.
Anyways, This is how I would write your C++ code:
#include "stdafx.h"
#include <iostream>
// Declaation of Input Variables:
int main(int arc, char* argv[])
{
int num;
num = atoi(argv[1]);
std::cout<<num+5;
return 0;
}
Note: I removed using namespace std because I noticed you were still trying to use the namespace in the main function (i.e. std::cout) and it better to keep it out of a global namespace.
you are writing into exe file, you should pass your argument like
system("C:enter code here\Users\sumit.exe 37");

always blocked on reading from pipe opened through php's proc_open when used with stream_select

I'm talking to a process requiring user interaction using the following (PHP 5.3/ Ubuntu 12.04),
$pdes = array(
0 => array('pipe', 'r'), //child's stdin
1 => array('pipe', 'w'), //child's stdout
);
$process = proc_open($cmd, $pdes, $pipes);
sleep(1);
if(is_resource($process)){
while($iter-->0){
$r=array($pipes[1]);
$w=array($pipes[0]);
$e=array();
if(0<($streams=stream_select($r,$w,$e,2))){
if($streams){
if($r){
echo "reading\n";
$rbuf.=fread($pipes[1],$rlen); //reading rlen bytes from pipe
}else{
echo "writing\n";
fwrite($pipes[0],$wbuf."\n"); //writing to pipe
fflush($pipes[0]);
}}}}
fclose($pipes[0]);
fclose($pipes[1]);
echo "exitcode: ".proc_close($process)."\n";
}
And this is my test program in C,
#include <stdio.h>
int main(){
char buf[512];
printf("before input\n");
scanf("%s",buf);
printf("after input\n");
return 0;
}
Now, the problem is $r is always empty after stream_select even if $pipes[1] is set to non-blocking where as write to $pipes[0] never blocks. However, things work fine without stream_select i.e. if I match reads and writes to the test program,
echo fread($pipes[1],$rlen); //matching printf before input
fwrite($pipes[0],$wbuf."\n"); //matching scanf
fflush($pipes[0]);
echo fread($pipes[1],$rlen); //matching printf after input
I couldn't figure out what's happening here. I'm trying to achieve something sort of web based terminal emulator here. Any suggestions on how to do this are welcome :)
Sorry guys for wasting your time. I figured out the problem a while later (sorry for the late update). Read was blocking due to a race condition.
I write to the process and immediately check the streams for availability. Write never blocks and data is not ready for reading yet (somehow even for 1 byte of data to be available it took 600ms). So, I was able to fix the problem by adding sleep(1) at the end of write block.

How to pass the content of a variable trough an external command in php?

I have a variable that contains a long string. (specifically it contains a few kilobytes of javascript-code)
I want to pass this string trough an external command, in this case a javascript-compressor, and capture the output of the external command (the compressed javascript) in php, assigning it to a variable.
I'm aware that there's classes for compressing javascript in php, but this is merely one example of a general problem.
originally we used:
$newvar = passthru("echo $oldvar | compressor");
This works for small strings, but is insecure. (if oldvar contains characters with special meaning to the shell, then anything could happen)
Escaping with escapeshellarg fixes that, but the solution breaks for longer strings, because of OS-limitations on maximum allowable argument-length.
I tried using popen("command" "w") and writing to the command - this works, but the output from the command silently disappears into the void.
Conceptually, I just want to do the equivalent of:
$newvar = external_command($oldvar);
Using the proc_open-function you can get handles to both stdout and stdin of the process and thus write your data to it and read the result.
Using rumpels suggestion, I was able to device the following solution which seems to work well. Posting it here for the benefit of anyone else interested in the question.
public static function extFilter($command, $content){
$fds = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("pipe", "w") // stderr is a pipe that the child will write to
);
$process = proc_open($command, $fds, $pipes, NULL, NULL);
if (is_resource($process)) {
fwrite($pipes[0], $content);
fclose($pipes[0]);
$stdout = stream_get_contents($pipes[1]);
fclose($pipes[1]);
$stderr = stream_get_contents($pipes[2]);
fclose($pipes[2]);
$return_value = proc_close($process);
// Do whatever you want to do with $stderr and the commands exit-code.
} else {
// Do whatever you want to do if the command fails to start
}
return $stdout;
}
There may be deadlock-issues: if the data you send is larger than the combined sizes of the pipes, then the external command will block, waiting for someone to read from it's stdout, while php is blocked, waiting for stdin to be read from to make room for more input.
Possibly PHP takes care of this issue somehow, but it's worth testing out if you plan to send (or receive) more data than fits in the pipes.

Categories