Pngquant has the following example for php
// '-' makes it use stdout, required to save to $compressed_png_content variable
// '<' makes it read from the given file path
// escapeshellarg() makes this safe to use with any path
$compressed_png_content = shell_exec("pngquant --quality=$min_quality-$max_quality - < ".escapeshellarg( $path_to_png_file));
I want to replace $path_of_file with the actual content.
This will avoid wasting I/O when converting a file from one format to png and then optimize it
What will be the new shell_exec() command in that situation
I am no PHP expert, but I believe you are looking for a 2-way pipe (write and read) to another process, so that you can write data to its stdin and read data from its stdout. So, I think that means you need proc_open() which is described here.
It will look something like this (untested):
$cmd = 'pngquant --quality ... -';
$spec = array(array("pipe", "r"), array("pipe", "w"), array("pipe", "w"));
$process = proc_open($cmd, $spec, $pipes);
if (is_resource($process))
{
// write your data to $pipes[0] so that "pngquant" gets it
fclose($pipes[0]);
$result=stream_get_contents($pipes[1]);
fclose($pipes[1]);
proc_close($process);
}
Related
I have python script which may can ask for input while running (I must run it from PHP and can't modify python script). My question is, can I interact with that script with PHP. For example, run "python script.py" and then if last message is "String:", than insert input.
>> python script.py
>> Log 1 ...
>> Log 2 ...
>> Enter value A:
>> 2
>> Enter value B:
>> 4
i want to do this.
If(cmdlogtext == 'Enter value A:')
* Enter number 2
To interact with running script you need to use proc_open.
proc_open Official docs
Below example is a simple usage.
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("file", "error-output.txt", "a") // stderr is a file to write to
);
$process = proc_open('python script.py', $descriptorspec, $pipes);
if (is_resource($process)) {
// $pipes now looks like this:
// 0 => writeable handle connected to child stdin
// 1 => readable handle connected to child stdout
// Any error output will be appended to error-output.txt
fwrite($pipes[0], 'Input 1' . PHP_EOL);
fwrite($pipes[0], 'Input 2' . PHP_EOL);
echo stream_get_contents($pipes[1]);
fclose($pipes[0]);
fclose($pipes[1]);
// It is important that you close any pipes before calling
// proc_close in order to avoid a deadlock
$return_value = proc_close($process);
}
Keep in mind:
To send inputs to running script, you must use fwrite on $pipes[0].
All parameters needs to be ended by PHP_EOL
You can use the exec function and then inspect the output.
exec("python script.py", $output, $returnValue);
Edit: Re-read your question and I'm not sure about interactively providing input but you may be able to modify the exec() function to pass arguments to the Python script assuming the script can accept them.
I want to communicate PHP and C++ code.
I need to pass a big JSON between them.
The problem is that I am currently using "passthru", but for some reason I do not know, the C ++ code does not receive the entire parameter, but is cut in 528 characters when the JSON is 3156.
By performing tests, I have been able to verify that the "passthru" command supports as many characters as 3156. But I do not know if there is a maximum input parameter size in C ++.
The PHP application is as follows:
passthru('programc++.exe '.$bigJSON, $returnVal);
The C++ application:
int main(int argc, char* argv[]){
char *json = argv[1];
}
Is there any way to fix the problem? I have read about PHP extensions and IPC protocols, but the problem is that I have to do a multiplatform program (I must have a version for Windows, another for Linux and for Mac). And I think that using PHP extensions and IPC protocols (as far as I could read) complicates things quite a bit.
Solution:
The solution is use "proc_open" and use the pipe stdin and stdout. In my case, I use the library rapidjson. I add double quotes in PHP in order to rapidJSON works and process the JSON.
PHP:
$exe_command = 'program.exe';
$descriptorspec = array(
0 => array("pipe", "r"), // stdin
1 => array("pipe", "w"), // stdout -> we use this
2 => array("pipe", "w") // stderr
);
$process = proc_open($exe_command, $descriptorspec, $pipes);
$returnValue = null;
if (is_resource($process)){
fwrite($pipes[0], $bigJSON);
fclose($pipes[0]);
$returnValue = stream_get_contents($pipes[1]);
fclose($pipes[1]);
}
C++:
int main(int argc, char* argv[]){
std::string json;
std::getline (std::cin, json);
cout << json << endl; // The JSON
}
I am a PHP beginner. I want to invoke an external Unix command, pipe some stuff into it (e.g., string and files), and have the result appear in my output buffer (the browser).
Consider the following:
echo '<h1>stuff for my browser output window</h1>';
$fnmphp= '/tmp/someinputfile';
$sendtoprogram = "myheader: $fnmphp\n\n".get_file_contents($fnmphp);
popen2outputbuf("unixprogram < $sendtoprogram");
echo '<p>done</p>';
An even better solution would let PHP write myheader (into Unix program), then pipe the file $fnmphp (into Unix program); and the output of unixprogram would immediately go to my browser output buffer.
I don't think PHP uses stdout, so that my Unix program STDOUT output would make it into the browser. Otherwise, this would happen to default if I used system(). I can only think of solutions that require writing tempfiles.
I think I am standing on the line here (German idiom; wires crossed)--- this probably has an obvious solution.
update:
here is the entirely inelegant but pretty precise solution that I want to replace:
function pipe2eqb( $text ) {
$whatever= '/tmp/whatever-'.time().'-'.$_SESSION['uid'];
$inf = "$whatever.in";
$outf= "$whatever.out";
assert(!file_exists($inf));
assert(!file_exists($outf));
file_put_contents($inf, $text);
assert(file_exists($inf));
system("unixprog < $inf > $outf");
$fo= file_get_contents($outf);
unlink($infilename);
unlink($outfilename);
return $fo;
}
It is easy to replace either the input or the output, but I want to replace both. I will post a solution when I figure it out.
the best to do this is the proc_open family of functions
<?php
$descriptorspec = array(
0 => array("pipe", "r"), // stdin
1 => array("pipe", "w"), // stdout
2 => array("pipe", "w") // stderr
);
$cwd = NULL;//'/tmp';
$env = NULL;//array();
$cmd='unixprog ';
$process = proc_open($cmd, $descriptorspec, $pipes, $cwd, $env);
assert(false!==$process);
now, to give arguments to unixprog, do like
$cmd='unixprog --arg1='.escapeshellarg($arg1).' --arg2='.escapeshellarg($arg2);
to talk to the program's stdin, do like
assert(strlen($stdinmessage)===fwrite($pipes[0],$stdinmessage));
to read from the process's stdout, do like
$stdout=file_get_contents($pipes[$1])
to read from the process's stderr, do like
$stderr=file_get_contents($pipes[$2])
to check if the program has finished, do like
$status=proc_get_status($process);
if($status['running']){echo 'child process is still running.';}
to check the return code of the process when it has finished,
echo 'return code from child process: '.$status['exitcode'];
to wait for the child process to finish, you CAN do
while(proc_get_status($process)['running']){sleep(1);}
this is a quick and easy way to do it, but it is not optimal. tl;dr: it may be slow or waste cpu. long version:
there is some nigh-optimal event-driven way to do this, but im not sure how to do it. imagine having to run a program 10 times, but the program execute in 100 milliseconds. this code would use 10 seconds! while optimal code would use only 1 second. you can use usleep() for microseconds, but its still not optimal, imagine if you're checking every 100 microseconds, but the program use 10 seconds to execute: you would waste cpu, checking the status 100,000 times, while optimal code would only check it once.. im sure there is a fancy way to let php sleep until the process finishes with some callback/signal, perhaps with stream_select , but i've yet to solve it. (if anybody have the solution, please let me know!)
-- read more at http://php.net/manual/en/book.exec.php
I'm thinking about making a php script which opens stockfish chess engine CLI, send fews commands and get back the output.
I think I can achieve this by using proc_open with pipes array but I can't figure out how to wait the whole output... If there's another better solution it's appreciated!
Here's my code:
// ok, define the pipes
$descr = array(
0 => array("pipe", "r"),
1 => array("pipe", "w"),
2 => array("pipe", "w")
);
$pipes = array();
// open the process with those pipes
$process = proc_open("./stockfish", $descr, $pipes);
// check if it's running
if (is_resource($process)) {
// send first universal chess interface command
fwrite($pipes[0], "uci");
// send analysis (5 seconds) command
fwrite($pipes[0], "go movetime 5000");
// close read pipe or STDOUTPUT can't be read
fclose($pipes[0]);
// read and print all output comes from the pipe
while (!feof($pipes[1])) {
echo fgets($pipes[1]);
}
// close the last opened pipe
fclose($pipes[1]);
// at the end, close the process
proc_close($process);
}
The process seems to start, but the second STDINPUT I send to process isn't able to wait until it finishes because the second command produces analysis lines for about 5 seconds and the result it prints it's immediate.
How can I get on it?
CLI link, CLI documentation link
Please, ask me for more information about this engine if you need.
Thank you!
fwrite($pipes[0], "uci/n");
fwrite($pipes[0], "go movetime 5000/n");
Without /n Stockfish see this as one command (ucigo movetime 5000) and don't recognise it.
Actually, your code works. $pipes[1] contained all the output from stockfish...
You might need to change the line
position startpos moves 5000
to a different number, as the 5000 means 5000 ms = 5 seconds, ie. the time when the engine stops. Try 10000 and the engine stops after 10 seconds etc.
You need to remove: fclose($pipes[0]); and make check for bestmove in while cycle, if bestmove is found - break the cycle, and after that only put fclose($pipes[0]);. That worked for me. And add \n separator at the end of commands.
Thanks for the code!
I have a variable that contains a long string. (specifically it contains a few kilobytes of javascript-code)
I want to pass this string trough an external command, in this case a javascript-compressor, and capture the output of the external command (the compressed javascript) in php, assigning it to a variable.
I'm aware that there's classes for compressing javascript in php, but this is merely one example of a general problem.
originally we used:
$newvar = passthru("echo $oldvar | compressor");
This works for small strings, but is insecure. (if oldvar contains characters with special meaning to the shell, then anything could happen)
Escaping with escapeshellarg fixes that, but the solution breaks for longer strings, because of OS-limitations on maximum allowable argument-length.
I tried using popen("command" "w") and writing to the command - this works, but the output from the command silently disappears into the void.
Conceptually, I just want to do the equivalent of:
$newvar = external_command($oldvar);
Using the proc_open-function you can get handles to both stdout and stdin of the process and thus write your data to it and read the result.
Using rumpels suggestion, I was able to device the following solution which seems to work well. Posting it here for the benefit of anyone else interested in the question.
public static function extFilter($command, $content){
$fds = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("pipe", "w") // stderr is a pipe that the child will write to
);
$process = proc_open($command, $fds, $pipes, NULL, NULL);
if (is_resource($process)) {
fwrite($pipes[0], $content);
fclose($pipes[0]);
$stdout = stream_get_contents($pipes[1]);
fclose($pipes[1]);
$stderr = stream_get_contents($pipes[2]);
fclose($pipes[2]);
$return_value = proc_close($process);
// Do whatever you want to do with $stderr and the commands exit-code.
} else {
// Do whatever you want to do if the command fails to start
}
return $stdout;
}
There may be deadlock-issues: if the data you send is larger than the combined sizes of the pipes, then the external command will block, waiting for someone to read from it's stdout, while php is blocked, waiting for stdin to be read from to make room for more input.
Possibly PHP takes care of this issue somehow, but it's worth testing out if you plan to send (or receive) more data than fits in the pipes.