I'm building a RAKEFILE and I want to display the output on a php generated page as it gets executed.
I tried using system() since the PHP docs mention this:
The system() call also tries to automatically flush the web server's output buffer after each line of output if PHP is running as a server module.
This seems to work with multiple shell comands but when I execute rake I only get the first line:
(in /Users/path/to/proj)
Any ideas?
Cheers!
Try use exec() function
exec($command, $output);
$output is an array
//retrieved data
for($out = '',$x = 0,$len = count($output); $x < $len; $x++) {
$out .= $output[$x] . "\r\n";
}
or simple:
$out = join("\r\n", $output);
The system() call also tries to automatically flush the web server's output buffer after > each line of output if PHP is running as a server module.
This means you would only get the last line of output from the return value. The example in the system() manual page shows that and it suggests to use passthru() to get raw output. I usually use exec() though.
Turs out both functions system() & exec() actually work. The generated rake output when using --verbose isn't taken into consideration though. That's why I was confused. If anyone has more extensive knowledge on the distinction, do share :)
Related
I have a PHP file that runs a node script using exec() to gather the output, like so:
$test = exec("/usr/local/bin/node /home/user/www/bin/start.js --url=https://www.example.com/");
echo $test;
It outputs a JSON string of data tied to the website in the --url paramater. It works great, but sometimes the output string is cut short.
When I run the command in the exec() script directly, I get the full output, as expected.
Why would this be? I've also tried running shell_exec() instead, but the same things happens with the output being cut short.
Is there a setting in php.ini or somewhere else to increase the size of output strings?
It appears the only way to get this working is by passing exec() to a temp file, like this:
exec("/usr/local/bin/node /home/user/www/bin/start.js --url=https://www.example.com/ > /home/user/www/uploads/json.txt");
$json = file_get_contents('/home/user/www/uploads/json.txt');
echo $json;
I would prefer to have the direct output and tried increasing output_buffering in php.ini with no change (output still gets cut off).
Definitely open to other ideas to avoid the temp file, but could also live with this and just unlink() the file on each run.
exec() only returns the last line of the output of the command you pass to it. Per the section marked Return Value of the following documentation:
The last line from the result of the command. If you need to execute a command and have all the data from the command passed directly back without any interference, use the passthru() function.
To get the output of the executed command, be sure to set and use the output parameter.
https://www.php.net/manual/en/function.exec.php
To do what you are trying to do, you need to pass the function an array to store the output, like so:
exec("/usr/local/bin/node /home/user/www/bin/start.js --url=https://www.example.com/", $output);
echo implode("\n", $output);
This is my code for executing a command from PHP:
$execQuery = sprintf("/usr/local/bin/binary -mode M \"%s\" %u %s -pathJson \"/home/ec2/fashion/jsonS/\" -pathJson2 \"/home/ec2/fashion/jsonS2/\"", $path, $pieces, $type);
exec($execQuery, $output, $return);
the $return value is always 0 but $output is empty. The $output should be a JSON.
If I execute the same but removing one letter to binary (for example /usr/local/bin/binar ) I get (correctly) a $return = 127.
If I write other parameters (like -mode R which doesn't exit) I got errors from the console (which are correct as well).
If I run the exact $execQuery (which I printf before to be sure about quotation marks) on the console, it executes correctly. It's only the PHP side where I've got the error.
What can be wrong?
Thank you in advance.
Well, a couple of things might be happening...
This binary you're running write to something else that STDOUT (for instance, STDERR)
The env vars available to the PHP user differ from the env vars available to the user running console (and those vars are required)
PHP User does not have permission to access some files involved.
In order to debug, it might be better to use proc_open instead of exec, and check the STDOUT and STDERR. This might give you additional information regarding what's happening.
Suggestion (and shameless advertising)
I wrote a small utility library for PHP that executes external programs in a safer way and provides aditional debug information. It might help you to, at least pinpoint the issue.
I'm trying to read python output from a php webapp.
I'm using $out = shell_exec("./mytest") in the php code to launch the application, and sys.exit("returnvalue") in the python application to return the value.
The problem is that $out doesn't contain my return value.
Instead if I try with $out = shell_exec("ls"), $out variable contain the output of ls command.
If I run ./mytest from terminal it works and I can see the output on my terminal.
sys.exit("returnvalue")
Using a string with sys.exit is used to indicate an error value. So this will show returnvalue in stderr, not stdout. shell_exec() only captures stdout by default.
You probably want to use this in your Python code:
print("returnvalue")
sys.exit(0)
Alternatively, you could also use this in your PHP code to redirect stderr to stdout.
$out = shell_exec("./mytest 2>&1");
(In fact, doing both is probably best, since having stderr disappear can be quite confusing if something unexpected happens).
I'm running a simple command in a loop
the command itself is ffmpeg, but I do not believe it's related to the issue
so, I have:
exec($exec.' 2>&1', $output, $return);
if($return)
{
foreach($output as $line)
{
file_put_contents($log_file, $line, FILE_APPEND);
}
}
This way, if anything goes wrong with the command I can read the output in the log. It works, however $output contains the entire shell history of the command. To clarify: every time an error occurs, all output that was generated by the particular command (including hundreds of successful executions from throughout the day) is dumped to the file. What should be a 5 line error being written is instead the entire 1000+ line history. I used the exact same code on CentOS and it gave me the expected output of only the output generated by the instance most recently executed.
From the documentation:
Note that if the array already contains some elements, exec() will append to the end of the array. If you do not want the function to append elements, call unset() on the array before passing it to exec().
I can't explain why it worked differently on CentOS.
I'm using system() PHP function to run some curl commands like this system("curl command here",$output); but it displays results on screen. Any way to avoid this output?
You're using the wrong function for that. According to the docs:
system() is just like the C version of the function in that it executes the given command and outputs the result.
So it always outputs. Use execÂDocs instead which does return (and not output) the programs output:
$last = exec("curl command here", $output, $status);
$output = implode("\n", $output);
Or (just for completeness) use output bufferingÂDocs:
ob_start();
system("curl command here", $status);
$output = ob_get_clean();
You coud try using output buffering.
ob_start();
system("curl command here",$output);
$result = ob_get_contents();
ob_end_clean();
You could either modify the command string and append " 1>/dev/null 2>&1" or - more elegantly - execute the process with a pipe (see example #2).
For a more refined control over the process' file handles, you can also use proc_open().
The system function displays the output from your command, so you're out of luck there.
What you want is to change system for exec. That function will not display the command's output.
No, you should use PHP curl library