Executing linux commands with PHP - php

I'm trying to execute a linux command through a PHP command-line script, which is no problem using the exec command.
The problem is, the command I am executing (mysqldump) outputs an error message if something is wrong (for example user/password is incorrect). I can't seem to be able to capture this error in order to log it. It just prints this error to the screen.
How do I cause this error not to be printed to the screen, but instead to put it in a variable for use in my script?
Thanks!

Use popen to run the process. The example #2 on this page shows exactly what you're looking for:
<?php
error_reporting(E_ALL);
/* Add redirection so we can get stderr. */
$handle = popen('/path/to/spooge 2>&1', 'r');
echo "'$handle'; " . gettype($handle) . "\n";
$read = fread($handle, 2096);
echo $read;
pclose($handle);
?>

exec("mysqldump -u user -p passwod database > outputfile.sql 2> error.log");

You need to redirect stderr to stdout, so you can capture it. This example routes stdout to devnull (thus ignoreing it) and routes stderr to you:
exec('ls * 2>&1 1>/dev/null');

I'm not too hot in Unix (New Years Resolution...) but these functions look helpful:
shell_exec - returns result as a string.
passthru - It looks like you can execute this like: passthru('command', $result); and then use $result.

tried using backticks?
$var = `command`;

The following will route stderr messages to the same place as the normal output.
exec("mysql_dump blah 2>&1",$output,$return_val)
if($return_val !== 0)
echo "there was an error"
2>&1 means re-route stderr messages to the same place as stdout, and thus will be loaded into the output array.

Have you looked at the system() command? It's been a while since I did any PHP, but that rings a bell.

Related

php live output for bash script if output is filtered [duplicate]

I'm just experimenting with PHP and shell_exec on my Linux server. It's a really cool function to use and I am really enjoying it so far. Is there a way to view the live output that is going on while the command is running?
For example, if ping stackoverflow.com was run, while it is pinging the target address, every time it pings, show the results with PHP? Is that possible?
I would love to see the live update of the buffer as it's running. Maybe it's not possible but it sure would be nice.
This is the code I am trying and every way I have tried it always displays the results after the command is finished.
<?php
$cmd = 'ping -c 10 127.0.0.1';
$output = shell_exec($cmd);
echo "<pre>$output</pre>";
?>
I've tried putting the echo part in a loop but still no luck. Anyone have any suggestions on making it show the live output to the screen instead of waiting until the command is complete?
I've tried exec, shell_exec, system, and passthru. Everyone of them displays the content after it's finished. Unless I'm using the wrong syntax or I'm not setting up the loop correctly.
To read the output of a process, popen() is the way to go. Your script will run in parallel with the program and you can interact with it by reading and writing it's output/input as if it was a file.
But if you just want to dump it's result straight to the user you can cut to the chase and use passthru():
echo '<pre>';
passthru($cmd);
echo '</pre>';
If you want to display the output at run time as the program goes, you can do this:
while (# ob_end_flush()); // end all output buffers if any
$proc = popen($cmd, 'r');
echo '<pre>';
while (!feof($proc))
{
echo fread($proc, 4096);
# flush();
}
echo '</pre>';
This code should run the command and push the output straight to the end user at run time.
More useful information
Note that if you are using sessions then having one of those running will prevent the user from loading other pages, as sessions enforce that concurrent requests cannot happen. To prevent this from being a problem, call session_write_close() before the loop.
If your server is behind a nginx gateway, then the nginx buffering may be disruptive to the desired behavior. Set the header header('X-Accel-Buffering: no'); to hint nginx that it shouldn't do that. As headers are sent first, this has to be called in the beginning of the script, before any data is sent.
First of all, thanks Havenard for your snippet - it helped a lot!
A slightly modified version of Havenard's code which i found useful.
<?php
/**
* Execute the given command by displaying console output live to the user.
* #param string cmd : command to be executed
* #return array exit_status : exit status of the executed command
* output : console output of the executed command
*/
function liveExecuteCommand($cmd)
{
while (# ob_end_flush()); // end all output buffers if any
$proc = popen("$cmd 2>&1 ; echo Exit status : $?", 'r');
$live_output = "";
$complete_output = "";
while (!feof($proc))
{
$live_output = fread($proc, 4096);
$complete_output = $complete_output . $live_output;
echo "$live_output";
# flush();
}
pclose($proc);
// get exit status
preg_match('/[0-9]+$/', $complete_output, $matches);
// return exit status and intended output
return array (
'exit_status' => intval($matches[0]),
'output' => str_replace("Exit status : " . $matches[0], '', $complete_output)
);
}
?>
Sample Usage :
$result = liveExecuteCommand('ls -la');
if($result['exit_status'] === 0){
// do something if command execution succeeds
} else {
// do something on failure
}
If you're willing to download a dependency, Symfony's processor component does this. I found the interface to working with this cleaner than reinventing anything myself with popen() or passthru().
This was provided by the Symfony documentation:
You can also use the Process class with the foreach construct to get
the output while it is generated. By default, the loop waits for new
output before going to the next iteration:
$process = new Process('ls -lsa');
$process->start();
foreach ($process as $type => $data) {
if ($process::OUT === $type) {
echo "\nRead from stdout: ".$data;
} else { // $process::ERR === $type
echo "\nRead from stderr: ".$data;
}
}
As a warning, I've run into some problems PHP and Nginx trying to buffer the output before sending it to the browser. You can disable output buffering in PHP by turning it off in php.ini: output_buffering = off. There's apparently a way to disable it in Nginx, but I ended up using the PHP built in server for my testing to avoid the hassle.
I put up a full example of this on Gitlab: https://gitlab.com/hpierce1102/web-shell-output-streaming

How to save status the youtube dl to the .txt file [duplicate]

I'm just experimenting with PHP and shell_exec on my Linux server. It's a really cool function to use and I am really enjoying it so far. Is there a way to view the live output that is going on while the command is running?
For example, if ping stackoverflow.com was run, while it is pinging the target address, every time it pings, show the results with PHP? Is that possible?
I would love to see the live update of the buffer as it's running. Maybe it's not possible but it sure would be nice.
This is the code I am trying and every way I have tried it always displays the results after the command is finished.
<?php
$cmd = 'ping -c 10 127.0.0.1';
$output = shell_exec($cmd);
echo "<pre>$output</pre>";
?>
I've tried putting the echo part in a loop but still no luck. Anyone have any suggestions on making it show the live output to the screen instead of waiting until the command is complete?
I've tried exec, shell_exec, system, and passthru. Everyone of them displays the content after it's finished. Unless I'm using the wrong syntax or I'm not setting up the loop correctly.
To read the output of a process, popen() is the way to go. Your script will run in parallel with the program and you can interact with it by reading and writing it's output/input as if it was a file.
But if you just want to dump it's result straight to the user you can cut to the chase and use passthru():
echo '<pre>';
passthru($cmd);
echo '</pre>';
If you want to display the output at run time as the program goes, you can do this:
while (# ob_end_flush()); // end all output buffers if any
$proc = popen($cmd, 'r');
echo '<pre>';
while (!feof($proc))
{
echo fread($proc, 4096);
# flush();
}
echo '</pre>';
This code should run the command and push the output straight to the end user at run time.
More useful information
Note that if you are using sessions then having one of those running will prevent the user from loading other pages, as sessions enforce that concurrent requests cannot happen. To prevent this from being a problem, call session_write_close() before the loop.
If your server is behind a nginx gateway, then the nginx buffering may be disruptive to the desired behavior. Set the header header('X-Accel-Buffering: no'); to hint nginx that it shouldn't do that. As headers are sent first, this has to be called in the beginning of the script, before any data is sent.
First of all, thanks Havenard for your snippet - it helped a lot!
A slightly modified version of Havenard's code which i found useful.
<?php
/**
* Execute the given command by displaying console output live to the user.
* #param string cmd : command to be executed
* #return array exit_status : exit status of the executed command
* output : console output of the executed command
*/
function liveExecuteCommand($cmd)
{
while (# ob_end_flush()); // end all output buffers if any
$proc = popen("$cmd 2>&1 ; echo Exit status : $?", 'r');
$live_output = "";
$complete_output = "";
while (!feof($proc))
{
$live_output = fread($proc, 4096);
$complete_output = $complete_output . $live_output;
echo "$live_output";
# flush();
}
pclose($proc);
// get exit status
preg_match('/[0-9]+$/', $complete_output, $matches);
// return exit status and intended output
return array (
'exit_status' => intval($matches[0]),
'output' => str_replace("Exit status : " . $matches[0], '', $complete_output)
);
}
?>
Sample Usage :
$result = liveExecuteCommand('ls -la');
if($result['exit_status'] === 0){
// do something if command execution succeeds
} else {
// do something on failure
}
If you're willing to download a dependency, Symfony's processor component does this. I found the interface to working with this cleaner than reinventing anything myself with popen() or passthru().
This was provided by the Symfony documentation:
You can also use the Process class with the foreach construct to get
the output while it is generated. By default, the loop waits for new
output before going to the next iteration:
$process = new Process('ls -lsa');
$process->start();
foreach ($process as $type => $data) {
if ($process::OUT === $type) {
echo "\nRead from stdout: ".$data;
} else { // $process::ERR === $type
echo "\nRead from stderr: ".$data;
}
}
As a warning, I've run into some problems PHP and Nginx trying to buffer the output before sending it to the browser. You can disable output buffering in PHP by turning it off in php.ini: output_buffering = off. There's apparently a way to disable it in Nginx, but I ended up using the PHP built in server for my testing to avoid the hassle.
I put up a full example of this on Gitlab: https://gitlab.com/hpierce1102/web-shell-output-streaming

Calling a PHP script from within PHP

I have a PHP script that I'd like to run both via the browser and from the command line. When I run the script from the command line, it executes without a problem. However, when I call it via a function such as exec or passthru, it doesn't work. I'm not getting any output from the call and I see no errors in the logs. I'm very confused...
echo exec('php /usr/share/nginx/www/function.php arg1');
Any ideas?
The output is captured in a parameter variable as an array and not returned back, when you use exec(). So do something like this:
exec('php /usr/share/nginx/www/function.php arg1', $output);
print_r($output);
#David,
A few things to check.
Create a PHP Info page and see if exec or passthru is enabled. Hosts disable it for security.
Try:
<?php
$exe = exec('php /usr/share/nginx/www/function.php arg1');
var_dump($exe);
?>
You could even do:
<?php
if(function_exists('exec')) {
echo "exec is enabled";
}
?>

how to hide system() output

i am working on windows XP . i can successfully run a system() command through my browser by calling a TCL script that automates a ssh session. I also return a value from the script. however my problem is that the script dumps the entire ssh session in the browser.
my php script looks like :
$lastline=system('"C:\tcl\bin\tclsh.exe" \path to file\filename.tcl '.$username.' '.$pass,$val);
filename.tcl:
spawn plink -ssh $user#$host
expect "password:"
send "$pass\r"
expect "\prompt:/->"
set $return_value [string compare /..string../ $expect_out(buffer)]
/...some code...this runs fine/
exit $return_value
everything runs fine and i get $return_value back correctly but the php file prints the result of the execution of the entire ssh session in my browser which looks like:
Using username "admin". admin#10.135.25.150's password: === /*some text*/ === \prompt:/->.../some text/
i want to prevent the system() function from printing this in my browser
i have used the shell_exec() function but it returns the entire ssh session result (which i have parsed in the tcl script and got a precise value to return to the php script)
is there a way i can do this without using shell_exec() but using system() instead
thanks in advance
The documentation for system() specifically says:
Execute an external program and display the output
On that page are listed alternatives. If you use the exec function instead, it will only execute the commands without displaying any output.
Example:
<?php
echo "Hello, ";
system("ls -l");
echo "world!\n";
?>
will display the output of system:
$ php -q foo.php
Hello, total 1
-rw-r--r-- 1 bar domain users 59 Jul 15 16:10 foo.php
world!
while using exec will not display any output:
<?php
echo "Hello, ";
exec("ls -l");
echo "world!\n";
?>
$ php -q foo.php
Hello, world!
use ob_start(); before and ob_clean(); after calling it
http://sandbox.phpcode.eu/g/850a3.php
<?php
ob_start();
echo '<pre>';
$last_line = system('ls');
ob_clean();
echo 'nothing returned!';
?>
In general if you want to prevent anything to output to the browser you can use ob_start() before your system() call and then ob_end_clean(). See http://php.net/manual/en/function.ob-start.php

PHP exec() return value for background process (linux)

Using PHP on Linux, I'd like to determine whether a shell command run using exec() was successfully executed. I'm using the return_var parameter to check for a successful return value of 0. This works fine until I need to do the same thing for a process that has to run in the background. For example, in the following command $result returns 0:
exec('badcommand > /dev/null 2>&1 &', $output, $result);
I have put the redirect in there on purpose, I do not want to capture any output. I just want to know that the command was executed successfully. Is that possible to do?
Thanks, Brian
My guess is that what you are trying to do is not directly possible. By backgrounding the process, you are letting your PHP script continue (and potentially exit) before a result exists.
A work around is to have a second PHP (or Bash/etc) script that just does the command execution and writes the result to a temp file.
The main script would be something like:
$resultFile = '/tmp/result001';
touch($resultFile);
exec('php command_runner.php '.escapeshellarg($resultFile).' > /dev/null 2>&1 &');
// do other stuff...
// Sometime later when you want to check the result...
while (!strlen(file_get_contents($resultFile))) {
sleep(5);
}
$result = intval(file_get_contents($resultFile));
unlink($resultFile);
And the command_runner.php would look like:
$outputFile = $argv[0];
exec('badcommand > /dev/null 2>&1', $output, $result);
file_put_contents($outputFile, $result);
Its not pretty, and there is certainly room for adding robustness and handling concurrent executions, but the general idea should work.
Not using the exec() method. When you send a process to the background, it will return 0 to the exec call and php will continue execution, there's no way to retrieve the final result.
pcntl_fork() however will fork your application, so you can run exec() in the child process and leave it waiting until it finishes. Then exit() with the status the exec call returned.
In the parent process you can access that return code with pcntl_waitpid()
Just my 2 cents, how about using the || or && bash operator?
exec('ls && touch /tmp/res_ok || touch /tmp/res_bad');
And then check for file existence.

Categories