I'm attempting to query an Exchange server using a powershell script executed by PHP and it doesn't seem to be working. The command I'm attempting to run is:
powershell "Add-PSSnapin Microsoft.Exchange.Management.PowerShell.E2010";"Get-CASMailbox -Identity user#example.com | fl ActiveSyncEnabled"
If I type that directly into the command prompt, it executes correctly and returns:
ActiveSyncEnabled : True
When I try it in PHP:
$output=shell_exec('powershell "Add-PSSnapin Microsoft.Exchange.Management.PowerShell.E2010";"Get-CASMailbox -Identity user#example.org | fl ActiveSyncEnabled"');
I get nothing. $output is set but blank. If I add "> output.txt" to the end, I get a blank text file. I'm able to run other, simpler powershell commands successfully via PHP, but not this one. The Apache service is running as the same user as when I successfully executed the script via the command prompt.
Any ideas?
I'm running:
Windows Server 2008 R2 SP1 64-bit
Apache 2.2.22 32-bit
PHP 5.2.17 VC6 32-bit
Exchange Management Console 2010
You may try adding 2>&1 at the end of the command, e.g.:
$output=shell_exec('powershell "Add-PSSnapin Microsoft.Exchange.Management.PowerShell.E2010";"Get-CASMailbox -Identity user#example.org | fl ActiveSyncEnabled" 2>&1');
Sometimes, a line break at the beginning of the powershell output seems to prevent shell_exec printing anything else.
Related
I have MATLAB r2013b, IIS 7.5, php 5.6 on Windows Server 2008 and trying to do the following:
<?php
...
chdir($matlabScriptsDir);
exec("matlab -r test_func(args) -logfile $logfile", $output);
$output shows that process exited with code 0, in the TaskManager I see MATLAB.exe running from the user IUSR, $logfile is created and locked by this process, but the process does nothing – just hangs (there are no issues with running the same command from cmd.exe).
Where is the problem?
using the -r option is like entering commands to Matlab's command line, so after the command is executed you still have Matlab running, waiting for your next command. Try instead:
matlab -r "test_func(args), exit"
So that the Matlab process will end. I suggest you do even more, and wrap it with a try-catch (followed by exit) so that in case of an error the process will not hang.
I try to plan one-time job with 'at' command. There is next code in script:
$cmd = 'echo "/usr/bin/php '.$script_dir.$script_name.' '.$args.'"|/usr/bin/at "'.$time.'" 2>&1';
exec($cmd, $output , $exit_code);
When I run this command from script it adds the job to the schelude. This I see by the line in logs job 103 at Thu Sep 3 15:08:00 2015 (same text contains $output). But then nothing happens in specified time like at ignores the job. And there are no error messages in logs.
When I run same command with same args from command line on server it scheludes the job and than runs it at specified time.
I found out that when I try to plan a job via php script it runs under apache user. I tried to run next in command line on server:
sudo -u apache echo "/usr/bin/php /var/www/pant/data/www/pant.com/scripts/Run.php firstarg secondarg "|/usr/bin/at "16:00 03.09.2015"
It works correct too. I checked sudoers and have added apache user with NOPASSWD privileges. Script Run.php has execute rights.
at.deny is empty. at.allow does not exist.
So question is: why 'at' does not run command given via php script (exec) but runs same command in command line? How to run it?
Thanks to all.
I found by chance answer at stackexchange.com:
The "problem" is typically PHP is intended to run as module in a webserver. You may need to install the commandline version of php before you can run php scripts from the commandline
I noticed this problem after a php script ran from cron started to timeout but it was not an issue when it was ran manually from command line. (PHP has max_execution_time is 0 for CLI by default)
So I tried to run a simple cron such:
50 8 * * * php -q /tmp/phpinfo.php > /tmp/phpinfo
The script would just call phpinfo().
Surprisingly it wrote out phpinfo in html format, which suggested that it was not run as CLI. And max_execution_time was 30 in the output.
Running the script manually from command line such
php -q /tmp/phpinfo.php | less
wrote out the php info in text format and max_execution_time was 0 in the output.
I know there must be a configuration issue somewhere, but I just could not find where the problem is. This is happening on a production server, which I have a complete control of. Running the same script from cron on my development machine worked fine.
Here is the summary of the difference
function | CLI | cron |
php_sapi_name | cli | cgi-fcgi |
php_ini_loaded_file | /usr/local/lib/php.ini | /usr/local/lib/php.ini |
I suspect your problem lies in a missing environment variable, specifically the all-important $PATH. When you run this:
php -q /tmp/phpinfo.php
the system must work out what program you mean by php. It does this by looking, in order, through the directories in the current $PATH environment variable.
Executed from a normal shell, your environment is set up in such a way that it finds the CLI version of PHP, as you expect.
However, when cron executes a command, it does so without all the environment variables that your interactive shell would set up. Since there will probably be other executables called php on your system, for different "SAPIs", it may pick the "wrong" one - in your case, the cgi-fcgi executable, according to the output you report from php_sapi_name().
To fix this, first find the path to the correct php executable in a normal shell by typing this:
which php
This should give you a path like /usr/bin/php. You can go one further and check if this is actually a "symbolic link" pointing at a different filename:
ls -l $(which php)
(you'll see an arrow in the output if it is, like /usr/bin/php -> /usr/bin/php5-cli)
Then take this full path to the PHP executable and use that in your crontab entry, so it looks something like this:
50 8 * * * /usr/bin/php5-cli -q /tmp/phpinfo.php > /tmp/phpinfo
I know that it is quite easy to figure out a terminal's size parameters via the stty -a command. When using local CLI PHP scripts, there is no problem at all on grabbing that output via system() or so.
But I am trying the same thing via a php script started from an ssh command. Sadly, all that stty ever returns is:
stty: standard input: Invalid argument.
The calling code is:
exec('stty -a | head -n 1', $query);
echo $query[0];
So the question is: If I can output to the terminal and read input from it (e.g. can fread() from STDIN and fwrite() to STDOUT in PHP, shouldn't stty also have valid STDIN and STDOUT?
Use ssh -t:
% php ~/src/termtest.php
speed 9600 baud; 39 rows; 127 columns;
% ssh localhost php ~/src/termtest.php
stty: stdin isn't a terminal
% ssh -t localhost php ~/src/termtest.php
speed 9600 baud; 39 rows; 127 columns;Connection to localhost closed.
SSH does not pass in a fully functional terminal by default. Shells and ncurses seem to be able to get them somehow, but to launch something that needs one directly from SSH you need to set -t.
For the same reason you can e.g. launch tmux (or screen) by ssh'ing to a server and then typing tmux at the prompt or through ssh -t _server_ tmux but not through sh _server_ tmux.
I am trying to run a php CLI script in the background and it just won't run - it has a status of Stopped SIGTOU (Trying to write output) - Here are the details
Mac OS X Lion 10.7.2
PHP 5.3.6 with Suhosin-Patch (cli) (built: Sep 8 2011 19:34:00)
I created a basic script test.php
<?php echo 'Hello world'.PHP_EOL; ?>
Here are the results of various tests:-
php -f test.php (Hello world gets displayed)
php -f test.php >test.log 2>&1 (Hello world gets put into test.log)
php -f test.php >test.log 2>&1 & --- I get [1]+ Stopped(SIGTTOU) php -f test.php > test.log 2>&1 -- and the job just sits there doing nothing nothing gets logged however lsof shows the log file is open
It is something to do with PHP? A similar shell script gets executed no problems in the background.
If readline is enabled in your build of php, simply pass /dev/null as the input.
In your example above, it would be:
php -f test.php </dev/null >test.log 2>&1
This is resolved now -- thanks to all who responded. The problem was that Apple provide PHP pre-built with the OS - the CLI version was built with readline included - http://www.php.net/manual/en/intro.readline.php ... this prevents any background running of scripts because readline automatically starts IO with the TTY ...
My problem was that I couldn't build my own version of PHP because of this -> http://forums.macrumors.com/showthread.php?t=1284479 - once I got that resolved my background script issue was gone :)
Well, a PHP script stops when its done execution, ie a simple echo "Hello World" is done execution as soon as it outputted the string, i would guess it have something to do with it ;-)