Invoking "php" command from a PHP script causing strange process behavior - php

I just moved a site from one host to another. The server environment is very similar (LAMP stack) and all the code worked when it got transferred, except one line. I've mutated it a bit for testing and am still getting very odd results:
<?php
$out = `php ../test/test.php 123 abc`;
?>
When running php ../test/test.php 123 abc from the command line in SSH, it works fine, as expected. And when I run: php testrunner.php (the file which has only the "$out" line above in it) in SSH, it also works as expected.
But once I load testrunner.php from the browser, it just hangs. Using ps aux | grep php to monitor the processes, processes seem to spawn up and die down (truncated for brevity):
myuser 12790 0.0 0.3 259016 45284 . . . 0:00 php ../test/test.php 123 abc
If I modify the "$out" line to be:
<?php
$out = `php ../test/test.php 123 abc &`;
?>
then I cause that script to run in the background. Surprisingly, a few seconds later when I run ps aux | grep php again, it shows the same stuff but with a new PID. I keep running ps aux and keep seeing it with a different PID. This continues for quite some time (several seconds, maybe even a minute).
This is very odd to me, since test.php only has a line to echo some text for testing purposes.
Works fine from the terminal. Hangs and has other weird behavior when invoked from the web. Am I missing something?
(I have evidence by redirecting output to a log file that, when run from the web browser, the PHP script seems to invoke ITSELF instead of the other script, test.php. And when it behaves like this, it doesn't receive any $argv parameters... but when I run it from the command line, all is well! Strange?)
UPDATE: Geez... I was just watching the server processes and the PHP ones of test.php started spawning out of control. They multiplied into the hundreds, maybe thousands, of processes: the server was brought down for a minute, SSH and everything. It's back up now, but I can't explain what's going on. There's no loops in the code and both the files involved are super-simple, isolated for testing purposes...
I'm working with my host as they respond to my support ticket, to see if this is environment-related or what... what could cause this to be happening, simply by changing the server environment?

My host, A Small Orange, has been helpful, but in the end, all I or they can figure is (from my support ticket):
... that SuPHP or some other security-based software we have running as part of our stack is preventing your processes from spawning new processes (because that behavior can be insecure for obvious reasons) ...
In any case, the scripts work fine on my Macbook (very different configuration with nginx) and on my old host's LAMP stack, which ASO has a similar setup.
Perhaps I will ask about spawning long-running processes without invoking the command line so that the calling script isn't blocked in another question.

Remove the spaces and put underscore
$out = `php ../test/test.php_123_abc`;

Related

How to use PHP to execute AutoHotKey script on Windows Server 2016?

I have a Windows Server 2016 VPS with Plesk and PHP 7.1x.
I am trying to execute a simple AutoHotKey script from PHP using the following command:
<?php shell_exec('start /B "C:\Program Files\AutoHotkey\AutoHotkey.exe" C:\inetpub\vhosts\mydomain.com\App_Data\myahkscript.ahk'); ?>
This is the only line on the page. I have tried different ahk scripts, the current one simply creates a MsgBox.
When I execute my php page, on VPS Task Manager I see three processes created with the expected USR: cmd.exe, conhost.exe and php-cgi.exe. However, my PHP page just sits waiting on the server and nothing actually happens on the server.
I have also tried the same line except replacing shell_exec with exec. This seems to make no difference. I have tried without start /b with both commands. In that case the PHP page completes but no new processes are started.
I cannot find any errors in any logs: Mod_Security, Plesk Firewall, IIS.
Any ideas?
EDIT:
I tried my command from the VPS command prompt and immediately slapped in the face with the obvious issue of the space in 'Program Files'. I quoted the string as shown above and the command works. This eliminated the hang when running from PHP. However, the command still does nothing when executed from the web page.
EDIT:
Based on suggestions from the referenced post 'debugging exec()':
var_dump: string(0)""
$output: Array()
$return_val: 1
One point was that I would probably not be able to invoke GUI applications. That puts a damper on the idea.

php freezes when executing an external sh script

I'll try to explain my problem in a time line history:
I've tried to run several external scripts from php and to return the exit code to the server with an ajax call again.
A single call should start or stop an service on that machine. That works fine on this developing machine.
OS : raspbian Os
Webserver : NginX 1.2.1
Php : 5.4.3.6
However I've exported the code to a larger machine with much more power and everything seemed to work fine but one thing:
A single call causes the php-fpm to freezes and never to come back. By detailed examination I found out, that the call created a zombie process I can not terminate (even with sudo).
OS : Ubuntu
Webserver : NginX 1.6.2
Php : 5.5.9
The only solution seemed to stop the php-fpm proc and than to restart it again. Then everything seems to work fine again, as long as I try to call that script again.
Calling php line
exec("sudo ".$script, $output, $return_var);
(With all variables are normal 'strings' with no special chars)
Start script
#!/bin/sh
service radicale start 2>&1
The service by the way started, but every time the webserver freezes and I had to restart php manually, but that is not acceptable (even for a web server). But only for that single script and only for that service (radicale) with that solemn command (start).
Searching in Google brought me to the point that there is a conflict between the php commands exec() and session_start().
Links:
https://bugs.php.net/bug.php?id=44942
https://bugs.php.net/bug.php?id=44994
Their conclusion was, that that bug could be worked around with such a construct:
...
session_write_close();
exec("sudo ".$script, $output, $return_var);
session_start();
...
But that, for my opinion, was no debugging, but more a helplessly workaround, because you loose the functionality of letting the user know, that his actions have fully functioned, but more let him believe an error has occurred. Much more confusing is the fact, that it runs fully on the Raspberry Pi A, but not on a 64-bit machine with a much larger CPU and 8 GB RAM.
So is there a real solution anywhere or is this workaround the only way to solve that problem? I've read a article about php having some probs with exec/shell_exec and the recognition of the return value? How can that be lost? Someone's having a guess?
THX for reading that long awful English, but I'm no native speaker and was no well listening student in my lessons.
It is likely the case that the new machine simply is not set up the way the Raspberry PI was setup -
You need to do a few things in your shell before this will work on your larger machine:
1). Allow php to use sudo.
sudo usermod -G sudo -a your-php-user
Note that to get the username for your-php-user, you can just run a script that says:
<?php echo get_current_user(); ?> - or alternatively:
<?php echo exec('whoami'); ?> -
2). Allow that user to use sudo without a password
sudo visudo - this command will open /etc/sudoers with a failsafe to keep you from botching anything.
Add this line to the very end:
your-php-user ALL=(ALL) NOPASSWD: /path/to/your/script,/path/to/other/script
You can put as many scripts there, separated by commas, as you need.
Now, your script should work just fine.
AGAIN, please note that you need to change your-php-user to whatever your php user is.
Hope this helps!
This is not a real solution, but it's a better solution than none.
Calling a bash script with
<?php
...
exec("sudo ".$script, $output, $return_var);
...
?>
ends only in this special case in a zombie Thread. As php-fpm waits in expectation for a result, it still holds the line, not giving up nor time outs for the rest of its thread still living. So every other request to the php server is still in queue and will never be processed. That may be okay for some long living or working threads, but my request was done in some [ms].
I did not found the cause for this. As far as I could do debugging, I wasn't the triggered Radicale process fault, for this on gave a any time clean and brave 0 as in return. It seemed that a php process just couldn't get a return line from it and so it still waits and waits.
No time left I changed the malfunction script from
#!/bin/sh
service radicale start 2>&1
to
#!/bin/sh
service radicale start > /dev/null 2>&1 &
... so signaling every returning line to nirvana and disconnecting all subroutines. For now the server did not hung itself up and works as desired. But the feeling this may be a major bug in php still stays in the back of my head, with the hope, that - someday - someone may defeat that bug.

Issues executing vbscript through PHP on WAMP stack

I am having issues executing a VBScript through Apache (WAMP) on Windows Server 2012. I am attempting to convert a Docx to PDF, and the script runs perfectly from the command line, but fails when running through PHP. Rather than posting the vbscript, I will provide a link to it: http://bit.ly/1gngYAn
When executed through PHP as follows, WINWORD.exe starts, as does the VBScript, and it hangs there and nothing happens. No PDF is generated (and I never see the ~temporary.docx hidden file pop in the directory).
I have tried just about every iteration of exec, system, passthru and COM ( 'WScript.Shell' ), and all have the same outcome.
To avoid "escaping" issues, I also tried executing the script though a .bat file so no arguments needed to be passed, and the outcome was the same.
Here is my current php code (convert.vbs is the code from the link above):
$obj = new COM ( 'WScript.Shell' );
$obj->Run ( 'cmd /C wscript.exe //B C:\Users\Administrator\Desktop\convert.vbs c:\wamp\www\fileconv\temp_store\52fa8272bf84f.docx', 1, false );
//I have tried different "window styles" too, and it doesn't make a difference
I also tried modifying the apache service user to run as administrator (this is not a production server), and enabled "Allow service to interact with the desktop", and it had the same outcome.
I have also made sure the directories had "full control" by everyone (reading, writing, executing, etc).
It runs perfectly if I run from the command line or with my ".bat" file.
Since it hangs (the script and word, not apache), I have looked at the event viewer in the control panel, but there are no events that pertain.
My questions is firstly, why is this happening, and secondly, if the first cannot be answered, is there a way that I can get a more in depth look at what is happening when the process is executed, as to further troubleshoot it? As of now, I have no data to review or output to see to help me troubleshoot.
Please feel free to ask for any details. I have tried many, many iterations to try to get this to work, searched high and low, and can't seem to come up with any answers.
I appreciate your assistance,
Louis
It took me a couple of days, but here is the solution I found:
I used PsExec - http://technet.microsoft.com/en-us/sysinternals/bb897553.aspx
The following flags are required: -h -i -accepteula -u -p
(I tried without the -h, -accepteula and -i, but no dice. This is running on Windows Server 2012 under WAMP)
Here is an example:
exec('c:\psexec\PsExec -h -i -accepteula -u Administrator -p '.$password.' C:\Windows\System32\CScript.exe //Nologo //B c:\wamp\www\fileconv\convert.vbs '.$filename)
Now it executes properly and as intended.
I hope this helps someone in the same situation!
PS The WScript.Shell method of execution I used in my question works just as well as exec(), except exec() waits until the process exits.
You should use exec() function
this is the url http://php.net/manual/fr/function.exec.php

suPHP and Lazarus console application running into weird shell malfunctions

i do appologize for the title, but couldn't find any other explaination. My company is running a development server with the latest LTS Ubuntu+Apache2+suPHP. To handle it, i am writing a Zend2 and Lazarus application. The web part with Zend runs well.
The problem is the console application written in Lazarus. It runs a couple of classes, to create databases and users, to download frameworks and so on. Also it should run a couple of shell commands for administration purpose (with root permissions). To aquire the rights, i am using a pretty ugly solution, using echo mymagicpassword | sudo -S mymagiccommand.
Here's a snippet:
constructor TRootProcess.Create(AOwner: TComponent);
begin
inherited Create(AOwner);
Options:=[poUsePipes,poWaitOnExit];
Executable:='/bin/sh';
Parameters.Add('-c');
Parameters.Add('echo %pwd% | sudo -S ');
end;
function TRootProcess.ExecuteCommand(command: String): String;
var
str: TStringList;
begin
str:=TStringList.Create;
command:=Copy(Parameters.GetText, 0, Length(Parameters.GetText)-1)+command;
command:=StringReplace(command,'%pwd%','mymagicpassword',[rfReplaceAll]);
Parameters.SetText(PChar(command));
Execute;
str.Clear;
str.LoadFromStream(Output);
Result:=str.Text;
end;
If i run this application by hand, everything runs well. But if i run it from PHP Applicaiton using shell_exec , the whole application runs (even the very last log entries) beside, starting other shell applications (ls, cp mkdir, useradd, chmod and so on)
I have actually no idea, what the problem is, anymore.
I don't get any errors in stdout/stderr, suPHP log or even Apache2 log.
Also running from PHP went well for about a week and apparently stopped working.
Thanks in advance
The problem is not really well described. At the very least, the line with Copy( is wrong, since strings start with index 1, not 0.
The loadfromstream is also not safe. Specially with larger outputs this might not complete. See "TProcess large I/O" in the Lazarus/FPC wiki.
Finally, you spawn new shells. After the command is done, the shell will be destroyed, and the next command will have yet another new shell. So doing "cd" is pretty pointless that way.

PHP shell_exec() issue

I am having an issue using the PHP function shell_exec().
I have an application which I can run from the linux command line perfectly fine. The application takes several hours to run, so I am trying to spawn a new instance using shell_exec() to manage better. However, when I run the exact same command (which works on the command line) through shell_exec(), it returns an empty string, and it doesn't look like any new processes were started. Plus it completes almost instantly. shell_exec() is suppose to wait until the command has finished correct?
I have also tried variations of exec() with the same outcome.
Does anyone have any idea what could be going on here?
There are no symbolic links or anything funky in the command: just the path to the application and a few command line parametes.
Some thing with you env
See output of env from cli (command line interface) and php script
Also see what your shell interpreter?
And does script and cli application runs from one user?
If so, se option safe_mode
Make sure the user apache is running on (probably www-data) has access to the files and that they are executable (ls -la). A simple chmod 777 [filename] would fix that.
By default PHP will timeout after 30 sec. You can disable the limit like this:
<?php
set_time_limit(0);
?>
Edit:
Also consider this: http://www.rabbitmq.com/

Categories