php freezes when executing an external sh script - php

I'll try to explain my problem in a time line history:
I've tried to run several external scripts from php and to return the exit code to the server with an ajax call again.
A single call should start or stop an service on that machine. That works fine on this developing machine.
OS : raspbian Os
Webserver : NginX 1.2.1
Php : 5.4.3.6
However I've exported the code to a larger machine with much more power and everything seemed to work fine but one thing:
A single call causes the php-fpm to freezes and never to come back. By detailed examination I found out, that the call created a zombie process I can not terminate (even with sudo).
OS : Ubuntu
Webserver : NginX 1.6.2
Php : 5.5.9
The only solution seemed to stop the php-fpm proc and than to restart it again. Then everything seems to work fine again, as long as I try to call that script again.
Calling php line
exec("sudo ".$script, $output, $return_var);
(With all variables are normal 'strings' with no special chars)
Start script
#!/bin/sh
service radicale start 2>&1
The service by the way started, but every time the webserver freezes and I had to restart php manually, but that is not acceptable (even for a web server). But only for that single script and only for that service (radicale) with that solemn command (start).
Searching in Google brought me to the point that there is a conflict between the php commands exec() and session_start().
Links:
https://bugs.php.net/bug.php?id=44942
https://bugs.php.net/bug.php?id=44994
Their conclusion was, that that bug could be worked around with such a construct:
...
session_write_close();
exec("sudo ".$script, $output, $return_var);
session_start();
...
But that, for my opinion, was no debugging, but more a helplessly workaround, because you loose the functionality of letting the user know, that his actions have fully functioned, but more let him believe an error has occurred. Much more confusing is the fact, that it runs fully on the Raspberry Pi A, but not on a 64-bit machine with a much larger CPU and 8 GB RAM.
So is there a real solution anywhere or is this workaround the only way to solve that problem? I've read a article about php having some probs with exec/shell_exec and the recognition of the return value? How can that be lost? Someone's having a guess?
THX for reading that long awful English, but I'm no native speaker and was no well listening student in my lessons.

It is likely the case that the new machine simply is not set up the way the Raspberry PI was setup -
You need to do a few things in your shell before this will work on your larger machine:
1). Allow php to use sudo.
sudo usermod -G sudo -a your-php-user
Note that to get the username for your-php-user, you can just run a script that says:
<?php echo get_current_user(); ?> - or alternatively:
<?php echo exec('whoami'); ?> -
2). Allow that user to use sudo without a password
sudo visudo - this command will open /etc/sudoers with a failsafe to keep you from botching anything.
Add this line to the very end:
your-php-user ALL=(ALL) NOPASSWD: /path/to/your/script,/path/to/other/script
You can put as many scripts there, separated by commas, as you need.
Now, your script should work just fine.
AGAIN, please note that you need to change your-php-user to whatever your php user is.
Hope this helps!

This is not a real solution, but it's a better solution than none.
Calling a bash script with
<?php
...
exec("sudo ".$script, $output, $return_var);
...
?>
ends only in this special case in a zombie Thread. As php-fpm waits in expectation for a result, it still holds the line, not giving up nor time outs for the rest of its thread still living. So every other request to the php server is still in queue and will never be processed. That may be okay for some long living or working threads, but my request was done in some [ms].
I did not found the cause for this. As far as I could do debugging, I wasn't the triggered Radicale process fault, for this on gave a any time clean and brave 0 as in return. It seemed that a php process just couldn't get a return line from it and so it still waits and waits.
No time left I changed the malfunction script from
#!/bin/sh
service radicale start 2>&1
to
#!/bin/sh
service radicale start > /dev/null 2>&1 &
... so signaling every returning line to nirvana and disconnecting all subroutines. For now the server did not hung itself up and works as desired. But the feeling this may be a major bug in php still stays in the back of my head, with the hope, that - someday - someone may defeat that bug.

Related

Async linux shell call from php not working

I'm stuck with a problem with an async shell call (from PHP), that simply isn't async, even though it should be.
exec("nohup sudo -H -u ubuntu /usr/local/bin/purge.sh $purgeString &> /dev/null &");
The script above is hit about 17 times in this one request type, so parallel operations is a must here!
WITHOUT the above operation (commented out) the request takes about 4 seconds. WITH the above operation, it takes 15+ seconds, which definitely means that the async part isn't working.
purge.sh writes its progress to a log file, and I can tail the log and see the entries come slowly and sequentially from each of the 17 operations.
I've tried all the solutions in Asynchronous shell exec in PHP and have had zero success.
[MORE INFO] -
I can run the command manually in the shell, as another user, and it works as expected. The file and folder permissions also seem to check out.
Anyone have any insights in what could be wrong?
OS: Ubuntu 14.04, PHP: 7.08, Webserver: NGINX 1.11.2.
EDIT
We scrapped this approach. But if anyone answers, I'll still mark the correct answer.

Issues executing vbscript through PHP on WAMP stack

I am having issues executing a VBScript through Apache (WAMP) on Windows Server 2012. I am attempting to convert a Docx to PDF, and the script runs perfectly from the command line, but fails when running through PHP. Rather than posting the vbscript, I will provide a link to it: http://bit.ly/1gngYAn
When executed through PHP as follows, WINWORD.exe starts, as does the VBScript, and it hangs there and nothing happens. No PDF is generated (and I never see the ~temporary.docx hidden file pop in the directory).
I have tried just about every iteration of exec, system, passthru and COM ( 'WScript.Shell' ), and all have the same outcome.
To avoid "escaping" issues, I also tried executing the script though a .bat file so no arguments needed to be passed, and the outcome was the same.
Here is my current php code (convert.vbs is the code from the link above):
$obj = new COM ( 'WScript.Shell' );
$obj->Run ( 'cmd /C wscript.exe //B C:\Users\Administrator\Desktop\convert.vbs c:\wamp\www\fileconv\temp_store\52fa8272bf84f.docx', 1, false );
//I have tried different "window styles" too, and it doesn't make a difference
I also tried modifying the apache service user to run as administrator (this is not a production server), and enabled "Allow service to interact with the desktop", and it had the same outcome.
I have also made sure the directories had "full control" by everyone (reading, writing, executing, etc).
It runs perfectly if I run from the command line or with my ".bat" file.
Since it hangs (the script and word, not apache), I have looked at the event viewer in the control panel, but there are no events that pertain.
My questions is firstly, why is this happening, and secondly, if the first cannot be answered, is there a way that I can get a more in depth look at what is happening when the process is executed, as to further troubleshoot it? As of now, I have no data to review or output to see to help me troubleshoot.
Please feel free to ask for any details. I have tried many, many iterations to try to get this to work, searched high and low, and can't seem to come up with any answers.
I appreciate your assistance,
Louis
It took me a couple of days, but here is the solution I found:
I used PsExec - http://technet.microsoft.com/en-us/sysinternals/bb897553.aspx
The following flags are required: -h -i -accepteula -u -p
(I tried without the -h, -accepteula and -i, but no dice. This is running on Windows Server 2012 under WAMP)
Here is an example:
exec('c:\psexec\PsExec -h -i -accepteula -u Administrator -p '.$password.' C:\Windows\System32\CScript.exe //Nologo //B c:\wamp\www\fileconv\convert.vbs '.$filename)
Now it executes properly and as intended.
I hope this helps someone in the same situation!
PS The WScript.Shell method of execution I used in my question works just as well as exec(), except exec() waits until the process exits.
You should use exec() function
this is the url http://php.net/manual/fr/function.exec.php

suPHP and Lazarus console application running into weird shell malfunctions

i do appologize for the title, but couldn't find any other explaination. My company is running a development server with the latest LTS Ubuntu+Apache2+suPHP. To handle it, i am writing a Zend2 and Lazarus application. The web part with Zend runs well.
The problem is the console application written in Lazarus. It runs a couple of classes, to create databases and users, to download frameworks and so on. Also it should run a couple of shell commands for administration purpose (with root permissions). To aquire the rights, i am using a pretty ugly solution, using echo mymagicpassword | sudo -S mymagiccommand.
Here's a snippet:
constructor TRootProcess.Create(AOwner: TComponent);
begin
inherited Create(AOwner);
Options:=[poUsePipes,poWaitOnExit];
Executable:='/bin/sh';
Parameters.Add('-c');
Parameters.Add('echo %pwd% | sudo -S ');
end;
function TRootProcess.ExecuteCommand(command: String): String;
var
str: TStringList;
begin
str:=TStringList.Create;
command:=Copy(Parameters.GetText, 0, Length(Parameters.GetText)-1)+command;
command:=StringReplace(command,'%pwd%','mymagicpassword',[rfReplaceAll]);
Parameters.SetText(PChar(command));
Execute;
str.Clear;
str.LoadFromStream(Output);
Result:=str.Text;
end;
If i run this application by hand, everything runs well. But if i run it from PHP Applicaiton using shell_exec , the whole application runs (even the very last log entries) beside, starting other shell applications (ls, cp mkdir, useradd, chmod and so on)
I have actually no idea, what the problem is, anymore.
I don't get any errors in stdout/stderr, suPHP log or even Apache2 log.
Also running from PHP went well for about a week and apparently stopped working.
Thanks in advance
The problem is not really well described. At the very least, the line with Copy( is wrong, since strings start with index 1, not 0.
The loadfromstream is also not safe. Specially with larger outputs this might not complete. See "TProcess large I/O" in the Lazarus/FPC wiki.
Finally, you spawn new shells. After the command is done, the shell will be destroyed, and the next command will have yet another new shell. So doing "cd" is pretty pointless that way.

Invoking "php" command from a PHP script causing strange process behavior

I just moved a site from one host to another. The server environment is very similar (LAMP stack) and all the code worked when it got transferred, except one line. I've mutated it a bit for testing and am still getting very odd results:
<?php
$out = `php ../test/test.php 123 abc`;
?>
When running php ../test/test.php 123 abc from the command line in SSH, it works fine, as expected. And when I run: php testrunner.php (the file which has only the "$out" line above in it) in SSH, it also works as expected.
But once I load testrunner.php from the browser, it just hangs. Using ps aux | grep php to monitor the processes, processes seem to spawn up and die down (truncated for brevity):
myuser 12790 0.0 0.3 259016 45284 . . . 0:00 php ../test/test.php 123 abc
If I modify the "$out" line to be:
<?php
$out = `php ../test/test.php 123 abc &`;
?>
then I cause that script to run in the background. Surprisingly, a few seconds later when I run ps aux | grep php again, it shows the same stuff but with a new PID. I keep running ps aux and keep seeing it with a different PID. This continues for quite some time (several seconds, maybe even a minute).
This is very odd to me, since test.php only has a line to echo some text for testing purposes.
Works fine from the terminal. Hangs and has other weird behavior when invoked from the web. Am I missing something?
(I have evidence by redirecting output to a log file that, when run from the web browser, the PHP script seems to invoke ITSELF instead of the other script, test.php. And when it behaves like this, it doesn't receive any $argv parameters... but when I run it from the command line, all is well! Strange?)
UPDATE: Geez... I was just watching the server processes and the PHP ones of test.php started spawning out of control. They multiplied into the hundreds, maybe thousands, of processes: the server was brought down for a minute, SSH and everything. It's back up now, but I can't explain what's going on. There's no loops in the code and both the files involved are super-simple, isolated for testing purposes...
I'm working with my host as they respond to my support ticket, to see if this is environment-related or what... what could cause this to be happening, simply by changing the server environment?
My host, A Small Orange, has been helpful, but in the end, all I or they can figure is (from my support ticket):
... that SuPHP or some other security-based software we have running as part of our stack is preventing your processes from spawning new processes (because that behavior can be insecure for obvious reasons) ...
In any case, the scripts work fine on my Macbook (very different configuration with nginx) and on my old host's LAMP stack, which ASO has a similar setup.
Perhaps I will ask about spawning long-running processes without invoking the command line so that the calling script isn't blocked in another question.
Remove the spaces and put underscore
$out = `php ../test/test.php_123_abc`;

php exec(svn commit) hangs

I know there's been similar questions but they don't solve my problem...
After checking out the folders from the repo (which works fine).
A method is called from jquery to execute the following in php.
exec ('svn cleanup '.$checkout_dir);
session_write_close(); //Some suggestion that was supposed to help but doesn't
exec ('svn commit -m "SAVE DITAMAP" '.$file);
These would output the following:
svn cleanup USER_WORKSPACE/0A8288
svn commit -m "SAVE DITAMAP" USER_WORKSPACE/0A8288/map.ditamap
1) the first line (exec ('svn cleanup')...executes fine.
2) as soon as I call svn commit then my server hangs, and everything goes to hell
The apache error logs show this error:
[notice] Child 3424: Waiting 240 more seconds for 4 worker threads to finish.
I'm not using the php_svn module because I couldn't get it to compile on windows.
Does anyone know what is going on here? I can execute the exact same cmd from the terminal windows and it executes just fine.
since i cannot find any documentation on jquery exec(), i assume this is calling php. i copied this from the documentation page:
When calling exec() from within an apache php script, make sure to take care of stdout, stderr and stdin (as in the example below). If you forget this and your shell command produces output the sh and apache deamons may never return (they will normally time out after a few minutes). From the calling web page the script may seem to not return any data.
If you want to start a php process that continues to run independently from apache (with a different parent pid) use nohub. Example:
exec('nohup php process.php > process.out 2> process.err < /dev/null &');
hope it helps
Okay, I've found the problem.
It actually didn't have anything to do with the exec running the background, especially because a one file commit doesn't take a lot of time.
The problem was that the commit was expecting a --username and --password that didn't show up, and just caused apache to hang.
To solve this, I changed the svnserve.conf in the folder where I installed svn and changed it to allow non-auth users write access.
I don't think you'd normally want to do this, but my site already authenticates the user name and pass upon logging in.
Alternatively you could

Categories