I have tried and failed to understand why my command line program does not work in Apache environment using a PHP exec() function call. Here is the scenario:
Installed Apache 2.x and PHP 5.3.3 on CentOS 32-bit
Hard-coded a php file called myScript.php that contains a simple call to exec() such as:
exec("./imageManipuator testImage.jpg 512 512 >& output.log &");
The exec() function should call my program with the given parameters to process the request, redirect output to a file "output.log" all in a background process defined by '&'.
I check the log and the program executes 1/4 way through the program and terminates without a clue why.
I tried executing the PHP file via PHP Interactive shell call via:
$ php myScript.php
The execution of the program completely finishes!!! That brings me to feel:
What is wrong with my Apache/PHP configuration to disallow my program to execute all the way.
Perhaps a permission issue? I tried changing the logged user as the Administrator user but it did not change anything.
Perhaps a memory issue? Shouldn't there be a warning or notification if a memory limit has been reached? I have not verified this is the issue?
That is my issue. I know I am using the function exec() correctly to execute the program in the background because the same script works when calling the same file via PHP Interactive shell. What could be wrong in the Apache/PHP configuration that disallows my program to fully execute?
Any suggestions is a bonus as I have exhausted my ideas.
Related
I know there's been similar questions but they don't solve my problem...
After checking out the folders from the repo (which works fine).
A method is called from jquery to execute the following in php.
exec ('svn cleanup '.$checkout_dir);
session_write_close(); //Some suggestion that was supposed to help but doesn't
exec ('svn commit -m "SAVE DITAMAP" '.$file);
These would output the following:
svn cleanup USER_WORKSPACE/0A8288
svn commit -m "SAVE DITAMAP" USER_WORKSPACE/0A8288/map.ditamap
1) the first line (exec ('svn cleanup')...executes fine.
2) as soon as I call svn commit then my server hangs, and everything goes to hell
The apache error logs show this error:
[notice] Child 3424: Waiting 240 more seconds for 4 worker threads to finish.
I'm not using the php_svn module because I couldn't get it to compile on windows.
Does anyone know what is going on here? I can execute the exact same cmd from the terminal windows and it executes just fine.
since i cannot find any documentation on jquery exec(), i assume this is calling php. i copied this from the documentation page:
When calling exec() from within an apache php script, make sure to take care of stdout, stderr and stdin (as in the example below). If you forget this and your shell command produces output the sh and apache deamons may never return (they will normally time out after a few minutes). From the calling web page the script may seem to not return any data.
If you want to start a php process that continues to run independently from apache (with a different parent pid) use nohub. Example:
exec('nohup php process.php > process.out 2> process.err < /dev/null &');
hope it helps
Okay, I've found the problem.
It actually didn't have anything to do with the exec running the background, especially because a one file commit doesn't take a lot of time.
The problem was that the commit was expecting a --username and --password that didn't show up, and just caused apache to hang.
To solve this, I changed the svnserve.conf in the folder where I installed svn and changed it to allow non-auth users write access.
I don't think you'd normally want to do this, but my site already authenticates the user name and pass upon logging in.
Alternatively you could
I am having an issue using the PHP function shell_exec().
I have an application which I can run from the linux command line perfectly fine. The application takes several hours to run, so I am trying to spawn a new instance using shell_exec() to manage better. However, when I run the exact same command (which works on the command line) through shell_exec(), it returns an empty string, and it doesn't look like any new processes were started. Plus it completes almost instantly. shell_exec() is suppose to wait until the command has finished correct?
I have also tried variations of exec() with the same outcome.
Does anyone have any idea what could be going on here?
There are no symbolic links or anything funky in the command: just the path to the application and a few command line parametes.
Some thing with you env
See output of env from cli (command line interface) and php script
Also see what your shell interpreter?
And does script and cli application runs from one user?
If so, se option safe_mode
Make sure the user apache is running on (probably www-data) has access to the files and that they are executable (ls -la). A simple chmod 777 [filename] would fix that.
By default PHP will timeout after 30 sec. You can disable the limit like this:
<?php
set_time_limit(0);
?>
Edit:
Also consider this: http://www.rabbitmq.com/
it's a bit complicated. :-) I have a website written in PHP that needs to call an external program written in .NET, running under Mono. It calls the program with:
/path/mono /path/executable arguments...
as per usual (I also tried to put this into a bash script and call the script itself from PHP).
As far as I can tell, everything is set up as required. Mono itself running all right, executable (or script) with executable rights, owners are all right. The correct setup can also be ascertained from the fact that if I issue the command line above (or call the script mentioned) from a command line on the server (Debian Lenny), everything works perfectly, my executable is executed without error.
But, if PHP calls the same (using exec(), system() or any other variant), I immediately get an exit code of 6 from Mono and my executable is not run at all (in order to test it, I use a "Hello World" executable that does nothing but emits a single output line and returns 0). Compiled without any additional dependency, just a real "Hello Word". It's absolutely sure that this program doesn't return the exit code 6, it must come from Mono but I couldn't find it documented anywhere.
I can call anything else from PHP, so it's not safe mode or any similar restriction that would keep me from executing external programs from PHP.
Thanks for any idea...
Try to execute mono from the command line but with the user running the webserver (www-data or apache).
ej.
#su apache
$/path/to/mono /path/to/program.exe
The common problems that I have using exec is a different environment variables (look if is bash or sh what apache is using), permissions to the user/group of apache, etc.
I need to run a Python script in the background after being called from a PHP file. The PHP file should continue to run independently of the Python script (i.e. it shouldn't hang waiting for the Python script to finish processing, but should instead carry on processing itself).
The Python script takes one argument and produces no output (it merely processes some data in the background), then exits. I'm running Python 2.6, PHP 5.2.6, and Ubuntu 9.04.
You could use exec() to kick off the Python interperator and have it send its output to either a file or to /dev/null with redirection. Using the & operator in the exec call will cause the command to be started and PHP to continue without waiting for a result.
http://www.developertutorials.com/tutorials/php/running-background-processes-in-php-349/ goes into more detail.
PHP Process Control can be used for this. The proc_open command can be used to start a process. You can later check up on it, read it's output etc.
View the manual entry: http://www.php.net/manual/en/function.proc-open.php and search around google for PHP Process Control
I'm guessing the PHP file is called via Apache, in which case you won't be able to fork(). You should make your Python script daemonize. Check out python-daemon.
You could use:
<?php
shell_exec('./test.sh &');
?>
where ./test.sh should be the execution line to your script
I am trying to run a php script on my remote Virtual Private Server through the command line. The process I follow is:
Log into the server using PuTTY
On the command line prompt, type> php myScript.php
The script runs just fine. BUT THE PROBLEM is that the script stops running as soon as I close the PuTTY console window.
I need the script to keep on running endlessly. How can I do that? I am running Debian on the server.
Thanks in advance.
I believe that Ben has the correct answer, namely use the nohup command. nohup stands for nohangup and means that your program should ignore a hangup signal, generated when you're putty session is disconnected either by you logging out or because you have been timed out.
You need to be aware that the output of your command will be appended to a file in the current directory named nohup.out (or $HOME/nohup.out if permissions prevent you from creating nohup.out in the current directory). If your program generates a lot of output then this file can get very large, alternatively you can use shell redirection to redirect the output of the script to another file.
nohup php myscript.php >myscript.output 2>&1 &
This command will run your script and send all output (both standard and error) to the file myscript.output which will be created anew each time you run the program.
The final & causes the script to run in the background so you can do other things whilst it is running or logout.
An easy way is to run it though nohup:
nohup php myScript.php &
If you run the php command in a screen, detach the screen, then it won't terminate when you close your console.
Screen is a terminal multiplexer that allows you to manage many processes through one physical terminal. Each process gets its own virtual window, and you can bounce between virtual windows interacting with each process. The processes managed by screen continue to run when their window is not active.