I need to thumb images in a separate process while the server sends an HTTP response, so I'm exec'ing a PHP CLI script. When the script is run directly by CLI, it works fine; but when I exec it, Imagick forces the exit status to 11 despite my exit(0). The latest point at which I can exit to prevent the 11 status is just before flattenImages is called.
PHP CLI source: http://codepad.org/WTHOiWw0 (designed for execution either as ordinary PHP or via CLI)
example CLI invocation: php -f lib/php/thumb_test.php -- img=om3e2a
issue history: https://stackoverflow.com/questions/5255
I tried to minimize that test-case by taking out all the validation and database interaction, but when I tried the 11 status left.
I finally thought to check Apache's error.log, and the 11 status was accompanied by this:
PHP Warning: Module 'imagick' already loaded in Unknown on line 0
I found the solution here:
http://www.somacon.com/p520.php
Apparently I accidentally put an extra << extension="imagick.so" >> line in php.ini. Removing it allowed the CLI script to return status 0.
Related
<?PHP
exec("ffmpeg.exe -i something.mp4 -ss 1 -t 1 -r 1 -s 320x240 -y something.jpg");
?>
Calling this script results in a server error 500 every other request.
PHP 7.01, IIS 10.
I have already ruled out that the problem might be related to the specific ffmpeg paramters of my call.
The execution is < than 1 second, so it can't be a PHP or IIS script execution timeout.
No matter how much time passes between one "call" to the script and the next, the odd numbered calls result in error 500, the even numbered calls are just fine.
Note that when I say "call" I actually refer to calling the script (i.e. http://server/script.php ) - whereas if I put 2,3, or 100 calls to Exec within the same script, they will all succeed.
Edit: Quite randomly, I tried to trigger a timeout by calling the same Exec("ffmpeg etc. ) line 100 times in a loop. To my surprise, the Error 500 disappears. So I removed the loop and added a similar pause by adding a call to sleep(10): the error 500 returns, and it's instant - like the server fails to run the script even before parsing it. Now I am totally lost..
Any hint?
Well, it seems that changing the FastCGI protocol for PHP from Named Pipe to TCP, fixed the problem.
It would still be interesting, though, understanding what makes Named Pipes fail immediately every other time. Setting Named Pipe Flushing didn't help.
I am trying to write a web app to run my R script.
I plan to use php shell_exec() to make this happen.
I wrote a test script like this
<?php echo shell_exec("Rscript /var/www/html/demo/MyRScript.R"); ?>
I run it through command line.
php test.php, it gives right output, I also observe through htop and found the process was using 100% cpu for couple of minutes.
However, when I tried to run it through visiting it through browser, through the address http://myURL/demo/test.php. It didn't run properly, it only gives first few line of my R script onbrief Off On FALSE 405 0 TRUE 0 405 petitioner P R FALSE 0 396 TRUE 414 0and stopped immediately. I cannot find that process through htop either.
I have tried other simple command line like ls, rm, they all work properly both on cmd line and through web app.
I have no idea what is wrong. Can it because the process takes too much CPU, so some mechanism will terminate it if it is call by web?
Or if there is any debug tool or method I can use to help me find the problem.
This is my first question on Stack overflow, I don't know if my information is enough. please tell me if there is other more information needed to tackle the problem.
Thank you so much.
It turn out to be the file permission problem.
In my R script, there is a line trying to access a file, while the web server user www-data do not have permission to read. After grant the permission to it, everything goes fine.
Adding 2>$1 at the end of my command is very useful, I am able to get the error message in my browser.
I wrote a simple PHP code to execute a console program:
<?php
$cmd = escapeshellcmd('progName.exe arg1 arg2 arg3');
exec($cmd);
?>
If I run the command on the console directly on the server, it works. However, when I run the PHP on the browser, it doesn't work. The process progName.exe is running (checked using Task Manager on the server), but it never finishes. This program is supposed to compute some parameters from the arguments and write the result to a binary file, and also produce a .WAV file. Here is the error message I get on the browser:
Error Summary
HTTP Error 500.0 - Internal Server Error
C:\php\php-cgi.exe - The FastCGI process exceeded configured activity timeout
Detailed Error Information
Module FastCgiModule
Notification ExecuteRequestHandler
Handler PHP
Error Code 0x80070102
Then I wrote a simple console program that write a sentence to a text file (writeTxt.exe hello.txt). Using the same PHP script, I ran it on the browser and it works.
I already tried to increase the timeout on the server, but still have the same error.
What could cause this problem?
When you execute a program in PHP using the exec function (e.g. exec('dir')), PHP waits until it is ended or you sent it to the background and PHP comes back directly (see documentation, especially the comments).
According to your posted PHP sources ($cmd = escapeshellcmd('progName.exe arg1 arg2 arg3');) the program is not sent to background by PHP - so what stays is that progName.exe...
...sends itself or a fork to the background (unlikely, but look into the sources of progName.exe)
...is waiting for input (<-- this is my favorite)
I missed something ;-)
As I said I bet it is the second option. Hope that helped a bit.
I have tried and failed to understand why my command line program does not work in Apache environment using a PHP exec() function call. Here is the scenario:
Installed Apache 2.x and PHP 5.3.3 on CentOS 32-bit
Hard-coded a php file called myScript.php that contains a simple call to exec() such as:
exec("./imageManipuator testImage.jpg 512 512 >& output.log &");
The exec() function should call my program with the given parameters to process the request, redirect output to a file "output.log" all in a background process defined by '&'.
I check the log and the program executes 1/4 way through the program and terminates without a clue why.
I tried executing the PHP file via PHP Interactive shell call via:
$ php myScript.php
The execution of the program completely finishes!!! That brings me to feel:
What is wrong with my Apache/PHP configuration to disallow my program to execute all the way.
Perhaps a permission issue? I tried changing the logged user as the Administrator user but it did not change anything.
Perhaps a memory issue? Shouldn't there be a warning or notification if a memory limit has been reached? I have not verified this is the issue?
That is my issue. I know I am using the function exec() correctly to execute the program in the background because the same script works when calling the same file via PHP Interactive shell. What could be wrong in the Apache/PHP configuration that disallows my program to fully execute?
Any suggestions is a bonus as I have exhausted my ideas.
When I call /usr/local/bin/pdftk from PHP in Apache (via shell_exec(), exec(), system(), etc.), it returns the SYNOPSIS message as expected.
When I call /usr/local/bin/pdftk input.pdf fill_form input.fdf output output.pdf flatten via shell_exec(), nothing returns.
When I copy and paste the exact same string to the same path in the shell (as the apache user), the output.pdf file is generated as expected.
Moving the pdftk command into a PHP shell script (shebang is #!/usr/bin/php) and executing it with php script.php works perfectly.
Calling that shell script (with its stderr redirected to stdout) from PHP in Apache (via shell_exec(script.php);) results in this line:
sh: line 1: 32547 Segmentation fault /usr/local/bin/pdftk input.pdf fill_form input.fdf output output.pdf flatten 2>&1
Whenever I run the script from the command line (via PHP or directly), it works fine. Whenever I run the script through PHP via Apache, it either fails without any notification or gives the SegFault listed above.
It's PHP 4.3.9 on RHEL4. Please don't shoot me. I've set memory to 512M with ini_set() and made sure that the apache user had read/write to all paths (with fopen()) and by logging in as apache ...
Just went and checked /var/log/messages to find this:
Oct 4 21:17:58 discovery kernel: audit(1286241478.692:1764638):
avc: denied { read } for pid=32627 comm="pdftk" name="zero"
dev=tmpfs ino=2161 scontext=root:system_r:httpd_sys_script_t
tcontext=system_u:object_r:zero_device_t tclass=chr_file
NOTE: Disabling SELinux "fixed" the problem. Has this moved into a ServerFault question? Can anybody give me the 30 second SELinux access controls primer here?
php-cli & php-cgi (or the module, depends on what your server uses) are different binaries. They don't even have to share the same version to live happily side by side on your server. They also may not share the same configuration. Increasing memory usually does nothing to help Segfaults. Points to check:
Are they the same version?
Do they have the same settings (consult the *.ini locations loaded in the phpinfo(); output, and possibly the whole output itself), if not: try what happens if you alter the one for your webserver to the one for the cli as far as possible.
Segfaults occur more in extensions then in the core afaik, and sometimes seemingly unrelated. Try to disable unneeded extensions one by one to see if the problem goes away.
Still no success? You may want to run apache with gdb, but I have no experience with that, it might tell you something though.
No luck? Recompile either the module of cgi your webserver uses.
It's PHP 4.3.9 on RHEL4. Please don't shoot me.
I feel more sad for you then anger, we're beyond the 5.3 mark, come over, it's a lot more happy here.