It seems php exec is not waiting for execution - php

Here's the thing,
$file="myjpg.jpg";
$runme="/var/www/html/facedetect/facedetect ".$file;
$output=shell_exec($runme);
var_dump($output);
Turns
NULL
But in reality the exact same command run through ssh takes a little while to complete, about 15 seconds and it does return an output.
So I'm thinking that PHP isn't waiting and exits before that completes.
What can be done to fix this? any special setting in the ini, something.
Or is there a miss configuration somewhere that might be related for shell_exec not to send myjpg.jpg as the argument for the executable. I'm just clueless about this.

shell_exec() returns NULL when the command fails; i.e. when it returns a none-zero exit value.
This is most probably the case.
To debug, try exec(), which always returns a value, it will return the error.
My gut-feeling is that /var/www/html/facedetect/facedetect or $file; are not accessible by the user that runs PHP (probably www-data).

Related

PHP exec doesn't get stdout after long execution

I got a program written in C and compiled with gcc, which I've tested for memory leaks etc in both Linux with Valgrind and in Windows with drmemory, and that with the relevant problematic argv arguments. I've found absolutely no problem doing so, and I get the output I expect when doing so. The only libraries included, and barely used are, stdio.h, stdlib.h, string.h, sys/time.h. Only output is printf and fopen to a specific file, which of course is closed after use, no messing with around with stdout or stderr or something like that.
I call this program from PHP with exec, and the program seems to work perfectly almost all the time, except for some few times when the executable takes like 30+ seconds or something along those lines. The problem is then that it seems I don't get the stdout that I should get assigned to the $output argument with the PHP exec function call, neither does it return anything, which should be the last line outputted from the program. The return value argument for the exec function gets set to 0, which really is the only return value the main function in the C program is set to return, there's no exit or anything else. I've of course set the time limit of the PHP script to 0, and it can run for hours just fine. Should also be noted I run this PHP script from cmd.exe in Windows 7 with administrator privileges.
I've been debugging this so much now, and before I decided to post here, I wanted to see if it actually got to the printf part of my program where the output I want should be printed, so I made it write the exact same string to file right after the printf. Guess what? It wrote the expected result to that file......
From this I take it that PHP exec somehow don't get the stdout output after a certain amount of time. Anyone got any ideas on how to fix this?
I guess a possible fix is to make the PHP script check the created file for the output instead of getting it through the $output argument that should get it from stdout. Perhaps do this when it gets nothing from $output, but this seems like a very shabby workaround.
Is there some kind of setting I've missed?
I'm running PHP 5.6.6.
EDIT:
I've also already made sure to set this at the start of the PHP script before I even started doing any of the tests etc:
error_reporting(E_ALL);
set_time_limit(0);
Did you try to extend the maximum execution time in your server configuration (or by using set_time_limit())? The default one is 30 as I remember correctly. Maybe this causes the problems.

popen and poclose under zend-server

I've stumbled upon a phenomenon that I can't explain myself.
I'm using popen to execute php and then execute a php script that way, and pclose to close that.
So far so fine. I ran into quite some severe troubles as the script where I used this didn't execute and instead after trying it 3 times in a row I crashed the zend-server (no page would open any more). I found out that the reason to this was that I used a wrong directory for the php.exe. Example:
if (pclose(popen("C:\wrongDir\php\php.exe C:\Zend\Apache2\htdocs\myApp\public\mytest.php 57 > C:\Logs\1\0\jobOut.log 2> C:\Logs\1\0\jobErr.log"))>-1)
{
.....
}
Aside from the "wrongDir" all other dirs were correct....the popen even created the jobOut and jobErr files (which were empty). (remark: PHP is not in a searchpath that is why it wasn't found without the correct path)
Even though I now solved the problem....I have the question if this is a normal behaviour there, or if I have done something wrong (maybe even server settings). As from what I read in the manual about both commands it sounded to me that in the my case I should have had a return value of either -1 or 0 and not the problem I ran into with the process and then the server hanging).
Thanks.
It appears that pclose() doesn't return the exit status of the process but rather of it's ability to close the process.
To get the process termination 'code' use pcntl_wifexited() and pcntl_wexitstatus()
http://php.net/manual/en/function.pclose.php
http://php.net/manual/en/function.pcntl-wexitstatus.php

How do I know when the task from command line is finished?

This is really important as I could not find anything I am looking for in Google.
How do I know when the application (or is it more appropriate to call it a task?) executed by a command line is done? How does the PHP know if the task of copying several files are done if I do like this:
exec("cp -R /test/ /var/test/test");
Does the PHP script continue to go to next code even while the command is still running in background to make copies? Or does PHP script wait until the copy is finished? And how does a command line application notify the script when it's done (if it does)? There must be some kind of interaction going on.
php's exec returns a string so yes. Your webpage will freeze until the command is done.
For example this simple code
<?PHP
echo exec("sleep 5; echo HI;");
?>
When executed it will appear as the page is loading for 5 seconds, then it will display:
HI;
How does the PHP know if the task of copying several files are done if I do like this?
Php does not know, it simply just run the command and does not care if it worked or not but returns the string produced from this command. Thats why it better to use PHP's copy command because it returns TRUE/FALSE upon statistics. Or create a bash/sh script that will return 0/FALSE or 1/TRUE to determine if command was successful if you are going this route. Then you can PHP as such:
<?PHP
$answer = exec("yourScript folder folder2");
if ($answer=="1") {
//Plan A Worked
} else {
//Plan A FAILED try PlanB
}
?>
It waits until the exec call returns, whatever it returns.
However it might be that the exit call returns although the command it has started has not yet finished. That might be the case if you detach from the control, for example by explicitly specifying a "&" at the end of the command.

php shell_exec() help needed for running a script in the background

My project calls for 3 php scripts that are run with if-else conditions. The first script is loaded on the index page of the site to check if a condition is set, and if it is, it calls for the second script. The second script check to see if other conditions are set and it finally calls for the last script if everything is good.
Now I could do this by just including the scripts in the if statement, but since the final result is a resource hogging MySQL dump, i need it to be run independently of the original trigger page.
Also those scripts should continue doing their things once triggered, regardless of the user actions on the index page.
One last thing: it should be able to run on win and nix.
How would you do this?
Does the following code make any sense?
if ($blah != $blah-size){
shell_exec ('php first-script.php > /dev/null 2>/dev/null &');
}
//If the size matches, die
else {
}
Thanks a million in advance.
UPDATE: just in case someone else is going through the same deal.
There seem to be a bug in php when running scripts as cgi but command line in Apache works with all the versions I've tested.
See the bug https://bugs.php.net/bug.php?id=11430
so instead i call the script like this:
exec("php-cli mybigfile.php > /dev/null 2>/dev/null &");
Or you could call it as shell. It works on nix systems but my local windows is hopeless so if anyone run it on windows and it works, please update this.
I would not do this by shell exec because you'd have no control over how many of these resource-hogging processes would be running at any one time. Thus, a user could go click-click-click-click and essentially halt your machine.
Instead, I'd build a work queue. Instead of running the dump directly, the script would submit a record to some sort of FIFO queue (could be a database table or a text file in a dir somewhere) and then immediately return. Next you'd have a cron script that runs at regular intervals and checks the queue to see if there's any work to do. If so, it picks the oldest thing, and runs it. This way, you're assured that you're only ever running one dump at a time.
The easiest way I can think is that you can do
exec("screen -d -m php long-running-script.php");
and then it will return immediately and run in the background. screen will allow you to connect to it and see what's happening.
You can also do what you're doing with 'nohup php long-running-script.php', or by writing a simple C app that does daemonize() and then execs your script.

PHP #exec is failing silently

This is driving me crazy. I'm trying to execute a command line statement on a windows box for my PHP web app. It's running on windows XP, IIS5.1. The web app is running fine, but I cannot get #exec() to work with a specific contactenated variable. My command construction looks like this:
$cmd = ($config->svn." cat ".$this->repConfig->svnParams().quote($path).' -r '.$rev.' > '.quote($filename));
This command does not work as is above, when it generates the following string:
svn --non-interactive --config-dir /tmp cat "file:///c:/temp/test/acccount/dbo_sproctest.sql" -r 1 > "C:\Inetpub\sites\websvn\temp\wsv5B45.tmp"
If I copy/paste this to my own command line, it works fine.
If I hard code that very same path instead of adding it with the variable, it works! I've tried with and without quotes around the file name. I've tried with and without quotes around the entire command. I've tried other directories. I've tried passing an output paramter to exec(), and it comes back empty (Array () ). I've tried redirecting the output of the error stream of the command to a file, and that error output file never gets created.
The only thing I can possibly concieve of is that exec() is failing silently. What on earth am I doing wrong here? If I hard code the file path, using the same dir structure and filename, it works fine. If I don't, it doesn't.
Maybe the slashes () in the file path aren't being escaped properly, but when I do it manually with single quotes they are not considered escape sequences??
UPDATE:
I took the # off of exec, and still not seeing any errors.
I gave the full path to SVN, still no luck. It should be noted that the command worked fine before with the non-full path SVN so long as I manually specify the file destination for cat.
Update 2: RE: Kieth
I'm calling exec by trying both:
exec($cmd);
or
exec($cmd, $out);
My php.ini already had safe_mode = 0.
I added error_reporting(E_ALL); and didn't see anything new
If I echo (or print_r) my exec call, I am not actually seing anything
If I echo (or print_r) my exec call when included an output var, I get an empty arr
Update 3
I tried both escapeshellcmd and escapeshellarg to no avail (good idea though).
I should add that the file is being created through invoking
tempnam("temp", "wbsn");
The fact that it works just fine if I manually specify the string instead of letting it be generated by tempname seems to suggests that the source of the problem, but I can't figure out how. I did a comparison of the manual string with the one generated, and it came back as a match.
#exec will always fail silently, because # is PHP's error suppression operator.
Not sure if this will help you since you're on Windows and I'm on Linux, but I ran into this same problem of silent errors from PHP exec(). I figured out that the command I was attempting to issue (nconvert) sends its error messages to the standard error stream, not standard out. So I added
2>&1
at the end of the command line to redirect my error back to the standard stream. Then I could see that nconvert was giving me a permission denied error.
It could be that the PATH isn't the same from your php script vs your user account. Try removing the # and see if it's trying to throw an error.
In addition, you may want to try putting the full filesystem path to the SVN executable.
Well, if removing # isn't working, try including this at the start of the piece of code you're running:
error_reporting(E_ALL);
That'll turn on full error reporting, just in case you have it turned down or disabled in your php.ini file.
I don't suppose you could show us how exactly you're calling exec()? Could you also check to make sure you're not accidentally running the script in safe mode? Are you echoing out what exec() is returning, and if so, what is it returning?
I'm not a big fan of adding another response, but if I just edit my previous response, you may not see it.
PHP has some special escaping commands for shell scripts: escapeshellcmd and escapeshellarg.
I think you should be able to use escapeshellcmd around your entire $cmd, but I'm not sure.
Have you tried echo exec("dir") or something simple to see if exec() is working at all?
I don't know what going on, but I at least have a workaround.
This works:
$tmp = tempnam("./", "wbsn");
$filename = dirname($tmp).'\\temp\\'.basename($tmp);
This, however, does not, but I would have expected it to generate the same path (diff file name since it's a new tempnam()).
$tmp = tempnam("temp", "wbsn");
Also, this does not work, which I also would expect to generate the same thing:
$tmp = tempnam("temp", "wbsn");
$filename = dirname($tmp).'\\'.basename($tmp);
All 3 of these solutions appear to generate the same file paths, but only the first one actually works when used in my exec. I have no clue why it does not.
Visual inspection (echo) of all 3 of these appear to generate the same paths (with the exception of filenames differing, of course). A string comparison of dirname() of each of these 3 shows as a match. I have no clue what the deal is, but the first one is a workaround.
A very useful trick when debugging shell exec problems is to place an "echo" at the beginning of the command. Make sure that you can view the standard output somewhere.
This will let you examine the command for any obvious problems. Perhaps it has a wildcard that is expanding unexpectedly, or perhaps the shell quoting is not exactly right. The echo will let you see this, and you can cut and paste the echoed command into another shell to see if it is working properly.
My advice will be to switch from WIN, IIS to linux, but as alternative you can try this:
function exec_alt($cmd) {
exec($cmd, $output);
if (!$output) {
/**
* FIXME: for some reason exec() returns empty output array #mine,'s machine.
* Somehow proc_open() approach (below) works, but doesn't work at
* test machines - same empty output with both pipes and temporary
* files (not we bypass shell wrapper). So use it as a fallback.
*/
$output = array();
$handle = proc_open($cmd, array(1 => array('pipe', 'w')), $pipes, null, null, array('bypass_shell' => true));
if (is_resource($handle)) {
$output = explode("\n", stream_get_contents($pipes[1]));
fclose($pipes[1]);
proc_close($handle);
}
}
return $output; }

Categories