PHP and shell_exec - php

I have a PHP website and I would like to execute a very long Python script in background (300 MB memory and 100 seconds). The process communication is done via database: when the Python script finishes its job, it updates a field in database and then the website renders some graphics, based on the results of the Python script.
I can execute "manually" the Python script from bash (any current directory) and it works. I would like to integrate it in PHP and I tried the function shell_exec:
shell_exec("python /full/path/to/my/script") but it's not working (I don't see any output)
Do you have any ideas or suggestions? It worths to mention that the python script is a wrapper over other polyglot tools (Java mixed with C++).
Thanks!

shell_exec returns a string, if you run it alone it won't produce any output, so you can write:
$output = shell_exec(...);
print $output;

First off set_time_limit(0); will make your script run for ever so timeout shouldn't be an issue. Second any *exec call in PHP does NOT use the PATH by default (might depend on configuration), so your script will exit without giving any info on the problem, and it quite often ends up being that it can't find the program, in this case python. So change it to:
shell_exec("/full/path/to/python /full/path/to/my/script");
If your python script is running on it's own without problems, then it's very likely this is the problem. As for the memory, I'm pretty sure PHP won't use the same memory python is using. So if it's using 300MB PHP should stay at default (say 1MB) and just wait for the end of shell_exec.

A proplem could be that your script takes longer than the server waiting time definied for a request (can be set in the php.ini or httpd.conf).
Another issue could be that the servers account does not have the right to execute or access code or files needed for your script to run.

Found this before and helped me solve my background execution problem:
function background_exec($command)
{
if(substr(php_uname(), 0, 7) == 'Windows')
{
pclose(popen('start "background_exec" ' . $command, 'r'));
}
else
{
exec($command . ' > /dev/null &');
}
}
Source:
http://www.warpturn.com/execute-a-background-process-on-windows-and-linux-with-php/

Thanks for your answers, but none of them worked :(. I decided to implement in a dirty way, using busy waiting, instead of triggering an event when a record is inserted.
I wrote a backup process that runs forever and at each iteration checks if there is something new in database. When it finds a record, it executes the script and everything is fine. The idea is that I launch the backup process from the shell.

I found that the issue when I tried this was the simple fact that I did not compile the source on the server I was running it on. By compiling on your local machine and then uploading to your server, it will be corrupted in some way. shell_exec() should work by compiling the source you are trying to run on the same server your are running the script.

Related

Asynchronous popen exec in PHP on Windows

I am having a strange issue. I recently set up a process on our test servers to call a script asynchronously from another script. It has been working during testing, up until recently. Let me give some of the technical details before proceeding. The call from the originating script looks like this (I have updated the logic that makes the call):
exec('cmd /S /C "'.$command.' 1> nul 2>&1"');
Where $command is created using the following logic:
$args = array(
'php',
'"'.getcwd().'{absolute path to a php script}"',
escapeshellarg($job_id),
escapeshellarg($vhost),
escapeshellarg($debug),
);
$command = implode(' ', $args);
PHP version is 5.3.10 and the server is a hosted box running Windows Server 2008 R2 Standard.
I have verified that php is still in the system path
I have used debug output to see that the asynch call is being made, and what args it is being made with -everything still seems to be correct.
If I call the command contained in $command manually from the windows system terminal it runs as expected
If I run the $command as exec would run it (with 'cmd /S /C' it fails complaining that it cannot open the php script)
What really boggles my mind is that this was working this time last week. I made a few changes to the code (seemingly unrelated) and even after reverting my changes to see if that was the problem - this process is still broken.
At this point I am mystified and would welcome any insight, ideas or help.
Eventually I was able to simply get around this problem by not using popen or exec at all, it seems like those may be best avoided when using PHP on a Windows platform - if you are trying to create an asynchronous script call that is. Instead I resorted to the Windows COM object support in PHP.
$handle = new COM('WScript.Shell');
$handle->Run($command, 0, false);
This starts the process in the background (hides window/does not open a new window) and executes it asynchronously - i.e. does not wait for it to finish.
Included here is the documentation page for WScript.Shell which gives the official description of the effect(s) of the parameters passed. Although I did not stop to benchmark it, I will say that anecdotally it feels that his process now runs more quickly than it used to when I was trying to use exec/popen/etc.
Rather than redirecting the output to nul, try redirecting everything to a file. That should tells you if there's a PHP error somewhere:
$outFilePath = "/path/to/file.log";
pclose (popen('start /B cmd /S /C "'.$command.' > ' . $outFilePath . ' 2>' . $outFilePath . ' &"',"r") );
Also is there any reason you are using pclose / popen rather than something more simple like exec?

Waiting for file to be created

I'm trying to compile an executable via PHP with msbuild which compiles my C# source, the majority of the script relies on the executable being created so it must wait for msbuild to compile the source.
If I don't put any sort of while loop it will compile fine and the executable is created but the problem is the rest of the script executes to fast and the end result isn't correct.
so at the moment I'm using this..
exec('C:\Windows\Microsoft.NET\Framework\v3.5\MSBuild.exe C:\Users\Administrator\AppData\Roaming\Compile\Myprogram\Myprogram.sln /p:Configuration=Release');
while (!file_exists('C:\Users\Administrator\AppData\Roaming\Compile\Myprogram\bin\Release\Myprogram.exe')) sleep(1);
However in this scenario it's almost as if the exec command never gets ran at all. It gets stuck in an infinite loop and eventually times out resulting in the exe never being compiled.
Any suggestions on the proper way to go about this?
Try running it as follows:
$output = array();
$cmd = 'C:\Windows\Microsoft.NET\Framework\v3.5\MSBuild.exe C:\Users\Administrator\AppData\Roaming\Compile\Myprogram\Myprogram.sln /p:Configuration=Release && exit';
exec($cmd, $output);

exec causes perpetual load

I noticed exec and shell_exec is causing perpetual loading.
Basically, I'm trying to do something as simple as loading a PHP script in the background. When I try to do that, it just loads and loads.
My code is as follows
exec('php test.php -- '.escapeshellarg($param1).' > /dev/null ');
I first thought it was my other script, so I pointed it to a file with just:
echo $agrv[1];
But it still loads perpetually.
Don't wait for the process to exit
exec() waits for the process to give an exit code. The link I provided above may help you.
Oh, and since you tagged Linux for whatever reason, I assume you're on a Linux distro.
You could consider this, aswell:
http://ca1.php.net/pcntl_fork

how to use php exec(), to run another script and run in the backround, and not wait for the script to finish

I am wanting to execute a large, database intensive script, but do not need to wait for the process to finish. I would simply like to call the script, let it run in the background and then redirect to another page.
EDIT:
i am working on a local Zend community server, on Windows 7.
I have access to remote linux servers where the project also resides, so i can do this on linux or windows.
i have this
public function createInstanceAction()
{
//calls a seperate php process which creates the instance
exec('php -f /path/to/file/createInstance.php');
Mage::getSingleton('adminhtml/session')->addSuccess(Mage::helper('adminhtml')->__('Instance creation process started. This may take up to a few minutes.'));
$this->_redirect('instances/adminhtml_instances/');
return;
}
this works perfectly, but the magento application hangs around for the process to finish. it does everything i expect, logging to file from time to time, and am happy with how its running. Now all i would like to do is have this script start, the controller action does not hang around, but instead redirects and thats that. from what I have learnt about exec(), you can do so by changing the way i call exec() above, to :
exec('php -f /path/to/file/createInstance.php > /dev/null 2>&1 &');
which i took from here
if i add "> /dev/null 2>&1 &" to the exec call, it doesnt wait around as expected, but it does not execute the script anymore. Could someone tell me why, and if so, tell me how i can get this to work please?
Could this be a permission related issue?
thanks
EDIT : Im assuming it would be an issue to have any output logged to file if i call the exec function with (/dev/null 2>&1 &) as that would cancel that. is that correct?
After taking time to fully understand my own question and the way it could be answered, i have prepared my solution.
Thanks to all for your suggestions, and for excusing my casual, unpreparedness when asking the question.
The answer to the above question depends on a number of things, such as the operating system you are referring to, which php modules you are running and even as far as what webserver you are running. So if i had to start the question again, the first thing i would do is state what my setup is.
I wanted to achieve this on two environments :
1.) Windows 7 running Zend server community edition.
2.) Linux (my OS is Linux odysseus 2.6.32-5-xen-amd64 #1 SMP Fri Sep 9 22:23:19 UTC 2011 x86_64)
to get this right, i wanted it to work either way when deploying to windows or linux, so i used php to determine what the operating system was.
public function createInstanceAction()
{
//determines what operating system is Being used
if (strtoupper(substr(PHP_OS, 0, 3)) === 'WIN')
{
//This is a windows server
//call a seperate php process to run independently from the broswer action
pclose(popen("start php /path/to/script/script.php","r"));
}
else
{
//assuming its linux, but in fact it simply means its not windows
// to check for linux specifically use (strtoupper(substr(PHP_OS, 0, 3)) === 'LIN')
exec('php -f /path/to/file/script.php >/dev/null 2>&1 &');
}
//the browser will not hang around for this process to complete, and you can contimue with whatever actions you want.
//myscript log any out put so i can capture info as it runs
}
In short, ask questions once you understand them. there are many ways to to achieve the above, and this is just one solution that works for my development and production environments.
thanks for the help all.
PHP popen
From the docs (this should help you do other stuff, while that process is working; not sure if closing the current PHP process will kill the opened process):
/* Add redirection so we can get stderr. */
$handle = popen('/path/to/executable 2>&1', 'r');
echo "'$handle'; " . gettype($handle) . "\n";
$read = fread($handle, 2096);
echo $read;
pclose($handle);
Solution 2:
Trick the browser to close the connection (assuming there is a browser involved):
ob_start();
?><html><!--example html body--></html><?php
$strContents=ob_get_clean();
header("Connection: Close");
header("Content-encoding: none");//doesn't work without this, I don't know why:(
ignore_user_abort(true);
header("Content-type: text/html");
header("Content-Length: ".strlen($strContents));
echo $strContents;
flush();
//at this point a real browser would close the connection and finish rendering;
//crappy http clients like some curl implementations (and not only) would wait for the server to close the connection, then finish rendering/serving results...:(
//TODO: add long running operations here, exec, or whatever you have.
You could write a wrapper-script, say createInstance.sh like
#! /bin/bash
trap "" SIGHUP
php -f "$1" > logfile.txt 2>&1 &
Then you call the script from within PHP:
exec('bash "/path/to/file/createInstance.sh"');
which should detach the new php process most instantly from the script. If that doesen't help, you might try to use SIGABRT, SIGTERM or SIGINT instead of SIGHUP, I don't know exactly which signal is sent.
I've been able to use:
shell_exec("nohup $command > /dev/null & echo $!")
Where $command is for example:
php script.php --parameter 1
I've noticed some strange behavior with this. For example running mysql command line doesn't work, only php scripts seem to work.
Also, running cd /path/to/dir && php nohup $command ... doesn't work either, I had to chdir() within the PHP script and then run the command for it to work.
The PHP executable included with Zend Server seems to be what's causing attempts to run a script in the background (using the ampersand & operator in the exec) to fail.
We tested this using our standard PHP executable and it worked fine. It's something to do with the version shipped with Zend Server though our limited attempts to figure out what that was going on have not turned anything up.

php shell_exec() help needed for running a script in the background

My project calls for 3 php scripts that are run with if-else conditions. The first script is loaded on the index page of the site to check if a condition is set, and if it is, it calls for the second script. The second script check to see if other conditions are set and it finally calls for the last script if everything is good.
Now I could do this by just including the scripts in the if statement, but since the final result is a resource hogging MySQL dump, i need it to be run independently of the original trigger page.
Also those scripts should continue doing their things once triggered, regardless of the user actions on the index page.
One last thing: it should be able to run on win and nix.
How would you do this?
Does the following code make any sense?
if ($blah != $blah-size){
shell_exec ('php first-script.php > /dev/null 2>/dev/null &');
}
//If the size matches, die
else {
}
Thanks a million in advance.
UPDATE: just in case someone else is going through the same deal.
There seem to be a bug in php when running scripts as cgi but command line in Apache works with all the versions I've tested.
See the bug https://bugs.php.net/bug.php?id=11430
so instead i call the script like this:
exec("php-cli mybigfile.php > /dev/null 2>/dev/null &");
Or you could call it as shell. It works on nix systems but my local windows is hopeless so if anyone run it on windows and it works, please update this.
I would not do this by shell exec because you'd have no control over how many of these resource-hogging processes would be running at any one time. Thus, a user could go click-click-click-click and essentially halt your machine.
Instead, I'd build a work queue. Instead of running the dump directly, the script would submit a record to some sort of FIFO queue (could be a database table or a text file in a dir somewhere) and then immediately return. Next you'd have a cron script that runs at regular intervals and checks the queue to see if there's any work to do. If so, it picks the oldest thing, and runs it. This way, you're assured that you're only ever running one dump at a time.
The easiest way I can think is that you can do
exec("screen -d -m php long-running-script.php");
and then it will return immediately and run in the background. screen will allow you to connect to it and see what's happening.
You can also do what you're doing with 'nohup php long-running-script.php', or by writing a simple C app that does daemonize() and then execs your script.

Categories