exec causes perpetual load - php

I noticed exec and shell_exec is causing perpetual loading.
Basically, I'm trying to do something as simple as loading a PHP script in the background. When I try to do that, it just loads and loads.
My code is as follows
exec('php test.php -- '.escapeshellarg($param1).' > /dev/null ');
I first thought it was my other script, so I pointed it to a file with just:
echo $agrv[1];
But it still loads perpetually.

Don't wait for the process to exit
exec() waits for the process to give an exit code. The link I provided above may help you.
Oh, and since you tagged Linux for whatever reason, I assume you're on a Linux distro.
You could consider this, aswell:
http://ca1.php.net/pcntl_fork

Related

php page starts bash script to run in the background and not timeout

I have seen this question on here before so I am sorry for the repetition but I have still not found an answer to my problem.
I have a bash script that takes a while to run. It needs to be passed variables set by a user on a webpage (don't worry there will be plenty of validation for security etc)
I can get the bash file to start but it dies after 20 seconds maybe when run from the webpage.
When run from the terminal.. runs absolutely fine.
Ok so I have the following:
$bashFile = shell_exec('./CoinCreationBashFile.sh "'.$coinName.'" "'.$coinNameAbreviation.'" "'.$blockReward.'" "'.$blockSpacing.'" "'.$targetTimespan.'" "'.$totalCoins.'" "'.$firstBitAddy.'" "'.$seedNode.'" "'.$seedName.'" "'.$headline.'" ');
Now this executes the bash file, but I read up about Shell_exec php with nohup and came up with the following:
$bashFile = shell_exec('nohup ./CoinCreationBashFile.sh "'.$coinName.'" "'.$coinNameAbreviation.'" "'.$blockReward.'" "'.$blockSpacing.'" "'.$targetTimespan.'" "'.$totalCoins.'" "'.$firstBitAddy.'" "'.$seedNode.'" "'.$seedName.'" "'.$headline.'" >/dev/null 2>&1 &');
But this still died after short time :(
So read up about set_time_limit and max_execution_time and set these to like 10000000 in the php.ini file.... Yet still no joy :(
Just want to run a bash script without it timing out and exiting. Don't really want to have to put an intermediate step in there but someone suggested I look at ZeroMQ to "detach worker from process" so I may have to go this route.
many thanks in advance
dont try runging a script via browser if they take more then 60 seconds instead try running this with SSH or as a cronjob.

php shell_exec() help needed for running a script in the background

My project calls for 3 php scripts that are run with if-else conditions. The first script is loaded on the index page of the site to check if a condition is set, and if it is, it calls for the second script. The second script check to see if other conditions are set and it finally calls for the last script if everything is good.
Now I could do this by just including the scripts in the if statement, but since the final result is a resource hogging MySQL dump, i need it to be run independently of the original trigger page.
Also those scripts should continue doing their things once triggered, regardless of the user actions on the index page.
One last thing: it should be able to run on win and nix.
How would you do this?
Does the following code make any sense?
if ($blah != $blah-size){
shell_exec ('php first-script.php > /dev/null 2>/dev/null &');
}
//If the size matches, die
else {
}
Thanks a million in advance.
UPDATE: just in case someone else is going through the same deal.
There seem to be a bug in php when running scripts as cgi but command line in Apache works with all the versions I've tested.
See the bug https://bugs.php.net/bug.php?id=11430
so instead i call the script like this:
exec("php-cli mybigfile.php > /dev/null 2>/dev/null &");
Or you could call it as shell. It works on nix systems but my local windows is hopeless so if anyone run it on windows and it works, please update this.
I would not do this by shell exec because you'd have no control over how many of these resource-hogging processes would be running at any one time. Thus, a user could go click-click-click-click and essentially halt your machine.
Instead, I'd build a work queue. Instead of running the dump directly, the script would submit a record to some sort of FIFO queue (could be a database table or a text file in a dir somewhere) and then immediately return. Next you'd have a cron script that runs at regular intervals and checks the queue to see if there's any work to do. If so, it picks the oldest thing, and runs it. This way, you're assured that you're only ever running one dump at a time.
The easiest way I can think is that you can do
exec("screen -d -m php long-running-script.php");
and then it will return immediately and run in the background. screen will allow you to connect to it and see what's happening.
You can also do what you're doing with 'nohup php long-running-script.php', or by writing a simple C app that does daemonize() and then execs your script.

php exec(svn commit) hangs

I know there's been similar questions but they don't solve my problem...
After checking out the folders from the repo (which works fine).
A method is called from jquery to execute the following in php.
exec ('svn cleanup '.$checkout_dir);
session_write_close(); //Some suggestion that was supposed to help but doesn't
exec ('svn commit -m "SAVE DITAMAP" '.$file);
These would output the following:
svn cleanup USER_WORKSPACE/0A8288
svn commit -m "SAVE DITAMAP" USER_WORKSPACE/0A8288/map.ditamap
1) the first line (exec ('svn cleanup')...executes fine.
2) as soon as I call svn commit then my server hangs, and everything goes to hell
The apache error logs show this error:
[notice] Child 3424: Waiting 240 more seconds for 4 worker threads to finish.
I'm not using the php_svn module because I couldn't get it to compile on windows.
Does anyone know what is going on here? I can execute the exact same cmd from the terminal windows and it executes just fine.
since i cannot find any documentation on jquery exec(), i assume this is calling php. i copied this from the documentation page:
When calling exec() from within an apache php script, make sure to take care of stdout, stderr and stdin (as in the example below). If you forget this and your shell command produces output the sh and apache deamons may never return (they will normally time out after a few minutes). From the calling web page the script may seem to not return any data.
If you want to start a php process that continues to run independently from apache (with a different parent pid) use nohub. Example:
exec('nohup php process.php > process.out 2> process.err < /dev/null &');
hope it helps
Okay, I've found the problem.
It actually didn't have anything to do with the exec running the background, especially because a one file commit doesn't take a lot of time.
The problem was that the commit was expecting a --username and --password that didn't show up, and just caused apache to hang.
To solve this, I changed the svnserve.conf in the folder where I installed svn and changed it to allow non-auth users write access.
I don't think you'd normally want to do this, but my site already authenticates the user name and pass upon logging in.
Alternatively you could

PHP and shell_exec

I have a PHP website and I would like to execute a very long Python script in background (300 MB memory and 100 seconds). The process communication is done via database: when the Python script finishes its job, it updates a field in database and then the website renders some graphics, based on the results of the Python script.
I can execute "manually" the Python script from bash (any current directory) and it works. I would like to integrate it in PHP and I tried the function shell_exec:
shell_exec("python /full/path/to/my/script") but it's not working (I don't see any output)
Do you have any ideas or suggestions? It worths to mention that the python script is a wrapper over other polyglot tools (Java mixed with C++).
Thanks!
shell_exec returns a string, if you run it alone it won't produce any output, so you can write:
$output = shell_exec(...);
print $output;
First off set_time_limit(0); will make your script run for ever so timeout shouldn't be an issue. Second any *exec call in PHP does NOT use the PATH by default (might depend on configuration), so your script will exit without giving any info on the problem, and it quite often ends up being that it can't find the program, in this case python. So change it to:
shell_exec("/full/path/to/python /full/path/to/my/script");
If your python script is running on it's own without problems, then it's very likely this is the problem. As for the memory, I'm pretty sure PHP won't use the same memory python is using. So if it's using 300MB PHP should stay at default (say 1MB) and just wait for the end of shell_exec.
A proplem could be that your script takes longer than the server waiting time definied for a request (can be set in the php.ini or httpd.conf).
Another issue could be that the servers account does not have the right to execute or access code or files needed for your script to run.
Found this before and helped me solve my background execution problem:
function background_exec($command)
{
if(substr(php_uname(), 0, 7) == 'Windows')
{
pclose(popen('start "background_exec" ' . $command, 'r'));
}
else
{
exec($command . ' > /dev/null &');
}
}
Source:
http://www.warpturn.com/execute-a-background-process-on-windows-and-linux-with-php/
Thanks for your answers, but none of them worked :(. I decided to implement in a dirty way, using busy waiting, instead of triggering an event when a record is inserted.
I wrote a backup process that runs forever and at each iteration checks if there is something new in database. When it finds a record, it executes the script and everything is fine. The idea is that I launch the backup process from the shell.
I found that the issue when I tried this was the simple fact that I did not compile the source on the server I was running it on. By compiling on your local machine and then uploading to your server, it will be corrupted in some way. shell_exec() should work by compiling the source you are trying to run on the same server your are running the script.

Running PHP after request

I would like to be able to start a second script (either PHP or Python) when a page is loaded and have it continue to run after the user cancels/navigates away is this possible?
You can send Connection:Close headers, which finishes the page for your user, but enables you to execute things "after page loads".
There is a simple way to ignore user abort (see php manual too):
ignore_user_abort(true);
Use process forking with pcntl.
It only works under Unix operating systems, however.
You can also do something like this:
exec("/usr/bin/php ./child_script.php > /dev/null 2>&1 &");
You can read more about the above example here.
for keeping the current script:
ignore_user_abort(true);
set_time_limit(0);
for running another script:
see $sock=fsockopen('http://localhost/path_to_script',80,$errorStr,3600) + stream_set_timeout($sock,3600);
see exec('php path_to_script'); - this will cause your script to run from the CLI, so you'd have to install php-cli on that server.
Another approach if you can't use the others is to include an img tag at the bottom of your output that requests a php page that does whatever it is you are wanting to do.
It will still show the loading animation though, so I think Karsten's suggestion is probably better (I'll try that next time I need to do this type of thing I think).

Categories