I have a PHP script that needs to run some background command line calls to create some files in the background. I do not want the user to have to wait until the command line stuff is done before the page loads. I have tried using the answers from other threads on SO and it is not working for me.
Here is what I have tried via exec() and shell_exec() to get it to run in the background:
'{ [multiple commands here] } &> /dev/null &;
If it is helpful, this is the server info:
uname -or
2.6.39-400.17.1.el6uek.x86_64 GNU/Linux
lsb_release -irc
Distributor ID: OracleServer
Release: 6.4
Codename: n/a
Maybe this example will help you out. Replace the commands with your commands.
<?php
shell_exec("{ sleep 10; touch /tmp/asdf; } > /dev/null 2>&1 &");
Related
My issue seems to be asked before but hold on, this one is a bit different.
I have 2 php files, I run the following commands:
nohup php file_1.php >/dev/null 2>&1 &
nohup php file_2.php >/dev/null 2>&1 &
This will create basically 2 php processes.
My problem is that sometimes one of the files or even both of them are killed by the server for unknown reason and I have to re-enter the commands over again. I tried 'forever' but doesn't help.
If the server is rebooted I will have to enter those 2 commands, I thought about Cronjob but I'm not sure if it would launch it twice which would create more confusion.
My question is how to start automatically the files if one or both of them got killed? What is the best way to achieve this which would check exactly that file_1.php or that file_2.php is indeed running?
There are a couple of ways you can do this. As #Chris Haas mentioned in the comments, supervisord can do this for you, or you can run a watchdog shell script from cron at regular intervals that checks to see if your script is running, and if not, start it. Here's one I use.
#!/bin/bash
FIND_PROC=`ps -ef |grep "php queue_agent.php --env prod" | awk '{if ($8 !~ /grep/) print $2}'`
# if FIND_PROC is empty, the process has died; restart it
if [ -z "${FIND_PROC}" ]; then
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
echo queue_agent.php failed at `date`
cd ${DIR}
nohup nice -n 10 php queue_agent.php --env prod -v > ../../sandbox/logs/queue_agent.log &
fi
exit 0
I think u can try to figure out why these two php scripts shut down as the first step. To solve this problem, u can use this php function:
void register_shutdown_function(callback $callback[.mixed $parameter]);
which Registers a callback to be executed after script execution finishes or exit() is called. So u can log some info when php files get shut down like this:
function myLogFunction() {
//log some info here
}
register_shutdown_function(myLogFunction);
Instead of putting the standard output and error output into /dev/null, u can put them into a log file(Since maybe we can get some helpful info from the output). So instead of using:
nohup php file_1.php >/dev/null 2>&1 &
nohup php file_2.php >/dev/null 2>&1 &
try:
nohup php file_1.php 2>yourLog.log &
nohup php file_2.php 2>yourLog.log &
If u want to autorun these two php files when u boot the server, try edit /etc/rc.local(which is autorun when the os start up). Add your php cli command lines in this file.
If u can't figure out why php threads get shut down, try supervisor as #Chris Haas mensioned.
I am attempting to launch sar and have it run forever via a php script. But for whatever reason it never actually launches. I have tried the following:
exec('sar -u 1 > /home/foo/foo.txt &');
exec('sar -o /home/foo/foo -u 1 > /dev/null 2>&1 &');
However it never launches sar. If I just use:
exec('sar -u 1')
It works but it just hangs the php script. My understanding that if a program is started with exec function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream.
I will assume your running this on a *nix platform. To get php to run something in the background and not wait for the process to finish I would recommend 2 things: First use nohup and also redirect the output of the command to /dev/null (trash).
Example:
<?php
exec('nohup sar -u 1 > /dev/null 2>/dev/null &');
nohup means we do not send the "hang up" signal (which kills the process) when the terminal running the command closes.
> /dev/null 2>/dev/null & redirects the "normal" and "error" outputs to the blackhole /dev/null location. This allows PHP to not have to wait for the outputs of the command being called.
On another note, if you are using PHP just to call a shell command, you may want to consider other options like Ubuntu's Upstart with no PHP component--if you are using Ubuntu that is.
I have looked at other answers they dont fit to this case.
I am using the full path to the file. Code I copied is simplified.
run.php contains:
shell_exec("php /var/www/html/sync/chourly.php $position $quotientx > /dev/null 2>/dev/null &");
if I use manually php run.php - it works great.
here is the line on crontab -e :
05 * * * * /usr/bin/wget -O /dev/null http://sync.eeeww.com/run.php
again the file run.php starts BUT chourly.php doesn't start. I am using centOS 6
any suggestions please?
Addition: I checked the permissions I am using ec2-user to run php run.php and crontab is using the same permission. it is able to run the file but shell_exec is where the issue occurs
Is /var/www/html/sync/chourly.php using $SERVER['DOCUMENT_ROOT'] ? Since you're explicitly calling the php interpreter (not mod_php), a `$SERVER['DOCUMENT_ROOT'] call will not work as you expect.
Try manually running the cron from shell to see where it's failing.
cd /
su - your_httpd_usersame -c "/usr/bin/wget -O /dev/null http://sync.bitpine.com/run.php"
I have a pdf file that makes the pdf2swf tool to run forever. when running the following command manually :
pdf2swf -Q 10 test.pdf
The script aborts after 10 seconds becouse of the -Q 10 flag, but when running the same command using php the script runs forever. I've tried using shell_exec() ,exec() and passthru() and all of them ignored the -Q flag.
Has anyone encontered anything like this with pdf2swf tool or with any other PHP exec ?
EDIT
when I run it manually
php -r "exec('pdf2swf -Q 10 test.pdf');"
It aborts after 10 seconds, but when running as a deamon, again, it won't abort
I tried same command in my test project and it worked fine with no issues, please find my code as -
$out = exec("pdf2swf font_example.pdf -o varun.swf", $rest);
I'm trying to trigger a PHP script to run in the background using the exec() function but I cannot get it to work. I've read countless posts on stack overflow and other forums and tried many variations to no avail.
Server Info:
Operating System: Linux
PHP: 5.2.17
Apache Version: 2.2.23
Home Directory: /home1/username
I'm currently using the code:
exec("/home1/username/php /home1/username/public_html/myscript.php > /dev/null &");
When I run the above script I get no error_log and no error in my cPanel error log, however the script definitely doesn't execute. When I browse to http://www.mydomain.com/myscript.php it runs and e-mails me instantly. Any idea why this isn't working / how I can find out what error is being produced?
Update cPanel Process Manager Output
exec("php /home1/username/php /home1/username/public_html/myscript.php > /dev/null &");
Produces:
27183 php /home1/username/php /home1/username/public_html/myscript.php
27221 [sh]
27207 php /home1/username/php /home1/username/public_html/myscript.php
27219 php /home1/username/php /home1/username/public_html/myscript.php
27222 php /home1/username/php /home1/username/public_html/myscript.php
27224 php /home1/username/php /home1/username/public_html/myscript.php
27249 sh -c php /home1/username/php /home1/username/public_html/myscript.php > /dev/null &
Is that normal? Script appears to hang around for a long time even though it should execute very quickly.
Couldn't get the exec working with php. Even when I got shell access to the server the command just hung. I decided to use wget instead which accomplishes the same thing. Works great :)
exec("wget http://www.mydomain.com/myscript.php > /dev/null &");
Have you tried invoking the php CLI directly?
exec("php /home1/username/php /home1/username/public_html/myscript.php > /dev/null &");
You will not need the #!, which would output to the browser if called through Apache.
EDIT.
It looks like your script is working, but your PHP script executing in the background is hanging (not exiting). Try this variation:
exec("php /home1/username/php /home1/username/public_html/myscript.php > /dev/null 2>&1 &");
What does “> /dev/null 2>&1″ mean?
since you want to run the myscript from your command line, wy not do this:
exec('(/home1/username/public_html/myscript.php) > /dev/null &',$r,$s);
And write this as a first line in the myscript.php:
#!/home1/username/php -n
<?php
//script goes here
?>
That should work. The hashbang tells the system what programme to use to run the script that follows, so you don't need to add that to your exec call. Also, it's safer (and therefore better) to put brackets around the full script call, just so PHP knows what output has to be redirected to what stream, to avoid any issues that might occur. Especially when libs or packages like PHP-GTK are installed on the server (hence the -n option).