running php exec command in background - php

I am attempting to launch sar and have it run forever via a php script. But for whatever reason it never actually launches. I have tried the following:
exec('sar -u 1 > /home/foo/foo.txt &');
exec('sar -o /home/foo/foo -u 1 > /dev/null 2>&1 &');
However it never launches sar. If I just use:
exec('sar -u 1')
It works but it just hangs the php script. My understanding that if a program is started with exec function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream.

I will assume your running this on a *nix platform. To get php to run something in the background and not wait for the process to finish I would recommend 2 things: First use nohup and also redirect the output of the command to /dev/null (trash).
Example:
<?php
exec('nohup sar -u 1 > /dev/null 2>/dev/null &');
nohup means we do not send the "hang up" signal (which kills the process) when the terminal running the command closes.
> /dev/null 2>/dev/null & redirects the "normal" and "error" outputs to the blackhole /dev/null location. This allows PHP to not have to wait for the outputs of the command being called.
On another note, if you are using PHP just to call a shell command, you may want to consider other options like Ubuntu's Upstart with no PHP component--if you are using Ubuntu that is.

Related

Why update nohup.out when running nohup in exec php

I'm running a php socket. I run the program through nohup. Run this program properly through root. But my problem is running the program via the exec () function in php. When I run the command this way the program runs correctly but the program output is not printed in nohup.out.
my command in ssh:
nohup php my_path/example.php & #is working
my command in user php:
exec('nohup php my_path/example.php >/dev/null 2>&1 &', $output); #not update nohup.out
please guide me...
From PHP docs on exec:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
From man nohup:
If standard input is a terminal, redirect it from /dev/null. If standard output is a terminal, append output to 'nohup.out' if possible, '$HOME/nohup.out' otherwise. If standard error is a terminal, redirect it to standard output. To save output to FILE, use 'nohup COMMAND > FILE'.
To satisfy both - redirect manually to nohup.out:
exec('nohup php my_path/example.php >>nohup.out 2>&1 &', $output);

Attempting to launch a php scrip in the background from a php script, using the exec() function, on a raspberry pi

I want to have a chat bot working on my raspberry pi.
In the command line, I am entering this command:
nohup php /path/to/script.php >/dev/null 2>&1 &
And this works. When I decide I need the the bot to be down, I enter this command:
killall php
This works, but I want to be able to start and stop the script from a browser.
So, in my dashboard php script....
//This command works.
$runningProc = exec("pgrep -a php");
//This command does not. The process starts, but immediately ends
$startCommand = "nohup php /path/to/script.php >/dev/null 2>&1 &";
exec($startCommand);
//This command does not seem to work for a nohup started from the command line.
exec("killall php");
Okay, so I figured out what was going on, once I found some relevant logs.
In my dashboard file I changed the exec() argument to :
$startCommand = "nohup php script.php >/dev/null 2> nohup.out &";
exec($startCommand);
This way, I could review the 'nohup.out' file for any errors.
The first require() command triggered a "No such file" error. Changing that function to the full filepath made everything work as expected.

Run multiple php scripts in sequence from a single php script or batch file in windows

I have 5 php scripts need to be executed one after other. How can i create a single php/batch script which executes all the php scripts in sequence.
I want to schedule a cron which run that php file.
#!/path/to/your/php
<?php
include("script1.php");
include("script2.php");
include("script3.php");
//...
?>
Alternative
#/bin/bash
/path/to/your/php /path/to/your/script1.php
/path/to/your/php /path/to/your/script2.php
/path/to/your/php /path/to/your/script3.php
# ...
If your scripts need to be accessed via http:
#/bin/bash
wget --quiet http://path/to/your/script1.php > /dev/null 2>&1
wget --quiet http://path/to/your/script2.php > /dev/null 2>&1
wget --quiet http://path/to/your/script3.php > /dev/null 2>&1
# ...
I did something for testing using the "wait" command, as it seems I just put wait between the calls to each script. I did a little test creating databases in php scripts, returning some records in a bash script, then updating with another php script, then another bash script to return the updated results and it seemed to work fine...
From what I have read, as long as the subscript doesn't call another subscript, the master script will wait if "wait" command is used between script calls.
Code is as below:
#!/bin/sh
/usr/bin/php test.php
wait
/bin/bash test.sh
wait
/usr/bin/php test2.php
wait
/bin/bash test2.sh
wait
echo all done
Hope it would execute all the php scripts in sequence.

Php Exec timeout

I have the following exec() command with an & sign at the end so the script runs in the background. However the script is not running in the background. It's timing out in the browser after exactly 5.6 minutes. Also if i close the browser the script doesn't keep running.
exec("/usr/local/bin/php -q /home/user/somefile.php &")
If I run the script via the command line, it does not time out. My question is how do i prevent timeout. How do i run the script in the background using exec so it's not browser dependent. What am i doing wrong and what should i look at.
exec() function handle outputs from your executed program, so I suggest you to redirect outputs to /dev/null (a virtual writable file, that automatically loose every data you write in).
Try to run :
exec("/usr/local/bin/php -q /home/gooffers/somefile.php > /dev/null 2>&1 &");
Note : 2>&1 redirects error output to standard output, and > /dev/null redirects standard output to that virtual file.
If you have still difficulties, you can create a script that just execute other scripts. exec() follows a process when it is doing a task, but releases when the task is finished. if the executed script just executes another one, the task is very quick and exec is released the same way.
Let's see an implementation. Create a exec.php that contains :
<?php
if (count($argv) == 1)
{
die('You must give a script to exec...');
}
array_shift($argv);
$cmd = '/usr/local/bin/php -q';
foreach ($argv as $arg)
{
$cmd .= " " . escapeshellarg($arg);
}
exec("{$cmd} > /dev/null 2>&1 &");
?>
Now, run the following command :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php > /dev/null 2>&1 &");
If you have arguments, you can give them too :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php x y z > /dev/null 2>&1 &");
You'll need to use shell_exec() instead:
shell_exec("/usr/local/bin/php -q /home/gooffers/somefile.php &");
That being said, if you have shell access, why don't you install this as a cronjob? I'm not sure why a PHP script is invoking another to run like this.

Run Script in Background Linux Server w/ PHP exec()

I'm trying to trigger a PHP script to run in the background using the exec() function but I cannot get it to work. I've read countless posts on stack overflow and other forums and tried many variations to no avail.
Server Info:
Operating System: Linux
PHP: 5.2.17
Apache Version: 2.2.23
Home Directory: /home1/username
I'm currently using the code:
exec("/home1/username/php /home1/username/public_html/myscript.php > /dev/null &");
When I run the above script I get no error_log and no error in my cPanel error log, however the script definitely doesn't execute. When I browse to http://www.mydomain.com/myscript.php it runs and e-mails me instantly. Any idea why this isn't working / how I can find out what error is being produced?
Update cPanel Process Manager Output
exec("php /home1/username/php /home1/username/public_html/myscript.php > /dev/null &");
Produces:
27183 php /home1/username/php /home1/username/public_html/myscript.php
27221 [sh]
27207 php /home1/username/php /home1/username/public_html/myscript.php
27219 php /home1/username/php /home1/username/public_html/myscript.php
27222 php /home1/username/php /home1/username/public_html/myscript.php
27224 php /home1/username/php /home1/username/public_html/myscript.php
27249 sh -c php /home1/username/php /home1/username/public_html/myscript.php > /dev/null &
Is that normal? Script appears to hang around for a long time even though it should execute very quickly.
Couldn't get the exec working with php. Even when I got shell access to the server the command just hung. I decided to use wget instead which accomplishes the same thing. Works great :)
exec("wget http://www.mydomain.com/myscript.php > /dev/null &");
Have you tried invoking the php CLI directly?
exec("php /home1/username/php /home1/username/public_html/myscript.php > /dev/null &");
You will not need the #!, which would output to the browser if called through Apache.
EDIT.
It looks like your script is working, but your PHP script executing in the background is hanging (not exiting). Try this variation:
exec("php /home1/username/php /home1/username/public_html/myscript.php > /dev/null 2>&1 &");
What does “> /dev/null 2>&1″ mean?
since you want to run the myscript from your command line, wy not do this:
exec('(/home1/username/public_html/myscript.php) > /dev/null &',$r,$s);
And write this as a first line in the myscript.php:
#!/home1/username/php -n
<?php
//script goes here
?>
That should work. The hashbang tells the system what programme to use to run the script that follows, so you don't need to add that to your exec call. Also, it's safer (and therefore better) to put brackets around the full script call, just so PHP knows what output has to be redirected to what stream, to avoid any issues that might occur. Especially when libs or packages like PHP-GTK are installed on the server (hence the -n option).

Categories