PHP exec and header redirect - php

I have a long running PHP script that i want to be executed in background on server after a user action. and the user should be redirected to other page while command should be running in background.
Below is the code
$command = exec('php -q /mylongrunningscript.php');
header("Location: /main.php?action=welcome");
The above script is running fine, but page does not redirected until $command = exec('php -q /mylongrunningscript.php'); is executed.
I want that user should be immediately redirected to the welcome page.
Is there any other way to achieve this task.
The other idea is that that $command = exec('php -q /mylongrunningscript.php'); should be executed on welcome page, but welcome page HTML is shown after the command has executed. command takes about 5,6 minutes and this time page does not redirects.
I am On Cent os Linux with PHP 5.3

Can you try this instead:
$result = shell_exec('php -q /mylongrunningscript.php > /dev/null 2>&1 &');
PS: Note that this is redirecting stdout and stderr to /dev/null If you want to capture output then use:
$result = shell_exec('php -q /mylongrunningscript.php > /tmp/script.our 2>&1 &');
Alternatively use this PHP function to run any Unix command in background:
//Run linux command in background and return the PID created by the OS
function run_in_background($Command, $Priority = 0) {
if($Priority)
$PID = shell_exec("nohup nice -n $Priority $Command > /dev/null & echo $!");
else
$PID = shell_exec("nohup $Command > /dev/null & echo $!");
return($PID);
}
Courtesy: A comment posted on http://php.net/manual/en/function.shell-exec.php

As noted in the exec() manual page of PHP:
If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.
So let's do that, using 2>&1 (basically 2 is stderr and 1 is stdout, so what this means is "redirect all stderr messages to stdout"):
shell_exec('php -q /mylongrunningscript.php 2>&1');
or if you want to know what it outputs:
shell_exec('php -q /mylongrunningscript.php 2>&1 > output.log');

Send the script output to /dev/null and the exec function will return immediately
$command = exec('php -q /mylongrunningscript.php > /dev/null 2>&1');

Related

Is timeout possible in exec function in php?

I want exec() in php stop execution after given time.
eg:
exec("/usr/local/bin/wrun 'uptime;ps -elf|grep httpd|wc -l;free -m;mpstat'",$admin_uptime)
stop his execution after 20 sec.
is it possible..??
<?php
exec("/usr/local/bin/wrun 'uptime;ps -elf|grep httpd|wc -l;free -m;mpstat'",$admin_uptime);
?>
in unix every process binds parent process. If parent dies child processes also die. So if you use this command: set_time_limit(20); your script and executed child processes killed.
exec() function handle outputs from your executed program, so I suggest you to redirect outputs to /dev/null (a virtual writable file, that automatically loose every data you write in).
Try to run :
exec("/usr/local/bin/php -q /home/gooffers/somefile.php > /dev/null 2>&1 &");
Note : 2>&1 redirects error output to standard output, and > /dev/null redirects standard output to that virtual file.
If you have still difficulties, you can create a script that just execute other scripts. exec() follows a process when it is doing a task, but releases when the task is finished. if the executed script just executes another one, the task is very quick and exec is released the same way.
Let's see an implementation. Create a exec.php that contains :
<?php
if (count($argv) == 1)
{
die('You must give a script to exec...');
}
array_shift($argv);
$cmd = '/usr/local/bin/php -q';
foreach ($argv as $arg)
{
$cmd .= " " . escapeshellarg($arg);
}
exec("{$cmd} > /dev/null 2>&1 &");
?>
Now, run the following command :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php > /dev/null 2>&1 &");
If you have arguments, you can give them too :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php x y z > /dev/null

Php Exec timeout

I have the following exec() command with an & sign at the end so the script runs in the background. However the script is not running in the background. It's timing out in the browser after exactly 5.6 minutes. Also if i close the browser the script doesn't keep running.
exec("/usr/local/bin/php -q /home/user/somefile.php &")
If I run the script via the command line, it does not time out. My question is how do i prevent timeout. How do i run the script in the background using exec so it's not browser dependent. What am i doing wrong and what should i look at.
exec() function handle outputs from your executed program, so I suggest you to redirect outputs to /dev/null (a virtual writable file, that automatically loose every data you write in).
Try to run :
exec("/usr/local/bin/php -q /home/gooffers/somefile.php > /dev/null 2>&1 &");
Note : 2>&1 redirects error output to standard output, and > /dev/null redirects standard output to that virtual file.
If you have still difficulties, you can create a script that just execute other scripts. exec() follows a process when it is doing a task, but releases when the task is finished. if the executed script just executes another one, the task is very quick and exec is released the same way.
Let's see an implementation. Create a exec.php that contains :
<?php
if (count($argv) == 1)
{
die('You must give a script to exec...');
}
array_shift($argv);
$cmd = '/usr/local/bin/php -q';
foreach ($argv as $arg)
{
$cmd .= " " . escapeshellarg($arg);
}
exec("{$cmd} > /dev/null 2>&1 &");
?>
Now, run the following command :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php > /dev/null 2>&1 &");
If you have arguments, you can give them too :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php x y z > /dev/null 2>&1 &");
You'll need to use shell_exec() instead:
shell_exec("/usr/local/bin/php -q /home/gooffers/somefile.php &");
That being said, if you have shell access, why don't you install this as a cronjob? I'm not sure why a PHP script is invoking another to run like this.

PHP: exec() doesn't run in the background even with ">/dev/null 2>&1 &"

I'm calling this in my php script:
exec("gutschein.php >/dev/null 2>&1 &");
Calling the script (generates a pdf and sends it away by e-mail) works, but the process is not running in the background (I checked it out with a sleep statement inside gutschein.php). The browser is hanging until execution of gutschein.php is finished.
I also checked out the following:
exec("/usr/bin/php gutschein.php >/dev/null 2>&1 &");
or
shell_exec("/usr/bin/php gutschein.php >/dev/null 2>&1 &");
It doesn't change anything. The script is actually running on a linux server. Has anybody an idea what I'm doing wrong?
Try system and/or passthru. I've had issues with exec before because it halts trying to fill the return array with data until the process has finished.
These will both echo raw output so even if they work you may need to handle that with a discarded output buffer.
/**
* #author Micheal Mouner
* #param String $commandJob
* #return Integer $pid
*/
public function PsExec($commandJob)
{
$command = $commandJob . ' > /dev/null 2>&1 & echo $!';
exec($command, $op);
$pid = (int) $op[0];
if ($pid != "")
return $pid;
return false;
}
This worked for me .. check it
also, return processId of the background process
All output must be redirected, else the script will hang as long as gutschein.php executes. Try
exec('/usr/bin/php gutschein.php &> /dev/null &');
Can you try one of the following 2 commands to run background jobs from PHP:
$out = shell_exec('nohup /usr/bin/php /path/to/gutschein.php >/dev/null 2>&1 &');
OR
$pid = pclose(popen('/usr/bin/php gutschein.php', 'r'));
It will execute the command in background and returns you the PID, which you can check using condition $pid > 0 to ensure it has worked.
You can use screen: screen -d -m /usr/bin/php gutschein.php
Here is screen's manual if you need more info on options.
Please see my experience at HERE.
The way I try and works for me -
php -q /path/to/my/test/script/cli_test.php argument1 < /dev/null &
In PHP, it is like
exec('php -q /path/to/my/test/script/cli_test.php argument1 < /dev/null &')
The browser is might waiting for a response, so let your script produce any output and terminate the "main process". a simple
die('ok');
should do the job.
btw, forking a new process by calling exec is might not the best solution since the new process isn't a real child process - means you can't control it. you might consider using pcntl (http://php.net/manual/de/book.pcntl.php) for this purpose. but stand clear, this extension has also his pitfalls, especially in the context of a webserver.

Shell_exec php with nohup

I think there are tons of similar posts but I haven't yet found a solution after searching around.
Basically, I'm trying to run two scripts in the background. When I run them in the commandline, I see after calling my first script:
/usr/bin/nohup php script.php > nohupoutput.log & echo $!
I've tried ...script.php > /dev/null & with the same result. I get:
/usr/bin/nohup: ignoring input and redirecting stderr to stdout
which I ignore and run the second one. I noticed that it seemed to be hanging there, and pressing Enter brought me back to machine:~folder>
/usr/bin/nohup php script2.php > nohupoutput.log & echo $!
Both scripts work. I tried to then convert this to a shell_exec command and nothing seems to work. I suspect that the ignoring input bit is causing difficulties, but I'm not sure. Regardless, the following does not work. It just hangs in the browser:
$output = shell_exec('/usr/bin/nohup php script.php > /dev/null &');
$output = shell_exec('/usr/bin/nohup php script2.php > /dev/null &');
Try:
$output = shell_exec('/usr/bin/nohup php script.php >/dev/null 2>&1 &');
Or:
exec('/usr/bin/nohup php script.php >/dev/null 2>&1 &');
This shoul work:
shell_exec('nohup /usr/bin/php path/to/script.php > output.txt &');
<?php
function execInBackground($cmd) {
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start /B ". $cmd, "r"));
}
else {
exec($cmd . " > /dev/null &");
}
}
// take note: to get your PHP_PATH, try looking at your phpinfo :)
echo execInBackground("/usr/local/php53/bin/php 'example2.php'");
?>
First put your php command in a shell file script, e.g. myscript.sh:
#!/bin/bash
# myscript.sh file
php script.php
Run nohup with myscript.sh:
sudo nohup ./myscript.sh &
Verify with ps:
ps aux | grep myscript.sh

php exec command (or similar) to not wait for result

I have a command I want to run, but I do not want PHP to sit and wait for the result.
<?php
echo "Starting Script";
exec('run_baby_run');
echo "Thanks, Script is running in background";
?>
Is it possible to have PHP not wait for the result.. i.e. just kick it off and move along to the next command.
I cant find anything, and not sure its even possible. The best I could find was someone making a CRON job to start in a minute.
From the documentation:
In order to execute a command and have it not hang your PHP script while
it runs, the program you run must not output back to PHP. To do this,
redirect both stdout and stderr to /dev/null, then background it.
> /dev/null 2>&1 &
In order to execute a command and have
it spawned off as another process that
is not dependent on the Apache thread
to keep running (will not die if
somebody cancels the page) run this:
exec('bash -c "exec nohup setsid your_command > /dev/null 2>&1 &"');
You can run the command in the background by adding a & at the end of it as:
exec('run_baby_run &');
But doing this alone will hang your script because:
If a program is started with exec function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
So you can redirect the stdout of the command to a file, if you want to see it later or to /dev/null if you want to discard it as:
exec('run_baby_run > /dev/null &');
This uses wget to notify a URL of something without waiting.
$command = 'wget -qO- http://test.com/data=data';
exec('nohup ' . $command . ' >> /dev/null 2>&1 & echo $!', $pid);
This uses ls to update a log without waiting.
$command = 'ls -la > content.log';
exec('nohup ' . $command . ' >> /dev/null 2>&1 & echo $!', $pid);
I know this question has been answered but the answers i found here didn't work for my scenario ( or for Windows ).
I am using windows 10 laptop with PHP 7.2 in Xampp v3.2.4.
$command = 'php Cron.php send_email "'. $id .'"';
if ( substr(php_uname(), 0, 7) == "Windows" )
{
//windows
pclose(popen("start /B " . $command . " 1> temp/update_log 2>&1 &", "r"));
}
else
{
//linux
shell_exec( $command . " > /dev/null 2>&1 &" );
}
This worked perfectly for me.
I hope it will help someone with windows. Cheers.
There are two possible ways to implement it.
The easiest way is direct result to dev/null
exec("run_baby_run > /dev/null 2>&1 &");
But in case you have any other operations to be performed you may consider ignore_user_abort
In this case the script will be running even after you close connection.
"exec nohup setsid your_command"
the nohup allows your_command to continue even though the process that launched may terminate first. If it does, the the SIGNUP signal will be sent to your_command causing it to terminate (unless it catches that signal and ignores it).
On Windows, you may use the COM object:
if(class_exists('COM')) {
$shell = new COM('WScript.Shell');
$shell->Run($cmd, 1, false);
}
else {
exec('nohup ' . $cmd . ' 2>&1 &');
}

Categories