This question already has answers here:
php exec command (or similar) to not wait for result
(7 answers)
Closed 7 years ago.
I have a PHP script that queries a database for a list of jobs to be done and fires off other PHP scripts based on what it finds in the database (basically a process queue).
Some of the scripts that the queue runner script executes may take 30 seconds or so to finish running (generating PDFs, resizing images, etc).
The problem is that shell_exec() in the queue runner script calls the processing scripts, but then doesn't wait for them to finish, resulting in the queue not being completed.
Queue runner script:
#!/usr/bin/php
<?php
// Loop through database and find jobs to be done
shell_exec(sprintf("/root/scripts/%s.php", $row['jobName']));
?>
Job script:
#!/usr/bin/php
<?php
shell_exec("/usr/bin/htmldoc -t pdf --webpage test.html > test.pdf");
// Update database to mark job as completed
?>
Running the job script directly from the command line works and the PDF is created.
Any ideas on how to fix this? Or a better way to run a process queue?
Try this:
shell_exec("nohup /usr/bin/htmldoc -t pdf --webpage test.html > test.pdf 2>&1 &");
Related
When executing multiple scripts within PHP using the exec command; are each script ran one at a time as in one after the other or are they ran simultaneously?
exec('/usr/bin/php -q process-duplicates.php');
exec('/usr/bin/php -q process-images.php');
exec('/usr/bin/php -q process-sitemaps.php');
Just want to make sure they are one after the other before attempting to rewrite my crontabs.
Sure, the only way to run at background is adding & to the command line arguments, which would put that exec()'d process into the background:
exec("php test.php &");
So you are right, they run one after the other.
NOTE: In your case you shouldn't use & as it will force to run all the scripts simultaneously.
exec waits for the script to return, see php.net
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
But as a devops, please please please do not run your cron jobs like this! Create entries in the crontab for each, or put them in a shell script and have cron run the script.
This question already has answers here:
php execute a background process
(21 answers)
Closed 9 years ago.
If I run this unix command directly in shell:
$ sleep 100 &
sleep runs in the background as expected, and I can continue working in the command line.
but trying the same thing with shell_exec() and php I get different results.
<?php
$sleep = $argv[1];
$shell="sleep " . $sleep . " &";
shell_exec($shell);
?>
when executing php sleep.php 100 the command line hangs and wont accept any more commands until sleep finishes. I am not sure whether this is a nuance I am missing with shell_exec() / $argv of php or with the unix shell.
Thanks.
The shell_exec function is trying to capture the output of the command, which it can't do while simultaneously continuing processing. In fact, if you look at the php source code, the php shell_exec function does a popen C call, which does a wait syscall on the command. wait guarantees that the subprocess doesn't return until the child has exited.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Make PHP wait for Matlab script to finish executing
Okay, starting from php execute a background process
to run a background process works great. The problem is, I need the return of that process also. The obvious solution to me is :
$cmd = "($cmd > $outputfile 2>&1 || echo $? > $returnfile) & echo $! > $pidfile";
exec($cmd);
When I run the generated command on the command line, it backgrounds and the files are filled out as expected. The problem is that when php exec() runs, the command doesn't go to the background (at least, exec doesn't return until the command finishes). I tried variations with nohup and wait $pid, but still no solution.
Any thoughts?
This is tricky- you could potentially fork the process to do something else, leaving the original process in place.
http://php.net/manual/en/function.pcntl-fork.php
However, if this is a web application, there's no built-in way to retrieve the return code or STDOUT back into the parent process since it's technically async (your request-response cycle will likely end before a result can be produced).
You could store the return code and / or STDOUT to files to check later, though.
I have this set in my php script to make it supposedly run as long as it needs to to parse and do mysql queries and fetch images for over 100,000 rows.
ignore_user_abort(true);
set_time_limit(0);
#begin logging output
error_reporting(E_ALL);
ini_set('memory_limit', '512M');
I run the command like this in shell:
nohup php myscript.php > output.txt
after running about 8 to 10 hours this script will still be running but execution just stops... no more output.. it's not a zombie process I checked top. It hasn't met the memory limit either and if it did wouldn't it exit?
What is going on? It's a real pain to babysit this script and write custom code to nudge it along. What is going on? I read up on unix maybe cleaning up zombies but it's not a zombie. I know it's not php settings.. and it's not running through a webserver it's from command line only so what gives.
It looks like you haven't detached your process correctly. Currently, if your process's parent die, your process will die too. If you place your process in background (create a real daemon), you'll not meet scuh trouble.
You can execute your PHP this way to really detach it :
php myscript.php > output.txt 2>&1 &
For your information :
> output.txt
will redirect standard output (ie. your echo, print etc) to output.txt file
2>&1
will redirect error output to standard output, writting it in the same output.txt file
&
is the most important thing in your case : it will detach your process to create a real daemon.
Edit : if you're having troubles while disconecting your shell, the most simple is to put your script on a bash script, for example run.sh :
#!/bin/bash
php myscript.php > output.txt 2>&1 &
And you'll run your script this way :
bash run.sh &
In such case, your shell will "think" your program has ended at the end of the shell script, not at the end of the php daemon.
Long-running PHP scripts shouldn't die or hang without reason. I've had scripts that run continuously for 6 months +. There must be something else going on inside of your script body.
I know I should use comment to answer this, but I have not enough reputation to do it...
Maybe your process is consuming 100% of CPU, I had an issue with a while loop without calling a sleep() or usleep() at the end of the loop.
I made this script to test the execution of PHP as a background process
foreach($tests as $test) {
exec("php test.php ".$test["id"]);
}
as suggested in php process background
and How to add large number of event notification reminder via Google Calendar API using PHP? and php execute a background process
But the script does not run faster than when it was all in one script without the addition of test.php.
What am I doing wrong?
Thanks in advance!
exec() will block until the process you're exec'ing has completed - in otherwords, you're basically running your 'test.php' as a subroutine. At bare minimum you need to add a & to the command line arguments, which would put that exec()'d process into the background:
exec("php test.php {$test['id']} &");