Parallelizing PHP processes with a Bash Script? - php

I want to launch ~10 php processes from a bash script. When one of them finishes, I'd like the bash script to launch another php process, and continue on indefinitely, always having ~10 php processes running.
What is the simplest way to do this?
The php file launched will be the same every time, but the php process will know to pull new values from the database so it's processing new data each time. The file I need to launch and all it's classes are already written in php.

Seems like a good fit for superivisord. The following configuration will make sure that 10 processes are always running, and deals with log rotation, which is also handy. All output, including stderr, will be written to /var/log/worker.log. With "autorestart=true", supervisord will replace a child process as soon as it exits.
[program:worker]
command=php /path/to/worker.php
process_name=%(program_name)s_%(process_num)d
stdout_logfile=/var/log/%(program_name)s.log
redirect_stderr=true
stdout_capture_maxbytes=512MB
stdout_logfile_backups=3
numprocs=10
numprocs_start=0
autostart=true
autorestart=true
Once you have the supervisor config in place (usually /etc/supervisord/conf.d), you can use supervisorctl as a convenient way to start and stop the process group.
$ supervisorctl start worker
...
$ supervisorctl stop worker
...
$ supervisorctl status
worker:worker_0 RUNNING pid 8985, uptime 0:09:24
worker:worker_1 RUNNING pid 10157, uptime 0:08:52
...
worker:worker_9 RUNNING pid 12459, uptime 0:08:31

You could use GNU Parallel, piping the list of images to manage into parallel as described here.

You could use something like this. Use one file to launch 10 (only run this once) and the bottom of each file could relaunch itself when it finished.
/**
* Asynchronously execute/include a PHP file. Does not record the output of the file anywhere.
*
* #param string $filename file to execute, relative to calling script (or root?)
* #param string $options (optional) arguments to pass to file via the command line
*/
function asyncInclude($filename, $options = '') {
exec("/path/to/php -f {$filename} {$options} >> /dev/null &");
}

jcomeau#intrepid:/tmp$ cat test.sh
#!/bin/sh
set -m # monitor mode
task="php-cgi /tmp/sleep.php"
function do_task {
$task >/dev/null &
echo -n spawned $! ' ' >&2
}
trap do_task SIGCHLD
for i in $(seq 1 10); do
do_task
done
while true; do
wait
done
jcomeau#intrepid:/tmp$ cat sleep.php
<?php sleep(3); ?>

Related

PHP only able to kill process when run in cli

I have this php script "start_stream.php", which starts a process (ffmpeg) using proc_open() and then stays in a loop checking for a keep-alive signal.
Once the keep-alive signal is too old, the php script will terminate the process by its pID with this command:
$output = system('taskkill /F /PID '.$this->pID);
echo 'TASKKILL OUTPUT: ';
var_dump($output);
The output from this is always like this:
TASKKILL OUTPUT: string(55) "SUCCESS: The process with PID 2272 has been terminated."
BUT under some circumstances the process does not terminate, but keeps running. But I always get the SUCCESS message.
If I run the start_stream.php script from cmd (windows) with this command:
php start_stream.php --stream [STREAM-NAME] --url [STREAM-URL]
Then the process will terminate properly as expected.
But since this is a web-application, I need to call the start_stream.php script from another php script. I do it like this:
//Set the command for streaming
$command = 'php start_stream.php --stream '.$stream.' --url '.$url;
//Add the environment variables for shell execution
putenv('PATH=' . $_SERVER['PATH']);
//Set the working directory to streaming directory
chdir("..\\streaming");
//Start the streaming process
define('STDIN',fopen("php://stdin","r"));
define('STDOUT',fopen("stdout-".$stream,"w"));
$descriptorspec = [STDIN, STDOUT, STDOUT];
$process = proc_open($command, $descriptorspec, $pipes);
When I do this, I still get the SUCCESS output, stating that the process has terminated, but the process keeps running in the background.
It works to kill the process with this:
shell_exec('taskkill /IM "ffmpeg.exe" /F');
But I cannot use that, since it will kill all ffmpeg processes, and I need to run concurrent processes, killing only the relevant ones.
As I said earlier, the script runs and terminates as intended if I use the following command on the command line:
php start_stream.php --stream [STREAM-NAME] --url [STREAM-URL]
But if I use php.exe instead of php like so:
php.exe start_stream.php --stream [STREAM-NAME] --url [STREAM-URL]
I will have the same problem with the ffmpeg process not terminating, but still giving a success message.
I am also using proc_terminate() in the start_stream.php script like so:
proc_terminate($this->process);
But this also gives me a "process terminated" status response while the process does not stop.
The pID for the process is the pID for the cmd instance running the process, not for the ffmpeg process itself. When I call the termination of this process, the cmd process is killed in every instance.
When calling the php process from cmd without using php.exe, the ffmpeg process is killed along with the cmd process. When using php.exe, only the cmd process is killed, but the ffmpeg process keeps running.
Is there a way to either ensure that the ffmpeg process is killed along with the cmd process, or maybe some way to get the pID of the ffmpeg process itself in the context that I need?
Although I was able to fix this, I am still not exactly sure why and how the execution conext changes between calling start_stream.php from cmd, and calling it from another php script.
When running a command using php's proc_open(), the pID returned will by default be of the shell executing the command. So if we are starting a process via the shell, we are getting the pID of the shell executing the process, but not the pID for the child process itself.
In some context, terminating the shell will also terminate the child process, but in other cases, the shell will be terminated, while the child process keeps running.
If we pass the bypass_shell = TRUE to proc_open, php will return the pID of the child process and will not be creating a new shell process to contain it.
The following example will return the destination process pID:
$command = "ffmpeg do something";
$descriptorspec = [STDIN, STDOUT, STDOUT];
$otherOptions = ['bypass_shell' => TRUE];
$this->process = proc_open( $command, $descriptorspec, [], NULL, NULL, $otherOptions);

End Process Created by PHP exec in Ubuntu using Apache

I have an Ubuntu VM running in VirtualBox that hosts a server using Apache. The concept of the server is to accept HTTP POST requests, store them in a MySQL database and then execute a Python script with the relevant POST data to be displayed in a Discord channel.
The process itself is working but each time the PHP script calls the Python script, a new process is created that never actually ends. After a few hours of receiving live data the server runs out of available memory due to the amount of lingering processes. The PHP script has the following exec call as the last line of code;
exec("python3 main.py $DATA");
I would like to come up with a way to actually kill the processes created from this exec command (using user www-data), either in the Python file after the script is executed or automatically with an Apache setting that I probably just do not know about.
When running the following command in a terminal I can see the different processes;
ps -o pid,user,%mem,command ax | sort -b -k3 -r
There are 3 separate processes that show up, 1 referencing the actual python3 exec command as marked up in PHP;
9903 www-data 0.4 python3 main.py DATADATADATADATADATADATA
Then another process showing the more common -k start commands;
9907 www-data 0.1 /usr/sbin/apache2 -k start
And lastly another process very similar to the PHP exec command;
9902 www-data 0.0 sh -c python3 main.py DATADATADATADATADATADATA
How can I ensure Apache cleans these processes up - OR what do I need to add into my Python or PHP code to appropriately exec a Python script without leaving behind processes?
Didn't realize the exec command in php would wait for a return output indefinitely. Added this to the end of the string I was using in my exec call; > /dev/null &
i.e.: exec("python3 main.py $DATA > /dev/null &");

PHP `exec()` doesn't work if run by a process initiated from Cron

I have a Beanstalk MQ where I put task to create APK, and a consumer named AppBuilder.php, which reads messages from Beanstalk MQ, and then exec the command which creates the App (android app).
The AppBuilder.php is run from the crontab. the process is
Crontab runs a health-check.sh shell script
health-check.sh runs AppBuilder.php in background
AppBuilder.php calls exec to create the process
Following is the relevant code snippet(s) from each file:
Root crontab is like so:
* * * * * /opt/cron/health-check/health-check.sh
health-check.sh is like this:
#!/bin/bash
PATH=$PATH:/sbin/
#HEALTH CHECK FOR AppBuilder Daemon
if (( $(ps -ef | grep "[A]ppBuilder" | wc -l) > 0 ))
then
echo "AppBuilder is Running"
else
echo "AppBuilder is Stopped, attempting to restart"
$PHP_CMD /opt/appbuilder/AppBuilder.php &
if pgrep "AppBuilder" > /dev/null
then
echo "AppBuilder is now Running"
else
echo "AppBuilder is still not Running"
fi
fi
AppBuilder.php has following exec command:
exec('sudo sh /var/www/cgi-bin/appbuilder/oneClickApk.sh &', $output, $resultCode);
If I run the AppBuilder.php directly, things work fine. However, from cron, it does not.
I've followed this SO Post, and modified the exec command to the following:
exec('/usr/bin/nohup /usr/bin/sudo /usr/bin/env TERM=xterm /bin/sh /var/www/cgi-bin/appbuilder/oneClickApk.sh &', $output, $resultCode);
However, still things fail. Any clues where this may be going wrong? I've spend a lot of time digging the forums, none helping. Please help!
EDIT 1:
The crontab runs, AppBuilder.php gets initialized, but after exec command, I could not see the oneClickApk.sh in process list
EDIT 2:
I changed the crontab from root to ec2-user, as suggested in comments: Still the process does not run.
Just following your first approach like below. But to fix the issue you have to do the following.
Check your cron-tab entry whether properly set or not.
In cron-tab provide full path to your executable
Check whether cron-tab user account has proper setting to execute sudo
Check whether cron-tab account has proper permission to execute your PHP application as well as your shell script. If not add the account into the proper group so that it can run both the PHP and shell script.
exec('sudo sh /var/www/cgi-bin/appbuilder/oneClickApk.sh &', $output, $resultCode);
Use above command its seems OK.
In you shell script put a wait command and check whether it is working fine or not. Just add only below two lines without any if condition first. If working then you need to check your if condition why they are not satisfying as per your requirement.
$PHP_CMD /opt/appbuilder/AppBuilder.php &
while (( $(ps -ef | grep "[h]ealth-check.sh" | wc -l) <= 0 ))
do
sleep 5
done
sleep 30

Run multiple php scripts in sequence from a single php script or batch file in windows

I have 5 php scripts need to be executed one after other. How can i create a single php/batch script which executes all the php scripts in sequence.
I want to schedule a cron which run that php file.
#!/path/to/your/php
<?php
include("script1.php");
include("script2.php");
include("script3.php");
//...
?>
Alternative
#/bin/bash
/path/to/your/php /path/to/your/script1.php
/path/to/your/php /path/to/your/script2.php
/path/to/your/php /path/to/your/script3.php
# ...
If your scripts need to be accessed via http:
#/bin/bash
wget --quiet http://path/to/your/script1.php > /dev/null 2>&1
wget --quiet http://path/to/your/script2.php > /dev/null 2>&1
wget --quiet http://path/to/your/script3.php > /dev/null 2>&1
# ...
I did something for testing using the "wait" command, as it seems I just put wait between the calls to each script. I did a little test creating databases in php scripts, returning some records in a bash script, then updating with another php script, then another bash script to return the updated results and it seemed to work fine...
From what I have read, as long as the subscript doesn't call another subscript, the master script will wait if "wait" command is used between script calls.
Code is as below:
#!/bin/sh
/usr/bin/php test.php
wait
/bin/bash test.sh
wait
/usr/bin/php test2.php
wait
/bin/bash test2.sh
wait
echo all done
Hope it would execute all the php scripts in sequence.

How to check if a php script is still running

I have a PHP script that listens on a queue. Theoretically, it's never supposed to die. Is there something to check if it's still running? Something like Ruby's God ( http://god.rubyforge.org/ ) for PHP?
God is language agnostic but it would be nice to have a solution that works on windows as well.
I had the same issue - wanting to check if a script is running. So I came up with this and I run it as a cron job. It grabs the running processes as an array and cycles though each line and checks for the file name. Seems to work fine. Replace #user# with your script user.
exec("ps -U #user# -u #user# u", $output, $result);
foreach ($output AS $line) if(strpos($line, "test.php")) echo "found";
In linux run ps as follows:
ps -C php -f
You could then do in a php script:
$output = shell_exec('ps -C php -f');
if (strpos($output, "php my_script.php")===false) {
shell_exec('php my_script.php > /dev/null 2>&1 &');
}
The above code lists all php processes running in full, then checks to see if "my_script.php" is in the list of running processes, if not it runs the process and does not wait for the process to terminate to carry on doing what it was doing.
Just append a second command after the script. When/if it stops, the second command is invoked. Eg.:
php daemon.php 2>&1 | mail -s "Daemon stopped" you#example.org
Edit:
Technically, this invokes the mailer right away, but only completes the command when the php script ends. Doing this captures the output of the php-script and includes in the mail body, which can be useful for debugging what caused the script to halt.
Simple bash script
#!/bin/bash
while [true]; do
if ! pidof -x script.php;
then
php script.php &
fi
done
Not for windows, but...
I've got a couple of long-running PHP scripts, that have a shell script wrapping it. You can optionally return a value from the script that will be checked in the shell-script to exit, restart immediately, or sleep for a few seconds -and then restart.
Here's a simple one that just keeps running the PHP script till it's manually stopped.
#!/bin/bash
clear
date
php -f cli-SCRIPT.php
echo "wait a little while ..."; sleep 10
exec $0
The "exec $0" restarts the script, without creating a sub-process that will have to unravel later (and take up resources in the meantime). This bash script wraps a mail-sender, so it's not a problem if it exits and pauses for a moment.
Here is what I did to combat a similar issue. This helps in the event anyone else has a parameterized php script that you want cron to execute frequently, but only want one execution to run at any time. Add this to the top of your php script, or create a common method.
$runningScripts = shell_exec('ps -ef |grep '.strtolower($parameter).' |grep '.dirname(__FILE__).' |grep '.basename(__FILE__).' |grep -v grep |wc -l');
if($runningScripts > 1){
die();
}
You can write in your crontab something like this:
0 3 * * * /usr/bin/php -f /home/test/test.php my_special_cron
Your test.php file should look like this:
<?php
php_sapi_name() == 'cli' || exit;
if($argv[1]) {
substr_count(shell_exec('ps -ax'), $argv[1]) < 3 || exit;
}
// your code here
That way you will have only one active instace of the cron job with my-special-cron as process key. So you can add more jobs within the same php file.
test.php system_send_emails sendEmails
test.php system_create_orders orderExport
Inspired from Justin Levene's answer and improved it as ps -C doesn't work in Mac, which I need in my case. So you can use this in a php script (maybe just before you need daemon alive), tested in both Mac OS X 10.11.4 & Ubuntu 14.04:
$daemonPath = "FULL_PATH_TO_DAEMON";
$runningPhpProcessesOfDaemon = (int) shell_exec("ps aux | grep -c '[p]hp ".$daemonPath."'");
if ($runningPhpProcessesOfDaemon === 0) {
shell_exec('php ' . $daemonPath . ' > /dev/null 2>&1 &');
}
Small but useful detail: Why grep -c '[p]hp ...' instead of grep -c 'php ...'?
Because while counting processes grep -c 'php ...' will be counted as a process that fits in our pattern. So using a regex for first letter of php makes our command different from pattern we search.
One possible solution is to have it listen on a port using the socket functions. You can check that the socket is still listening with a simple script. Even a monitoring service like pingdom could monitor its status. If it dies, the socket is no longer listening.
Plenty of solutions.. Good luck.
If you have your hands on the script, you can just ask him to set a time value every X times in db, and then let a cron job check if that value is up to date.
troelskn wrote:
Just append a second command after the script. When/if it stops, the second command is invoked. Eg.:
php daemon.php | mail -s "Daemon stopped" you#example.org
This will call mail each time a line is printed in daemon.php (which should be never, but still.)
Instead, use the double ampersand operator to separate the commands, i.e.
php daemon.php & mail -s "Daemon stopped" you#example.org
If you're having trouble checking for the PHP script directly, you can make a trivial wrapper and check for that. I'm not sufficiently familiar with Windows scripting to put how it's done here, but in Bash, it'd look like...
wrapper_for_test_php.sh
#!/bin/bash
php test.php
Then you'd just check for the wrapper like you'd check for any other bash script: pidof -x wrapper_for_test_php.sh
I have used cmder for windows and based on this script I came up with this one that I managed to deploy on linux later.
#!/bin/bash
clear
date
while true
do
php -f processEmails.php
echo "wait a little while for 5 secobds...";
sleep 5
done

Categories