shell script to monitor php process - php

How can I auto restart a process when it is dead?
I am currently doing it like this:
#!/bin/bash
while true;do
ps -aux 2>/dev/null |grep redis_subscribe|grep -v grep >/dev/null
if [ $? -ne 0 ];then
php /data/www/wwwroot/app.eclicks.cn/oil/index.php public/redis_subscribe subscribe 2>&1 >>/data/cilogs/manitor/image_upload.log &
fi;
sleep 10;
done;

#!/bin/bash
while true;do
ps -aux 2>/dev/null |grep redis_subscribe|grep -v grep >/dev/null
if [ $? -ne 0 ];then
php /data/www/wwwroot/app.eclicks.cn/oil/index.php public/redis_subscribe subscribe 2>&1 >>/data/cilogs/manitor/image_upload.log &
fi;
sleep 10;
done;
2>/dev/null is to redirect "Warning: bad syntax, perhaps a bogus" to /dev/null
and, >/dev/null is the same.
hope to help you!

Related

is there another way to separate a process from php exec?

I've been at this for a few days now on and off while working on other sections of my project
echo "playing";
header("HTTP/1.1 200 OK");
exec('./omx-start.sh "' . $full . '" > /dev/null 2>&1 &');
die();
I've also put the exec in my die like:
die(exec('nohup ./omx-start.sh "' . $full . '" > /dev/null 2>&1 &'));
I've also tried adding nohup (like above)
content of omx-start.sh
ps cax | grep "omxplayer" > /dev/null
if [ $? -eq 0 ]; then
sudo killall omxplayer && sudo killall omxplayer.bin
fi
echo $1
if [ -e "playing" ]
then
rm "playing"
fi
mkfifo "playing"
nohup omxplayer -b -o hdmi "$1" > /dev/null 2>&1 &
also I've added nohup and & at by the control operator
it SHOULD fork off into subshell
I can do this easily with python, with any other language actually.
I am almost going to have to make my php script call a python script that runs the omx-start.sh script too? or is there actually a good way to fork php scripts or force them to stop loading?
My die(); SOMETIMES triggers as well if I do die("test"); I can see it, sometimes triggering. and the page STILL is hanging (loading) but the php process is freed up to take other request at that time.. but the page.. still.. hangs.. what?

One command - two cron job

I need to execute in one command two php files.
the second file need to run right after the first finish.
This is what i did, not sure if it's ok:
/usr/bin/wget -O /dev/null -o /dev/null https://example.com/scripts/cron.php; sleep 2; wget -q -O - https://example.com/cron2.php > /dev/null 2>&1
I added sleep between the commands, it will work?
You can use && for sequential execution of command,
check https://www.steveroot.co.uk/2010/07/05/cron-multiple/ And What is the purpose of "&&" in a shell command?
In Your case You can try :
01 00 * * * //usr/bin/wget -O /dev/null -o /dev/null https://example.com/scripts/cron.php && wget -q -O - https://example.com/cron2.php > /dev/null 2>&1
Hope these will Help.

Getting a process to fork and be independent of parent in Apache + PHP + Bash

I have a bash script that calls liquidsoap like so
/bin/sh -c "echo \$\$ > \"${sdir}/pid/${sfile}.pid\" && exec liquidsoap \"${sdir}/liq/${sfile}.liq\" >/dev/null 2>&1 || rm \"${sdir}/pid/{$sfile}.pid\"" &
(For readability, it might look like this with variables filled in)
/bin/sh -c "echo \$\$ > \"/radio/pid/station.pid\" && exec liquidsoap \"/radio/liq/station.liq\" >/dev/null 2>&1 || rm \"/radio/pid/station.pid\"" &
In PHP, the script is called with
return shell_exec("{$this->streamBase}/scripts/{$this->streamName} start config {$stationConfig}");
My problem is, I just had to restart Apache, and when I did, it also killed the liquid soap instances. I would like to get it to run fully independent of Apache such that I could restart Apache and they would keep running.
I'm not sure how I can achieve that.
EDIT:
I've tried changing
/bin/sh -c "echo \$\$ > \"${sdir}/pid/${sfile}.pid\" && exec liquidsoap \"${sdir}/liq/${sfile}.liq\" >/dev/null 2>&1 || rm \"${sdir}/pid/{$sfile}.pid\"" &
to
(/bin/sh -c "echo \$\$ > \"${sdir}/pid/${sfile}.pid\" && exec liquidsoap \"${sdir}/liq/${sfile}.liq\" >/dev/null 2>&1 || rm \"${sdir}/pid/{$sfile}.pid\"" & ) &
and
nohup /bin/sh -c "echo \$\$ > \"${sdir}/pid/${sfile}.pid\" && exec liquidsoap \"${sdir}/liq/${sfile}.liq\" >/dev/null 2>&1 || rm \"${sdir}/pid/{$sfile}.pid\"" &
Neither keep liquidsoap running if I restart (or stop/start) Apache. When Apache stops, so do those processes.
for an exit code to be propogated up the chain the parents and grandparents must exist, and if you kill the grandparent, aka apache, yes, you kill the children and grandchildren unless they leave the family and become daemons.

PHP: can not exec a script in the background

I have main php script on web-server nginx+php-fpm and try to run another php script in background using GET request from web-browser. Line code in main.php to call detect.php:
exec("/usr/bin/php -f /var/www/detect.php 6 > /dev/null 2>&1 &");
detect.php does not start. I don't have any errors.
If to remove "&":
exec("/usr/bin/php -f /var/www/detect.php 6 > /dev/null 2>&1 ");
detect.php starts successfully.
From shell bash with "&" :
sudo -u www-data /usr/bin/php -f /var/www/detect.php 6 > /dev/null 2>&1 &
Script detect.php starts successfully.
try this and make sure your php path are correct
$dir=dirname(__FILE__);
$file_name="detect.php";
$php_path="/usr/bin/php";
$args = array(6);
$command1="cd $dir";
$command2="$php_path $file_name ".implode(" ",$args) ." > /dev/null 2>/dev/null &";
$final_command=$command1."; ".$command2;
shell_exec($final_command);

Wget download queue script

The idea is that when wget is running and downloading something, I can just add another URL that will be downloaded once the current download is finished. I only want to download 1 file at a time. I wrote this script
#!/bin/bash
test=/tmp/wget-download-link.txt
echo -n "$test" | while IFS= read -N 1 a; do
wget -o /tmp/wget.log -P /mnt/usb -i /tmp/wget-download-link.txt
if [[ "$a" == $'\n' ]] ; then
wget -nc -o /tmp/wget.log -P /mnt/usb -i /tmp/wget-download-link.txt
fi
#printf "$a"
echo download finished
done
The script will check for any new lines that consist of URLs, if there's any, it will rerun wget again, the problem is that this script will just keep looping, wget will download the same file continuously and just rename them if it already exists. How do I make wget re-run if there's any new URLs in the wget-download-link.txt file but stop it when the file already exists?
#msturdy I run your script but wget redownload and rename files that already exist, my script:
#!/bin/bash
test=/tmp/wget-download-link.txt
l=$(wc -l $test)
tail -n $l -f $test | while read url; do
wget -o /tmp/wget.log -P /mnt/usb -i /tmp/wget-download-link.txt
done
my wget-download-link.txt file:
http://media2.giga.de/2014/11/angel-beats-kanade.jpg
http://juanestebanrojas.com/wp-content/uploads/2014/06/angel-beats-wallpapers-4.jpg
http://images5.fanpop.com/image/photos/30100000/Angel-Beats-new-life-angel-beats-30142329-2560-909.jpg
http://kristenhazelkannon.files.wordpress.com/2013/06/angelbeats2.jpg
Downloaded files:
angel-beats-wallpapers-4.jpg
angel-beats-wallpapers-4.jpg.1
Angel-Beats-new-life-angel-beats-30142329-2560-909.jpg.1
Angel-Beats-new-life-angel-beats-30142329-2560-909.jpg
angel-beats-kanade.jpg.2
angel-beats-kanade.jpg.1
angel-beats-kanade.jpg
angelbeats2.jpg
the script keeps running, and will just rename files to .1 .2 .3 etc.
SOLVED WITH THIS
while [ true ] ; do
urlfile=$( ls /root/wget/wget-download-link.txt | head -n 1 )
dir=$( cat /root/wget/wget-dir.txt )
if [ "$urlfile" = "" ] ; then
sleep 180
continue
fi
url=$( head -n 1 $urlfile )
if [ "$url" = "" ] ; then
mv $urlfile $urlfile.invalid
continue
fi
mv $urlfile $urlfile.busy
wget $url -P $dir -o /www/wget.log -c -t 100 -nc
mv $urlfile.busy $urlfile.done
done

Categories