I'm executing a shell script as a cron job every minute like so:
* * * * * /bin/bash /var/www/html/stream.sh
The script contains the following code:
#!/bin/sh
if ps -ef | grep -v grep | grep get_tweets.php ; then
exit 0
else
nohup php /var/www/html/streaming/db/get_tweets.php > /dev/null &
exit 0
fi
I'm also running another shell script as a cron, the only difference between the two is "get_tweets" is replaced with "parse_tweets_keyword" and cron is executed like so:
* * * * * /bin/bash /var/www/html/process-check.sh
Problem is, whilst the latter cron works perfectly fine, the first doesn't seem to run the script successfully, however when I run the command:
nohup php /var/www/html/streaming/db/get_tweets.php > /dev/null &
The script runs perfectly, so i'm not entirely sure what the problem is. The permissions for all files are correct and executable and i'm running the crons as root under crontab and both of the PHP script that i'm attempting to execute are running as background processes.
If anyone can help or knows the issue, would be greatly appreciated. I'm also open to better ways of running the PHP script perhaps as a single line in crontab rather than running the shell script to run the PHP script via cron.
So it seems this was the answer to my question:
#!/bin/sh
cd /var/www/html/streaming/db
if ps ax | grep -v grep | grep get_tweets.php ;
then
exit 0
else
nohup php ./get_tweets.php > /dev/null &
exit 0
fi
It would appear even running as root get_tweets.php would not execute without first changing the directory prior to actually running the command. As per the second line:
cd /var/www/html/streaming/db
Although I'm not entirely sure why it wouldn't run, as parse_tweets_keyword.php ran perfectly using the first bash script posted in my question.
Related
My issue seems to be asked before but hold on, this one is a bit different.
I have 2 php files, I run the following commands:
nohup php file_1.php >/dev/null 2>&1 &
nohup php file_2.php >/dev/null 2>&1 &
This will create basically 2 php processes.
My problem is that sometimes one of the files or even both of them are killed by the server for unknown reason and I have to re-enter the commands over again. I tried 'forever' but doesn't help.
If the server is rebooted I will have to enter those 2 commands, I thought about Cronjob but I'm not sure if it would launch it twice which would create more confusion.
My question is how to start automatically the files if one or both of them got killed? What is the best way to achieve this which would check exactly that file_1.php or that file_2.php is indeed running?
There are a couple of ways you can do this. As #Chris Haas mentioned in the comments, supervisord can do this for you, or you can run a watchdog shell script from cron at regular intervals that checks to see if your script is running, and if not, start it. Here's one I use.
#!/bin/bash
FIND_PROC=`ps -ef |grep "php queue_agent.php --env prod" | awk '{if ($8 !~ /grep/) print $2}'`
# if FIND_PROC is empty, the process has died; restart it
if [ -z "${FIND_PROC}" ]; then
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
echo queue_agent.php failed at `date`
cd ${DIR}
nohup nice -n 10 php queue_agent.php --env prod -v > ../../sandbox/logs/queue_agent.log &
fi
exit 0
I think u can try to figure out why these two php scripts shut down as the first step. To solve this problem, u can use this php function:
void register_shutdown_function(callback $callback[.mixed $parameter]);
which Registers a callback to be executed after script execution finishes or exit() is called. So u can log some info when php files get shut down like this:
function myLogFunction() {
//log some info here
}
register_shutdown_function(myLogFunction);
Instead of putting the standard output and error output into /dev/null, u can put them into a log file(Since maybe we can get some helpful info from the output). So instead of using:
nohup php file_1.php >/dev/null 2>&1 &
nohup php file_2.php >/dev/null 2>&1 &
try:
nohup php file_1.php 2>yourLog.log &
nohup php file_2.php 2>yourLog.log &
If u want to autorun these two php files when u boot the server, try edit /etc/rc.local(which is autorun when the os start up). Add your php cli command lines in this file.
If u can't figure out why php threads get shut down, try supervisor as #Chris Haas mensioned.
I need to execute a php script once in a day.
I followed these steps to make the script run.
crontab -e
30 17 * * * /usr/bin/php /var/www/html/file.php >> /var/log/1.log >2&1
Its not working.There is nothing inside 1.log(1.log is the file which i have created)
I checked the log file using
grep CRON /var/log/syslog
CMD redirected to /usr/bin/php /var/www/html/file.php >> /var/log/1.log >2&1
MTA was not installed ,discarded the output.
but i do not want a mail to be sent so i do not want to install MTA. I just want cronjob to be run.
Any help would be appreciated
I have a Beanstalk MQ where I put task to create APK, and a consumer named AppBuilder.php, which reads messages from Beanstalk MQ, and then exec the command which creates the App (android app).
The AppBuilder.php is run from the crontab. the process is
Crontab runs a health-check.sh shell script
health-check.sh runs AppBuilder.php in background
AppBuilder.php calls exec to create the process
Following is the relevant code snippet(s) from each file:
Root crontab is like so:
* * * * * /opt/cron/health-check/health-check.sh
health-check.sh is like this:
#!/bin/bash
PATH=$PATH:/sbin/
#HEALTH CHECK FOR AppBuilder Daemon
if (( $(ps -ef | grep "[A]ppBuilder" | wc -l) > 0 ))
then
echo "AppBuilder is Running"
else
echo "AppBuilder is Stopped, attempting to restart"
$PHP_CMD /opt/appbuilder/AppBuilder.php &
if pgrep "AppBuilder" > /dev/null
then
echo "AppBuilder is now Running"
else
echo "AppBuilder is still not Running"
fi
fi
AppBuilder.php has following exec command:
exec('sudo sh /var/www/cgi-bin/appbuilder/oneClickApk.sh &', $output, $resultCode);
If I run the AppBuilder.php directly, things work fine. However, from cron, it does not.
I've followed this SO Post, and modified the exec command to the following:
exec('/usr/bin/nohup /usr/bin/sudo /usr/bin/env TERM=xterm /bin/sh /var/www/cgi-bin/appbuilder/oneClickApk.sh &', $output, $resultCode);
However, still things fail. Any clues where this may be going wrong? I've spend a lot of time digging the forums, none helping. Please help!
EDIT 1:
The crontab runs, AppBuilder.php gets initialized, but after exec command, I could not see the oneClickApk.sh in process list
EDIT 2:
I changed the crontab from root to ec2-user, as suggested in comments: Still the process does not run.
Just following your first approach like below. But to fix the issue you have to do the following.
Check your cron-tab entry whether properly set or not.
In cron-tab provide full path to your executable
Check whether cron-tab user account has proper setting to execute sudo
Check whether cron-tab account has proper permission to execute your PHP application as well as your shell script. If not add the account into the proper group so that it can run both the PHP and shell script.
exec('sudo sh /var/www/cgi-bin/appbuilder/oneClickApk.sh &', $output, $resultCode);
Use above command its seems OK.
In you shell script put a wait command and check whether it is working fine or not. Just add only below two lines without any if condition first. If working then you need to check your if condition why they are not satisfying as per your requirement.
$PHP_CMD /opt/appbuilder/AppBuilder.php &
while (( $(ps -ef | grep "[h]ealth-check.sh" | wc -l) <= 0 ))
do
sleep 5
done
sleep 30
I need to make a background script that is spawned by PHP command line script that echos to the SSH session. Essentially, I need to do this linux command:
script/path 2>&1 &
If I just run this command in linux, it works great. Output is still displayed to the screen, but I can still use the same session for other commands. However, when I do this in PHP, it doesn't work the same way.
I've tried:
`script/path 2>&1 &`;
exec("script/path 2>&1 &");
system("script/path 2>&1 &")
...And none of these work. I need it to spawn the process, and then kill itself so that I can free up the session, but I still want the output from the child process to print to the screen.
(please comment if this is unclear... I had a hard time putting this into words :P)
I came up with a solution that works in this case.
I created a wrapper bash script that starts up the PHP script, which in turn spawns a child script that has its output redirected to a file, which the bash script wrapper tails.
Here is the bash script I came up with:
php php_script.php "$#"
ps -ef | grep php_script.log | grep -v grep | awk '{print $2}' | xargs kill > /dev/null 2>&1
if [ "$1" != "-stop" ]
then
tail -f php_script.log -n 0 &
fi
(it also cleans up "tail" processes that are still running so that you don't get a gazillion processes when you run this bash script multiple times)
And then in your php child script, you call the external script like this:
exec("php php_script.php >> php_script.log &");
This way the parent PHP script exits without killing the child script, you still get the output from the child script, and your command prompt is still available for other commands.
I have a bash program that picks data from a file and delivers these data (if fulfilling a threshold) to another file.
It is a php script inside the bash script named smaoutput-analyse.sh
When executed from the shell it functions perfect.
When executed as a cron job as root is is executed correct, but there is no output.
Here is the output from grep -i cron /var/log/syslog
Aug 14 16:06:01 raspberrypi CRON[6705]: (root) CMD (/home/pi/scripts/SBFspot.sh > /home/pi/test/smaoutput.txt 2>&1 )
Aug 14 16:06:01 raspberrypi CRON[6706]: (root) CMD (/home/pi/test/smaoutput-analyse.sh > /dev/null 2>&1)
The information is (as mentioned before) correctly added when running fom the shell
#!/usr/bin/php
<?php
echo " Programm to read smaoutput.txt",PHP_EOL;
// etc etc
`if(!file_put_contents("sma_saved_data.txt",$sma_saved_data_string,FILE_APPEND)){
// failure
echo "error opening the file sma_saved_data.txt for writing",PHP_EOL;
}
// etc etc
?>
Here are the crontab lines:
# Every minute result of SMA
*/1 8-22 * * * /home/pi/scripts/SBFspot.sh > /home/pi/test/smaoutput.txt 2>&1
# afterwards read and save in file
*/1 10-20 * * * /home/pi/test/smaoutput-analyse.sh > /dev/null 2>&1
I think I have set file permissions correct +rw for the files and +rwx for the bash
What did I miss
You should check the following issues:
Are all the environment variables the same? So call printenv from bash and create a cron-job */1 8-22 * * * printenv > /tmp/printenv.txt --> Compare the ouput of file /tmp/printenv.txt with printenv from bash
Are you executing with the same user and the same permissions? Execute echo "$USER" and create a cron-job */1 8-22 * * * echo "$USER" > /tmp/user.txt --> Compare the ouput of file /tmp/user.txt with echo "$USER" from bash
Check the path you executing the script. Call pwd from bash and create a cron-job */1 8-22 * * * pwd> /tmp/pwd.txt --> Compare the ouput of file /tmp/pwd.txt with the ouput of pwd from bash
Realy interesting.
This was my first post and I received quite a swift response.
Thank you!
The solution was simple: provide the full path to where you want to keep the file.
Remark: To be shure that everything will be executed I alway put cronjobs in a root crontab and not in a user crontab.
Maybe that is "smart thinking" but not so smart acting.
I would appreciate to get some comment on this root cronjob idea.
The post about executing printenv, echo "&USER" and pwd both from bash and from cron was interesting.
The printenv from bash gives a lot of information, amongs which SHELL=/bin/bash, SSH_CLIENT, SSH_TTY. MAIL, PATH, SSH_CONNECTION and lots more starting with LS_COLORS, the printenv from cron is just 6 lines HOME=/root, LOGNAME=root, PATH=/usr/bin, LANG=en_GB,UTF-8, SHELL=/bin/sh and PWD=/root
The echo "&USER" from bash gives pi, whilst from cron gives a blank file
The pwd from bash gives /home/pi/test and from cron /root
These results are understandable.
Can I learn from it that I should create cronjobs as user pi and not as user root?