One command - two cron job - php

I need to execute in one command two php files.
the second file need to run right after the first finish.
This is what i did, not sure if it's ok:
/usr/bin/wget -O /dev/null -o /dev/null https://example.com/scripts/cron.php; sleep 2; wget -q -O - https://example.com/cron2.php > /dev/null 2>&1
I added sleep between the commands, it will work?

You can use && for sequential execution of command,
check https://www.steveroot.co.uk/2010/07/05/cron-multiple/ And What is the purpose of "&&" in a shell command?
In Your case You can try :
01 00 * * * //usr/bin/wget -O /dev/null -o /dev/null https://example.com/scripts/cron.php && wget -q -O - https://example.com/cron2.php > /dev/null 2>&1
Hope these will Help.

Related

Cron Job with PHP and an increment variable argument

I would like to know of it is possible to execute a Cron job every 30 seconds and increment an argument from a value (1 for exemple) to another (50000) :
Like :
wget https://mon-url.com/file/cron.php?id=1 >/dev/null 2>&1
wget https://mon-url.com/file/cron.php?id=2 >/dev/null 2>&1
wget https://mon-url.com/file/cron.php?id=3 >/dev/null 2>&1
wget https://mon-url.com/file/cron.php?id=4 >/dev/null 2>&1
....
wget https://mon-url.com/file/cron.php?id=50000 >/dev/null 2>&1
Is there any command to do that programaticaly ?
Thanks
As suggested before i'd rather go with bash script like this (or similar):
#!/bin/bash
i=1;
while [ $i -le 5 ]
do
wget https://mon-url.com/file/cron.php?id=$i >/dev/null 2>&1
i=$(($i+1));
sleep 30
done
Regards.
Ps. change 5 after -le to whathever you need
Look here below you want:
$x = 1; // You value set to minim
$y = 50000; // Your value set to maxim
while($x <= $y) {
echo "wget https://mon-url.com/file/cron.php?id=$x >/dev/null 2>&1";
$x++;
}
You can use this in your script for cron jobs. Good look!
Perhaps u can just export your counter incrementing it by one every time
COUNTER=0
*/30 * * * * wget https://mon-url.com/file/cron.php?id="$COUNTER" >/dev/null 2>&1 && export COUNTER=$((COUNTER+1))

Cron Job Scheduled did not run on specified time

I scheduled a cron job to hit a page 1st of every month at 12.00AM but the cron didn't work for some reason.
The below is the cron I have used :
0 0 1 * * /usr/bin/php /var/www/html/cronleave.php >/dev/null 2>&1
Any help will be appreciated.
using sample :
0 0 1 * * wget -O /dev/null -o /dev/null http://www.domain.com/cronleave.php >/dev/null 2>&1
and check time server

Getting a process to fork and be independent of parent in Apache + PHP + Bash

I have a bash script that calls liquidsoap like so
/bin/sh -c "echo \$\$ > \"${sdir}/pid/${sfile}.pid\" && exec liquidsoap \"${sdir}/liq/${sfile}.liq\" >/dev/null 2>&1 || rm \"${sdir}/pid/{$sfile}.pid\"" &
(For readability, it might look like this with variables filled in)
/bin/sh -c "echo \$\$ > \"/radio/pid/station.pid\" && exec liquidsoap \"/radio/liq/station.liq\" >/dev/null 2>&1 || rm \"/radio/pid/station.pid\"" &
In PHP, the script is called with
return shell_exec("{$this->streamBase}/scripts/{$this->streamName} start config {$stationConfig}");
My problem is, I just had to restart Apache, and when I did, it also killed the liquid soap instances. I would like to get it to run fully independent of Apache such that I could restart Apache and they would keep running.
I'm not sure how I can achieve that.
EDIT:
I've tried changing
/bin/sh -c "echo \$\$ > \"${sdir}/pid/${sfile}.pid\" && exec liquidsoap \"${sdir}/liq/${sfile}.liq\" >/dev/null 2>&1 || rm \"${sdir}/pid/{$sfile}.pid\"" &
to
(/bin/sh -c "echo \$\$ > \"${sdir}/pid/${sfile}.pid\" && exec liquidsoap \"${sdir}/liq/${sfile}.liq\" >/dev/null 2>&1 || rm \"${sdir}/pid/{$sfile}.pid\"" & ) &
and
nohup /bin/sh -c "echo \$\$ > \"${sdir}/pid/${sfile}.pid\" && exec liquidsoap \"${sdir}/liq/${sfile}.liq\" >/dev/null 2>&1 || rm \"${sdir}/pid/{$sfile}.pid\"" &
Neither keep liquidsoap running if I restart (or stop/start) Apache. When Apache stops, so do those processes.
for an exit code to be propogated up the chain the parents and grandparents must exist, and if you kill the grandparent, aka apache, yes, you kill the children and grandchildren unless they leave the family and become daemons.

PHP: can not exec a script in the background

I have main php script on web-server nginx+php-fpm and try to run another php script in background using GET request from web-browser. Line code in main.php to call detect.php:
exec("/usr/bin/php -f /var/www/detect.php 6 > /dev/null 2>&1 &");
detect.php does not start. I don't have any errors.
If to remove "&":
exec("/usr/bin/php -f /var/www/detect.php 6 > /dev/null 2>&1 ");
detect.php starts successfully.
From shell bash with "&" :
sudo -u www-data /usr/bin/php -f /var/www/detect.php 6 > /dev/null 2>&1 &
Script detect.php starts successfully.
try this and make sure your php path are correct
$dir=dirname(__FILE__);
$file_name="detect.php";
$php_path="/usr/bin/php";
$args = array(6);
$command1="cd $dir";
$command2="$php_path $file_name ".implode(" ",$args) ." > /dev/null 2>/dev/null &";
$final_command=$command1."; ".$command2;
shell_exec($final_command);

Using wget with shell_exec and at command?

I've been struggling with shell_exec PHP function and at linux command for 2 days.
To make it short, this works:
shell_exec('/usr/bin/at 09:32 <<EOF
touch /var/www/website/hello.txt
EOF'
);
this doesn't:
shell_exec('/usr/bin/at 09:32 <<EOF
wget -O - -q -t 1 "http://192.168.56.101/website/test.php?param=hello" >/dev/null 2>&1
EOF'
);
Why ?
(note: the code above does work in console)
Thanks in advance.
Ok I've got it at last !!
For those who are interested the pb comes that the wget command also need to be invoked with the full path (ie: /usr/bin/wget).
What misleaded me is that the touch command doesn't need it. It's weird but anyway here's the working code:
shell_exec('/usr/bin/at 09:32 <<EOF
/usr/bin/wget -O - -q -t 1 "http://192.168.56.101/website/test.php?param=hello" >/dev/null 2>&1
EOF'
);

Categories