Running cron job from browser - php

I have several cron jobs that run automatically, and I was requested to add a button that says 'run now' in the browser... Is this possible? Some of these cron jobs need to be executed from command line as they take around 15 minutes... Is it possible to execute them from the browser, not as a normal php function but somehow trigger an external php from the browser?

You're looking for the exec() function.
If it's a 15 minute task, you have to redirect its output and execute in in the background. Normally, exec() waits for the command to finish.
Example: exec("somecommand > /dev/null 2>/dev/null &");

Related

Is executing multiple PHP Scripts ran one at a time or simultaneously?

When executing multiple scripts within PHP using the exec command; are each script ran one at a time as in one after the other or are they ran simultaneously?
exec('/usr/bin/php -q process-duplicates.php');
exec('/usr/bin/php -q process-images.php');
exec('/usr/bin/php -q process-sitemaps.php');
Just want to make sure they are one after the other before attempting to rewrite my crontabs.
Sure, the only way to run at background is adding & to the command line arguments, which would put that exec()'d process into the background:
exec("php test.php &");
So you are right, they run one after the other.
NOTE: In your case you shouldn't use & as it will force to run all the scripts simultaneously.
exec waits for the script to return, see php.net
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
But as a devops, please please please do not run your cron jobs like this! Create entries in the crontab for each, or put them in a shell script and have cron run the script.

Activate Cron job from a PHP page

I googled but didn't found any solution
I have a PHP page that takes 45 minutes to execute.
What I am trying to achieve is:
1. Whenever I run a URL abc.com/test.php the script should check the cron job and activate it (run myscript.php) .
2. And should execute the page until it's complete.
So the PHP should call a cron job and execute it ONLY once when it is requested.Is Cron right approach
I do not want the script below, which i tried This will add a script that runs every day at 9:30am.
exec('echo -e "`crontab -l`\n30 9 * * * /path/to/script" | crontab -');
Why set a new cronjob, if you only want to execute it once?
exec('php -f /path/to/script >> /dev/null 2>&1 &');
This will run the script. Echo all the output into the nowhere and use fork, so it will run in the background and your Request won't wait for a return.

Timeout issue when running background PHP script as web

I've got a PHP script that does some heavy lifting that I'm trying to fire off as a background script using the following code:
exec("script.php > /dev/null 2> /dev/null &");
What happens:
When I run the code above as part of my web app, the script quits
after about a minute.
When I run the code as part of my web app without the final
ampersand, the job runs fine - but exec() waits for the script to
complete before loading the next page, defeating the purpose as the user stares at an unresponsive page.
When I run the shell command script.php > /dev/null 2> /dev/null & as myself from the console with the final ampersand, the job runs fine.
When I run the shell command from the console as web, the job stops running after about a minute.
I've tried piping my output to logfiles: script.php > /home/public/tmp/output.txt 2> /home/public/tmp/errors.txt &. Output looks normal, I don't get any errors. The script just stops working.
The basic rule seems to be: If run as a foreground process as web or as me, it'll complete. If run as a background process as web, it stops working after about a minute.
I'm not running this as a cronjob because my host (NearlyFreeSpeech) can only run cronjobs once an hour, which is more than I want to make users wait for when the job only takes a couple minutes- it might as well fire when users initiate it.
The subscript starts with set_time_limit(60 * 60 * 4); so this shouldn't be a matter of PHP timing out.
set_time_limit
does not include shell-execution time.
http://php.net/manual/en/function.set-time-limit.php
Try using of the code examples in the comments on that site.

Executing multiple linux commands using && in background from a PHP script

Ok so I have a operating system Ubuntu. I have lampp. Now I wanna execute some command in the background which is going to take atleast 10-15 minutes to execute. I want to execute this command from PHP script from web interface not cli.
First Instance:
When I do this , the php script is successfully able to run the command in the background:
command1 &> /dev/null &
Second Instance:
But when I do this:
command1 &> /dev/null && command2 &
,
Then the command does not run in background, the php script pauses until this command executes. I want the command2 to execute after command1 is completed so that I (or my php script) can know that command1 has been executed. But it should be in background else my php script doesn't execute on time.
When I do the second instance from command line, it is able to run in background, but when I do it from php script then it is unable to run in background.
I am using the exec function of php to execute those commands.
<?php
exec('command1 &> /dev/null && command2 &',$out,$ret);
?>
Any help would be appreciated.
Try this:
<?php
exec('(command1 && command2) >/dev/null &',$out,$ret);
What it's doing is launching your commands in a subshell so that command1 runs first and then command2 only runs after command1 completes successfully, then redirects all the output to dev/null and runs the whole thing in the background.
If you want to run command2 regardless of the exit code of command1 use ; instead of &&.
The computer is doing what you're instructing it to do. The && list operator tells the shell to only run the second command if the first command completes successfully. Instead, you should run two independent commands; if there really is a dependency between the two commands, then you may need to reconsider your approach.

PHP script makes a duplicate

I have a long-running PHP script with set_time_limit(0) set. It works very good for 15 minutes (900 sec) but then become something very strange: a second process with the same parameters starting! I see it because I am starting a new log file at the beginning of the script and there is two log files processing same data!
BTW script runs in background from PHP with
exec('wget http://example.com/script.php?id=NNN > /dev/null &');
This instruction normally runs only once, and I can not get what runs it second time after 900 seconds (exact time).
This is because wget has a read time limit of 900sec. After it is reached, the download restarts.
You can set the timeout higher with the --timeout=seconds or the --read-timeout=seconds argument.
Or, you can start it directly from the shell(this way is much better).
Here is a link: wget download options
Here is the shell code(for Linux):
exec('php yourscript.php > /dev/null 2>&1 &');

Categories