PHP script with crontab loop - php

I have a php script that have a long execution time.
I've set
set_time_limit(0);
ini_set('max_execution_time', 36000);
and with the Browser the script goes well.
But when I call with crontab like
0 3 * * * wget -q -O - "http://mysite.it/script.php"
I can see that the script start and finish (I've a logger implemented) but the log tell me that the script is continously invoked as if crontab does not recognize the end of the script and it did start again every time.

Related

php cron timeout - run script in background

i have a script in php extract_data.phpand it takes 20 minutes to run the script.
I activated cron to run the script. but the cron has a time of only 30 seconds (this time cannot be increased).
the problem is that i always get the timeout error.
I would like it to appear: file loading... while the script is running.
i test exec("extract_data.php"." > /dev/null &");
does not work
It seems that you have a default configuration for the timeout.
Please take a look here: https://www.php.net/manual/en/function.set-time-limit.php
set_time_limit(0); // To run without timeout.
Or
set_time_limit(20*60); //to allow to run for only 20 minutes

Activate Cron job from a PHP page

I googled but didn't found any solution
I have a PHP page that takes 45 minutes to execute.
What I am trying to achieve is:
1. Whenever I run a URL abc.com/test.php the script should check the cron job and activate it (run myscript.php) .
2. And should execute the page until it's complete.
So the PHP should call a cron job and execute it ONLY once when it is requested.Is Cron right approach
I do not want the script below, which i tried This will add a script that runs every day at 9:30am.
exec('echo -e "`crontab -l`\n30 9 * * * /path/to/script" | crontab -');
Why set a new cronjob, if you only want to execute it once?
exec('php -f /path/to/script >> /dev/null 2>&1 &');
This will run the script. Echo all the output into the nowhere and use fork, so it will run in the background and your Request won't wait for a return.

Timeout issue when running background PHP script as web

I've got a PHP script that does some heavy lifting that I'm trying to fire off as a background script using the following code:
exec("script.php > /dev/null 2> /dev/null &");
What happens:
When I run the code above as part of my web app, the script quits
after about a minute.
When I run the code as part of my web app without the final
ampersand, the job runs fine - but exec() waits for the script to
complete before loading the next page, defeating the purpose as the user stares at an unresponsive page.
When I run the shell command script.php > /dev/null 2> /dev/null & as myself from the console with the final ampersand, the job runs fine.
When I run the shell command from the console as web, the job stops running after about a minute.
I've tried piping my output to logfiles: script.php > /home/public/tmp/output.txt 2> /home/public/tmp/errors.txt &. Output looks normal, I don't get any errors. The script just stops working.
The basic rule seems to be: If run as a foreground process as web or as me, it'll complete. If run as a background process as web, it stops working after about a minute.
I'm not running this as a cronjob because my host (NearlyFreeSpeech) can only run cronjobs once an hour, which is more than I want to make users wait for when the job only takes a couple minutes- it might as well fire when users initiate it.
The subscript starts with set_time_limit(60 * 60 * 4); so this shouldn't be a matter of PHP timing out.
set_time_limit
does not include shell-execution time.
http://php.net/manual/en/function.set-time-limit.php
Try using of the code examples in the comments on that site.

PHP script makes a duplicate

I have a long-running PHP script with set_time_limit(0) set. It works very good for 15 minutes (900 sec) but then become something very strange: a second process with the same parameters starting! I see it because I am starting a new log file at the beginning of the script and there is two log files processing same data!
BTW script runs in background from PHP with
exec('wget http://example.com/script.php?id=NNN > /dev/null &');
This instruction normally runs only once, and I can not get what runs it second time after 900 seconds (exact time).
This is because wget has a read time limit of 900sec. After it is reached, the download restarts.
You can set the timeout higher with the --timeout=seconds or the --read-timeout=seconds argument.
Or, you can start it directly from the shell(this way is much better).
Here is a link: wget download options
Here is the shell code(for Linux):
exec('php yourscript.php > /dev/null 2>&1 &');

php running in background through shell

Created a cron which indeed executes in every six hours which does create a shell script with different input to store.php every time it is executed.
* */6 * * * /usr/bin/php /var/www/execute.php > /dev/null &
execute.php will create the test.sh file with following commands and executes test.sh
/usr/bin/php /var/www/store.php 'x' > /dev/null &
/usr/bin/php /var/www/store.php 'y' > /dev/null &
it executes successfully and the logic is done. but when I checked through top command on shell
$top
I see all the php process running continuously, even after the execution is completed.
I want it to run the php code only once but it is still running in background... how should I stop it and how to do it only once... even added exit in php file(store.php at the end)....
If you want to end a script, you can use a return statement at the end of the file. Or you can simply call exit(). However, the script SHOULD end it's self once it get's to the last line. Suggesting that the script is not ever getting to the end of the file ... I would recomend putting in some echo statements at various parts of the file and making sure that it actually get's to the end of the file.

Categories