I'm using https://cron-job.org/ for a cron job but I have a problem. When running the script manually (for example, a script that gets data from a csv file), it works but if I run the script through that cron job, it fails because of that 30 seconds max_execution_time I guess.
But the problem is that in my script I'm already using:
ini_set('max_execution_time', 300);
It should be 5 minutes instead of 30 seconds before the cron job fails. What am I doing wrong?
Here is an image with that cron job history:
cron-job.org has some limits:
How and how long does cron-job.org visit my URLs?
cron-job.org visits your URLs at the configured dates/intervals and waits for the URL/script to finish execution. If your URL/script does not finish after 30 seconds, it will timeout and we will close the connection to prevent delays in the execution of the jobs of other users. Our system reads up to 1024 bytes of the output of your URLs/scripts. In case your script sends more data, job execution will be aborted. (Please also see the next question.)
You can read more here: FAQ.
Related
i have a script in php extract_data.phpand it takes 20 minutes to run the script.
I activated cron to run the script. but the cron has a time of only 30 seconds (this time cannot be increased).
the problem is that i always get the timeout error.
I would like it to appear: file loading... while the script is running.
i test exec("extract_data.php"." > /dev/null &");
does not work
It seems that you have a default configuration for the timeout.
Please take a look here: https://www.php.net/manual/en/function.set-time-limit.php
set_time_limit(0); // To run without timeout.
Or
set_time_limit(20*60); //to allow to run for only 20 minutes
PHP Has a method called set_time_limit (and a config option max_execution_time), which allows you to specify a time limit which triggers the script to exit when reached. However, this does not take into account time on external calls. Which means that if (for example) the PHP script makes a database call that does something dumb and runs for hours, the set time limit will not trigger.
If I want to guarantee that my PHP script's execution time will not exceed a specific time, is there some way to do this? I'm running the script directly in PHP (rather than via apache or similar), so I'm considering writing a bash script to monitor ps and kill it after a certain time. It occurs to me that I can't be the first person with this problem though, so is there some already-built solution available for this?
Note: This question is NOT about the PHP error from exceeding it's maxed execution time. This is about the case where PHP exceeds a time limit without triggering this error. This question is NOT just about limiting the time of a PHP script, it is about limiting the time allowed to a PHP script making external calls.
When running PHP from the command line the default setting "max_execution_time" is 0.
You can try something similar to solve your problem.
$cmd = "(sleep 10 && kill -9 ".getmypid().") > /dev/null &";
exec($cmd); //issue a command to force kill this process in 10 seconds
$i=0;
while(1){
echo $i++."\n";
sleep(1); //simulating a query
}
I have a rails app which initiates php script using:
php -f aa.php
The php script has at the beginning
set_time_limit(60); // 1 minute
ini_set('max_execution_time',60);
But the php script does not die even after two minutes.
I verified using ps -ef | grep php and the process is still running.
Does it not die because the parent ( rails app) is alive ?
I have verified the safe_mode is off for php
Okay, see this Note from php.net, the script won't stop because of set_time_limit as long as there is f.e. a DB query
NOTE:
The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
You might set a timeout function in your loop so that it stops after a minute or whatever time you wish
I have an issue which stops my (slow) process. I start my background slow process using a php page with a button as follow:
<form id="trial" method="post" action=""><input name="trial" value="Start!" type="submit">
<?php
set_time_limit(0);
if (isset($_POST['trial'])) {
system("/srv/www/cgi-bin/myscript.sh");
}
?>
At some point after 1.5 days the process stops, I have modified the php.ini and the apache config file inserting a very high number in the timeout directive, but it seems it does not work, or there is some other process that is stopping myscript.sh.. do you have any suggestions?
thanks!
I'm assuming you have access to the server via SSH based on your post.
If the real goal is to get your script to run continuously, why not log in and
nohup myscript.sh
As long as your script behaves, it will continue to run as long as it needs to after you close the terminal.
Check the Logs
To determine why your script is failing, you'll definitely want to check /var/log/kern.log and /var/log/syslog. Look for any entries containing your script or any of it's children. Your script may be getting killed off by the kernel ( exceeding limits ) or erroring out at runtime.
Execute the script continuously will take some problem so set Cron for every 30 mins in your system.
set_time_limit(30);
system("/srv/www/cgi-bin/myscript.sh");
Cron setup :
30 * * * * php /path/to/your/php/file.php
This question already has answers here:
Limit execution time of an function or command PHP
(7 answers)
Closed 9 years ago.
I've a script to send packages of 300-500 e-mails hour. That means that this script will be fired once a hour using cron or other feature.
The server has a max execution limit of 30secs and it's not configurable.
I've been thinking if the pseudo-code below should work:
$time=time();
$count=0;
while(condition){
$count++;
send(email);
$now=time();
if($now-$time>=29){break;} //1sec margin
}
echo "$count e-mails sent";
Opinions?
if your script is launched with cron it means that you're using PHP-CLI "PHP Command Line Interface".
As mentioned in the PHP documentation, your have no time limit while using CLI.
So you don't have to worry about that : max_execution_time is set to unlimited.
Just to double-check that you can't set the execution time, here are two suggestions.
You could simply call set_time_limit() before sending an e-mail. According to the PHP docs:
When called, set_time_limit() restarts the timeout counter from zero. In other words, if the timeout is the default 30 seconds, and 25 seconds into script execution a call such as set_time_limit(20) is made, the script will run for a total of 45 seconds before timing out.
For instance:
foreach ($emails as $email) {
set_time_limit(30);
send($email, ...);
}
Another option is via the cron. Since you are running PHP from a cron job, you can specify your own php.ini. You could execute your script as follows:
php -c /custom/directory/my_php.ini my_script.php
Where my_php.ini may specify:
max_execution_time = 0 ; (unlimited)
Break up the task in smaller chunks. Use the database to keep "state" of the actual job execution.
This approach has the advantage of being scalable: you probably will end-up having to send more emails as you grow, won't you?