Maximum execution time of 30 seconds exceeded every time - php

every time my php script sends me "Maximum execution time of 30 seconds exceeded". Is there any way to show the custmize error to visitor instead of this big error message. i mean can we use our own error message at the place of this?

You probably have an endless loop somewhere. You could debug the PHP on your own machine (adding debug printing inside, etc.)

add this in the top of your php script
set_time_limit(0);
ini_set('max_execution_time', 0);
this means your script has unlimited time to execute the script, but you can limit it to x seconds, if you replace the 0 with a number of seconds like 500. But if you have a bug, like a endless loop the script would never stop and maybe your server could die if 100's visitor hit the script.

pin down your endless loop by inserting
die('got here...');
at top and move it downwards in your script, you'll find your problematic code quite quickly. Real debugging is smoother but needs some setting up.
A functional php script should normally take less than a second to execute.
regards,
///t

Related

prevent php timeout in block of code

Wanted to know if there was a was to prevent a php timeout of occurring if a part of the code has started being process.
Let me explain:
i have a script that is executed that take way too long to even use
ini_set('max_execution_time', 0);
set_time_limit(0);
the code is built to allow it to timeout and restart where it was but I have 2 line of code that need to be executed together for that to happen
$email->save();
$this->_setDoneWebsiteId($user['id'], $websiteId);
is there a way in php to tell it it has to finish executing them both even if the timeout is called?
Got an idea as I'm writing this, i could use a time_out of 120 sec and start a timer and if there is less then 20 sec left before timeout to stop, i just wanted to know if i was missing something.
Thank you for your inputs.
If your code is not synchronous and some task takes more than 100 seconds - you'll not be able to check the execution time.
I see only one truly HACK (be careful, test it with php -f in console for be able to kill the processes):
<?php
// Any preparations here
register_shutdown_function(function(){
if (error_get_last()) { // There was timeout exceeded error
// Call the rest of your system
// But note: you have no stack, no context, no valid previous frame - nothing!
}
});
One thing you could do is use the DATE time features to monitor your average execution time. (kind of builds up with each execution assuming you have a loop).
If the average then time is then longer than how ever much time you have left (you would be counting how much time has been taken already against your maximum execution time), you would trigger a restart and let it pick up from where it left off.
How ever if you are experiencing time outs, you might want to look at ways to make your code more efficient.
No, you can't abort timeout handler, but i'd say that 20 seconds is quite a big time if you're not parsing something huge. However, you can do the following:
Get time of the execution start ($start = $_SERVER['REQUEST_TIME'] or just $start = microtime(true); in the beginning of your controller).
Asset that execution time is lesser than 100 seconds before running $email->save() or halt/skip code if necessary. This is as easy as if (microtime(true) - $start < 100) { $email->save()... }. You would like more abstraction on this, however.
(if necessary) check execution time after both methods have ran and halt execution if it has timed out.
This will require time_limit set to big value or even turned off and tedious work to prevent too long execution. In most cases timeout is your friend that just tells you're work is taking too much time and you should rethink your architecture and inner processes; if you're out of 120 seconds, you'll probably want to put that work on a daemon.
Thank you for your input, as I thought the timer solution is the best way.
what I ended up doing was the following, this is not the actual code as its too long to make a good answer but just the general idea.
ini_set('max_execution_time', 180);
set_time_limit(180);
$startTime = date('U'); //give me the current timestamps
while(true){
//gather all the data i need for the email
//I know its overkill to break the loop with 1 min remaining
//but i realy dont want to take chances
if(date('U') < ($startTime+120)){
$email->save();
$this->_setDoneWebsiteId($user['id'], $websiteId);
}else{
return false
}
}
I could not use the idea of measuring the average time of each cycle as it vary too much.
I could have made the code more efficient but it number of cycle is based on the number of users and websites in the framework. It should grow big enough to need multiple run to be completed anyway.
Il have to make some research to understand register_shutdown_function, but I will look into it.
Again Thank you!

PHP Cron job execution time limits fail

I've got a CronJob scheduled to run every 7 minutes. The script runs a loop over a number of users and sends email via SMTP or does some API calls via curl.
Thus, most of the execution time is apparently spent outside the realm tracked by max_execution_time on Linux. So, because I was experiencing hangs in my script that were always fixed by restarting it (I'm also looking for the cause of the hangs, but wasn't successful so far).
Because with set_time_limit set to 6 minutes, the script still ran 30 minutes sometimes, I now check microtime(true) after each round in the loop and break out of it, if it has been running more than 6 minutes.
Still, the script sometimes runs 37 minutes (even though I can see that emails, which map to one round in a loop, still go off).
The only trick I have left is pcntl_fork. I am reluctant to employ it because of platform dependence, and because I figured using a loop and microtime(true) should track time spent outside the process too and I'd like to understand why this isn't the case here.
The max_execution_time is used for limiting script execution time. However this does not affect system calls. See the manpage.
You can also use set_time_limit() to properly set the maximum amount of time a script can run. This only changes the time_limit for the script in scope. Have you tried this as well?
You can find another question that has been asked may help slightly at this link.
I ended up going outside PHP to get the time, this is simple, works and doesn't require me to learn to deal with pcntl functions.
$time_comp = microtime(true); // save start time
function check_time_against_mysql($dbh,$time_comp) // run this function
{ // to see how much time passed
$time = $dbh->prepare("SELECT UNIX_TIMESTAMP() - :created AS time_in_seconds");
$time->bindValue(":created",$time_comp);
$time->execute() or die("fail time");
$time_passed = $time->fetch(PDO::FETCH_ASSOC);
return floatval($time_passed['time_in_seconds']);
}

PHP Refresh Page

Currently I have a problem with GoDaddy and its hosting. The maximum amount of time allowed for a PHP script to run is set to 120 seconds = 2 mins.
Sometimes when I run my script, it takes more than 120 seconds and sometimes it takes 30 seconds.
When it takes over 120 seconds, I get a Internal Server Error (500).
The question I have is, is it possible to figure out if a script run time is at 110 seconds and then refresh the page so that the Internal Server Error does not occur.
If what ever is causing your script to run long is some sort of self contained blocking function (like a database query), then you are kind of screwed.
If not, say you are in a very large loop, then just do the math your self.
At the very start of your script, save the current time: $t_start = time();
Then at the top of your loop, do a $runtime = time() - $t_start; then you have how long you have been running, and can use that to break your loop gracefully if nearing your 120 limit.

PHP while loop issue

I have PHP script contains a long loop to list about 60,000 email addresses from mysql, but after 30,000, my php script stops working and brakes, and sometimes all php script is written in white page (my page has imaged background), I increased PHP memory limit to unlimited, but no help.
What's problem?
The default execution time limit is 30 seconds, or the *max_execution_time* value from php.ini. Is your script taking longer than that? If so, you can increase the limit with set_time_limit.
The better question is, what are you doing with 60,000 email addresses that is taking so long? Chances are, there is a much better way to do whatever is taking too long.
There are two things you can try here. Firstly, set_time_limit(0). Setting it to zero means there is essentially no time limit.
Secondly you might want to use ignore_user_abort(false). This will et whether a client disconnect should abort script execution.
Both will ensure that your script keeps running for as long as it needs... or until your server packs out :)
The problem is not in memory I think it's in Time for execution
set_time_limit(VERY_BIG_NUMBER);
Update:
add
ini_set("display_errors","on");
error_reporting(E_ALL);
and inspect errors if any
check "max_execution_time"
How long does the script run before breaking? 30 seconds?
Use set_time_limit() and increase the max execution time for the script.

Repeat php script after time out

My php script creates thumbnails of images. Sometimes when it handles a lot of images, the script have to run a long time and ends after 60 seconds because of the time limit on my server.
Can I tell the script to time out after 59sek and then repeat itself?
I need some ideas.
Edit:
I don't think my web hosting allows me to change max_execution_time
I can't believe this is my answer..
loopMe.php:
<meta http-equiv="refresh" content="65;url=loopMe.php">
<?php
/* code that is timing out here */
?>
Your best bet though may be to look at set_time_limit() like others have suggested.
Important Note: this loop will never end. You could however have a session variable, say, $_SESSION['stopLoop'] to flag whenever the script successfully ends. Before printing the meta-refresh line you can check the value of that.
If you don't want to modify your code, use the set_time_limit(0), which sets to have no time limit.
set_time_limit(0)
http://php.net/manual/en/function.set-time-limit.php
I wouldn't recommend to use though, if your system grows this would take a very long time processing. It is better to recode in such a way that you run that only when your server is on low traffic and control how much time it will process data.
For instance you could store the data you need to process in a queue and process as much as you can in a time widow once per night.
set_time_limit(0);
yeah as above suggested use set_time_limit It will increase the sript execution timeout.
If your are in loop for processing multiple files then do set_time_limit every time you before you process the file to the expected duration of time. i.e if a file is assumed to take max 60s for execution before processing use set_time_limit(60)
If your server is not runnig with php in safe mode you can use set_time_limit($time); where $time is the time you need to execute the script, for example you can use set_time_limit(240); set the timeout on 4 minutes.
Or for example you can find out in your script how much time has passed and before the timeout expires add some seconds (set_time_limit(10);) until the script finish. (if you call multiple times to set_time_limit() you will add the time of each call.)
Using this code you can calculate the elapsed time and after for each image you compare it against the server timeout.
Also you can modify your script to log the images that has been already processed and in case of timeout carry on with the task in teh same point.
You can run your script from a cron job in your server (if it's allow you) to make it run periodically. Take a look at this article

Categories