PHP Refresh Page - php

Currently I have a problem with GoDaddy and its hosting. The maximum amount of time allowed for a PHP script to run is set to 120 seconds = 2 mins.
Sometimes when I run my script, it takes more than 120 seconds and sometimes it takes 30 seconds.
When it takes over 120 seconds, I get a Internal Server Error (500).
The question I have is, is it possible to figure out if a script run time is at 110 seconds and then refresh the page so that the Internal Server Error does not occur.

If what ever is causing your script to run long is some sort of self contained blocking function (like a database query), then you are kind of screwed.
If not, say you are in a very large loop, then just do the math your self.
At the very start of your script, save the current time: $t_start = time();
Then at the top of your loop, do a $runtime = time() - $t_start; then you have how long you have been running, and can use that to break your loop gracefully if nearing your 120 limit.

Related

Falsely getting "The request was aborted because it exceeded the maximum execution time"

I say "falsely" because:
this script runs comparatively faster than other similar scripts in the same application and it uses an extremely similar structure (it takes about 5 seconds with breakpoints if click continue when debugging)
none of my other scripts time-out
almost all of my scripts contain something like this:
//give script enough time to run
$rowcount = count($rows);
set_time_limit(3 * $rowcount);
the bit that is timing-out is an execute() in a PDO wrapper class that I use on every page of my application
So my question is: what can cause this error if the maximum execution time is not being exceeded?
The answer is in my own mess.
I said "it takes about 5 seconds..." but I've only given 3 seconds per record.
With 1 record, the script times out because it only has 3 seconds...

PHP / Apache2 page taking too long to load

I have some PHP code that executes some server intensive code with a lot of calculations and loops. It typically takes 2 to 10 seconds for it to run. But as the code is written, it should always finish (no infinite loops) and it should never take more than 10 seconds. However, I randomly get these situations where it takes around 60 seconds for the page to load in my browser, yet the PHP microtime function is telling me it only took 2 seconds.
$start_time = microtime(TRUE);
// All my vairous loops and calculations
$end_time = microtime(TRUE);
$g = number_format(($end_time - $start_time), 4);
echo "<br>Total Time " . $g;
I run my own Ubuntu 12 instance so I have complete control over server processes. I'm not a very experienced server admin though. My PHP.ini max_execution_time is set to 30. Yet when I have this problem, it takes well over 30 seconds to finally appear in my browser (and then tells me it ran in 2 seconds). What I'm wondering is if could this simply be a network issue. If so, how do I know if it is my ISP or on the data center side? Could it also be something at the Apache service level? Like some buffer filling up and delaying the sending of the page.

PHP Cron job execution time limits fail

I've got a CronJob scheduled to run every 7 minutes. The script runs a loop over a number of users and sends email via SMTP or does some API calls via curl.
Thus, most of the execution time is apparently spent outside the realm tracked by max_execution_time on Linux. So, because I was experiencing hangs in my script that were always fixed by restarting it (I'm also looking for the cause of the hangs, but wasn't successful so far).
Because with set_time_limit set to 6 minutes, the script still ran 30 minutes sometimes, I now check microtime(true) after each round in the loop and break out of it, if it has been running more than 6 minutes.
Still, the script sometimes runs 37 minutes (even though I can see that emails, which map to one round in a loop, still go off).
The only trick I have left is pcntl_fork. I am reluctant to employ it because of platform dependence, and because I figured using a loop and microtime(true) should track time spent outside the process too and I'd like to understand why this isn't the case here.
The max_execution_time is used for limiting script execution time. However this does not affect system calls. See the manpage.
You can also use set_time_limit() to properly set the maximum amount of time a script can run. This only changes the time_limit for the script in scope. Have you tried this as well?
You can find another question that has been asked may help slightly at this link.
I ended up going outside PHP to get the time, this is simple, works and doesn't require me to learn to deal with pcntl functions.
$time_comp = microtime(true); // save start time
function check_time_against_mysql($dbh,$time_comp) // run this function
{ // to see how much time passed
$time = $dbh->prepare("SELECT UNIX_TIMESTAMP() - :created AS time_in_seconds");
$time->bindValue(":created",$time_comp);
$time->execute() or die("fail time");
$time_passed = $time->fetch(PDO::FETCH_ASSOC);
return floatval($time_passed['time_in_seconds']);
}

500 internal server error when execute a simple PHP loop program?

<?php
set_time_limit(600);
while (1){
usleep(30000000);
}
?>
The php program above causing the 500 internal server error. The problem is , I have set the time limit as 600 -> 10 minutes. And I am expecting it will run for ten minutes, for 30 second each , it will run a while loop for 1 time, and keep the activity to 10 minute. However, it return 500 internal server error at round 2 minutes. What is the root cause / how to fix the problem ? thanks
I spun your code up on my own Linux-based web-server. I used a Linux server rather than IIS simply because I thought it would give me more debugging information.
Your code for me works just fine. It doesn't time out. I would then have to say that there is something specific to IIS. I would check the Event Log. A 500 error should have something interesting in there. To get to the event log (http://msdn.microsoft.com/en-us/library/ms524984(v=vs.90).aspx):
From the Start menu, click Run. In the Open box, type eventvwr. Click OK.
Event Viewer is listed under the System Tools node.
I would suggest, if you need this program to wait for 10 minutes, either go into your PHP.ini file (if that's the problem) and up the max_execution_time by A LOT, or as an alternative, let the page sit for 10 seconds and then reload the same page with header(). The advantage to this approach is that you aren't killing EXTRA CPU cycles by just spinning and locking everything up. You are instead just re-querying your server and using a persistent cookie:
<?php
echo "Time Slept: ".$_COOKIE['total_time_slept']; //persistent variable you can use to measure how much time has elapsed
if($_COOKIE['total_time_slept'] < 1000) //if less than 10 minutes
{
$page = $_SERVER['PHP_SELF'];
$sec = 10;
setcookie('total_time_slept', $_COOKIE['total_time_slept']+$sec); //append that time to the counter
header("Refresh: $sec; url=$page"); //reload this page every 10 seconds
}
?>
I had a similar problem with usleep on a LAMP installation where PHP run in safe-mode.
So my suggestion is check if your PHP run on safe-mode. If that's the case try using the sleep function (although it won't help if you want to loop each 0.001s for instance).
Note also that usleep doesn't work on PHP Windows platforms until PHP 5.
I know this thread is old but I hope this check will help the others.

Maximum execution time of 30 seconds exceeded every time

every time my php script sends me "Maximum execution time of 30 seconds exceeded". Is there any way to show the custmize error to visitor instead of this big error message. i mean can we use our own error message at the place of this?
You probably have an endless loop somewhere. You could debug the PHP on your own machine (adding debug printing inside, etc.)
add this in the top of your php script
set_time_limit(0);
ini_set('max_execution_time', 0);
this means your script has unlimited time to execute the script, but you can limit it to x seconds, if you replace the 0 with a number of seconds like 500. But if you have a bug, like a endless loop the script would never stop and maybe your server could die if 100's visitor hit the script.
pin down your endless loop by inserting
die('got here...');
at top and move it downwards in your script, you'll find your problematic code quite quickly. Real debugging is smoother but needs some setting up.
A functional php script should normally take less than a second to execute.
regards,
///t

Categories