PHP / Apache2 page taking too long to load - php

I have some PHP code that executes some server intensive code with a lot of calculations and loops. It typically takes 2 to 10 seconds for it to run. But as the code is written, it should always finish (no infinite loops) and it should never take more than 10 seconds. However, I randomly get these situations where it takes around 60 seconds for the page to load in my browser, yet the PHP microtime function is telling me it only took 2 seconds.
$start_time = microtime(TRUE);
// All my vairous loops and calculations
$end_time = microtime(TRUE);
$g = number_format(($end_time - $start_time), 4);
echo "<br>Total Time " . $g;
I run my own Ubuntu 12 instance so I have complete control over server processes. I'm not a very experienced server admin though. My PHP.ini max_execution_time is set to 30. Yet when I have this problem, it takes well over 30 seconds to finally appear in my browser (and then tells me it ran in 2 seconds). What I'm wondering is if could this simply be a network issue. If so, how do I know if it is my ISP or on the data center side? Could it also be something at the Apache service level? Like some buffer filling up and delaying the sending of the page.

Related

CPU timer and php microtime

I am interested to know if there is better way to time operations. Relying on timer, as it seems the microtime .. or any other method that reads OS time will clearly not give as good an estimate.
I am interested to know in general what is the precision of timing operations nowadays in a linux based operating system, say Red Hat Linux.
Now to be more specific, I recently wrote an algorithm and tried to compare the time it took. My php code worked like this :
$start = microtime(true);
$result = myTimeConsumingProcess();
$end= microtime(true);
$timeinMiliSec = 1000.0*($end - $start);
It turns out that sometime the process took 0 ms and on different execution, with precisely the same data it took 9.9xxxx milliseconds.
The explanation for this as I can imagine is that time is measured using timer interrupts. If the execution starts and finishes before the next timer interrupt updates the time of the OS, we are bound to get 0 sec as the difference.
It used to be the case from my early DOS days that a timer interrupts was called at every 18.19 ms. Looks like that is not the case any more with better machines and upgraded OS , we have are now running timers faster than that.

Only one call from concurrent request with 60 sec timeout

I have a function callUpdate() that needs to be executed after every update in the webpage admin.
callUpdates execute some caching (and takes up to 30 sec..) so it is not important to execute it immediately but in reasonable amount of time after the last update lets say 60 sec.
The goal is to skip processing if the user (users) make several consecutive changes in a short amount of time.
here is my current code:
//this in separate stand alone script that is called asynchronous way
//so hanging for 1min does not and block the app.
function afterUpdate(){
$time = time();
file_put_contents('timer.txt', $time);
sleep(60);
if (file_get_contents("timer.txt") == $time) {
callUpdate();
}
}
My concerns here are bout the sleep function .. if it takes too much resources
(if I make 10 quick saves, this will start 10 PHP processes running for almost 60 sec each ..)
And what will happen if 2 users call simultaneously file_put_contents() on the same file.
Please tell me if there is better approach and if there are some major issues in mine.
NOTE: data between sessions can be stored only in a file
and there I have limited access to the server setup "APC settings and such"

PHP Cron job execution time limits fail

I've got a CronJob scheduled to run every 7 minutes. The script runs a loop over a number of users and sends email via SMTP or does some API calls via curl.
Thus, most of the execution time is apparently spent outside the realm tracked by max_execution_time on Linux. So, because I was experiencing hangs in my script that were always fixed by restarting it (I'm also looking for the cause of the hangs, but wasn't successful so far).
Because with set_time_limit set to 6 minutes, the script still ran 30 minutes sometimes, I now check microtime(true) after each round in the loop and break out of it, if it has been running more than 6 minutes.
Still, the script sometimes runs 37 minutes (even though I can see that emails, which map to one round in a loop, still go off).
The only trick I have left is pcntl_fork. I am reluctant to employ it because of platform dependence, and because I figured using a loop and microtime(true) should track time spent outside the process too and I'd like to understand why this isn't the case here.
The max_execution_time is used for limiting script execution time. However this does not affect system calls. See the manpage.
You can also use set_time_limit() to properly set the maximum amount of time a script can run. This only changes the time_limit for the script in scope. Have you tried this as well?
You can find another question that has been asked may help slightly at this link.
I ended up going outside PHP to get the time, this is simple, works and doesn't require me to learn to deal with pcntl functions.
$time_comp = microtime(true); // save start time
function check_time_against_mysql($dbh,$time_comp) // run this function
{ // to see how much time passed
$time = $dbh->prepare("SELECT UNIX_TIMESTAMP() - :created AS time_in_seconds");
$time->bindValue(":created",$time_comp);
$time->execute() or die("fail time");
$time_passed = $time->fetch(PDO::FETCH_ASSOC);
return floatval($time_passed['time_in_seconds']);
}

PHP Refresh Page

Currently I have a problem with GoDaddy and its hosting. The maximum amount of time allowed for a PHP script to run is set to 120 seconds = 2 mins.
Sometimes when I run my script, it takes more than 120 seconds and sometimes it takes 30 seconds.
When it takes over 120 seconds, I get a Internal Server Error (500).
The question I have is, is it possible to figure out if a script run time is at 110 seconds and then refresh the page so that the Internal Server Error does not occur.
If what ever is causing your script to run long is some sort of self contained blocking function (like a database query), then you are kind of screwed.
If not, say you are in a very large loop, then just do the math your self.
At the very start of your script, save the current time: $t_start = time();
Then at the top of your loop, do a $runtime = time() - $t_start; then you have how long you have been running, and can use that to break your loop gracefully if nearing your 120 limit.

500 internal server error when execute a simple PHP loop program?

<?php
set_time_limit(600);
while (1){
usleep(30000000);
}
?>
The php program above causing the 500 internal server error. The problem is , I have set the time limit as 600 -> 10 minutes. And I am expecting it will run for ten minutes, for 30 second each , it will run a while loop for 1 time, and keep the activity to 10 minute. However, it return 500 internal server error at round 2 minutes. What is the root cause / how to fix the problem ? thanks
I spun your code up on my own Linux-based web-server. I used a Linux server rather than IIS simply because I thought it would give me more debugging information.
Your code for me works just fine. It doesn't time out. I would then have to say that there is something specific to IIS. I would check the Event Log. A 500 error should have something interesting in there. To get to the event log (http://msdn.microsoft.com/en-us/library/ms524984(v=vs.90).aspx):
From the Start menu, click Run. In the Open box, type eventvwr. Click OK.
Event Viewer is listed under the System Tools node.
I would suggest, if you need this program to wait for 10 minutes, either go into your PHP.ini file (if that's the problem) and up the max_execution_time by A LOT, or as an alternative, let the page sit for 10 seconds and then reload the same page with header(). The advantage to this approach is that you aren't killing EXTRA CPU cycles by just spinning and locking everything up. You are instead just re-querying your server and using a persistent cookie:
<?php
echo "Time Slept: ".$_COOKIE['total_time_slept']; //persistent variable you can use to measure how much time has elapsed
if($_COOKIE['total_time_slept'] < 1000) //if less than 10 minutes
{
$page = $_SERVER['PHP_SELF'];
$sec = 10;
setcookie('total_time_slept', $_COOKIE['total_time_slept']+$sec); //append that time to the counter
header("Refresh: $sec; url=$page"); //reload this page every 10 seconds
}
?>
I had a similar problem with usleep on a LAMP installation where PHP run in safe-mode.
So my suggestion is check if your PHP run on safe-mode. If that's the case try using the sleep function (although it won't help if you want to loop each 0.001s for instance).
Note also that usleep doesn't work on PHP Windows platforms until PHP 5.
I know this thread is old but I hope this check will help the others.

Categories