<?php
set_time_limit(600);
while (1){
usleep(30000000);
}
?>
The php program above causing the 500 internal server error. The problem is , I have set the time limit as 600 -> 10 minutes. And I am expecting it will run for ten minutes, for 30 second each , it will run a while loop for 1 time, and keep the activity to 10 minute. However, it return 500 internal server error at round 2 minutes. What is the root cause / how to fix the problem ? thanks
I spun your code up on my own Linux-based web-server. I used a Linux server rather than IIS simply because I thought it would give me more debugging information.
Your code for me works just fine. It doesn't time out. I would then have to say that there is something specific to IIS. I would check the Event Log. A 500 error should have something interesting in there. To get to the event log (http://msdn.microsoft.com/en-us/library/ms524984(v=vs.90).aspx):
From the Start menu, click Run. In the Open box, type eventvwr. Click OK.
Event Viewer is listed under the System Tools node.
I would suggest, if you need this program to wait for 10 minutes, either go into your PHP.ini file (if that's the problem) and up the max_execution_time by A LOT, or as an alternative, let the page sit for 10 seconds and then reload the same page with header(). The advantage to this approach is that you aren't killing EXTRA CPU cycles by just spinning and locking everything up. You are instead just re-querying your server and using a persistent cookie:
<?php
echo "Time Slept: ".$_COOKIE['total_time_slept']; //persistent variable you can use to measure how much time has elapsed
if($_COOKIE['total_time_slept'] < 1000) //if less than 10 minutes
{
$page = $_SERVER['PHP_SELF'];
$sec = 10;
setcookie('total_time_slept', $_COOKIE['total_time_slept']+$sec); //append that time to the counter
header("Refresh: $sec; url=$page"); //reload this page every 10 seconds
}
?>
I had a similar problem with usleep on a LAMP installation where PHP run in safe-mode.
So my suggestion is check if your PHP run on safe-mode. If that's the case try using the sleep function (although it won't help if you want to loop each 0.001s for instance).
Note also that usleep doesn't work on PHP Windows platforms until PHP 5.
I know this thread is old but I hope this check will help the others.
Related
I'm running a cron job once an hour that can initiate multiple "jobs" to finish.
The cron job points to a php file which starts a "background script" that repeats itself multiple times when necessary.
The time to finish each "job" can take unknown amount of time, but always finishes in less than 20 minutes with margin. Therefore the server "max_execution_time" is set to allow script to run for 20 min to prevent timeout.
If the script has 4 jobs where each job takes 10 minutes, the total execution time would exeed 20 minutes.
My thought on how to resolve this, was to use header("Location: $url") after one "job" finished to initiate a new restart of the script for "next job" and get 20 new minutes to work with.
As long as a "job" only takes a couple of minutes, everything works fine. But when it reaches a "job" that takes 12 minutes. The header("Location: $url") fails with 500 Internal Server Error
Followed by:
Additionally, a 500 Internal Server Error
error was encountered while trying to use an ErrorDocument to handle the request.
This happends even if the total amount of exectution time including previous "jobs" is less than 20 minutes.
I don't exceed the cron job redirects and I don't exceed 20 min of execution time (including all current "jobs").
So even if my thought of using header("Location: $url") to prevent future total execution time to exceed 20 min. I already have failure before reaching limit of either amount of redirects or execution time.
What's currently failing is 6 "jobs" and the total execution time of the first 4 is 2,5 minutes. The 5:th job is 12 min. The location redirect to the last 6:th job does not work, as the header("Location: $url") fails after the 5:th job has been running for 12 minutes. Total execution time for all "jobs" until this point is only 14,5 minutes and redirects is 5 (Allowed redirects is 60).
If I switch out the 12 min job to 6 other shorter running jobs, having a total of 10 shorter jobs, everything works fine.
So there seems to be somthing connected to the run time of the script and how header("Location:") work?
I know this solution is not the best (or any good) way to handle this kind of task. But the road led me to this point during the development and finding out why the 500 internal error is occuring would make this solution work while I rewrite a new "job" handler.
Therefore any help would be welcome :)
if ($job != "available") { // this illustrates a check in database if there is any jobs "available" or if all is finished
return "all jobs finished";
}
$repeating = true;
while ( $repeating == true ) {
// has different stuff that can take some time to finish
if($something == true) {
// run task
continue;
}
if($something_else == true) {
// run task
continue;
}
// above currently takes between 1 to 12 minutes
// when finished set $repeating to false to end while loop
$repeating = false;
}
// here is a log written to file (which never fails)
write_string_to_file($string);
// current job finished, reload this page again, check if there is any jobs left. If jobs left, this initiates new 20 minutes before exectution time timeout?
// ($url is pointing to this page again)
header("Location: $url") // this fails if above while loop takes 12 minutes, which currently is the longest running "job"
Note: I made a custom 500 error page which pointed me to the header("Location") as the issue, as that gave me following info on what whas requested:
Requested URL: "the $url im actually reloading was presented here"
Redirect Status Code: 500
Worth mentioning. The error also said:
Additionally, a 500 Internal Server Error
error was encountered while trying to use an ErrorDocument to handle the request.
So no error log is available at the moment
I suspect that the Error 500 you're getting is actually related to the script finding its environment different from what it expected.
For example, some session or DB connection expired.
I'd suggest you set error_reporting to maximum and add suitable logging for errorlevels (including startup errors).
Without knowing in detail what the jobs do and how they work, it's not likely that the precise cause of error can be guessed. The redirect works, as your custom handler demonstrated. The problem is that the new script instance is failing.
Another possibility is redirect to a diagnostic script that simply dumps into a file the content of $_SERVER, $_SESSION etc. . Then, start adding the content of the job manager script to the diagnostic script, until it breaks with a 500 error again.
First you should check the error-logs to get the exact error-message.
Without the exact message we can't provide detailed answers.
Also how about storing the last job somewhere(text-file/database), to continue with the next job.
For example:
<?php
$id_job = 0;
switch($id_job)
{
case 10:
job1();
case 20:
job2();
}
after each job just set the id_job of the current/done job and add 10 before entering the switch/case.
I have some PHP code that executes some server intensive code with a lot of calculations and loops. It typically takes 2 to 10 seconds for it to run. But as the code is written, it should always finish (no infinite loops) and it should never take more than 10 seconds. However, I randomly get these situations where it takes around 60 seconds for the page to load in my browser, yet the PHP microtime function is telling me it only took 2 seconds.
$start_time = microtime(TRUE);
// All my vairous loops and calculations
$end_time = microtime(TRUE);
$g = number_format(($end_time - $start_time), 4);
echo "<br>Total Time " . $g;
I run my own Ubuntu 12 instance so I have complete control over server processes. I'm not a very experienced server admin though. My PHP.ini max_execution_time is set to 30. Yet when I have this problem, it takes well over 30 seconds to finally appear in my browser (and then tells me it ran in 2 seconds). What I'm wondering is if could this simply be a network issue. If so, how do I know if it is my ISP or on the data center side? Could it also be something at the Apache service level? Like some buffer filling up and delaying the sending of the page.
I've got a CronJob scheduled to run every 7 minutes. The script runs a loop over a number of users and sends email via SMTP or does some API calls via curl.
Thus, most of the execution time is apparently spent outside the realm tracked by max_execution_time on Linux. So, because I was experiencing hangs in my script that were always fixed by restarting it (I'm also looking for the cause of the hangs, but wasn't successful so far).
Because with set_time_limit set to 6 minutes, the script still ran 30 minutes sometimes, I now check microtime(true) after each round in the loop and break out of it, if it has been running more than 6 minutes.
Still, the script sometimes runs 37 minutes (even though I can see that emails, which map to one round in a loop, still go off).
The only trick I have left is pcntl_fork. I am reluctant to employ it because of platform dependence, and because I figured using a loop and microtime(true) should track time spent outside the process too and I'd like to understand why this isn't the case here.
The max_execution_time is used for limiting script execution time. However this does not affect system calls. See the manpage.
You can also use set_time_limit() to properly set the maximum amount of time a script can run. This only changes the time_limit for the script in scope. Have you tried this as well?
You can find another question that has been asked may help slightly at this link.
I ended up going outside PHP to get the time, this is simple, works and doesn't require me to learn to deal with pcntl functions.
$time_comp = microtime(true); // save start time
function check_time_against_mysql($dbh,$time_comp) // run this function
{ // to see how much time passed
$time = $dbh->prepare("SELECT UNIX_TIMESTAMP() - :created AS time_in_seconds");
$time->bindValue(":created",$time_comp);
$time->execute() or die("fail time");
$time_passed = $time->fetch(PDO::FETCH_ASSOC);
return floatval($time_passed['time_in_seconds']);
}
Currently I have a problem with GoDaddy and its hosting. The maximum amount of time allowed for a PHP script to run is set to 120 seconds = 2 mins.
Sometimes when I run my script, it takes more than 120 seconds and sometimes it takes 30 seconds.
When it takes over 120 seconds, I get a Internal Server Error (500).
The question I have is, is it possible to figure out if a script run time is at 110 seconds and then refresh the page so that the Internal Server Error does not occur.
If what ever is causing your script to run long is some sort of self contained blocking function (like a database query), then you are kind of screwed.
If not, say you are in a very large loop, then just do the math your self.
At the very start of your script, save the current time: $t_start = time();
Then at the top of your loop, do a $runtime = time() - $t_start; then you have how long you have been running, and can use that to break your loop gracefully if nearing your 120 limit.
I need to create a script in PHP language which performs permutation of numbers. But PHP has an execution time limit set to 60 seconds. How can I run the script so that if you need to run more than 60 sesunde, not been interrupted by the server. I know I can change the maximum execution time limit in php, but I want to hear another version that does not require to know in advance the execution time of a script.
A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.
Any advice is welcome. An example code would be useful.
Thanks.
First I need to enter a number, lets say 25. After this the script is launch and it need to do the following: for every number <= than 25 it will create a file with the numbers generated at the current stage; for the next number it will open the previuos created file, and will create another file base on the lines of the opened file and so on. Because this take to long, I need to avoid the script beeing intrerupted by the server.
#emanuel:
I guess when your friend told you "A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.", he/she must have meant "Split your script computation into x pieces of work and run it separately"
For example with this script you can execute it 150 times to achieve a 150! (factorising) and show the result:
// script name: calc.php
<?php
session_start();
if(!isset($_SESSION['times'])){
$_SESSION['times'] = 1;
$_SESSION['result'] = 0;
}elseif($_SESSION['times'] < 150){
$_SESSION['times']++;
$_SESSION['result'] = $_SESSION['result'] * $_SESSION['times'];
header('Location: calc.php');
}elseif($_SESSION['times'] == 150){
echo "The Result is: " . $_SESSION['result'];
die();
}
?>
BTW (#Davmuz), you can only use set_time_limit() function on Apache servers, it's not a valid function on Microsoft IIS servers.
set_time_limit(0)
You could try to put the calls you want to make in a queue, which you serialize to a file (or memory cache?) when an operation is done. Then you could use a CRON-daemon to execute this queue every sixty seconds, so it continues to do the work, and finishes the task.
The drawbacks of this approach are problems with adding to the queue, with file locking and the such, and if you need the results immediately, this can prove troublesome. If you are adding stuff to a Db, it might work out. Also, it is not very efficient.
Use set_time_limit(0) but you have to disable the safe_mode:
http://php.net/manual/en/function.set-time-limit.php
I suggest to use a fixed time (set_time_limit(300)) because if there is a problem in the script (endless loops or memory leaks) this can not be a source of problems.
The web server, like Apache, have also a maximum time limit of 300 seconds, so you have to change it. If you want to do a Comet application, it may be better to chose another web server than Apache that can have long requests times.
If you need a long execution time for a heavy algorithm, you can also implement a parallel processing: http://www.google.com/#q=php+parallel+processing
Or store the input data and computer with another external script with a cron or whatever else.