php's set_time_limit(0) hangs and throws HTTP 500 - php

This simple test, that calls set_time_limit 10 times:
<?php
echo("<html><head><title>set_time test</title></head><body><h1>set_time test</h1>");
for ($i=0; $i<10; $i++) {
error_log("Round $i");
set_time_limit(0); echo("<p>$i</p>");
}
echo("</body></html>");
Hangs forever with error:
// Fatal error: Maximum execution time of 30 seconds exceeded in .../_set_time.php on line 4
// Call Stack:
// 0.0007 231768 1. {main}() .../_set_time.php:0
// 0.0007 232000 2. set_time_limit() .../_set_time.php:4
30 seconds is the value of max_execution_time in php.ini. If I raise it to 3000, the scripts just keeps spinning. I have AllowOverrideAll for the document root in apache's config (assuming it matters - it shouldn't, but still)
A similar machine (same OS & SW versions, config very close) runs the script flawlessly each time. Log files show nothing. Google searches are mostly silently or irrelevant. Seems some weird system/configuration quirk.
Ideas?
OS: CentOS 6.9
apache 2.2.15
php 5.6 (webtatic RPM:php56w-5.6.30-1.w6.x86_64)
Edit: Submitted to the xdebug project as bug: https://bugs.xdebug.org/view.php?id=1457

OK, so the culprit is the xdebug extension (installed on one machine but not on the other). By removing it (php56w-pecl-xdebug-2.5.3-1.w6.x86_64) the unwanted behavior disappears. Bit tricky, took more than 2 days to pinpoint, between narrowing it to set_time_limit and finding the responsible extension.

Related

503 timeout on PHP 5.6.31

I have two and more servers running PHP 5.4.45 with same scripts and none of them are doing any issue when calling a script that requires more than 60 seconds to finish.
However, a new server having PHP 5.6.31 is returning 503 timeout, when I run the same script. I tried everything found on the internet from keepAlive to timeout in httpd.conf and in php.ini, I already have
ini_set('memory_limit', '-1');
ini_set('max_execution_time', 5);
Inside the code and all that. Same script same everything just different PHP version on 5.4.45 it works perfectly while on 5.6.31 I always get timeout unless when the full script executes in less than 60 seconds [I don't know where that limit comes from eventhough I changed anything related to 60 from the httpd.conf and php.ini) .
Kindly can you help me troubleshoot.

How do I handle Fatal Error in PHP? [duplicate]

This question already has answers here:
ini_set, set_time_limit, (max_execution_time) - not working
(3 answers)
Closed 8 years ago.
I am inserting a huge data values through CSV file via HTML form.
I have used set_time_limit(0); so that script run until the whole operation is not performed.
Fatal error: Maximum execution time of 300 seconds exceeded in
C:\xampp\htdocs\clytics\include\clytics.database.php on line 135
Now, I am trying to catch this fatal error.
I have used set_time_limit(0); so that script run until the whole
operation is not performed.
Probably needs more memory as well. Basically your code is just a hog & you need to tame the way it is gobbling up resources.
But that is just a tangent on the overall architecture issues you might be facing.
Specific to there issue, is there something in your code that would override that value of set_time_limit(0);?
Also, ar you running this script via the command line or the PHP in Apache? Because the CLI config php.ini is 100% different form the Apache module config of php.ini.
For example, on Ubuntu the Apache PHP php.ini is here:
/etc/php5/apache2/php.ini
But the command line (CLI) php.ini is here:
/etc/php5/cli/php.ini
And if you want to brute force your script to eat up memory regardless of your config settings, you can add this to the top of your PHP file:
ini_set('MAX_EXECUTION_TIME', -1);
If one reads up more on set_time_limit this comes up:
Set the number of seconds a script is allowed to run. If this is
reached, the script returns a fatal error. The default limit is 30
seconds or, if it exists, the max_execution_time value defined in the
php.ini.
Then reading up on max_execution_time this comes up:
This sets the maximum time in seconds a script is allowed to run
before it is terminated by the parser. This helps prevent poorly
written scripts from tying up the server. The default setting is 30.
When running PHP from the command line the default setting is 0.
But then, the magic 300 number shows up:
Your web server can have other timeout configurations that may also
interrupt PHP execution. Apache has a Timeout directive and IIS has a
CGI timeout function. Both default to 300 seconds. See your web server
documentation for specific details.
So now you know where the 300 comes from. But doing ini_set('MAX_EXECUTION_TIME', -1); should let you run the script without a timeout.
And final bit of info if none of that somehow works: Look into max_input_time:
This sets the maximum time in seconds a script is allowed to parse
input data, like POST and GET. Timing begins at the moment PHP is
invoked at the server and ends when execution begins.
While max_input_time might not seem to be related, in some versions of PHP, there is a bug where max_input_time and max_execution_time are directly connected.

PHP Fatal error: Max execution time exceeded - EVEN after set_time_limit(0)

I keep running into the following PHP error when running my script
Fatal error: Maximum execution time of 30 seconds exceeded in C:\wamp\apps\sqlbuddy1.3.3\functions.php on line 22
I already put this in my PHP file, and I STILL get this error message.
#set_time_limit(0);
Am I missing something?
Edit: This error only shows up after SEVERAL minutes, not after 30 seconds. Could something also be delaying its appearance?
set_time_limit() has no effect when running in safe_mode:
This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
You can check the value of safe_mode and max_execution_time with phpinfo().
Given the facts that you are using Windows and experiencing the timeout later than 30s you might have somewhere else in your code a reset of the timeout (set_time_limit(30)):
The set_time_limit() function [..] only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
Search your code for:
ini_set('max_execution_time', 30)
set_time_limit(30)
Rather than relying on the PHP file to change the php.ini settings, you should do it yourself. Find where the php.ini is kept for WAMP, and change/add this line:
max_execution_time = 500;
There is a PHP config that unallows the script to change the time_limit.
You can change your PHP behavior on php.ini file

Plesk 9.5 IIS 7 Fast CGI timeout error 500

I'm running the following setup:
- Windows 2008 web edition
- IIS 7
- Plesk 9.5
- FastCGI
PROBLEM DESCRIPTION
When running a script that takes longer than 30 seconds, I get the 500 internal server error message and not the "normal" response ("max execution time of 30 seconds is reached"). Also, this message always appears after about 40 secondes. Also, after putting set_time_limit(3600); into the code this same results happen.
ACTUAL RESULT
Both with and without the set_time_limit code:
After about 40 seconds a 500 internal server error appears
EXPECTED RESULT
Without set_time_limit:
After 30 seconds a message will appear saying the 30 seconds max execution limit is reached.
With set_time_limit:
The full script runs with a max of the number of seconds set in the set_time_limit
ADITIONAL NOTE
The problem is solved when running CGI in stead of FastCGI.
Can anybody help me?
I could found some links that could help you. The problem is the activityTimeout of your cgi module.
Increase fastCgi / PHP activityTimeout in IIS7
FastCGI timeout value change
For my, works in summary this.
Open the dos console "cmd" and go to
c:\windows\system32\inetsrv\
then, execute this command, changing the path of your php-cgi.
appcmd set config -section:system.webServer/fastCgi "-[fullPath='C:\php\php-cgi.exe'].activityTimeout:3600"
I hope this help you!

PHP Scripts Unresponsive After 120 Seconds

This one really has me stumped. I have not ran across this problem on any other servers I have worked on.
This is on an Ubuntu 10.04.1 LTS server with PHP 5.3.2-1ubuntu4.5.
When I have a PHP script that does not have any output for over 120 seconds, the script will not show any subsequent output; however, any non-output will still be executed. This happens for both php5-cgi & php5 (cli). For example:
1. $iSleep = 120;
2. echo 'Now: '.date('H:i:s')."\n";
3. echo 'Sleeping for: '.$iSleep."\n";
4. echo 'Will wake up at: '.date('H:i:s', (time()+$iSleep))."\n";
5. sleep($iSleep);
6. echo 'Woke up at: '.date('H:i:s')."\n";
7. mail('test#example.com', 'Subject', 'Message');
I will get all the output back to the screen through line 4. Line 6 will never appear on the screen, but I will get an email from line 7. If I change line 1 to be 119 or less, the code will execute fully as expected. Please let me know if there are any other settings (php.ini) or version numbers that you want to know. Thanks in advance for your time.
My answer is also mostly a guess but you have the normal max_execution_time variable. By default, this is 30 as per the documentation. But one caveat it mentions:
The maximum execution time is not affected by system calls, stream operations etc. Please see the set_time_limit() function for more details.
I am positive mail() is a system call, thus you want to use set_time_limit as described.
Hopefully this solves your issue.
PHP appears to respond properly when I connect from other clients. I need to figure out what makes the client I am connecting from different.

Categories