In PHP, when the script consume more than memory_limit value, the script stop with an error. How can I add a warning level: if my script consumes more than 90Mb I have a warning in the log file, but the script go on, and still crashes if it consumes more than 128Mb?
I know nothing about PHP extensions or PHP C code, but as long as we already build PHP by ourself, we can even patch the code.
In Zend/zend_alloc.c I can see this
if (segment_size < true_size || heap->real_size + segment_size > heap->limit) {
Really easy to add a line before this and compare used memory to another limit, and issue a warning.
Can I do this in an extension, or by patching the PHP code? Why this does not already exist? It is a bad idea? Does this already exist somewhere?
Adding the same warning for MAX_EXECUTION_TIME seem more difficult, as I still don't understand the way the timer is handled.
Here are some interesting questions / articles I have found for you:
This code shows a PHP way to catch a fatal error.
Safely catch a 'Allowed memory size exhausted' error in PHP
Basically you can use the PHP register_shutdown_function to run a function when the script exits or stops. And the function error_get_last() returns information about the last error, which would of been the fatal one:
ini_set('display_errors', false);
error_reporting(-1);
set_error_handler(function($code, $string, $file, $line){
throw new ErrorException($string, null, $code, $file, $line);
});
register_shutdown_function(function(){
$error = error_get_last();
if(null !== $error)
{
echo 'Caught at shutdown';
}
});
try
{
while(true)
{
$data .= str_repeat('#', PHP_INT_MAX);
}
}
catch(\Exception $exception)
{
echo 'Caught in try/catch';
}
I wouldn't recommend that you just edit the PHP C code. If you don't want to do this is PHP then you should really make an extension.
You could do it inside your php script using memory_get_usage(). It's not really at the system level and you'd have to call it several times while the script executes to catch the moment you use too much.
Related
I read that file_get_content is synchronous, but when I tried the code below I dont' think so :
$url = "http://foo.com";
$a = array("file11.php", "file2.php", "file3.php");
foreach ($a as $file)
{
$final = $url . "/" . $file;
print "Calling $final ...";
$res = file_get_contents($final);
if ($res)
print "OK";
else
print "ERR!";
print "<br>";
}
Each file executes some complex tasks, so I know the minimal excution time of any script, but this code runs very fastly and seems not to wait each request ! How can I wait for each file request?
Thanks :)
The above code is definitely synchronous. So if you say that the code exits after a few seconds, while it should be a lot longer, then you probably have a problem with the code.
Try to wrap this code in a try {} catch. And print the error. See what it says.
Try { code here } catch (Exception $e) { }
Also, most default settings in the php.ini for MAX_EXECUTION for a script is 30 seconds. After that it will exit on a fatal timeout error too. Check the setting in your php.ini and adjust it to your needs.
Edit:
Gathering your comments, I now assume you are trying to execute the php files you are referring to. This makes your question very confusing and the tags just wrong.
The code you use in your example only reads the contents of the file, so it's not executing anything. Which explains why it returns so fast, while you expect it to take a while.
If you want to execute the referred php files, approach it like this:
Include_once( $final );
Instead of opening the contents.
I've read this thread: php: catch exception and continue execution, is it possible?
Every answer suggests that a try catch will continue executing the script. Here is an example where it doesn't:
try{ $load = #sys_getloadavg(); }
catch (Exception $e){ echo 'Couldn\'t find load average.<br>'; return false; }
I'm running it on xampp on windows, which could be why it errors (it gives a Call to undefined function sys_getloadavg() error when the # is removed), but that isn't the issue in question. It could be any function that doesn't exist, isn't supported or fails - I can not get the script to continue executing.
Another example is if there is a syntax error in the try, say I'm including an external file and parsing it as an array. This also produces an error and stops executing.
Is there any brute force way to continue the script running, regardless of what fails in the try?
Unlike other languages, there's a difference in PHP between exceptions and errors. This would be like a compile error in other languages. that require declaration files. You can't catch or ignore Fatal errors like a function not exisiting. You can test for existence before using though:
if( function_exists('sys_getloadavg') {
try{ $load = #sys_getloadavg(); }
catch (Exception $e){ echo 'Couldn\'t find load average.<br>'; return false; }
}
i call an php pgm per cronjob at different times.
the pgm includes many php-files.
each file sends or gets data from partners.
How can i handle errors in one includes pgm.
at the time, one ftp-connection in an included pgm fails so the complete script crushes.
how can i handle this ?
You should wrap code, which is possible to crash, into try/catch construction. This will throw exeption, but the script will continue to work. More here.
Need to know more about you code inorder to give you definite answer.
In general php errors isn't catchable unless you define your own error handler from which you throw exceptions your self. Using the code below makes most runtime errors catchable (as long as they arent considered fatal)
error_reporing(E_ALL);
set_error_handler(function($errno, $errstr, $errfile, $errline) {
if($errno == E_STRICT || $errno == E_DEPRECATED) {
return true;
}
throw new RuntimeException('Triggered error (code '.$errno.') with message "'.$errstr.'"');
});
Btw, You could also define your own exception handler to display triggered errors with a full stack trace when an exception isn't catched.
Notice! I would not suggest that you add this code to a production website without rigorous testing first, making sure everything still works as expected.
Edit:
I have no idea what your code looks like, but I guess you can do something like:
require 'error-handler.php'; // where you have your error handler (the code seen above)
$files_to_include = array(
'some-file.php',
'some-other-file.php',
...
);
foreach($files_to_include as $file) {
try {
include $file;
}
catch(Exception $e) {
echo "$file failed\nMessage: ".$e->getMessage()."\nTrace:\n".$e->getTraceAsString();
}
}
I've been playing around with a system I'm developing and managed to get it to cause this:
Fatal error: Maximum execution time of 30 seconds exceeded
It happened when I was doing something unrealistic, but nevertheless it could happen with a user.
Does anyone know if there is a way to catch this exception? I've read around but everyone seems to suggest upping the time allowed.
How about trying as PHP documentation (well... at least one of its readers) say:
<?php
function shutdown()
{
$a = error_get_last();
if ($a == null) {echo "No errors";}
else {print_r($a);}
}
register_shutdown_function('shutdown');
ini_set('max_execution_time', 1);
sleep(3);
?>
Have a look at the following links:
http://www.php.net/manual/en/function.set-error-handler.php#106061
http://www.php.net/manual/en/function.register-shutdown-function.php
Your only options are to increase the allowed execution time (setting it to 0 makes it infinite, but that is not recommended) of the script or spawn a new thread and hope for the best.
The reason that this isn't catchable is that it isn't really thrown. No one line of the code actually triggered the error, rather PHP said, "Nope, sorry, this is too long. Time to shut down now." And that makes sense. Imagine having a script with a max execution time of 30 seconds catching that error and taking another 30 seconds... in a poorly designed program, that opens up some rather nasty opportunities to exploit. At a minimum, it will create opportunities for DOS attacks.
This isn't an exception, it's an error. There are important differences between exceptions and errors, first and foremost errors can't be caught with try/catch semantics.
PHP scripts are built around a paradigm of short execution times, so PHP is configured by default to assume that if a script has been running for longer than 30 seconds it must be caught in an infinite loop and therefore should be terminated. This is to prevent an errant PHP script causing a denial of service, either by accident or by malicious intent.
However, scripts do sometimes need more running time than they are allocated by default.
You can try changing the maximum execution time, either by using set_time_limit() or by altering the value of max_execution_time in the php.ini file to raise the limit. you can also remove the limit entirely by setting the execution time to 0, though this isn't recommended.
set_time_limit() may be disabled by mechanisms such as disable_functions so it might not be available to you, likewise you might not have access to php.ini. If both of these are the case then you should contact your host for help.
One exception is PHP scripts run from the command line. Under these running conditions, PHP scripts may be interactive and need to spend a long time processing data or waiting for input. For this reason there isn't a max_execution_time limit on scripts run from the command line by default.
EDIT TO ADD: PHP 7's error handling had a major overhaul. I believe that errors and exceptions are now both subclasses of Throwable. This may make the above no longer relevant for PHP7+, though I'll have to look more closely into the specifics of how error handling works now to be sure.
There is nothing you can do about it. but you can have graceful shutdown using register_shutdown_function
<?php
ini_set('display_errors', '0');
ini_set("max_execution_time",15 ); //you can use this if you know your script should not take longer than 15 seconds to finish
register_shutdown_function('shutdown');
function shutdown()
{
$error = error_get_last();
if ($error['type'] === E_ERROR) {
//do your shutdown stuff here
//be care full do not call any other function from within shutdown function
//as php may not wait until that function finishes
//its a strange behavior. During testing I realized that if function is called
//from here that function may or may not finish and code below that function
//call may or may not get executed. every time I had a different result.
// e.g.
other_function();
//code below this function may not get executed
}
}
while(true)
{
}
function other_function()
{
//code in this function may not get executed if this function
//called from shutdown function
}
?>
Yeah I tested the solution by TheJanOnline. sleep() does not count into php execution time so here is WORKING version with indefinite loop:
<?php
function shutdown()
{
$a=error_get_last();
if($a==null)
echo "No errors";
else
print_r($a);
}
register_shutdown_function('shutdown');
ini_set('max_execution_time',1 );
while(1) {/*nothing*/}
// will die after 1 sec and print error
?>
There is a little tricky way to handle "Fatal error: Maximum execution time of 30 seconds exceeded" as exception in certain cases:
function time_sublimit($k = 0.8) {
$limit = ini_get('max_execution_time'); // changes even when you set_time_limit()
$sub_limit = round($limit * $k);
if($sub_limit === 0) {
$sub_limit = INF;
}
return $sub_limit;
}
In your code you must to measure execution time and throw exception earlier than the timeout fatal error may be triggered. $k = 0.8 is a 80% of allowed execution time, so you have 20% of time to handle exception.
try{
$t1 = time(); // start to mesure time.
while (true) { // put your long-time loop condition here
time_spent = time() - $t1;
if(time_spent >= time_sublimit()) {
throw new Exception('Time sublimit reached');
}
// do work here
}
} catch(Exception $e) {
// catch exception here
}
I came up with this based on the answer #pinkal-vansia gave. So I'm not claiming an original answer, but an answer with a practical application. I needed a way for the page to refresh itself in the event of a timeout. Since I have been observing enough timeouts of my cURL script to know the code is working, but that sometimes for whatever reason it fails to connect to the remote server, or read the served html fully, and that upon refresh the problem goes away, I am ok with script refreshing itself to "cure" a Maximum execution timeout error.
<?php //script name: scrape_script.php
ini_set('max_execution_time', 300);
register_shutdown_function('shutdown');
function shutdown()
{
?><meta http-equiv="refresh" content="0; url=scrape_script.php"><?php
// just do a meta refresh. Haven't tested with header location, but
// this works fine.
}
FYI, 300 seconds is not too long for the scraping script I'm running, which takes just a little less than that to extract the data from the kinds of pages I'm scraping. Sometimes it goes over by just a few seconds only due to connection irregularities. Knowing that it's connection times that sometimes fail, rather than script processing, it's better to not increase the timeout, but rather just automatically refresh the page and try again.
I faced a similar problem and here was how I solved it:
<?php
function shutdown() {
if (!is_null($error = error_get_last())) {
if (strpos($error['message'], 'Maximum execution time') === false) {
echo 'Other error: ' . print_r($error, true);
} else {
echo "Timeout!\n";
}
}
}
ini_set('display_errors', 0);
register_shutdown_function('shutdown');
set_time_limit(1);
echo "Starting...\n";
$i = 0;
while (++$i < 100000001) {
if ($i % 100000 == 0) {
echo ($i / 100000), "\n";
}
}
echo "done.\n";
?>
This script, as is, is going to print Timeout! at the end.
You can modify the line $i = 0; to $i = 1 / 0; and it is going to print:
Other error: Array
(
[type] => 2
[message] => Division by zero
[file] => /home/user/test.php
[line] => 17
)
References:
PHP: register_shutdown_function - Manual
PHP: set_time_limit - Manual
PHP: error_get_last - Manual
Greetings,
I am writing some code inside a framework for PHP 5.3, and I am trying to catch all errors in a way that will allow me to gracefully crash on client side and add some log entry at the same time. To be sure to also catch parse errors, I am using register_shutdown_function to specifically catch parse errors.
Here is the function that I register
static function shutdown()
{
if(is_null($e = error_get_last()) === FALSE)
if($e["type"] == E_PARSE)
self::error($e["type"], $e["message"], $e["file"], $e["line"], array(self::$url));
}
The error method does two things :
It adds an error entry to a log file using fopen in append.
It execute an error display: it explicitely sets the HTTP code to 500, and display a custom format 500 error page. Some include (which I do within a wapper class, but is only an include for now) are required from there
For some reason, I can fopen my log file and append, but cannot do a simple include; it just silently dies from there.
Here is what the log outputs if I add a Log entry for each includes
static public function file($file)
{
if(class_exists("Logs"))
Logs::append("errors.log", $file . ":" . ((include $file) ? 1 : 0));
else
include $file;
}
// Inside the Logs class...
static public function append($file, $message)
{
if(!is_scalar($message))
$message = print_r($message, true);
$fh = fopen(Config::getPath("LOGS") . "/" . $file, 'a');
fwrite($fh, $message."\r\n");
fclose($fh);
}
Here what the log file gives me:
/Users/mt/Documents/workspace/core/language.php:1
...
/Users/mt/Documents/workspace/index.php:1
/Users/mt/Documents/workspace/core/view.php:1
[2010-01-31 08:16:31] Parsing Error (Code 4) ~ syntax error, unexpected T_VARIABLE {/Users/mt/Documents/workspace/controllers/controller1.php:13}
After the parse error is hitted, it does run the registered function, but as soon as it hits a new include file, it dies of a silent death... is there a way to circumvent this? Why would I be able to open a file for read/write, but not for inclusion?
Try to use Lagger
It would seem that it is related with either something in the config, either with some build specifics.
I ran the code initially on MacOSX, which would not run and fail as described, but it runs on a compiled version of PHP under Ubuntu.
Which is kinda fine for me, but pretty much makes me wonder why it still fails under OSX (XAMPP to be more precise).