Fatal error php - php

Is there a way to make the code continue (not exit) when you get a fatal error in PHP?
For example I get a timeout fatal error and I want whenever it happens to skip this task and the continue with others.
In this case the script exits.

There is a hack using output buffering that will let you log certain fatal errors, but there's no way to continue a script after a fatal error occurs - that's what makes it fatal!
If your script is timing out you can use set_time_limit() to give it more time to execute.

"Fatal Error", as it's name indicates, is Fatal : it stop the execution of the script / program.
If you are using PHP to generate web pages and get a Fatal error related to max_execution_time which, by defaults, equals 30 seconds, you are certainly doing something that really takes too mych time : users won't probably wait for so long to get the page.
If you are using PHP to do some heavy calculations, not in a webpage (but via CLI, or a cron, or stuff like that), you can set another (greater) value for max_execution_time.
You have two ways of doing that :
First is to modify php.ini, to set this value (it's already in the file ; just edit the property's value). Problem is it'll modify it also for the web server, which is bad (this is a security measure, after all).
Better way is to create a copy of php.ini, called, for instance, phpcli.ini, and modify this file. Then, use it when invoking php :
php -c phpcli.ini myscript.php
This'll work great if you have many properties you need to configure for CLI execution. (Like memory_limit, which often has to be set to a higher value for long-running batches)
The other way is to define a different value for max_execution_time when you invoke php, like this :
php -d max_execution_time=60 myscript.php
This is great if you launch this via the crontab, for instance.

It depends on the exact error type. You can catch errors by creating your own error handler. See the documentation on set_error_handler(), but not all types of errors can be caught. Look at the timeout error you get and see what type it is. If it is one of E_ERROR, E_PARSE, E_CORE_ERROR, E_CORE_WARNING, E_COMPILE_ERROR or E_COMPILE_WARNING then you cannot catch it with an error handler. If it another type then you can. Catch it with the error handler and simply return.

If you have a suitable PHP version (PHP>=5.2 for error_get_last) you can try the technique described here which uses register_shutdown_function and error_get_last.
This won't allow you to "continue" when you get a fatal error, but it at least allows you to log the error (and perhaps send a warning email) before displaying a custom error page to the user.
It works something like this:
function fatalErrorHandler()
{
$lastError = error_get_last();
if (isset($lastError["type"]) && $lastError["type"]==E_ERROR) {
// do something with the fatal error
}
}
...
register_shutdown_function('fatalErrorHandler');
A few points:
you can use ob_clean() to remove any content that was generated prior to the fatal error.
it's a really bad idea to do anything to intensive in the shutdown handler, this technique is about graceful failure rather than recovery.
whatever you do, don't try to log the error to a database ... what if it was a database timeout that caused the fatal error?
for some reason I've had problems getting this technique to work 100% of the time when developing in Windows using WAMP.

The most simple answer I can give you is this function: http://php.net/manual/en/function.pcntl-fork.php
In more detail, what you can do is:
Fork the part of the process you think might or might not cause a fatal error (i.e. the bulk of your code)
The script that forks the process should be a very simple script.
For example this is something that I would do with a job queue that I have:
<?php
// ... load stuff regarding the job queue
while ($job = $queue->getJob()) {
$pid = pcntl_fork();
switch ($pid) {
case -1:
echo "Fork failed";
break;
case 0:
// do your stuff here
echo "Child finished working";
break;
default:
echo "Waiting for child...";
pcntl_wait($status);
// check the status using other pcntl* functions if you want
break;
}
}

Is there a way then to limit the execution time of an function but not all script?
For example
function blabla()
{
return "yes";
}
to make it so that if it is not executed in 25 seconds to return no;

Related

Throwing exception from custom error handler in PHP

I was working on resource streams and more specifically on fopen().
This function throws a warning when failing, in addition to returning false instead of a resource. The unwanted warnings beings a problem I decided to suppress them.
Two possibilities came to my mind: using the error-suppression operator # or using set_error_handler(). And having been told that # had not so good performances and posed often more problem than it resolved, I ran a quick benchmark to see how set_error_handler() fared against it.
And here comes the problematic code:
<?php
error_reporting(E_ALL);
function errorHandler(int $errorNumber, string $errorMessage)
{
throw new \Exception();
}
$previousHandler = set_error_handler("errorHandler");
$operations = 10000;
for($i = 0; $i < $operations; $i++) {
try{
$inexistant[0];
} catch (\Exception $e) {}
}
set_error_handler($previousHandler);
echo 'ok';
Running this simple code will crash the apache server with this message:
[mpm_winnt:notice] [pid 6000:tid 244] AH00428: Parent: child process 3904 exited with status 3221225725 -- Restarting.
After searching, this message means that the server had an access violation error, mainly in the case of reaching the stack size limit. This however should not be the case as this code should not increase the stack size (and in fact, it doesn't increase the PHP stack frames).
I also tested if the timing was important but even with a sleep of 3ms between each iteration the crash happen after more or less the same number of iteration. This number is around 700 but fluctuate very slightly, sometimes running fine at 704 and sometimes not.
Also, searching on the php bug tracker doesn’t show anything relevant, except maybe for this bug entry which talk about the fact that there is handling around the call to the handling function. This could mean there is a possibility the exception could bypass some handling on the exit of the function, but as I don't know a thing about the PHP source code, this is pure speculation.
As I would like to propagate the error message properly, the way using set_error_handler() would be the most legible way, but I know I can use error_get_last() and the # operator to achieve the same goal with a lot more code (as there is multiple functions like fopen() called one after another in the real projet).
So here are the questions: Is this a bug of PHP? Is there a way to circumvent this problem while keeping clear code?
Thanks.
PS: I know the reason for the benchmark is... dubious at best and I should take whatever code is the most legible as long as the performances are viable, but it still made me discover this interesting point of code.
Edit: I forgot to put the versions I tested this against:
Windows 7, Apache 2.4.38, PHP 7.3.2 via XAMPP
Windows 7, Apache 2.4.29, PHP 7.2.2 via XAMPP
Ubuntu Server 18.04, Apache/2.4.29 (Ubuntu), PHP 7.2.15-0ubuntu0.18.04.1
Why go through all this hassle? It would be a lot better to just handle the return value of false, since the docs state: Returns a file pointer resource on success, or FALSE on error. Therefore it should be sufficient to just check if the return value of fopen is false and then continue operation based on that (or throw your own error if necessary).
As this was has been confirmed as a PHP bug and the exact reason was found, I'll post both the answers and how to solve this until the bug is solved.
First, here is the bug report: https://bugs.php.net/bug.php?id=77693.
It is a stack overflow caused the capture of the exception context, which in this case include the function calling the error handler as this is in his behavior to capture the parent context. This parent context include the previoulsy caught exception, which is added to the context of the new exception to be caught, and repeating ad vitam aeternam until a crash.
As the reason is clearly determined, the solution is simple: Just add an unset() to the end of the catch block, like so:
$operations = 10000;
for($i = 0; $i < $operations; $i++) {
try{
$inexistant[0];
} catch (\Exception $e) {
unset($e);
}
}
Then there is no more problem.
To add to the second question asking for clean alternatives in the case of fopen, and other methods like it, here is a solution:
function throwLastError() {
$context = error_get_last();
error_clear_last();
throw new ErrorException($context["message"],
0,
$context["type"],
$context["file"],
$context["line"]);
}
// Wrong call to fopen
if (!#fopen("", "a"))
throwLastError();
Both answers have around the same performances, the # method being 10% slower, when all the error parameters are used in both.

How to stop PHP code execution?

Is there a way to immediately stop PHP code execution?
I am aware of exit but it clearly states:
Terminates execution of the script. Shutdown functions and object destructors will always be executed even if exit is called.
So what I want to achieve is to stop the PHP code execution exactly when I call exit or whatever.
Any help?
Edit: After Jenson's answer
Trial 1:
function newExit() {
__halt_compiler();
}
echo "start";
newExit();
echo "you should not see this";
Shows Fatal error: __HALT_COMPILER() can only be used from the outermost scope in which was pretty expected.
Trial 2:
function newExit() {
include 'e.php';
}
echo "start";
newExit();
echo "you should not see this";
e.php just contains __halt_compiler();
This shows startyou should not see this
Edit: Why I want to do this?
I am working on an application that includes a proprietary library (required through virtual host config file to which I don't have access) that comes as encrypted code. Is a sort of monitoring library for security purpose. One of it's behaviours is that it registers some shutdown functions that log the instance status (it saves stats to a database)
What I want to do is to disable this logging for some specific conditions based on (remote IP)
Please see the following information from user Pekka 웃
According to the manual, destructors are executed even if the script gets terminated using die() or exit():
The destructor will be called even if script execution is stopped using exit(). Calling exit() in a destructor will prevent the remaining shutdown routines from executing.
According to this PHP: destructor vs register_shutdown_function, the destructor does not get executed when PHP's execution time limit is reached (Confirmed on Apache 2, PHP 5.2 on Windows 7).
The destructor also does not get executed when the script terminates because the memory limit was reached. (Just tested)
The destructor does get executed on fatal errors (Just tested) Update: The OP can't confirm this - there seem to be fatal errors where things are different
It does not get executed on parse errors (because the whole script won't be interpreted)
The destructor will certainly not be executed if the server process crashes or some other exception out of PHP's control occurs.
Referenced in this question
Are there any instances when the destructor in PHP is NOT called?
whats wrong with return ?
echo "you will see this";
return;
echo "you will not see this";
You can use __halt_compiler function which will Halt the compiler execution
http://www.php.net/manual/en/function.halt-compiler.php
You could try to kill the PHP process:
exec('kill -9 ' . getmypid());
Apart from the obvious die() and exit(), this also works:
<?php
echo "start";
__halt_compiler();
echo "you should not see this";
?>
I'm not sure you understand what "exit" states
Terminates execution of the script. Shutdown functions and object destructors will always be executed even if exit is called.
It's normal to do that, it must clear it's memmory of all the variables and functions you called before. Not doing this would mean your memmory would remain stuck and ocuppied in your RAM, and if this would happen several times you would need to reboot and flush your RAM in order to have any left.
or try
trigger_error('Die', E_ERROR);

PHP how to trigger user error with trigger_error in an object destructor while the script shuts down?

While implementing some class I've run into a little problem:
If the script ends and destructors are called because the script has finished, I wanted to trigger an error occasionally.
I thought the trigger_error() function would be of use. However, if error_reporting(-1) the triggered error is not send any longer to STDOUT or STDERR - while it is expected to do so (e.g. if not within the __destructor/termination phase of the script, trigger_error works as expected).
If I echo some message out, it will be send to STDOUT (CLI mode).
I now wonder
how I can trigger an error in this phase of the application?
and/or alternatively how can I detect that currently the script is shutting down because it has successfully ended?
Note: I tested connection_status() but it's useless in my case as it's about connection handling only and merely unrelated. I wonder if there is some function that does the same for the scripts execution status (starting, running, exiting).
Simplified Example Code
This is some very reduced example code to illustrate the issue. Naturally is the error only triggered if it makes sense for the object:
<?php
class TriggerTest
{
public function __destruct()
{
trigger_error('You should have missed something.');
}
}
$obj = new TriggerTest;
exit();
The problem is, that trigger_error() gets executed but the error does not appear anywhere.
How about if you force the error reporting to be a certain setting, trigger the error and then set the error reporting back to it's normal form?
Answer: Just do it. I had some misconfiguration for the error handler and therefore it did not work. My fault.
However it's still interesting if there is any function or similar to determine the execution state on shutdown.

eval() and PHP errors

I have a eval function like this
if(FALSE === #eval($code)) echo 'your code has php errors';
So if the code has synthax errors it will return that message.
The problem is that if within the code you have something like:
require_once('missing_file.php');
it will just break the page, without my nice error message :(
Is there any workaround for this?
Well, first I hope that $code comes from a trusted source and that you're executing arbitrary code sent by the users.
Second, the only way I see you can workaround that is to save $code into a file, run it with the command line PHP interpreter, and check the exit value. Note that passing this test doesn't make $code fatal error free, it just so happened that this particular execution of the script did not throw any fatal error; there may be other code paths that trigger such an error.
This is because once eval triggers a fatal error, it can't be recovered and the script dies. eval only returns FALSE if there is a parsing error.

Displaying custom error page in PHP for errors which can't be caught by set_error_handler

I would like to be able to discard a partially rendered page and show an error page in PHP.
I already know about set_error_handler(), but it can only trap certain types of errors. I would like to know how to show an error page when an error type which can't be trapped by set_error_handler() is raised.
Unfortunately, it seems that the following code, when run with PHP 5.3.2 on Apache 2.2, doesn't do what I would expect it to do:
<?php
// Start the output buffer
ob_start();
// Output something into the buffer.
// I only want this to be displayed if I call one of the
// ob_flush functions or echo the buffer myself later.
echo "yep";
// Call a function I know not to exist in order to raise
// an error which cannot be trapped by set_error_handler()
// and would, if display_errors was On, output "Fatal
// error: Call to undefined function fwee()..."
function_which_does_not_exist();
// This will never be executed.
$out = ob_get_clean();
The output of the script is:
yep
Whereas I would expect it to output nothing (or spew error info and only error info if display_errors() is on).
I have confirmed using LiveHTTPHeaders that PHP 5.3.2 does send a 500 error to the browser when display_errors is off (and a 200 when it's on) using the version of apache supplied by MacPorts, but it only ever spits 200s when using PHP 5.3.1 on XAMPP.
I tried setting ErrorDocument 500 "test" in the apache configuration (confirmed to be working by doing the same for 404) but PHP never shows the custom error, even when the entire contents of the script is just header('HTTP/1.1 500 Internal Server Error');
I'm not sure what else to do to make sure a partially rendered page is replaced with a simple error.
I can also confirm that this happens in the Yii framework. If I edit the view for the "about" page in the blog demo to have a line which reads <?php echo function_which_does_not_exist() ?>, I get a partially rendered page.
You could pass ob_start the name of a callback function, that is executed before the output is flushed on ob_get_clean().
This callback function seams to be executed even if an error occured on the page.
This way you could do something like this:
<?php
$endReached = 0;
function outpu_cb($buffer) {
global $endReached;
if ($endReached) return $buffer;
else return 'Your error message';
}
// Start the output buffer
ob_start('outpu_cb');
// Output something into the buffer.
// I only want this to be displayed if I call one of the
// ob_flush functions or echo the buffer myself later.
echo "yep";
// Call a function I know not to exist in order to raise
// an error which cannot be trapped by set_error_handler()
// and would, if display_errors was On, output "Fatal
// error: Call to undefined function fwee()..."
function_which_does_not_exist();
// This will never be executed.
$endReached = 1;
echo ob_get_clean();
?>
I think the only right way to do this is by using correct output buffering, than you don't have to rely on specific webserver or browser behaviour.
Best you'd use a MVC framework to handle this for you. All output is buffered until all systems are go, so when an error occurs you can take another route, clear the current buffer and display some nice error message.
You can also use the ob_*() family of functions.
You have to call ob_start() as the very first thing in your script (well, before any output is generated)
Install an error_handler to fetch errors
When an error occured, clean the buffer and re-route your app logic to display some nice userfriendly error message
If your talking about E_FATAL or other such errors yes you can catch them with a custom error handler using set_error_handler().
All you need to add is a shutdown function.
// Set the error handler
set_error_handler(array('error', 'handler'));
// Catch E_FATAL errors too!
register_shutdown_function(array('error', 'catch_fatal'));
// Set the exception handler
set_exception_handler(array('error', 'exception'));
// Manually return a new exception
function catch_fatal()
{
if($e=error_get_last())Error::exception(new ErrorException($e['message'],$e['type'],0,$e['file'],$e['line']));
}
Take a look at http://micromvc.com or http://kohanaphp.com/ to see how it's done.
An old question, but for the record I'd suggest avoiding this issue rather than handling it.
My own approach is to build the response in a response object rather than echoing it as you go along, and only echo the output once the full response has been processed without error. This requires a template system that parses your template and builds your response into a string, in contrast to a classic PHP template which echoes output from your placeholders.
This way you entirely avoid PHP's crufty management of the output cache in error states.

Categories