PDO Error Handling, is this ok? - php

I use PDO with try and catch so i can catch the error into a user defined function. I want to know if this method is acceptable.
The Try and Catch:
Try
{
...
} catch (Exception $e) {
// Proccess error
$msg = $e->getMessage();
$timestamp = date("Y-m-d H:i:s");
$line = $e->getLine();
$code = $e->getCode();
handle_error($msg, $timestamp, $line, $code);
die("oops! It's look like we got an error here! Try Again!");
}
The error handling function:
function handle_error($msg, $timestamp, $line, $code) {
$file = 'errorlog.txt';
$data = "$timestamp // Error Message: $msg | Error Code: $code | Error Line: $line \n";
file_put_contents($file, $data, FILE_APPEND);
return;
}
Thanks

You could simplify the log writing by changing your code similar to this
catch (PDOException $e) {
$my_error = "Failed;" . $e;
error_log(print_r($my_error,TRUE));
}

The only thing I could think of that could wrong is that if by some chance a user finds out a way to trigger PDO errors then he can basically DDoS the system by triggering it over and over again and filling up your harddrive with the same error. Not to mention it can use up all your disk I/O bandwidth (when exploiting it).
Then again, Apache does similar logging and I haven't heard of HDDs being filled up even while under DDoS attacks.

Related

Unable to catch PHP file_get_contents error using try catch block [duplicate]

This question already has answers here:
HTTP requests with file_get_contents, getting the response code
(6 answers)
Closed 8 months ago.
I am trying to fetch an image using file_get_contents function but it gives an error. To handle the error I am using try catch block but it does not catch the error and fails.
My code:
try {
$url = 'http://wxdex.ocm/pdd.jpg'; //dummy url
$file_content = file_get_contents($url);
}
catch(Exception $e) {
echo 'Error Caught';
}
Error:
Warning: file_get_contents(): php_network_getaddresses: getaddrinfo failed: No such host is known
Warning: file_get_contents(http://wxdex.ocm/pdd.jpg): failed to open stream: php_network_getaddresses: getaddrinfo failed: No such host is known.
NOTE:: I am able to fetch any other valid image url on remote.
try/catch doesn't work because a warning is not an exception.
You can try this code so you can catch warnings as well.
//set your own error handler before the call
set_error_handler(function ($err_severity, $err_msg, $err_file, $err_line, array $err_context)
{
throw new ErrorException( $err_msg, 0, $err_severity, $err_file, $err_line );
}, E_WARNING);
try {
$url = 'http://wxdex.ocm/pdd.jpg';
$file_content = file_get_contents($url);
} catch (Exception $e) {
echo 'Error Caught';
}
//restore the previous error handler
restore_error_handler();
Following is the altenative way , just need to check for the data , if not we can throw the exception to handle it. it will be safer compared with setting the new error handler
try {
$url = 'http://wxdex.ocm/pdd.jpg';
$file_content = file_get_contents($url);
if(empty($file_content)){
throw new Exception("failed to open stream ", 1);
}else{
echo "File is loaded and content is there";
}
} catch (Exception $e) {
echo 'Error Caught';
}
check URL exist before that using get header function
$url = 'http://wxdex.ocm/pdd.jpg';
$file_headers = #get_headers($url);
if(!$file_headers || $file_headers[0] == 'HTTP/1.1 404 Not Found' ||trim($file_headers[0]) == 'HTTP/1.1 403 Forbidden') {
$exists = false;
}else{
$exists = true;
}
if($exists===true){
$file_content = file_get_contents($url);
}

fgets() halting without error in Swiftmailer

I'm attempting to generate a test email in Laravel 4.2 (Windows 7, IIS 6.1), and I've encountered a silent termination - it just fails, doesn't return my view, and doesn't return an error or Exception. I've managed to brute force my way through the Laravel codebase and located the termination within Swift\Transport\AbstractSmtpTransport::_getFullResponse(), specifically the line $line = $this->_buffer->readLine($seq);:
protected function _getFullResponse($seq)
{
$response = '';
try {
do {
$line = $this->_buffer->readLine($seq);
$response .= $line;
} while (null !== $line && false !== $line && ' ' != $line{3});
} catch (Swift_IoException $e) {
$this->_throwException(
new Swift_TransportException(
$e->getMessage())
);
} catch (Swift_TransportException $e) {
$this->_throwException($e);
}
return $response;
}
That do loop executes twice. The first time $line is assigned the value * OK The Microsoft Exchange IMAP4 service is ready., which is great, as obviously I'm getting to the server. Unfortunately, the second iteration fails in Swift\Transport\StreamBuffer::readLine() at the line $line = fgets($this->_out); :
public function readLine($sequence)
{
if (isset($this->_out) && !feof($this->_out)) {
$line = fgets($this->_out);
if (strlen($line)==0) {
$metas = stream_get_meta_data($this->_out);
if ($metas['timed_out']) {
throw new Swift_IoException(
'Connection to ' .
$this->_getReadConnectionDescription() .
' Timed Out'
);
}
}
return $line;
}
}
I've tried wrapping that line in a try/catch, and nothing happens, the code just halts with no information on the second iteration of the do loop. So, any advice as to a) how to squeeze more information out of the halt or b) what could cause fgets() to halt this way?
Ok, progress: after broadening my search to Swiftmailer in general, I was able to find mention of a timeout occurring at that particular line in Swiftmailer. By extending the max_execution_time in php.ini I was able to get an actual Exception:
Expected response code 220 but got code "", with message "* OK The Microsoft Exchange IMAP4 service is ready. * BYE Connection is closed. 13 "
I think under the circumstances I'll close this question and move onto figuring out why I'm not getting a 220.

get Fatal Error on eval()

Im making a project that will execute a PHP code in a string..Since im not using any PHP compiler or interpreter..I decided to use eval() function.. Im using Codeigniter Framework
MY CODE
CONTROLLER
$code = $this->input->post('code');
$functionCall = $this->input->post('funCall');
$expOut = $this->input->post('expectedOutput');
$integrate = $code." ".$functionCall;
ob_start();
eval($integrate);
$ret = ob_get_contents();
ob_end_clean();
if($ret === $expOut){
//all work fine
}
else{
redirect('main/wrongCode');
}
}
Everything work fine but when the output is fatal error..it will not execute the redirect('main/wrongCode');..
Is there's a way to get the fatal error? So i can make a condition?
I think you could use PHP Exception Handling
try{
ob_start();
eval($integrate);
$ret = ob_get_contents();
ob_end_clean();
}catch(Exception $e) {
echo 'Caught exception: ', $e->getMessage(), "\n";
}
if($ret === $expOut){
//Do sth
}
else{
// Do sth else
}
have a look at PHP Exceptions

Stream FTP download to output

I am trying to stream/pipe a file to the user's browser through HTTP from FTP. That is, I am trying to print the contents of a file on an FTP server.
This is what I have so far:
public function echo_contents() {
$file = fopen('php://output', 'w+');
if(!$file) {
throw new Exception('Unable to open output');
}
try {
$this->ftp->get($this->path, $file);
} catch(Exception $e) {
fclose($file); // wtb finally
throw $e;
}
fclose($file);
}
$this->ftp->get looks like this:
public function get($path, $stream) {
ftp_fget($this->ftp, $stream, $path, FTP_BINARY); // Line 200
}
With this approach, I am only able to send small files to the user's browser. For larger files, nothing gets printed and I get a fatal error (readable from Apache logs):
PHP Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 15994881 bytes) in /xxx/ftpconnection.php on line 200
I tried replacing php://output with php://stdout without success (nothing seems to be sent to the browser).
How can I efficiently download from FTP while sending that data to the browser at the same time?
Note: I would not like to use file_get_contents('ftp://user:pass#host:port/path/to/file'); or similar.
Found a solution!
Create a socket pair (anonymous pipe?). Use the non-blocking ftp_nb_fget function to write to one end of the pipe, and echo the other end of the pipe.
Tested to be fast (easily 10MB/s on a 100Mbps connection) so there's not much I/O overhead.
Be sure to clear any output buffers. Frameworks commonly buffer your output.
public function echo_contents() {
/* FTP writes to [0]. Data passed through from [1]. */
$sockets = stream_socket_pair(STREAM_PF_UNIX, STREAM_SOCK_STREAM, STREAM_IPPROTO_IP);
if($sockets === FALSE) {
throw new Exception('Unable to create socket pair');
}
stream_set_write_buffer($sockets[0], 0);
stream_set_timeout($sockets[1], 0);
try {
// $this->ftp is an FtpConnection
$get = $this->ftp->get_non_blocking($this->path, $sockets[0]);
while(!$get->is_finished()) {
$contents = stream_get_contents($sockets[1]);
if($contents !== false) {
echo $contents;
flush();
}
$get->resume();
}
$contents = stream_get_contents($sockets[1]);
if($contents !== false) {
echo $contents;
flush();
}
} catch(Exception $e) {
fclose($sockets[0]); // wtb finally
fclose($sockets[1]);
throw $e;
}
fclose($sockets[0]);
fclose($sockets[1]);
}
// class FtpConnection
public function get_non_blocking($path, $stream) {
// $this->ftp is the FTP resource returned by ftp_connect
return new FtpNonBlockingRequest($this->ftp, $path, $stream);
}
/* TODO Error handling. */
class FtpNonBlockingRequest {
protected $ftp = NULL;
protected $status = NULL;
public function __construct($ftp, $path, $stream) {
$this->ftp = $ftp;
$this->status = ftp_nb_fget($this->ftp, $stream, $path, FTP_BINARY);
}
public function is_finished() {
return $this->status !== FTP_MOREDATA;
}
public function resume() {
if($this->is_finished()) {
throw BadMethodCallException('Cannot continue download; already finished');
}
$this->status = ftp_nb_continue($this->ftp);
}
}
Try:
#readfile('ftp://username:password#host/path/file');
I find with a lot of file operations it's worthwhile letting the underlying OS functionality take care of it for you.
Sounds like you need to turn off output buffering for that page, otherwise PHP will try to fit it in all memory.
An easy way to do this is something like:
while (ob_end_clean()) {
; # do nothing
}
Put that ahead of your call to ->get(), and I think that will resolve your issue.
I know this is old, but some may still think it's useful.
I've tried your solution on a Windows environment, and it worked almost perfectly:
$conn_id = ftp_connect($host);
ftp_login($conn_id, $user, $pass) or die();
$sockets = stream_socket_pair(STREAM_PF_INET, STREAM_SOCK_STREAM,
STREAM_IPPROTO_IP) or die();
stream_set_write_buffer($sockets[0], 0);
stream_set_timeout($sockets[1], 0);
set_time_limit(0);
$status = ftp_nb_fget($conn_id, $sockets[0], $filename, FTP_BINARY);
while ($status === FTP_MOREDATA) {
echo stream_get_contents($sockets[1]);
flush();
$status = ftp_nb_continue($conn_id);
}
echo stream_get_contents($sockets[1]);
flush();
fclose($sockets[0]);
fclose($sockets[1]);
I used STREAM_PF_INET instead of STREAM_PF_UNIX because of Windows, and it worked flawlessly... until the last chunk, which was false for no apparent reason, and I couldn't understand why. So the output was missing the last part.
So I decided to use another approach:
$ctx = stream_context_create();
stream_context_set_params($ctx, array('notification' =>
function($code, $sev, $message, $msgcode, $bytes, $length) {
switch ($code) {
case STREAM_NOTIFY_CONNECT:
// Connection estabilished
break;
case STREAM_NOTIFY_FILE_SIZE_IS:
// Getting file size
break;
case STREAM_NOTIFY_PROGRESS:
// Some bytes were transferred
break;
default: break;
}
}));
#readfile("ftp://$user:$pass#$host/$filename", false, $ctx);
This worked like a charm with PHP 5.4.5. The bad part is that you can't catch the transferred data, only the chunk size.
a quick search brought up php’s flush.
this article might also be of interest: http://www.net2ftp.org/forums/viewtopic.php?id=3774
(I've never met this problem myself, so that's just a wild guess ; but, maybe... )
Maybe changing the size of the ouput buffer for the "file" you are writing to could help ?
For that, see stream_set_write_buffer.
For instance :
$fp = fopen('php://output', 'w+');
stream_set_write_buffer($fp, 0);
With this, your code should use a non-buffered stream -- this might help...

Ignoring PHP error, while still printing a custom error message

So:
#fopen($file);
Ignores any errors and continues
fopen($file) or die("Unable to retrieve file");
Ignores error, kills program and prints a custom message
Is there an easy way to ignore errors from a function, print a custom error message and not kill the program?
Typically:
if (!($fp = #fopen($file))) echo "Unable to retrieve file";
or using your way (which discards file handle):
#fopen($file) or printf("Unable to retrieve file");
Use Exceptions:
try {
fopen($file);
} catch(Exception $e) {
/* whatever you want to do in case of an error */
}
More information at http://php.net/manual/language.exceptions.php
slosd's way won't work. fopen doesn't throw an exception. You should thow it manually
I will modify your second exaple and combine it with slosd's:
try
{
if (!$f = fopen(...)) throw new Exception('Error opening file!');
}
catch (Exception $e)
{
echo $e->getMessage() . ' ' . $e->getFile() . ' at line ' . $e->getLine;
}
echo ' ... and the code continues ...';
Here's my own solution. Note that it needs either a script-level global or a static variable of a class for easy reference. I wrote it class-style for reference, but so long as it can find the array it's fine.
class Controller {
static $errors = array();
}
$handle = fopen($file) or array_push(Controller::errors,
"File \"{$file}\" could not be opened.");
// ...print the errors in your view
Instead of dying you could throw an exception and do the error handling centrally in whichever fashion you see fit :-)

Categories