This might be a silly question. I have two separate databases that are being used with Laravel 4. One of them can only be accessed with a certain IP (security reasons) while the other can be accessed. I have two different mysql connections. I have seen the Database connection test using this:
if(DB::connection('mysql')->getDatabaseName()){ }
To test what can be seen and what can't be seen, I tried to give the mysql a false password. I get this nice ugly error how it can't connect. Is there a way to make it where if the Database cannot be reached, just to ignore it? There's only one PHP class that's calling the secure only database on page load, but the above check doesn't seem to be working.
Going through the core code of laravel, there is no specific exceptions being thrown when a database connection fails.
The solution hence, is:
try {
//Strings always evaluate to boolean true
$dbConnected = (bool)DB::connection('mysql')->getDatabaseName();
}
catch (Exception $e)
{
$dbConnected = false;
}
Then work your code based on the variable $dbConnected.
Related
We're developing a PHP application that connects both to a PostgreSQL server and an IBM i server with DB2. While the PDO connection to PGSQL works just fine, the connection to DB2 can only fetch from tables; trying to Insert or Delete results in the following error:
SQLSTATE[HY000]: General error: -7008 (SQLExecute[4294960288] at /build/php7.0-ltLrbJ/php7.0-7.0.33/ext/pdo_odbc/odbc_stmt.c:260)
This error happens both on our development and production environments. Both servers are Ubuntu (different versions, but not by much); I'm using the ODBC driver for PDO.
We tried to connect to other IBM i servers, and with different users, but the exact same problem still arises. Can Select, but not Insert. Googling the error code doesn't yield any useful result, and as you can see the error message itself it's as unhelpful as can be. The code in the SQLExecute particularly doesn't appears anywhere, not even a single result (there is a result from an IBM page, but it's actually a different error code).
The code is pretty simple, but perhaps there is some obvious and glaring error there.
The test script:
include("DB2.php");
$oDAO = new DAO();
$res = $oDAO->ejecuta("INSERT INTO <Library>.<File> VALUES (1,0,1)");
The DAO:
class DAO{
var $link;
public function __construct(){
// función constructora, inicia la conexión
$this->link = new PDO("odbc:DRIVER={IBM i Access ODBC Driver};SYSTEM=<System>;PROTOCOL=TCPIP",
'<user>', '<pass>');
$this->link->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$this->link->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
}
private function begin() { $this->link->beginTransaction(); }
private function rollback() { $this->link->rollBack(); }
private function commit() { $this->link->commit(); }
public function ejecuta($query){
try{
$this->begin();
$oResult = $this->link->query($query);
if($oResult){
$bResult = true;
$this->commit();
}else{
$bResult = false;
$this->rollback();
}
}
catch (Exception $e){
echo $e->getMessage();
$bResult = false;
$this->rollback();
}
return $bResult;
}
}
Frankly, we're out of options and I already wasted two weeks with this. We just need to insert and delete records. So any help is welcome.
The symptoms you describe are consistent with attempting to modify the database under commitment control, but without journaling enabled.
There are three common ways to deal with this:
Turn journaling on. This is pretty extreme, since the folks who administer the database would have to do this, and if they've got journaling turned off, it's likely they either don't really know how to deal with journals, or don't want to. But it's the only practical way to have full commitment control on Db2 for i.
Connect with autocommit on. This will add an implicit commit to any database-modifying SQL statements executed with this connection. In my experience, this is the most common and convenient way to handle the situation.
Add WITH NC to each relevant SQL statement. In principle, this gives you statement-by-statement control over whether to suspend commitment control. In practice, if you are thinking of doing this in the first place, you probably don't have journaling enabled, and thus you will have to do this on each and every database-modifying SQL statement. This is why most people gravitate toward option 2.
I have a question regarding PHP's way to handle exceptions and Laravel 5.
Here's a snippet of what I'm trying to accomplish
try
{
//set up webservices
saveData();
} catch(exception $e)
{
Log:error('stuff went bananas')
}
saveData(){
//record stuff through DB with eloquent
//in case some data is missing connect to a second DB
try
{
connect()
query to get data
}catch(exception $e)
{
//log error again
}
In other words, I'm calling webservices that receive data. I'm then inserting these into my own DB inside the Model.
At this stage I'm trying to see if I know the user ID that I just received. In case I don't I have to connect to a second database to request that same user.
There are two issues though
the second database I'm trying to connect isn't likely to be up
it will often not hold the information I need (I have to try again the next day)
There's nothing I can do against these issues. But what I want is for my code to try and connect, and in case it fails, log the errors and keep moving inserting the data.
Thanks in advance.
During the process of my PHP learning I have been trying to read up on the best practices for error reporting and handling, but statements vary person to person and I have struggled to come up with a clear concise way of handling errors in my applications. I use exceptions on things that could go wrong, but for the most part it is hard for me to understand whether an exception should kill the application and display an error page or just be caught and silently dealt with.
Something that seems to elude me is, is there such thing as too much reporting? Every single time you call a function something could go horribly wrong meaning that if you were to confirm every single function call you would have to fill pages with if statements and work out what effect one failure may have on the rest. Is there a concise document or idea for error reporting that could clear this up for me? Are there best practices? What are the best examples of good error handling?
Currently I do the following:
Add important event results to an array to be logged and emailed to me if a fatal error was to occur
Display abstract/generic errors for fatal errors.
Use exceptions for cases that are likely to fail
Turn on error reporting in a development environment and off for live environment
Validate all user input data
Sanitizing invalid user input
Display concise, informative error messages to users without providing a platform for exploitation.
Exceptions are the only thing that you haven't understood IMHO: exceptions are meant to be out of your control, are meant to be caught be dealt with from outside the scope they are thrown in. The try block has a specific limit: it should contain related actions. For example take a database try catch block:
$array = array();
try {
// connect throws exception on fail
// query throws exception on fail
// fetch results into $array
} catch (...) {
$array[0]['default'] = 'me';
$array[0]['default2'] = ...;
...
}
as you can see I put every database related function inside the try block. If the connection fails the query and the fetching is not performed because they would have no sense without a connection. If the querying fails the fetching is skipped because there would be no sense in fetching no results. And if anything goes wrong, I have an empty $array to deal with: so I can, for example, populate it with default data.
Using exceptions like:
$array = array();
try {
if (!file_exists('file.php')) throw new Exception('file does not exists');
include('file.php');
} catch (Exception $e) {
trigger_error($e->getMessage());
}
makes no sense. It just a longer version of:
if (!file_exists('file.php')) trigger_error('file does not exists');
include('file.php');
I have an intermittent bug that I'm trying to track down, and I'd like to capture only MySQL queries that fail resulting in a rollback. I don't want a full general query or binary log because there would be millions of entries in the haystack to sort through.
Something like this solution except for MySQL would be perfect.
TIA,
JD
Not a direct answer to your question, but the utility mysqlbinlog can extract data from the binary log.
See: the user comments in this page: http://dev.mysql.com/doc/refman/5.5/en/binary-log.html
And this page: http://ronaldbradford.com/blog/mysql-dml-stats-per-table-2009-09-09/
Here's the official documentation for mysqlbinlog, which might help you get the info you need.
In MySQL it is very difficult (or maybe impossible). You can do it in PHP. If you don't use low functions like mysql_query and you use high methods like ->query(), you can add logic to theirs. If query failed (return false for example), add it to log. Sorry for my english.
Note for Zend_DB:
class My_DB extends Zend_DB {
public function insert($data) {
try {
parent::insert($data);
} catch (Exception $e) {
// put $e->getMessage() to log
}
}
}
You can overwrite different methods, such as update, query and others...
I've been developing a web application with PHP and MySQL. The other day, I made some changes to the database and adapted one page to the new table layout but not another page. I didn't test well enough, so the site went online with the error still in the other page. It was a simple MySQL error, and once one of my co-workers spotted it, it was a simple fix.
Now that it's happened, I'd like to know how I can catch other MySQL errors. I'm thinking about some sort of notification system, that would send me an email when a mysql_query() fails.
I understand, of course, that I wouldn't be notified until after the error occurred, but at least I would have been notified immediately, rather than my co-worker come tell me after who-knows-how-many other people had run into the same fatal error.
Is there some sort of way to put in a hook, so that PHP automatically runs a function when an error happens? Or do I need to go through my code and add this functionality to every location where I use mysql_query()?
If the latter, do you have any recommendations on how to prevent something like this in the future? If this is the case I'll probably create a class to abstract SQL operations. I know I should have been doing this the whole time... But I did organize my sets of functions into different include files, so at least I'm doing most things right. Right?
You could use a wrapper function like this:
function mysql_query_wrapper($query, $link=null)
{
if (is_null($link)) {
$result = myql_query($query);
} else {
$result = myql_query($query, $link);
}
if (mysql_error($result)) {
// error occurred
}
return $result;
}
Then you just need to replace each mysql_query call with mysql_query_wrapper.
You can use custom functions for error handling using set_error_handler().
However, mysql_query won't trigger an error, but return false. The errors turn up only afterwards when trying to work with the results. In this case it might be better to define a custom wrapper function that calls mysql_query() and outputs possible errors using mysql_error(). That way, you can immediately halt your application on an error if so desired.