Store database (PDO) connection - php

I'm currently writing a PHP application and i noticed that my page loads kinda slow. I takes about 2 seconds (2.0515811443329 to be exact).
I've tracked down what the bottleneck was and it's the part where i'm creating a PDO connection to my MySQL database.
My 'connect()' method doesn't do any exoctic stuff. It simply looks like this:
public function connect ( $database, $host, $username, $password )
{
try
{
$this->db = new \PDO("mysql:dbname=".$database.";host=".$host, $username, $password);
if ( !$this->db )
{
throw new \Exception('Failed to connect to the database!');
}
$this->db->setAttribute(\PDO::ATTR_ERRMODE, \PDO::ERRMODE_EXCEPTION);
}
catch ( \Exception $e )
{
echo '<strong>Exception: </strong>'.$e->getMessage();
return false;
}
return true;
}
So when i comment out the call to the 'connect()' method, then my page loads in: 0.035506010055542
This is a huge difference. I can imagine that creating a connection to a database does take up some time, but it takes more than 1,5 seconds... I'm not sure if this is normal?
If it is normal, that it takes up that amount of time then is there a way to store the database connection? Like putting it in a session? Actually, as far as i know storing it in a session isn't possible. But it would be the ideal solution. Storing the connection somewhere until the user closes his browser.
In anyway, is there a problem with my PDO / MySQL? And can i simply store the connection resource somehow? So that i don't have to reconnect to my database everytime for every new page?
PS. I'm doing this all on a localhost (Windows).

You're probably making a connection with 'localhost' as address. Try to change that to '127.0.0.1'. That should fix the problem.

You can create a persistent connection to database using PDO. From the manual
Many web applications will benefit from making persistent connections to database servers. Persistent connections are not closed at the end of the script, but are cached and re-used when another script requests a connection using the same credentials. The persistent connection cache allows you to avoid the overhead of establishing a new connection every time a script needs to talk to a database, resulting in a faster web application.
And example:
<?php
$dbh = new PDO('mysql:host=localhost;dbname=test', $user, $pass, array(
PDO::ATTR_PERSISTENT => true
));
?>

Related

Force quicker timeout for remote MySQL connections?

Had an issue recently where a major fire at a datacentre destroyed one of our machines. While this server is now offline all of the other servers on our network are still functional, although this has brought to light a problem that I'm hoping to find the answer for.
Is it possible to reduce the amount of time PDO will wait before reporting that the connection failed?
In my particular scenario there are others servers that rely on remote MySql connections to retrieve data from the destroyed server in order to display certain things. Now I have measures in place to handle the situation when the requested data isn't supplied, but have never had to deal with a machine being unreachable until now.
So right now I'm in the situation where my online servers are hanging on page load due to the destroyed server being offline.
Is there any preference or setting that can be provided when trying to make the initial remote connection where you say "return false without throwing a script breaking error after 5 seconds" for example?
I call my connections through a db class :
protected static function getDB()
{
static $db = null;
if ($db === null) {
$dbhost = DB_HOST;
$dbuser = DB_USER;
$dbpass = DB_PASS;
$dbname = DB_NAME;
try {
$db = new PDO("mysql:host=$dbhost;dbname=$dbname;charset=utf8mb4",
$dbuser, $dbpass);
$db->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
$db->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
} catch (PDOException $e) {
echo $e->getMessage();
}
}
return $db;
}
Like this...
$db = static::getDB();
I'm presuming that I can simply return false in the catch exception instead of echoing the error message which answers half of my question if true, but is it possible to speed up the time it will wait before throwing the exception in the event of an unreachable machine?
EDIT
I found this and tried it...
$db->setAttribute(PDO::ATTR_TIMEOUT, '5');
... but it makes no difference to the timeout whatsoever? Does this setting only relate to local connections maybe?
Just found the answer myself...
At first I found this and added it but it made NO difference whatsoever to the time taken before bomb out in my situation, maybe it works in others though?
$db->setAttribute(PDO::ATTR_TIMEOUT, '5');
But the answer that DID work for me was dropping this in right before the database connection :
ini_set("default_socket_timeout", 5);
Hope it helps somebody else.

"Could not connect. Too many connections" Error in MySQLi

I have the following code
function openDBConn($params){
$conn_mode = $params['conn_mode'];
$db_conn = $params['db_conn'];
//create connections
if(empty($db_conn->info)) {
$db_conn = new mysqli("localhost", $user, $password, "database");
$db_conn->set_charset('UTF8');
$mysqli_error = $db_conn->connect_error;
}
if($mysqli_error !== NULL){
die('Could not connect <br/>'. $mysqli_error);
}else{
return $db_conn;
}
}
//close db connection
function closeDBConn( $params ){
$db_conn = $params['db_conn'];
$db_conn->close;
}
//used as below
$db_conn = openDBConn();
save_new_post( $post_txt, $db_conn );
closeDBConn( array('db_conn'=>$db_conn));
From time to time, I get the "Could not connect. Too many connections" error.
This tends to happen when I have Google bot scanning my website.
This problem seems to have started ever since upgrading to MySQLi from MySQL.
Is there any advice on how ensure all connections are closed?
Thanks
You need to increase the number of connections to your MySQL server (the default is only 100 and typically each page load consumes one connection)
Edit /etc/my.cnf
max_connections = 250
Then restart MySQL
service mysqld restart
http://major.io/2007/01/24/increase-mysql-connection-limit/
Some hosters have a hard limit how many open database connections your are allowed to have. Maybe you want to contact your hoster to know how many you are allowed to open. For websites with hight traffic load more connections can be helpful.
Do you have access to the server directly or is it a hosted solution?
If you have direct access you can check the mySQL config files to see how many connections are allowed and increase it.
If you don't you might want to contact your webhost about increasing the limit and see if they will comply.

avoiding multiple attemps to mysql_connect

File : Config.php
<?php
require 'inc.database.php';
// Checking if there already a connection. If not then connect to the database.
if(!$IsConnected){
$Database = new Database();
$Database->connect("localhost", "aih786_raheel", "raheel786", "aih786_basicblog");
$IsConnected = TRUE;
}
?>
I m using my config file on my every page because on every page i need to have my database object. Thing i want to clear is that by this approach can i avoid multiple attemps to connect to the database as it is not a good practice to make same connection again and again.
Lets say i have a login page which is the first page of my cms. The connection will be opened on the login page and now when i move to the dashboard.php page i require the config.php file in this page too...so by this it won't create the connection and object again.
Pleas tell me is this the right approach to achieve my goal and also will it give me the access to the object $Database ? I'm not sure if we can use the object on differnt pages once it has been created on first page.
A very rudimentary approach would be to define a function that returns a database connection on-demand, e.g.:
function getDefaultDatabaseConnection()
{
$db = new Database;
$db->connect(...);
return $db;
}
Usually, I try to fire up one connection per page load that needs it.
If I have the proper variable already stored in SESSION variables, then oftentimes
it is not necessary to fire one up.
Given that, I do consider it proper form to drop the connection object at the end
of the script that called it.
And Jack is right, I use a function to fire up the connection.
function dbConnect_readOnly() {
$host="127.0.0.1";
$user="*********";
$password="********";
$dbname="**********";
try {
$DBH = new PDO("mysql:host=$host;dbname=$dbname", $user, $password);
$DBH->setAttribute( PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION );
}
catch(PDOException $e) {
echo "Unable to connect to database.";
file_put_contents('PDOErrors.txt', $e->getMessage(), FILE_APPEND);
}
return $DBH;
}
and to close:
function dbClose_connection($DBH) {
$DBH = null;
}
Include the script at the top of every page that eeds connectivity just after you check for session variables.

PHP - Best way to connect to database when there are multiple connections

I have just recently acquired the service side of a medium size project. The former developer has all of his functions as separate php scripts instead of classes (func1.php, func2.php, etc)... All these 'functions' make a reference to mysqli_connect via referencing the actual
'databaseonnection.php' file. This is creating a new connection every time any of the scripts run (every time I have to call a function) and I don't want to do that. I was thinking about having a persistent connection, but I'm worried about it getting out of hands as the project is growing more and more every day. So, has anyone ever encountered a similar situation? What is the best way to handle my connection to the database? Any suggestions would be greatly appreciated.
From the docs for mysql_connect. If a second call is made to mysql_connect() with the same arguments, no new link will be established, but instead, the link identifier of the already opened link will be returned.
EDIT: I'm sorry I thought you wanted connectivity help. There is no way except to move all those "functions" into one file where the connection is for them only.
I create a con.php file where my PDO connection is established then include that file anywhere you wish to use a connection Here is the base for a PDO connection:
$PDO = new PDO("mysql:host=localhost;dbname=dbname", "user_name", "password");
Here is my notes on using the PDO object to make prepared queries. There is more than you need below but good luck.
Within your PHP file that needs a connection:
1: include('con.php');
2: $datas = $PDO->prepare(SELECT * FROM table WHERE title LIKE :searchquery);
// prepare method creates and returns a PDOstatment object ( print_r($datas); ) which contains an execute() method
// PDOstatment object has its own methods ie. rowCount()
// $datas->bindValue(':search', '% . $search . %', )
// Optional - Manually bind value. see http://php.net/manual/en/pdostatement.bindparam.php
3: $datas->execute( array(':searchquery' => $searchquery . '%'));
// pass in values that need to be bound AND EXECUTE.
// There are 17 ways to "fetch" data with the PDO object.
4: $datas-fetchALL(PDO::FETCH_OBJ);
close a pdo connection by the handle:
$PDO = null;
I think you'll be much better off using PDO as opposed to the old MYSQL functions e.g. mysql_connect. It's much more robust an interface.
Below is the basic code to do this:
$db_handle = new PDO("mysql:host=".$db_host.";dbname=".$db_name.";port=".$db_port."", $db_username, $db_password, $connect_options);
where $db_handle is the PDO object representing the database connection, $db_host is your hostname [usually localhost], $db_name is the name of your database, $db_port is the database port number [usually 3306], $db_username and $db_password are your database user access credentials, and $connect_options are optional driver-specific connection options.
To enable persistent connections you need to set the driver-specific connection option for it before opening the connection: $connect_options = array(PDO::ATTR_PERSISTENT => true); then execute the earlier database connection code.
You can get more information on this from the PHP Docs here: http://www.php.net/manual/en/pdo.construct.php and http://php.net/manual/en/pdo.connections.php.
Regarding creating persistent connections, I would suggest that you close every database connection you open at the end of your script (after all your database operations of course) by nullifying your database handle: $db_handle = NULL;. You should do this whether you opened a persistent connection or not. It sounds counter-intuitive, but I believe you should free up any database resources when your script is done.
The performance disadvantages of doing this [from my experience] are neglible for most applications. This is obviously an arguable assertion and you may also find the following link helpful in further clarifying your strategy in this regard:
Persistent DB Connections - Yea or Nay?
Happy coding!
if you have very complex project and need big budget to re-design, and prefer very simple alteration then
1) stay in mysqli_connect
2) move the database connection to header of your script.
3) remove the function databse close() on that functions.
4) remove the connection link variables, it wont needed for single database.
5) close the database on end of footer.
By this way, database connection establish when starting your script and after all queries, it will be closed on footer. your server can handle the connections without closing/re-open by using keepalive method. basically default keepalive value is 30 to 90 seconds.

How can I get php pdo code to keep retrying to connect if there are too many open connections?

I have an issue, it has only cropped up now. I am on a shared web hosting plan that has a maximum of 10 concurrent database connections. The web app has dozens of queries, some pdo, some mysql_*.
Loading one page in particular peaks at 5-6 concurrent connections meaning it takes a minimum of 2 users loading it at the same time to spit an error on one or both of them.
I know this is inefficient, I'm sure I can cut that down quite a bit, but that's what my idea is at the moment is to move the pdo code into a function and just pass in a query string and an array of variables, then have it return an array (partly to tidy my code).
THE ACTUAL QUESTION:
How can I get this function to continue to retry until it manages to execute, and hold up the script that called it (and any script that might have called that one) until it manages to execute and return it's data? I don't want things executing out of order, I am happy with code being delayed for a second or so during peak times
Since someone will ask for code, here's what I do at the moment. I have this in a file on it's own so I have a central place to change connection parameters. the if statement is merely to remove the need to continuously change the parameters when I switch from my test server to the liver server
$dbtype = "mysql";
$server_addr = $_SERVER['SERVER_ADDR'];
if ($server_addr == '192.168.1.10') {
$dbhost = "localhost";
} else {
$dbhost = "xxxxx.xxxxx.xxxxx.co.nz";
}
$dbname = "mydatabase";
$dbuser = "user";
$dbpass = "supersecretpassword";
I 'include' that file at the top of a function
include 'db_connection_params.php';
$pdo_conn = new PDO("mysql:host=$dbhost;dbname=$dbname", $dbuser, $dbpass);
then run commands like this all on the one connection
$sql = "select * from tbl_sub_cargo_cap where sub_model_sk = ?";
$capq = $pdo_conn->prepare($sql);
$capq->execute(array($sk_to_load));
while ($caprow = $capq->fetch(PDO::FETCH_ASSOC)) {
//stuff
}
You shouldn't need 5-6 concurrent connections for a single page, each page should only really ever use 1 connection. I'd try to re-architect whatever part of your application is causing multiple connections on a single page.
However, you should be able to catch a PDOException when the connection fails (documentation on connection management), and then retry some number of times.
A quick example,
<?php
$retries = 3;
while ($retries > 0)
{
try
{
$dbh = new PDO("mysql:host=localhost;dbname=blahblah", $user, $pass);
// Do query, etc.
$retries = 0;
}
catch (PDOException $e)
{
// Should probably check $e is a connection error, could be a query error!
echo "Something went wrong, retrying...";
$retries--;
usleep(500); // Wait 0.5s between retries.
}
}
10 concurrent connections is A LOT. It can serve 10-15 online users easily.
Heavy efforts needed to exhaust them.
So there is something wrong with your code.
There are 2 main reasons for it:
slow queries take too much time and thus serving one hit uses one mysql connection for too long.
multiple connections opened from every script.
The former one have to be investigated but for the latter one it's simple:
Do not mix myqsl_ and PDO in one script: you are opening 2 connections at a time.
When using PDO, open connection only once and then use it throughout your code.
Reducing the number of connections in one script is the only way to go.
If you have multiple instances of PDO class in your code, you will need to add that timeout handling code you want to every call. So, heavy code rewriting required anyway.
Replace these new instances with global $pdo; instead. It will take the same amount of time but it will be permanent solution, not temporary patch as you want it.
Please be sensible.
PHP automatically closes all the connections st the end of the script, you don't have to care about closing them manually.
Having only one connection throughout one script is a common practice. It is used by ALL the developers around the world. You can use it without any doubts. Just use it.
If you have transaction and want to log something in database you sometimes need 2 connections in one script

Categories