best practise to do many connections mysqli - php

I received max_user_connections error this days. and I was wondering if I am doing something wrong.
I have a config.php file with mysqli connection script:
$mysqli = new mysqli('localhost', 'my_user', 'my_password', 'my_db');
so pages where I need to get something in mysqli I include config.php. here is an example:
example.php
<?php
include_once("config.php");
$stmt = $mysqli->prepare("select...");
$stmt->execute();
$stmt->bind_result(...,...);
while($stmt->fetch()) {
...
}
$stmt->close();
?>
some html <p> <img>...
<?php
$stmt = $mysqli->prepare("select...");
$stmt->execute();
$stmt->bind_result(...,...);
while($stmt->fetch()) {
...
}
$stmt->close();
?>
some html <p> <img>...
<?php
$stmt = $mysqli->prepare("select...");
$stmt->execute();
$stmt->bind_result(...,...);
while($stmt->fetch()) {
...
}
$stmt->close();
?>
So, my question is: is it the best practise to do selects like this? should I close mysqli connect after each select and open again? or do selects on the top together without separete than with some html in the middle?

the best practise to do selects like this?
I hate it when people use the term "best practice" it's usually a good indicator they don't know what they are talking about.
The advantage of your approach is that its nice and simple. But as you've discovered, it does not scale well.
In practice its quite hard to measure, but in most applications the MySQL connection is unused for most of the lifecycle of the script. Hence there are usually big wins to be made by delaying the opening of connection and closing it as soon as possible.
The former can be done by decorating the mysqli class, the connect method just stores the credentials while anything which needs to talk to the database should call the wrapped connect method when it needs access to the database. However typically the yeild of this approach is low unless your code creates a lot of database connections which are never used (in which case a cheaper solution would be to increase the connection limit).
It can take a long time after the last query is run before the connection is closed down. Explicitly closing the connection addresses this, but requires a lot of code changes.
do not open and close a connection for each query. Although it will result in a reduced number of connections to the databasee, the performance will suck
The biggest win usually comes from optimizing your queries - reduced concurrency and a better user experience.

Related

Why do we have to close the MySQL database after a query command?

I'm starter.
I want to know what will happen if we don't close the MySQL connection.
1- Is it possible to open more than one database if we don't close them? I mean can we open more than one database in a same time?
2- Does closing database increase the speed?
3- Is it necessary to close the database or it is optional?
Look at this code. I don't use "mysql_close()" so I don't close the database after each request. There are a lot of requests for this PHP page. Maybe 50000 per each minute. I want to know closing database is necessary for this code or no?
<?php
//Include the file that lets us to connect to the database.
include("database/connection.php");
//Call "connect" function to connect to the database.
connect("database", "localhost", "root", "", "user");
//The GPRS module send a string to this site by GET method. The GPRS user a variable named variable to send the string with.
$received_string = $_GET["variable"];
//Seprates data in an array.
$array_GPRS_data = explode(",", $received_string);
//we need to remove the first letter.
$array_GPRS_data[9] = substr($array_GPRS_data[9], 1);
$array_GPRS_data[13] = substr($array_GPRS_data[13], 4, 2).substr($array_GPRS_data[13], 2, 2).substr($array_GPRS_data[13], 0, 2);
//Query statement.
$query = "INSERT INTO $array_GPRS_data[17](signal_quality, balance, satellite_derived_time, satellite_fix_status, latitude_decimal_degrees,
latitude_hemisphere, longitude_decimal_degrees, longitude_hemisphere, speed, bearing, UTCdate, theChecksum)
VALUES('$array_GPRS_data[0]', '$array_GPRS_data[1]', '$array_GPRS_data[5]', '$array_GPRS_data[6]', '$array_GPRS_data[7]',
'$array_GPRS_data[8]', '$array_GPRS_data[9]', '$array_GPRS_data[10]', '$array_GPRS_data[11]', '$array_GPRS_data[12]', '$array_GPRS_data[13]',
'$array_GPRS_data[16]')";
//Run query.
$result = mysqli_query($query);
//Check if data are inserted in the database correctly.
if($result)
{
echo("*#01");
}
else
{
echo("Error: 001");
echo (mysqli_error());
}
?>
Yes, you can have multiple database connections. You are not opening a database, you are opening a database connection. The database is 'open' (i.e. running) all of the time, generally speaking, whether you are connected to it or not.
Depends... if you only have one open connection on a page, then you don't need to close it because it will automatically close when PHP is done. If you have many, then you could potentially make the database server slower, or make the database server run out of available connections (it can only have a certain number of connections open at the same time). That said, most modern database servers can handle hundreds of concurrent connections.
Optional, but recommended. It's not a big deal for small-medium projects (i.e. if you have less than 100 concurrent visitors at any given time, you probably won't have any issues regardless). Since you have many thousand visitors per minute, you should actively close the database connection as soon as you are done with it, to free it up as soon as possible.
Once you connect to the database it is not necessary to close. As non-persistent connection automatically closed at the end of script execution.
Follow this for more information

PHP - Best way to connect to database when there are multiple connections

I have just recently acquired the service side of a medium size project. The former developer has all of his functions as separate php scripts instead of classes (func1.php, func2.php, etc)... All these 'functions' make a reference to mysqli_connect via referencing the actual
'databaseonnection.php' file. This is creating a new connection every time any of the scripts run (every time I have to call a function) and I don't want to do that. I was thinking about having a persistent connection, but I'm worried about it getting out of hands as the project is growing more and more every day. So, has anyone ever encountered a similar situation? What is the best way to handle my connection to the database? Any suggestions would be greatly appreciated.
From the docs for mysql_connect. If a second call is made to mysql_connect() with the same arguments, no new link will be established, but instead, the link identifier of the already opened link will be returned.
EDIT: I'm sorry I thought you wanted connectivity help. There is no way except to move all those "functions" into one file where the connection is for them only.
I create a con.php file where my PDO connection is established then include that file anywhere you wish to use a connection Here is the base for a PDO connection:
$PDO = new PDO("mysql:host=localhost;dbname=dbname", "user_name", "password");
Here is my notes on using the PDO object to make prepared queries. There is more than you need below but good luck.
Within your PHP file that needs a connection:
1: include('con.php');
2: $datas = $PDO->prepare(SELECT * FROM table WHERE title LIKE :searchquery);
// prepare method creates and returns a PDOstatment object ( print_r($datas); ) which contains an execute() method
// PDOstatment object has its own methods ie. rowCount()
// $datas->bindValue(':search', '% . $search . %', )
// Optional - Manually bind value. see http://php.net/manual/en/pdostatement.bindparam.php
3: $datas->execute( array(':searchquery' => $searchquery . '%'));
// pass in values that need to be bound AND EXECUTE.
// There are 17 ways to "fetch" data with the PDO object.
4: $datas-fetchALL(PDO::FETCH_OBJ);
close a pdo connection by the handle:
$PDO = null;
I think you'll be much better off using PDO as opposed to the old MYSQL functions e.g. mysql_connect. It's much more robust an interface.
Below is the basic code to do this:
$db_handle = new PDO("mysql:host=".$db_host.";dbname=".$db_name.";port=".$db_port."", $db_username, $db_password, $connect_options);
where $db_handle is the PDO object representing the database connection, $db_host is your hostname [usually localhost], $db_name is the name of your database, $db_port is the database port number [usually 3306], $db_username and $db_password are your database user access credentials, and $connect_options are optional driver-specific connection options.
To enable persistent connections you need to set the driver-specific connection option for it before opening the connection: $connect_options = array(PDO::ATTR_PERSISTENT => true); then execute the earlier database connection code.
You can get more information on this from the PHP Docs here: http://www.php.net/manual/en/pdo.construct.php and http://php.net/manual/en/pdo.connections.php.
Regarding creating persistent connections, I would suggest that you close every database connection you open at the end of your script (after all your database operations of course) by nullifying your database handle: $db_handle = NULL;. You should do this whether you opened a persistent connection or not. It sounds counter-intuitive, but I believe you should free up any database resources when your script is done.
The performance disadvantages of doing this [from my experience] are neglible for most applications. This is obviously an arguable assertion and you may also find the following link helpful in further clarifying your strategy in this regard:
Persistent DB Connections - Yea or Nay?
Happy coding!
if you have very complex project and need big budget to re-design, and prefer very simple alteration then
1) stay in mysqli_connect
2) move the database connection to header of your script.
3) remove the function databse close() on that functions.
4) remove the connection link variables, it wont needed for single database.
5) close the database on end of footer.
By this way, database connection establish when starting your script and after all queries, it will be closed on footer. your server can handle the connections without closing/re-open by using keepalive method. basically default keepalive value is 30 to 90 seconds.

How can I get php pdo code to keep retrying to connect if there are too many open connections?

I have an issue, it has only cropped up now. I am on a shared web hosting plan that has a maximum of 10 concurrent database connections. The web app has dozens of queries, some pdo, some mysql_*.
Loading one page in particular peaks at 5-6 concurrent connections meaning it takes a minimum of 2 users loading it at the same time to spit an error on one or both of them.
I know this is inefficient, I'm sure I can cut that down quite a bit, but that's what my idea is at the moment is to move the pdo code into a function and just pass in a query string and an array of variables, then have it return an array (partly to tidy my code).
THE ACTUAL QUESTION:
How can I get this function to continue to retry until it manages to execute, and hold up the script that called it (and any script that might have called that one) until it manages to execute and return it's data? I don't want things executing out of order, I am happy with code being delayed for a second or so during peak times
Since someone will ask for code, here's what I do at the moment. I have this in a file on it's own so I have a central place to change connection parameters. the if statement is merely to remove the need to continuously change the parameters when I switch from my test server to the liver server
$dbtype = "mysql";
$server_addr = $_SERVER['SERVER_ADDR'];
if ($server_addr == '192.168.1.10') {
$dbhost = "localhost";
} else {
$dbhost = "xxxxx.xxxxx.xxxxx.co.nz";
}
$dbname = "mydatabase";
$dbuser = "user";
$dbpass = "supersecretpassword";
I 'include' that file at the top of a function
include 'db_connection_params.php';
$pdo_conn = new PDO("mysql:host=$dbhost;dbname=$dbname", $dbuser, $dbpass);
then run commands like this all on the one connection
$sql = "select * from tbl_sub_cargo_cap where sub_model_sk = ?";
$capq = $pdo_conn->prepare($sql);
$capq->execute(array($sk_to_load));
while ($caprow = $capq->fetch(PDO::FETCH_ASSOC)) {
//stuff
}
You shouldn't need 5-6 concurrent connections for a single page, each page should only really ever use 1 connection. I'd try to re-architect whatever part of your application is causing multiple connections on a single page.
However, you should be able to catch a PDOException when the connection fails (documentation on connection management), and then retry some number of times.
A quick example,
<?php
$retries = 3;
while ($retries > 0)
{
try
{
$dbh = new PDO("mysql:host=localhost;dbname=blahblah", $user, $pass);
// Do query, etc.
$retries = 0;
}
catch (PDOException $e)
{
// Should probably check $e is a connection error, could be a query error!
echo "Something went wrong, retrying...";
$retries--;
usleep(500); // Wait 0.5s between retries.
}
}
10 concurrent connections is A LOT. It can serve 10-15 online users easily.
Heavy efforts needed to exhaust them.
So there is something wrong with your code.
There are 2 main reasons for it:
slow queries take too much time and thus serving one hit uses one mysql connection for too long.
multiple connections opened from every script.
The former one have to be investigated but for the latter one it's simple:
Do not mix myqsl_ and PDO in one script: you are opening 2 connections at a time.
When using PDO, open connection only once and then use it throughout your code.
Reducing the number of connections in one script is the only way to go.
If you have multiple instances of PDO class in your code, you will need to add that timeout handling code you want to every call. So, heavy code rewriting required anyway.
Replace these new instances with global $pdo; instead. It will take the same amount of time but it will be permanent solution, not temporary patch as you want it.
Please be sensible.
PHP automatically closes all the connections st the end of the script, you don't have to care about closing them manually.
Having only one connection throughout one script is a common practice. It is used by ALL the developers around the world. You can use it without any doubts. Just use it.
If you have transaction and want to log something in database you sometimes need 2 connections in one script

Database locked while trying to access from PHP script

I am writing an Android app which communicates with a PHP backend. The backend db is SQLite 3. The problem is, I am getting this error intermittently PHP Warning: SQLite3::prepare(): Unable to prepare statement: 5, database is locked. I am opening a connection to the database in each PHP file and closing it when the script finishes. I think the problem is that one script locked the database file while writing to it and the second script was trying to access it, which failed. One way of avoiding this would be to share a connection between all of the php scripts. I was wondering if there is any other way of avoiding this?
Edit:
This is the first file:
<?php
$first = SQLite3::escapeString($_GET['first']);
$last = SQLite3::escapeString($_GET['last']);
$user = SQLite3::escapeString($_GET['user']);
$db = new SQLite3("database.db");
$insert = $db->prepare('INSERT INTO users VALUES(NULL,:user,:first,:last, 0 ,datetime())');
$insert->bindParam(':user', $user, SQLITE3_TEXT);
$insert->bindParam(':first', $first, SQLITE3_TEXT);
$insert->bindParam(':last', $last, SQLITE3_TEXT);
$insert->execute();
?>
Here is the second file:
<?php
$user = SQLite3::escapeString($_GET['user']);
$db = new SQLite3("database.db");
$checkquery = $db->prepare('SELECT allowed FROM users WHERE username=:user');
$checkquery->bindParam(':user', $user, SQLITE3_TEXT);
$results = $checkquery->execute();
$row = $results->fetchArray(SQLITE3_ASSOC);
print(json_encode($row['allowed']));
?>
First, when you are done with a resource you should always close it. In theory it will be closed when it is garbage-collected, but you can't depend on PHP doing that right away. I've seen a few databases (and other kinds of libraries for that matter) get locked up because I didn't explicitly release resources.
$db->close();
unset($db);
Second, Sqlite3 gives you a busy timeout. I'm not sure what the default is, but if you're willing to wait a few seconds for the lock to clear when you execute queries, you can say so. The timeout is in milliseconds.
$db->busyTimeout(5000);
I was getting "database locked" all the time until I found out some features of sqlite3 must be set by using SQL special instructions (i.e. using PRAGMA keyword). For instance, what apparently solved my problem with "database locked" was to set journal_mode to 'wal' (it is defaulting to 'delete', as stated here: https://www.sqlite.org/wal.html (see Activating And Configuring WAL Mode)).
So basically what I had to do was creating a connection to the database and setting journal_mode with the SQL statement. Example:
<?php
$db = new SQLite3('/my/sqlite/file.sqlite3');
$db->busyTimeout(5000);
// WAL mode has better control over concurrency.
// Source: https://www.sqlite.org/wal.html
$db->exec('PRAGMA journal_mode = wal;');
?>
Hope that helps.

PDO let database stay open, or open and close when needed?

I have just discovered PDO and I'm very excited about it, but I have read a few tutorials on how to implement it, and they show me different ways of doing it.
So now I'm confused which way is the best.
example 1: open database once.
include("host.php"); //including the database connection
//random PDO mysql stuff here
Example 2: open close the database when needed:
try {
$dbh = new PDO(mysql stuff);
$sql = "mysql stuff";
foreach ($dbh->query($sql) as $row)
{
echo $row['something'];
}
/*** close the database connection ***/
$dbh = null;
}
catch(PDOException $e)
{
echo $e->getMessage();
}
Which is best? I would think example 2 is best, but there much more code than example 1.
Usually, there is significant time spent/lost when connecting, and you want to do it only once. Do not go closing a connection you need later on, it will only slow things down. You may consider closing a connection sooner if you are reaching the maximum connections limit, but that's more a hint you should scale up then a permanent solution IMHO.

Categories