Should my database driver classes support replication (PHP)? - php

I'm currently writing a PHP application and drivers (classes) for the database engines. I was wondering if I need to write a replication support (master-slave)? I'm a bit new to this, so, what kind of things should my project or classes worry about if I want to support load balancing/replication? Oh and this is about MySQL.

The way we use our master-slave db, is to use the master for all "active usage", and the slave for all reporting (where it doesn't matter if the data is still "catching up" slightly). Depending on your needs, you could have -all- data manipulation occur on the master, and -all- data reading occur on the slave. This especially helps when you have blocking inserts or updates. (Note: Also consider the "insert delayed" MySQL syntax where possible, which helps avoid blocking too.)
As far as the PHP support for this, all you really need is to keep clean handling for multiple (two) database connections, and use the master (read/write) or slave (ONLY READ) db connection as desired.

If you think you will use the slaves to read and the master to write, then your Class needs to support at least several connections at once.
I will show you the API I used, If you choose that way, I can send you the class.
ShusterDb::getInstance('read')->select($sql); //makes sure this is a SELECT in the method.
ShusterDb::getInstance('write')->scalar($sql);

Itay, if you are open to sending your class, I would be interested in seeing / possibly using it.

Related

Is it a good programming having two or more different mysql connections? [duplicate]

I am wanting to hear what others think about this? Currently, I make a mysql database connection inside of a header type file that is then included in the top of every page of my site. I then can run as many queries as I want on that 1 open connection. IF the page is built from 6 files included and there is 15 different mysql queries, then they all would run on this 1 connection.
Now sometimes I see classes that make multiple connections, like 1 for each query.
Is there any benefit of using one method over the other? I think 1 connection is better then multiple but I could be wrong?
Creating connections can be expensive (I don't have a reference for this statement as yet Edit: Aha! Here it is) so it seems as if the consensus is to use fewer connections. Using a single connection for all queries on a single page seems to be a better choice than multiple connections.
In PHP+MySQL usually there is no much sence to use multiple connections per page (just slower and a little more RAM consumed).
The only way it might be useful is when you alter connection paremters which might interfer with other pages (like collation). But good PHP programs usually never do that kind of stuff.
Also, it is a good idea to enable persistent connections, so that 1 MySQL connection would be reused across multiples page executions.
If really depends on the level of activity you suspect the site will generate - if it's a high traffic web site, you'll soon run out of connections (unless you set the adjust MySQLs max connections to a stupidly high level, but that'll eventually grind the server to a halt).
I'd generally recommend that the front end of a web site should use a shared database object (singleton is your friend), as it doesn't require a great deal of discipline to write with this is mind and you won't waste time making connections. If you require additional concurrent queries on the backend, it shouldn't be that much of a deal as this isn't likely to be a highly trafficked area.
Its not recommended to execute multiple small queries where the work can be done using just one query, you can use a single query to get data from multiple tables and ieven multiple databases. see the link below:
http://www.x-developer.com/php-scripts/sql-connecting-multiple-databases-in-a-single-query
I don't see any benefit of using multiple connections, I 'd rather think it is a sign of bad structure. These are the reasons I can think of against using multiple connections:
You have to initialize the database multiple times. Setting conection properties upon connection establishment (like SET NAMES UTF8) would have to be done on multiple line.
It is definitely slower than a single connection.
A non-technical reason: Someone working with your code will most probably not expect it and might spend hours debugging the connection properties he had set in another connection.
Having a global connection object (or a class providing one) is the much better approach in PHP.
Are you sure the classes that make multiple connections aren't just returning a reference to the already open connection when one is open? I've seen a lot of stuff structured that way. It really is better performance-wise to use only one connection per page.

PDO SELECT from SLAVE and INSERT into MASTER

is there any chance to set in PDO settings that SELECT's will be executed on SLAVE DB server and Insert & Update & DELETE will be executed on MASTER DB server, or I need to create PHP handler to do that?
Situation:
We have Master - Master replication for MySQL. We are going to add two new servers so it will be - Master/Slave - Master/Slave.
I want to create some handling for SELECT queries. I want execute SELECT queries on SLAVE instead of MASTER and all UPADTE&INSERT&DELETE queries will be executed on MASTER. Is this possible with some setting?
Thanks!
No, you can't configure PDO or any of PHP's database extensions to do this. That is simply because each PDO (or MySQLi, etc.) instance represents a single connection, to a single server.
So yes, you'll need a handler that is aware of multiple connections to do that. Some popular ORMs and other database-abstraction layers do provide such functionality.
I recommend not doing it even if you could. Replication is "asynchronous". That is, when you insert into the Master, there is no assurance that it will arrive at the Slave before you try to read it. Nor even any guarantee that it will arrive today!
If you user posts a comment on a blog, and then goes to a page that shows the comment, they will be annoyed if the comment does not show. They may assume that the comment was lost and then repost it. This causes you grief when users complain about double-posting.
This is called "critical read". This simple way to avoid the mess is to be careful about what you send to the Slaves -- namely nothing that would lead to "disappearing" posts.
There are various "proxy" packages that allow from the read-write split you describe; some try to avoid the "critical read", but I don't trust them.
A Galera Cluster (see PXC, MariaDB), does synchronous reads, so it can avoid the critical read problem. (There is, however, a setting you need to apply.)

Simple logging to database class/toolkit for php?

We have a PHP/MySQL application and I want to setup a logging mechanism to log all financial related actions into this table.
I was thinking of a simple file based mechanism, but due to the risk of two instances conflicting with each other, I suppose a database based mechanism would be better.
Can anyone recommend a class / toolkit that can encapsulates simple logging mechanisms, but with a database backend?
Try Log4PHP, it's quite flexible and support a wide range of storages for the log!
I'm guessing that you want to log changes to data which is stored in a database.
Why not just use the DBMS audit tracking facilities?
(with mysql, just set up a dummy slave to consume the replication logs).

Accessing MySQL from PHP and another process at the same time

I'm writing a program that runs (24/7) on a Linux server and adds entries to a MySQL database.
The contents of the database are presented on a web interface with PHP and the user should be able to delete entries using the web interface.
Is it possible to access the database from multiple processes at the same time?
Yes, databases are designed for this purpose quite well. You'll want to keep a few things in mind in your designs:
Concurrency and race conditions on database writes.
Performance.
Separate database permissions for separate applications.
Unless you're doing something like accessing the DB using a singleton, the max number of simultaneous mysql connections php will use is limited in your php.ini. I believe it defaults to 100.
Yes multiple users can access the database at the same time.
You should however take care that the data is consistent.
If you create/edit entry with many small sql statements and in the meantime someone useses the web interface this may lead to some errors.
If you have a simple db this should not be a problem, else you should consider using transactions.
http://dev.mysql.com/doc/refman/5.0/en/ansi-diff-transactions.html
Yes and there will not be any problems while trying to delete records in the presence of that automated program which runs 24/7 if you are using the InnoDb engine. This is because transactions happen one at a time, one starts after another has finished and the database is consistent everytime.
This answer How to implement the ACID model for a database has many relevant points.
Read about the ACID Properties of a database. A Mysql database with InnoDb engine will take care of all these things for you and you need not worry about that.

In PHP/MySQL should I open multiple database connections or share 1?

I am wanting to hear what others think about this? Currently, I make a mysql database connection inside of a header type file that is then included in the top of every page of my site. I then can run as many queries as I want on that 1 open connection. IF the page is built from 6 files included and there is 15 different mysql queries, then they all would run on this 1 connection.
Now sometimes I see classes that make multiple connections, like 1 for each query.
Is there any benefit of using one method over the other? I think 1 connection is better then multiple but I could be wrong?
Creating connections can be expensive (I don't have a reference for this statement as yet Edit: Aha! Here it is) so it seems as if the consensus is to use fewer connections. Using a single connection for all queries on a single page seems to be a better choice than multiple connections.
In PHP+MySQL usually there is no much sence to use multiple connections per page (just slower and a little more RAM consumed).
The only way it might be useful is when you alter connection paremters which might interfer with other pages (like collation). But good PHP programs usually never do that kind of stuff.
Also, it is a good idea to enable persistent connections, so that 1 MySQL connection would be reused across multiples page executions.
If really depends on the level of activity you suspect the site will generate - if it's a high traffic web site, you'll soon run out of connections (unless you set the adjust MySQLs max connections to a stupidly high level, but that'll eventually grind the server to a halt).
I'd generally recommend that the front end of a web site should use a shared database object (singleton is your friend), as it doesn't require a great deal of discipline to write with this is mind and you won't waste time making connections. If you require additional concurrent queries on the backend, it shouldn't be that much of a deal as this isn't likely to be a highly trafficked area.
Its not recommended to execute multiple small queries where the work can be done using just one query, you can use a single query to get data from multiple tables and ieven multiple databases. see the link below:
http://www.x-developer.com/php-scripts/sql-connecting-multiple-databases-in-a-single-query
I don't see any benefit of using multiple connections, I 'd rather think it is a sign of bad structure. These are the reasons I can think of against using multiple connections:
You have to initialize the database multiple times. Setting conection properties upon connection establishment (like SET NAMES UTF8) would have to be done on multiple line.
It is definitely slower than a single connection.
A non-technical reason: Someone working with your code will most probably not expect it and might spend hours debugging the connection properties he had set in another connection.
Having a global connection object (or a class providing one) is the much better approach in PHP.
Are you sure the classes that make multiple connections aren't just returning a reference to the already open connection when one is open? I've seen a lot of stuff structured that way. It really is better performance-wise to use only one connection per page.

Categories