One user database, two applications at different servers - php

I currently run an Rails app on a single VPS with it's own MySQL database. This database also contains the users table. We are currently building a second application which should have the same users. If one registers for either, he has an account for both.
However, the problem is that this second (PHP) application must be hosting at another location. What are the best practices here? Can I directly connect to the external database without causing a big delay? Should I sync them? Create a API at the first?
I'm searching for the most maintainable method possible. Thank you.

You can allow the second host access to the first MySQL server. The best practices (as far as I'm aware) to do this would be to create a new user account, give it the required privileges to the users table, only allow access to it from the IP or domain of the second host and using a secure password. You would run this query on the MySQL server:
GRANT ALL ON mydatabase.users TO 'mynewuser'#'123.123.123.123'
IDENTIFIED BY 'mysecurepassword';
Needless to say you would replace 'mydatabase.users' with your database and table name, 'mynewuser' with the username of the user you want to have access (you can put anything here, the user will be automatically created), '123.123.123.123' with the IP or domain name of your second server and 'mysecurepassword' with a good and long password, preferably randomly generated.
Your second server would now be able to connect to the MySQL server and have ALL privileges (you can change that to what ever privileges it needs) on the mydatabase.users table.
MySQL 5.6 GRANT Syntax
Greensql.com on MySQL remote access best practices
To minimize the small performance penalty I would refrain from creating multiple MySQL connections (Try to use as few as possible, preferably just one). I'm not 100% on the following, but I'm pretty sure it would be a great idea to reduce the amount of separate queries you execute as well. Instead of running 10 separate inserts, run one insert with multiple VALUES segments, e.g.
INSERT INTO mytable(id, name) VALUES
(0, 'mia'),
(1,'tom'),
(2,'carl');
I'm sure MySQL Prepared Statements would also be of considerable help reducing the speed penalty

You can connect to the database from the second application.
but the delay for the connection will always be there.
Another thing is you have to consider about the security as the database will be accessible to the world. You can restrict the connection only to the second web server.
To reduce the delay, you have options like having a memcached or other caches so that you dont have to hit the database again and again.
You can have a mysql replication and have the main database as master and setup another mysql in the second application server and make it as slave.
The Second server's mysql syncs with the master and all the write queries (update, insert, delete etc.) should go to master. This should reduce the database load.

Related

Backup & Security Considerations for MySQL/PHP app in production

Some specific questions beyond this:
1) What are your most critical security considerations? I feel like I've secured the data in transit with encryption/https, data at rest by encrypting sensitive data that I don't need access to, set up a firewall to access phpmyadmin, changed root passwords, etc., but I'm not confident that I've 'checked all the boxes' so to speak. Is there a robust guide to securing mysql/php applications out there somewhere? Perhaps a pen test is the only way to get confidence to some degree?
2) What are your backup considerations? I've got a master/slave relationship set up for the mysql database in two different datacenters, and weekly backups of the production server itself. The code is all in source control, but I have some uploaded documents that I'd lose if the whole thing crashed on Day 6 after backups. Any ideas on that one? Considering moving the document storage to a different server and backing it up nightly, or asynchronously just saving the document on initial upload to two separate servers. They are not large docs, and volume isn't high yet, but, is that scalable?
I feel like I've secured the data in transit with encryption/https
It's not really clear from this how far you have gone, so forgive me if you have already addressed the following.
Obviously https secures the transmission of data between the client to your webserver but it should not be confused with an encrypted connection between your application and the database. Admittedly the risk factor of data being intercepted here is much lower, but it depends how sensitive the data is. Information on setting up encrytped database connections can be found here. As you are replicating between data centres you should consider setting up replication to use encrypted connections also.
Replication and backing up are not the same thing, once you have replicated the data from the master to the slave you still need to back up the slave. You can use mysqldump but the caveat is mysqldump creates a logical backup not a physical one so restore functions are slow and it's not good for scaleability. There's a good post here with solutions for making a physical backup instead.
Apart from that there are the usual kind of security measures you should implement with any system:
Create and assign separate user accounts for each process with the
minimum level of access permissions needed to perform their function.
Create an admin account with a non generic admin type username.
Remove root account and any others that are likely to get you pwned
like admin etc
Only encrypt data if you need to be able to decrypt it again. Salt
and hash sensitive data which is only needed for validation
(passwords etc) and save the resulting value in the database. Validate user inputs against the hash.
Depending upon your use case you may also consider:
In /etc/mysql/my.cnf within the mysqld section you can disallow
connections from anywhere except the local machine by adding
bind-address = 127.0.0.1
Disable loading files into MySQL from the local filesystem for
users without file level privileges by adding local-infile=0
Implementing stored procedures for common database tasks. These
have several benefits (as well as some drawbacks) but are good from a
security point of view as you can grant user accounts permissions on
stored procedures without granting any permissions on the tables
which they utilise.
Using database views to restrict access to some columns within
tables while still allowing access to other columns within the same
table. As with stored procedures there are pros and cons for views.
There are probably a million other things which someone who understands MySQL much better than I do could reel off to you without even thinking about it, but that's all I've got.

Provide simple mysql table administration interface in a hosting environment

I have a web portal, consisting of various projects, based on MySQL database. The project administrators need to administrate their tables. I would like to provide them some already existing, free, simple interface with as cheap setup as possible (like 10-15 minutes max of my time per each new project - IMPORTANT! The number of project administrators and their requests grows...). It should support:
required: table export, import, insert new row, modify existing rows
not needed, but a plus: foreign keys replaced with some other field value from the foreign table (usualy a "dictionary"), display only certain columns, etc.
The problem is that it is a hosting environment, so I have no superuser permissions to the MySQL database. I don't have the grant permission so I must ask my hosting provider to run every grant command, so I want to minimize these requests as it is an above-standard service (it's their grace to me). And these requests would have to be done quite often.
What database administration tools/solutions can I use?
My ideas:
1) MySQL ODBC connector + MS Access as a client. The MS Access would connect via ODBC connector to the MySQL server. I can prepare small MS Access file that would contain a link to desired tables, and also quickly generated forms!
This is cool, however, I would need to contact my provider every time to create db user with desired permissions... to prevent users from changing table structure or destroying other tables...
2) Client -> Proxy -> MySQL server. Like in 1), but with some proxy. I'm now theorizing, but the Access could also use other protocol (e.g. HTTP) to connect some proxy that would handle the permissions and this proxy would then pass it to MySQL server. Does there exist something like that?
3) PHPMyADMIN. The problem from point 1) remains. However, the permission checking could be theoretically implemented on the PHP level here, so no need to change any MySQL permissions! Is PHPMyADMIN capable of that, out of the box? Can I simply configure a new user which can only see table A & B and only modify column C, etc.?
However, the import is not much user friendly (no XLS, only CSV, no delimiter autodetection etc.), as well as inserting new records...
4) There are plenty of modern web tools with spreadsheet-like look like GoogleDocs. Could these be used for the task. Again, in theory the permission checking could be done at the web-server (not database) layer... and set up easily... (?)
I'm sure many people had to solve the same issue, so I'm looking forward your experiences and ideas!
My final solution was a deal with a hosting provider - I asked him to create 5 dummy database users for future usage and also asked him to grant me the GRANT OPTION privilege. So I can configure privileges of those users, without having to ask the hosting provider! I did'n know of this possibility at the time of asking.
And then, I use MS Access with MySQL ODBC Connector as a front-end to the MySQL database. Great1!

MySQL table replicated between servers

Thats the problem I'm facing:
Given tree servers, each one offers some web services and have a table login like:
[id,username, password, email,...]
My target is to allow the access in each server to the users in the others, keeping the inter-server independence The desired behavior isn't complex:
When a user is registered in one of the servers that user should be added to the other servers without taking too long.
When a user change his pass in one server the others server must reflect that change too.
If two changes collide then keep only the newest change
I have been asked to do this without spending much time so I wonder if there is any standard and easy-to-perform solution to this problem.
All the servers use REST web service with PHP and MySQL.
It is for a shared hosting so I can't perform admin actions like configuring the mySQL server
You can replicate data between databases using MYSQL replication.
Usually it is used to replicate a whole DB but you can use do/ignore and rewrite rules to specify which tables to replicate.
replication filtering rules
replication logging
I have never used MYSQL replication this way so can't help further than this but I know it is possible.
You can create two mysql users.
first user will given write privileges and point to master
second user will given read only privileges, can load balance between the three servers
Change your application when require write, connect mysql using first user.
If require read only, use the second user.
I don't think share hosting is an problem,
pay more money to ask the hosting company to do the necessary configuration (that's obvious)
Or seek for other hosting company that allow administrator access such as AWS.

connect a database with externals tables

I never made something similar .
I have a system and i need to relate my data with external data (in another database).
My preference is get these data and create my own tables, but in this case when the other dbs are updated my personal tables will be obsolete.
So, basically i need to synchronize my tables with external tables, or just get the external data values.
I don't have any idea how i can connect and relate data from ten external databases.
I need to check if an user is registered in another websites basically.
Any help?
I am crrently doing something similar.
Easiset way I found is to pull the data in, though I do bi-directional syncronisation in my project you haven't mentionned this so I imagine it's a data pull you are aiming for .
You need to have user accounts on the other servers, and the account needs to be created with an ip instead of 'localhost'. You will connect from your end through mysql client using the ip of distant host instead of the ususal localhost.
see this page for a bit more info.
If, like me you have to interface to different db server types, I recommend using a database abstraction library to ease the managing of data in a seamless way across different sql servers. I chose Zend_db components, used standaline with Zend_config as they Support MySQL and MSSQL.
UPDATE - Using a proxy DB to access mission critical data
Depending on the scope of your project, if the data is not accessible straight from remote database, there are different possibilities. To answer your comment I will tell you how we resolved the same issues on the current project I am tied to. The client has a big MSSQL database that is is business critical application, accounting, invoicing, inventory, everything is handled by one big app tied to MSSQL. My mandate is to install a CRM , and synchronise the customers of his MSSQL mission-critical-app into the CRM, running on MySQL by the way.
I did not want to access this data straight from my CRM, this CRM should not ever touch their main MSSQL DB, I certainly am not willing to take the responsibility of something ever going wrong down the line, even though in theory this should not happen, in practice theory is often worthless. The recommandation I gave (and was implemented) was to setup a proxy database, on their end. That database located on the same MSSQL instance has a task that copies the data in a second database, nightly. This one, I am free to access remotely. A user was created on MSSQL with just access to the proxy, and connection accepted just from one ip.
My scipt has to sync both ways so in my case I do a nightly 'push' the modified records from MSSQL to the crm and 'pull' the added CRM records in the proxy DB. The intern gets notified by email of new record in proxy to update to their MSSQL app. Hope this was clear enough I realize it's hard to convey clearly in a few lines. If you have other questions feel free to ask.
Good-luck!
You have to download the backup (gzip,zip) of the wanted part(or all) of the Database and upload it to the another Database.
Btw. cronjobs wont help you at this point, because you cant have an access to any DB from outside.
Does the other website have an API for accessing such information? Are they capable of constructing one? If so, that would be the best way.
Otherwise, I presume your way of getting data from their database is by directly querying it. That can work to, just make a mysql_connect to their location and query it just like it was your own database. Note: their db will have to be setup to work with outside connections for this method to work.

Locking a SQL Server Database with PHP

I'm wanting extra security for a particular point in my web app. So I want to lock the database (SQL Server 2005). Any suggestions or is this even necessary with SQL Server?
Edit on question:
The query is failing silently with no errors messages logged, and does not occur inside of a transaction.
Final Solution:
I never was able to solve the problem, however what I wound up doing was switching to MySQL and using a transactional level query here. This was not the main or even a primary reason to switch. I had been having problems with SQL Server and it allowed me to have our CMS and various other tools all running on the same database. Previous we had a SQL Server and a MySQL database running to run our site. The port was a bit on the time consuming however in the long run I feel it will work much better for the site and the business.
I suppose you have three options.
Set user permissions so that user x can only read from the database.
Set the database into single user mode so only one connection can access it
sp_dboption 'myDataBaseName', single, true
Set the database to readonly
sp_dboption 'myDataBaseName', read only, true
I never was able to solve the problem, however what I wound up doing was switching to MySQL and using a transactional level query here. This was not the main or even a primary reason to switch. I had been having problems with MSSQL and it allowed me to have our CMS and various other tools all running on the same database. Previous we had a MSSQL and a MySQL database running to run our site. The port was a bit on the time consuming however in the long run I feel it will work much better for the site and the business.

Categories