Thats the problem I'm facing:
Given tree servers, each one offers some web services and have a table login like:
[id,username, password, email,...]
My target is to allow the access in each server to the users in the others, keeping the inter-server independence The desired behavior isn't complex:
When a user is registered in one of the servers that user should be added to the other servers without taking too long.
When a user change his pass in one server the others server must reflect that change too.
If two changes collide then keep only the newest change
I have been asked to do this without spending much time so I wonder if there is any standard and easy-to-perform solution to this problem.
All the servers use REST web service with PHP and MySQL.
It is for a shared hosting so I can't perform admin actions like configuring the mySQL server
You can replicate data between databases using MYSQL replication.
Usually it is used to replicate a whole DB but you can use do/ignore and rewrite rules to specify which tables to replicate.
replication filtering rules
replication logging
I have never used MYSQL replication this way so can't help further than this but I know it is possible.
You can create two mysql users.
first user will given write privileges and point to master
second user will given read only privileges, can load balance between the three servers
Change your application when require write, connect mysql using first user.
If require read only, use the second user.
I don't think share hosting is an problem,
pay more money to ask the hosting company to do the necessary configuration (that's obvious)
Or seek for other hosting company that allow administrator access such as AWS.
Related
I am deploying a small PHP + MySQL service to my client and would like to know what is the proper way to set up the database:
Should I use the hosting provider control panel and create the database schema?
Or should I put SQL CREATE scripts in my PHP to run during the "init phase"? Do hosting providers even allow PHP to create tables?
It's a really small site, one tiny info page and one web service page for fetching data from the database.
I usually offload all deployment tasks into an install script. This way you can deploy in a matter of seconds, and can repeat if necessary. I do not know of a way to restrict scripts from database modifications (other than mysql user permissions, which will typically be defined by you)
It may depend what your hosting provider offers - personally I would use the control panel which should at least provide phpMyAdmin. You can then export your schema from your development database and import it to the live version.
Depending on your hosting provider you get a number of databases. Worst is 1 database, with a fixed name, most do 5 or more, with the ability to create your own database name. Often with a prefix.
I would go for the panel from the hoster, all though you can give any SQL statement through PHP.
Why add the complication of PHP for the installation?
Just use raw SQL. Simpler. Fire that into the database.
Use PHP for the interface. Creating tables/stored procedures/triggers etc is a one off event.
I have a web portal, consisting of various projects, based on MySQL database. The project administrators need to administrate their tables. I would like to provide them some already existing, free, simple interface with as cheap setup as possible (like 10-15 minutes max of my time per each new project - IMPORTANT! The number of project administrators and their requests grows...). It should support:
required: table export, import, insert new row, modify existing rows
not needed, but a plus: foreign keys replaced with some other field value from the foreign table (usualy a "dictionary"), display only certain columns, etc.
The problem is that it is a hosting environment, so I have no superuser permissions to the MySQL database. I don't have the grant permission so I must ask my hosting provider to run every grant command, so I want to minimize these requests as it is an above-standard service (it's their grace to me). And these requests would have to be done quite often.
What database administration tools/solutions can I use?
My ideas:
1) MySQL ODBC connector + MS Access as a client. The MS Access would connect via ODBC connector to the MySQL server. I can prepare small MS Access file that would contain a link to desired tables, and also quickly generated forms!
This is cool, however, I would need to contact my provider every time to create db user with desired permissions... to prevent users from changing table structure or destroying other tables...
2) Client -> Proxy -> MySQL server. Like in 1), but with some proxy. I'm now theorizing, but the Access could also use other protocol (e.g. HTTP) to connect some proxy that would handle the permissions and this proxy would then pass it to MySQL server. Does there exist something like that?
3) PHPMyADMIN. The problem from point 1) remains. However, the permission checking could be theoretically implemented on the PHP level here, so no need to change any MySQL permissions! Is PHPMyADMIN capable of that, out of the box? Can I simply configure a new user which can only see table A & B and only modify column C, etc.?
However, the import is not much user friendly (no XLS, only CSV, no delimiter autodetection etc.), as well as inserting new records...
4) There are plenty of modern web tools with spreadsheet-like look like GoogleDocs. Could these be used for the task. Again, in theory the permission checking could be done at the web-server (not database) layer... and set up easily... (?)
I'm sure many people had to solve the same issue, so I'm looking forward your experiences and ideas!
My final solution was a deal with a hosting provider - I asked him to create 5 dummy database users for future usage and also asked him to grant me the GRANT OPTION privilege. So I can configure privileges of those users, without having to ask the hosting provider! I did'n know of this possibility at the time of asking.
And then, I use MS Access with MySQL ODBC Connector as a front-end to the MySQL database. Great1!
I currently run an Rails app on a single VPS with it's own MySQL database. This database also contains the users table. We are currently building a second application which should have the same users. If one registers for either, he has an account for both.
However, the problem is that this second (PHP) application must be hosting at another location. What are the best practices here? Can I directly connect to the external database without causing a big delay? Should I sync them? Create a API at the first?
I'm searching for the most maintainable method possible. Thank you.
You can allow the second host access to the first MySQL server. The best practices (as far as I'm aware) to do this would be to create a new user account, give it the required privileges to the users table, only allow access to it from the IP or domain of the second host and using a secure password. You would run this query on the MySQL server:
GRANT ALL ON mydatabase.users TO 'mynewuser'#'123.123.123.123'
IDENTIFIED BY 'mysecurepassword';
Needless to say you would replace 'mydatabase.users' with your database and table name, 'mynewuser' with the username of the user you want to have access (you can put anything here, the user will be automatically created), '123.123.123.123' with the IP or domain name of your second server and 'mysecurepassword' with a good and long password, preferably randomly generated.
Your second server would now be able to connect to the MySQL server and have ALL privileges (you can change that to what ever privileges it needs) on the mydatabase.users table.
MySQL 5.6 GRANT Syntax
Greensql.com on MySQL remote access best practices
To minimize the small performance penalty I would refrain from creating multiple MySQL connections (Try to use as few as possible, preferably just one). I'm not 100% on the following, but I'm pretty sure it would be a great idea to reduce the amount of separate queries you execute as well. Instead of running 10 separate inserts, run one insert with multiple VALUES segments, e.g.
INSERT INTO mytable(id, name) VALUES
(0, 'mia'),
(1,'tom'),
(2,'carl');
I'm sure MySQL Prepared Statements would also be of considerable help reducing the speed penalty
You can connect to the database from the second application.
but the delay for the connection will always be there.
Another thing is you have to consider about the security as the database will be accessible to the world. You can restrict the connection only to the second web server.
To reduce the delay, you have options like having a memcached or other caches so that you dont have to hit the database again and again.
You can have a mysql replication and have the main database as master and setup another mysql in the second application server and make it as slave.
The Second server's mysql syncs with the master and all the write queries (update, insert, delete etc.) should go to master. This should reduce the database load.
I have two shared hosting accounts, each with a database of its own (its own cPanel login). Assuming these two databases have the same structure, what do I need to do in order to synchronize them?
The synchronization script would be on a domain connected with one of the hosting accounts. I know the MySQL/PHP for synchronizing databases that are on the same account is fairly simple, but what's confusing me here is how to access the database which is on different hosting?
This isn't a one-time thing, I need to be able to do this by clicking a button/link.
The only thing that comes to mind is having the remote database export everything to .csv files on a regular basis and have the script on the domain connected to the first database import everything, but there's gotta be a better way?
In case this whole question is confusing, the gist of the problem is - is there a way to have a script on a domain access a database on a completely different shared hosting account?
In short, no, there's no way.
Usually, hosting providers allow DB access only to localhost users. Meaning that script from another machine can't access it.
Also, what kind of synchronization is it? One-way or two-way? (but, I guess, this is out of scope here)
The only viable solution that comes to mind is some kind of dump/restore procedure.
Example:
webserver A (source of data) defines an URL, by requesting which you can get dumped content of the DB
webserver B (destination of data) defines a page with button 'Sync'.
upon clicking the 'Sync' button, server B will fetch that URL from server A, receive A's data and merge it with its own.
NOTE
It is important to secure data export URL. In that script you can check, for example, IP of incoming request, or presence and correctness of "access_token" or whatever you like.
Can you connect to the database via the SSL/SSH or PHP tunnel? If so, try Data Comparison tool in dbForge Studio MySQL.
Data Comparison tool will allow you to compare data between different databases. You may test it with a trial version.
I never made something similar .
I have a system and i need to relate my data with external data (in another database).
My preference is get these data and create my own tables, but in this case when the other dbs are updated my personal tables will be obsolete.
So, basically i need to synchronize my tables with external tables, or just get the external data values.
I don't have any idea how i can connect and relate data from ten external databases.
I need to check if an user is registered in another websites basically.
Any help?
I am crrently doing something similar.
Easiset way I found is to pull the data in, though I do bi-directional syncronisation in my project you haven't mentionned this so I imagine it's a data pull you are aiming for .
You need to have user accounts on the other servers, and the account needs to be created with an ip instead of 'localhost'. You will connect from your end through mysql client using the ip of distant host instead of the ususal localhost.
see this page for a bit more info.
If, like me you have to interface to different db server types, I recommend using a database abstraction library to ease the managing of data in a seamless way across different sql servers. I chose Zend_db components, used standaline with Zend_config as they Support MySQL and MSSQL.
UPDATE - Using a proxy DB to access mission critical data
Depending on the scope of your project, if the data is not accessible straight from remote database, there are different possibilities. To answer your comment I will tell you how we resolved the same issues on the current project I am tied to. The client has a big MSSQL database that is is business critical application, accounting, invoicing, inventory, everything is handled by one big app tied to MSSQL. My mandate is to install a CRM , and synchronise the customers of his MSSQL mission-critical-app into the CRM, running on MySQL by the way.
I did not want to access this data straight from my CRM, this CRM should not ever touch their main MSSQL DB, I certainly am not willing to take the responsibility of something ever going wrong down the line, even though in theory this should not happen, in practice theory is often worthless. The recommandation I gave (and was implemented) was to setup a proxy database, on their end. That database located on the same MSSQL instance has a task that copies the data in a second database, nightly. This one, I am free to access remotely. A user was created on MSSQL with just access to the proxy, and connection accepted just from one ip.
My scipt has to sync both ways so in my case I do a nightly 'push' the modified records from MSSQL to the crm and 'pull' the added CRM records in the proxy DB. The intern gets notified by email of new record in proxy to update to their MSSQL app. Hope this was clear enough I realize it's hard to convey clearly in a few lines. If you have other questions feel free to ask.
Good-luck!
You have to download the backup (gzip,zip) of the wanted part(or all) of the Database and upload it to the another Database.
Btw. cronjobs wont help you at this point, because you cant have an access to any DB from outside.
Does the other website have an API for accessing such information? Are they capable of constructing one? If so, that would be the best way.
Otherwise, I presume your way of getting data from their database is by directly querying it. That can work to, just make a mysql_connect to their location and query it just like it was your own database. Note: their db will have to be setup to work with outside connections for this method to work.