I have two shared hosting accounts, each with a database of its own (its own cPanel login). Assuming these two databases have the same structure, what do I need to do in order to synchronize them?
The synchronization script would be on a domain connected with one of the hosting accounts. I know the MySQL/PHP for synchronizing databases that are on the same account is fairly simple, but what's confusing me here is how to access the database which is on different hosting?
This isn't a one-time thing, I need to be able to do this by clicking a button/link.
The only thing that comes to mind is having the remote database export everything to .csv files on a regular basis and have the script on the domain connected to the first database import everything, but there's gotta be a better way?
In case this whole question is confusing, the gist of the problem is - is there a way to have a script on a domain access a database on a completely different shared hosting account?
In short, no, there's no way.
Usually, hosting providers allow DB access only to localhost users. Meaning that script from another machine can't access it.
Also, what kind of synchronization is it? One-way or two-way? (but, I guess, this is out of scope here)
The only viable solution that comes to mind is some kind of dump/restore procedure.
Example:
webserver A (source of data) defines an URL, by requesting which you can get dumped content of the DB
webserver B (destination of data) defines a page with button 'Sync'.
upon clicking the 'Sync' button, server B will fetch that URL from server A, receive A's data and merge it with its own.
NOTE
It is important to secure data export URL. In that script you can check, for example, IP of incoming request, or presence and correctness of "access_token" or whatever you like.
Can you connect to the database via the SSL/SSH or PHP tunnel? If so, try Data Comparison tool in dbForge Studio MySQL.
Data Comparison tool will allow you to compare data between different databases. You may test it with a trial version.
Related
I am deploying a small PHP + MySQL service to my client and would like to know what is the proper way to set up the database:
Should I use the hosting provider control panel and create the database schema?
Or should I put SQL CREATE scripts in my PHP to run during the "init phase"? Do hosting providers even allow PHP to create tables?
It's a really small site, one tiny info page and one web service page for fetching data from the database.
I usually offload all deployment tasks into an install script. This way you can deploy in a matter of seconds, and can repeat if necessary. I do not know of a way to restrict scripts from database modifications (other than mysql user permissions, which will typically be defined by you)
It may depend what your hosting provider offers - personally I would use the control panel which should at least provide phpMyAdmin. You can then export your schema from your development database and import it to the live version.
Depending on your hosting provider you get a number of databases. Worst is 1 database, with a fixed name, most do 5 or more, with the ability to create your own database name. Often with a prefix.
I would go for the panel from the hoster, all though you can give any SQL statement through PHP.
Why add the complication of PHP for the installation?
Just use raw SQL. Simpler. Fire that into the database.
Use PHP for the interface. Creating tables/stored procedures/triggers etc is a one off event.
I have a web portal, consisting of various projects, based on MySQL database. The project administrators need to administrate their tables. I would like to provide them some already existing, free, simple interface with as cheap setup as possible (like 10-15 minutes max of my time per each new project - IMPORTANT! The number of project administrators and their requests grows...). It should support:
required: table export, import, insert new row, modify existing rows
not needed, but a plus: foreign keys replaced with some other field value from the foreign table (usualy a "dictionary"), display only certain columns, etc.
The problem is that it is a hosting environment, so I have no superuser permissions to the MySQL database. I don't have the grant permission so I must ask my hosting provider to run every grant command, so I want to minimize these requests as it is an above-standard service (it's their grace to me). And these requests would have to be done quite often.
What database administration tools/solutions can I use?
My ideas:
1) MySQL ODBC connector + MS Access as a client. The MS Access would connect via ODBC connector to the MySQL server. I can prepare small MS Access file that would contain a link to desired tables, and also quickly generated forms!
This is cool, however, I would need to contact my provider every time to create db user with desired permissions... to prevent users from changing table structure or destroying other tables...
2) Client -> Proxy -> MySQL server. Like in 1), but with some proxy. I'm now theorizing, but the Access could also use other protocol (e.g. HTTP) to connect some proxy that would handle the permissions and this proxy would then pass it to MySQL server. Does there exist something like that?
3) PHPMyADMIN. The problem from point 1) remains. However, the permission checking could be theoretically implemented on the PHP level here, so no need to change any MySQL permissions! Is PHPMyADMIN capable of that, out of the box? Can I simply configure a new user which can only see table A & B and only modify column C, etc.?
However, the import is not much user friendly (no XLS, only CSV, no delimiter autodetection etc.), as well as inserting new records...
4) There are plenty of modern web tools with spreadsheet-like look like GoogleDocs. Could these be used for the task. Again, in theory the permission checking could be done at the web-server (not database) layer... and set up easily... (?)
I'm sure many people had to solve the same issue, so I'm looking forward your experiences and ideas!
My final solution was a deal with a hosting provider - I asked him to create 5 dummy database users for future usage and also asked him to grant me the GRANT OPTION privilege. So I can configure privileges of those users, without having to ask the hosting provider! I did'n know of this possibility at the time of asking.
And then, I use MS Access with MySQL ODBC Connector as a front-end to the MySQL database. Great1!
i need to create a webapp to show and allow editing for a set of data.
This data is contained in an Access Database file, used by another application (a desktop application).
I'm evaluating the best way to carry out this job.
Unfortunatly my purpose to migrate to another database solution (rdbms such as MySQL or Postgres) was rejected by the customer.
The issue here is how to keep data integrity and syncronized between the server and the desktop that executes the application that also uses this data.
All I need to do is, read data, store edited or new data, give to authorized users an interface to review this new inserted data -thus validating it-, and import this to the original access database.
I've found the following possible solutions (to update the desktop mdb copy), but each of them has pros and cons:
remote access to the windows machine
exposes the machine to unauthorized access
use rsync to keep files syncronized (once a day)
if the mdb on the client has been edited with the desktop application there will be data loss
can be update only when all data has been validated
there won't be real syncronized data (until rsync will run)
client-server applications
can use secure layers to protect data against attackers
a 3rd application (on the desktop) is required
syncronization requires authorized users to use this 3rd application to import data (that will query the remote db and update the local mdb)
Do you know some other way that could help me to get this done?
I'm oriented on the client-server model, also if this would be more expensive, but it's the only way I see to make this work.
Do you see some other pros/cons of the purposed solution?
I didn't choose the PL to develop this, but I was thinking to use either PHP and/or Python.
The remote environment (for the server) can either be Windows or *nix (preferred).
Thanks.
The first idea:
exposes the machine to unauthorized access
This is not really a valid argument. Everything you put on the Internet is exposed. An it is not like it cannot be further protected via SSL/TLS. Even RDP can be secured via a SSH tunnel, for example.
To my mind, the easiest way and most elegant way to do is by using web services (SOAP). Write the server code that does inserts/updates on the Access database with something like a Python or Java. Generate a WSDL from the working code. From the WSDL you can generate a client for PHP/Python. Now all you have to do is to write the web interface that uses the PHP/Python client.
For security using SSL and Basic authentication should be enough (supported by SOAPpy in the case of Python, for example).
You can use pyodbc to connect to the Access database.
well you can use 2 db and syncronize changes with a sort of web service between them.
seperating web server Db (which you could use a modern mysql or whatever) and the current access Db
You should build a sort of a Rest Api returning new or changed records against GET method, Deleting against DELETE method etc. using a timestamp in the http method.
and then you could query at each side with a scheduled job for new records at the other side (transferring with json) resulting in keeping the records relatively insync.
You could take care of security with exposing the application db only in a certain port and only to http queries coming from the webapp server ip address. also using http auth, hashes etc..
if this isn't a heavy load, high concurrency app (which I guess it isn't since you use access as a Db) this should work.
you could build this kind of mini-api with any python webframework like turbogears 2.1,django or the mini frameworks like bottle or flask
p.s If you prefer python (and why wouldn't you) don't use pyodbc directly, work with python beautiful orm - sqlalchemy is much better
I think how this works really depends on the authentication issue and number of users that need to review the data.
The reason I ask?
You can consider using Access 2010 and office 365. This allows you to have linked tables to the cloud, but in fact the tables are also cached local to your Access desktop. This means that real time replication sync of data is used, and this is automatic for Access 2010 (so you don’t' have to write any code).
What this means is while running the Access desktop application, you can pull the plug on the network and it will continue to run. The instant you have a wifi or a connection, then changes local are synced up to office 365. Even better is you can now build web forms in Access.
Data touched or edited (or new records on either side) will come down the pipe to your local computer. So you add reords in Access client, the web users will ALSO see these new reocrds.
So Access 2010 now has web publishing, and this works with the new office 365. The price starts at $6 per month. And if just for a few users, then have them all logon using the same account! This means you can have this all up and running in less time than it took to make this post, and for less then $10 per month!
For those not aware, Access 2010 has web publishing. When you publish the Access forms, then are converted to .net forms (zammel/XAML) forms, and the code is converted to JavaScript. So form code actually runs browser side.
Since the system runs on office 365, then you using some heavy duty iron and you can in theory scale out to millions of users for this setup. When you publish the Access application to office 365, then on the server side you not using mdb or Access files anymore, but what is called Access Web Services. The tables in fact become the equilivant of SharePoint lists . And new for SP 2010 is those lists now have relational features like cascade delete.
The real beauty of this system is you can write and create and do everything inside of Access without have to learn or touch ANY KIND of server side technology. Here is short video of mine, and at the half way point I run the Access application with nothing more than a web browser.
http://www.youtube.com/watch?v=AU4mH0jPntI
There is no activeX or even Silverlight required. In fact my Access applications run fine on a iPad using the safari web browser.
So you could consider to continue using Access, and just publish your application to the web with the new Access 2010 features.
Thats the problem I'm facing:
Given tree servers, each one offers some web services and have a table login like:
[id,username, password, email,...]
My target is to allow the access in each server to the users in the others, keeping the inter-server independence The desired behavior isn't complex:
When a user is registered in one of the servers that user should be added to the other servers without taking too long.
When a user change his pass in one server the others server must reflect that change too.
If two changes collide then keep only the newest change
I have been asked to do this without spending much time so I wonder if there is any standard and easy-to-perform solution to this problem.
All the servers use REST web service with PHP and MySQL.
It is for a shared hosting so I can't perform admin actions like configuring the mySQL server
You can replicate data between databases using MYSQL replication.
Usually it is used to replicate a whole DB but you can use do/ignore and rewrite rules to specify which tables to replicate.
replication filtering rules
replication logging
I have never used MYSQL replication this way so can't help further than this but I know it is possible.
You can create two mysql users.
first user will given write privileges and point to master
second user will given read only privileges, can load balance between the three servers
Change your application when require write, connect mysql using first user.
If require read only, use the second user.
I don't think share hosting is an problem,
pay more money to ask the hosting company to do the necessary configuration (that's obvious)
Or seek for other hosting company that allow administrator access such as AWS.
I never made something similar .
I have a system and i need to relate my data with external data (in another database).
My preference is get these data and create my own tables, but in this case when the other dbs are updated my personal tables will be obsolete.
So, basically i need to synchronize my tables with external tables, or just get the external data values.
I don't have any idea how i can connect and relate data from ten external databases.
I need to check if an user is registered in another websites basically.
Any help?
I am crrently doing something similar.
Easiset way I found is to pull the data in, though I do bi-directional syncronisation in my project you haven't mentionned this so I imagine it's a data pull you are aiming for .
You need to have user accounts on the other servers, and the account needs to be created with an ip instead of 'localhost'. You will connect from your end through mysql client using the ip of distant host instead of the ususal localhost.
see this page for a bit more info.
If, like me you have to interface to different db server types, I recommend using a database abstraction library to ease the managing of data in a seamless way across different sql servers. I chose Zend_db components, used standaline with Zend_config as they Support MySQL and MSSQL.
UPDATE - Using a proxy DB to access mission critical data
Depending on the scope of your project, if the data is not accessible straight from remote database, there are different possibilities. To answer your comment I will tell you how we resolved the same issues on the current project I am tied to. The client has a big MSSQL database that is is business critical application, accounting, invoicing, inventory, everything is handled by one big app tied to MSSQL. My mandate is to install a CRM , and synchronise the customers of his MSSQL mission-critical-app into the CRM, running on MySQL by the way.
I did not want to access this data straight from my CRM, this CRM should not ever touch their main MSSQL DB, I certainly am not willing to take the responsibility of something ever going wrong down the line, even though in theory this should not happen, in practice theory is often worthless. The recommandation I gave (and was implemented) was to setup a proxy database, on their end. That database located on the same MSSQL instance has a task that copies the data in a second database, nightly. This one, I am free to access remotely. A user was created on MSSQL with just access to the proxy, and connection accepted just from one ip.
My scipt has to sync both ways so in my case I do a nightly 'push' the modified records from MSSQL to the crm and 'pull' the added CRM records in the proxy DB. The intern gets notified by email of new record in proxy to update to their MSSQL app. Hope this was clear enough I realize it's hard to convey clearly in a few lines. If you have other questions feel free to ask.
Good-luck!
You have to download the backup (gzip,zip) of the wanted part(or all) of the Database and upload it to the another Database.
Btw. cronjobs wont help you at this point, because you cant have an access to any DB from outside.
Does the other website have an API for accessing such information? Are they capable of constructing one? If so, that would be the best way.
Otherwise, I presume your way of getting data from their database is by directly querying it. That can work to, just make a mysql_connect to their location and query it just like it was your own database. Note: their db will have to be setup to work with outside connections for this method to work.