I am developing an application whose database needs to be replicated in both directions over multiple number of local offline clients.
Please see the general explanation about it:
User installs client software on personal computer.
User synch data from the remote database server onto local database.
Now User can work on this local database and perform insert, update, delete on it.
At the same time other people can also do insert, update, delete on remote database.
User then connect to remote database and commit their local changes to the remote one.
User retrieve the remote change(Done by other people in remote database) onto their local database.
Right now the idea in my mind is that, We need to track the changes making in remote and local database and creating the web service for replicating the changes, but somewhere I am getting problem with primary keys that might be the same that generated in local and remote database. But this is approach is very lengthy, I also have doubt that it will work on real time or not.
My question is that is there any technology/tool in the MYSQL Server that help me to achieve this task without creating web services. I read out about mysql replication but it work for only one side replication i.e. Master - Slave, I need two sided synchronization.
You may use "Replication concept" between Master and Slave database.
You can read and get some idea about it.
http://learnmysql.blogspot.in/2010/10/setup-and-test-mysql-replication-in-20.html
http://www.percona.com/doc/percona-xtrabackup/2.1/howtos/setting_up_replication.html
Related
I'm using Curl to send information between local and remote MySQL databases in order to keep them synchronised.
Each row that is sent keeps record of wether or not it has been updated in the central database, and keeps a record of the central database id for its transaction.
Is there a better way to keep many local databases synchronised with a central remote server?
PS I cannot use database replication because of shared hosting server restrictions.
You have not really answered Alfallouji's question. Nor have you said anything about any of the contraints to your approach.
You said you want to replicate the data somewhere else, when the primary database is the hosted solution. What access is there to 'somewhere else' from the hosted server? Does somewhere else have a static IP address? Is it always up?
Can you access the binary log on the primary server?
How much replication lag can you accomodate?
There are lots of solutions.....mysql_proxy, mysql's built-in asynchronous replication (optionally with manual copying of replication logs). Multi-master replication. Or at the device level with DRBD, or write a custom dump tool and rsync the results.....
I have a simple MySQL database (one table with 12 rows of data and 5 columns) sitting on the web-server of my host provider.
On my home PC I create the data programmatically and store it in a free version of SQL Server (on my home PC). I would like to "upload" this data to the MySQL db in real time (or as close as I can get) over the internet (I'm assuming this is the only way to connect the pipes).
I know that opening up a MySQL database to a remote internet connection probably is not a "secure" thing to do, but the resulting data table will be publicly available anyway via an "app" so I'm not too worried about that (I suppose a hacker could "overwrite" my data with their own if they were both industrious and inclined) but I think the risk/reward is so small its not a major concern.
Anyway, what is the easiest way to do this with some semblance of security? I only know how to program in VB (I did a little HTML and ASP back in the day, but that was a long time ago). I could learn a few lines of code in another language if need be.
I do not have a static IP, and I've never actually interacted with a MySQL database before (only SQL server, so my MySQL knowledge/ familiarity is zero...but a db is a db, so how hard can it be?). Because of my home network firewall, I can't allow connections "in". I will have to make the connection to the MySQL db "out" from my home PC --> to the hosted database.
Ok this problem is not actually super simple.
What you will find is most shared hosting providers do not allow just any IP to access their databases.
Solution? set the IP for your computer of course! BUT.....you are probably on home internet connection so your IP address can CHANGE (if you have a static IP you are a lucky person!)
So the best way - create a mini-API!
Basically, you want to post your data to a script (with some security of course) that then inserts this data into the database.
It is a lot of work but having done all this before it seems to be the only way unless you have a dedicated server / advanced access privileges!
You could take a look at WAMP for your home pc. It's simple to use.
And then you should take a look at Mysql remote connections(some details here)
I would try this:
At your local computer install MySQL server, there's a free community
edition available for download, try the web installer since its more lightweight. Use the
custom installation and make sure MySql Workbench is selected too.
Workbench has a migration tool for the most common databases, Try this locally, so you can tell if all your data is correctly migrated from your local SQL Server to a MySQL db and there are no data losses in the process.
Then, You could probably connect through Workbench to your online MySQL db
and migrate your data to it directly from your just created local db. In case you cannot connect, make a backup of your local db and send the files to
your server by ftp or similar process. Then, simply restore DB from the backup file on your
online server.
Hope this helps!
I have a php+mysql application and would like to know what's the best way to make my app work when the internet connection is lost.
I thought of having two identical DataBases, one in my internet host, and other in my localhost. So, when there is no connection, I would store all the data to my localhost.
My question is how can I transfer the data from my localhost to the DataBase in my internet host?
You need to use replication - this link will tell you how to set it up so that your local machine is a replica of the remote machine.
Are you just collecting data like subscriptions? If so, you could just queue up new records temporarily when you detect that your main db is down, and do ordinary inserts and deletes to transfer the unprocessed new records when you detect that both databases are available and any unprocessed records exist locally.
If you need to query and maintain relational data between the two databases, then you'll need a more robust and complex replication strategy.
If you have phpMyAdmin on both servers you can export the localhost database and then import it on the hosted database.
I have been looking into creating a custom mySQL Point Of Sales system so that there is one centralised database for inventory levels between multiple stores and online etc.
The biggest problem I see is the unlikely event that the internet drops out in the bricks and mortar stores. If this were to happen, could it be set up so that the POS system is running off a local mySQL database on that computer (using MAMP or something similar) and then once internet is available again, automatically sync the databases to update sales and inventory levels?
In regards to 'how is the actual POS system going to be accessed without internet' I'm was thinking that the POS system would be run on the server when internet is available, and then when the net drops out it would be run from files stores on the machine pointing to the local database on the machine.
Yes, a minimal & viable solution would just be to have all of the POS data entered locally as well as on the remote database, then it serves as a sort of backup in case anything happens to the central DB.
As far as automating the 'fix' of the central DB after an outage, maybe the best way is to have the central system request sales data from the local DBs of each store. If the workflow is setup like this, then you don't really have to do anything 'special' about internet outages.
The problem here is obviously writes. You can use replication to always have a local readable copy of the database, but it's tricky to have multiple masters when using replication. I haven't used MySQL Cluster, but it may be what you need.
But since the problem is writes you can possibly implement the writing part of the POS system as a service you send messages to. When the network is down, queue the messages and send them when online.
An easier solution may actually be to always ensure network stability. Set up some mobile (GSM/3G) connection for failover and possibly even a standard POTS telephone line as well.
MySQL Master/Slave Replication would seem the logical approach.
Your MAMP code works directly against the local (slave) database; and when the MAMP server has access to the master database, the databases stay synchronised. If the slave loses access to the Internet, it's still working locally; and when the internet connection is restored it will synchronise again with the master.
With a little care in the database design (particularly autoincrement pks), you can run multiple local/slave servers all with their own local store data, and the master is a repository of data across all stores.
I have a local intranet application which runs off a basic WAMP server in our offices. Every morning, one of our team members manually syncs our internal mysql db with our external mysql db (where our online enrollments occur). If a change is made during the day on the intranet application, it is not reflected on the external db until the following day.
I am wondering if it is possible to (essentially) tunnel to an external mysql connection from say a wamp or xampp server from within our offices and work in 'real-time'.
Anybody had any luck or advice?
Yes
Replication enables data from one MySQL database server (the master) to be replicated to one or more MySQL database servers (the slaves). Replication is asynchronous - slaves need not to connected permanently to receive updates from the master. This means that updates can occur over long-distance connections and even over temporary or intermittent connections such as a dial-up service. Depending on the configuration, you can replicate all databases, selected databases, or even selected tables within a database.
If you use the external server directly, performance is likely to suffer. A Gigabit LAN might be a thousand times faster than your Internet connection - particularly the upload speed of an ADSL connection.
Just make your internal application use the database from the external one. You may need to add permission to the external server to allow connections from your internal server IP, but otherwise this is just like having a webserver and sperate db server that need to access each other.
Can't really tell you how to do this here - it all depends on your specific configuration, something that I would thing is a little complicated (and too specialized) to figure out on SO.