I'm using Curl to send information between local and remote MySQL databases in order to keep them synchronised.
Each row that is sent keeps record of wether or not it has been updated in the central database, and keeps a record of the central database id for its transaction.
Is there a better way to keep many local databases synchronised with a central remote server?
PS I cannot use database replication because of shared hosting server restrictions.
You have not really answered Alfallouji's question. Nor have you said anything about any of the contraints to your approach.
You said you want to replicate the data somewhere else, when the primary database is the hosted solution. What access is there to 'somewhere else' from the hosted server? Does somewhere else have a static IP address? Is it always up?
Can you access the binary log on the primary server?
How much replication lag can you accomodate?
There are lots of solutions.....mysql_proxy, mysql's built-in asynchronous replication (optionally with manual copying of replication logs). Multi-master replication. Or at the device level with DRBD, or write a custom dump tool and rsync the results.....
Related
Situation:
Php application with mysql database running on 2 sites
online -static ip X.X.X.X
localhost (not online most of time and dynamic ip)
application traffic is usually low <10 users.
what i need is that whenever a change is done to the online database, this change is pushed to localhost -if its online or when ever its available- and vise versa (any changes done locally is uploaded online to database when ever there is online connection).
is it possible to setup such replication by mysql ? or do i need to write a custom PHP that ping master server and once its available
thanks very much :).
Yes you can do this with replication. Just pick which server you want to be the master and have the second one send all of its changes to the main one then the main one could send its changes back.
Replication can be a bit daunting to set up but once its up and running its grate. http://dev.mysql.com/doc/refman/5.0/en/replication-howto.html
Let's first analyze your question:
The problem of accessing MySQL with a dynamic ip.
This is very easy. Once you installed MySQL on a server with an ever-changing IP, what you can do is go to NO-IP, DynDNS or any other Dynamic DNS service and register for free with them. Once you've registered with them, you would get a client for your operating system. Install that and then you can access your MySQL server using a domain name.
Example:
Instead of having to access your server at 127.0.0.1, you can access it as mysql-server.easynet.net etc.
Now the second and albeit complex part of your question, how to do available and lazy replication.
This is relatively a bit more complex than the previous step. But, what actually happens is that you have to choose a scheme of replication. Basically what you are looking for here is MASTER-MASTER replication since you have a possibility of changes happening at both the MySQL servers. Thus the updates need to be bi-directional, that's what this scheme of replication does. How to do it? Well, I am providing the links which I've found easier to follow:
Master-Master Replication
Step-by-step MySQL Master Replication
I hope that would ease your plight and answer your question!
Cheers!
Sure, you can
You need to setup both MySQL servers as Master and Slave at the same time.
Configure the online server as Master, and the localhost server as slave, and once replication is OK.
Configure the localhost as Master and the online server as slave.
I already did that on two servers.
About the dynamic IP on the local host, simply you can use any dynamic IP service like: no-ip, and use the dns name instead of the IP.
Here's a post i've written (in french, but you can get the configuration snippets from it) for setting up a MASTER-MASTER replication with a load balancer (mysql proxy) for balancing SQL queries between both nodes.
I am creating a software for my client in PHP and MySql, The software will be running in the local network, the software should also run in online that is all the data should be viewed online, i would like to know is there a way to update the mysql database from local to online whenever the internet is connected, if the internet is not connected all the data will be stored in local mysql server. they wont be adding any data in the online server and they will only view the contents. Kindly help me in this
It sounds like you are looking for a way to have a read-only, "online" (assuming WAN) MySQL server which is updated from a read-write, "offline" (LAN) MySQL Server which is updated by your users.
If that's the case, you may want to consider a Master/Slave MySQL Replication configuration:
http://www.rackspace.com/knowledge_center/article/mysql-replication-masterslave
you can run a cron job in your local server which monitor and upload data to online server when internet is available. You need to track (set a status) which data is uploaded which not for synchronization
Setup replication between your mysql databases. The local network database would be the slave and the web server would be the master.
See http://dev.mysql.com/doc/refman/5.5/en/replication-howto.html for setting up replication.
I would suggest you setup master/slave replication.
I would make the local server the master and the internet server the slave.
However, if you are using a common shared hosting service, I am not too sure if you have permission to setup such DB replication structure.
Tryhttps://www.digitalocean.com/community/articles/how-to-set-up-master-slave-replication-in-mysql. However, with the mysql versions pre 5.6 (I think) you can't delay the replication from a predefined interval. However, http://alexalexander.blogspot.com/2013/03/mysql-slave-delay-how-to.html suggests using something called a percona tool kit. I haven't used it though.
I am developing an application whose database needs to be replicated in both directions over multiple number of local offline clients.
Please see the general explanation about it:
User installs client software on personal computer.
User synch data from the remote database server onto local database.
Now User can work on this local database and perform insert, update, delete on it.
At the same time other people can also do insert, update, delete on remote database.
User then connect to remote database and commit their local changes to the remote one.
User retrieve the remote change(Done by other people in remote database) onto their local database.
Right now the idea in my mind is that, We need to track the changes making in remote and local database and creating the web service for replicating the changes, but somewhere I am getting problem with primary keys that might be the same that generated in local and remote database. But this is approach is very lengthy, I also have doubt that it will work on real time or not.
My question is that is there any technology/tool in the MYSQL Server that help me to achieve this task without creating web services. I read out about mysql replication but it work for only one side replication i.e. Master - Slave, I need two sided synchronization.
You may use "Replication concept" between Master and Slave database.
You can read and get some idea about it.
http://learnmysql.blogspot.in/2010/10/setup-and-test-mysql-replication-in-20.html
http://www.percona.com/doc/percona-xtrabackup/2.1/howtos/setting_up_replication.html
I have been looking into creating a custom mySQL Point Of Sales system so that there is one centralised database for inventory levels between multiple stores and online etc.
The biggest problem I see is the unlikely event that the internet drops out in the bricks and mortar stores. If this were to happen, could it be set up so that the POS system is running off a local mySQL database on that computer (using MAMP or something similar) and then once internet is available again, automatically sync the databases to update sales and inventory levels?
In regards to 'how is the actual POS system going to be accessed without internet' I'm was thinking that the POS system would be run on the server when internet is available, and then when the net drops out it would be run from files stores on the machine pointing to the local database on the machine.
Yes, a minimal & viable solution would just be to have all of the POS data entered locally as well as on the remote database, then it serves as a sort of backup in case anything happens to the central DB.
As far as automating the 'fix' of the central DB after an outage, maybe the best way is to have the central system request sales data from the local DBs of each store. If the workflow is setup like this, then you don't really have to do anything 'special' about internet outages.
The problem here is obviously writes. You can use replication to always have a local readable copy of the database, but it's tricky to have multiple masters when using replication. I haven't used MySQL Cluster, but it may be what you need.
But since the problem is writes you can possibly implement the writing part of the POS system as a service you send messages to. When the network is down, queue the messages and send them when online.
An easier solution may actually be to always ensure network stability. Set up some mobile (GSM/3G) connection for failover and possibly even a standard POTS telephone line as well.
MySQL Master/Slave Replication would seem the logical approach.
Your MAMP code works directly against the local (slave) database; and when the MAMP server has access to the master database, the databases stay synchronised. If the slave loses access to the Internet, it's still working locally; and when the internet connection is restored it will synchronise again with the master.
With a little care in the database design (particularly autoincrement pks), you can run multiple local/slave servers all with their own local store data, and the master is a repository of data across all stores.
I have a local intranet application which runs off a basic WAMP server in our offices. Every morning, one of our team members manually syncs our internal mysql db with our external mysql db (where our online enrollments occur). If a change is made during the day on the intranet application, it is not reflected on the external db until the following day.
I am wondering if it is possible to (essentially) tunnel to an external mysql connection from say a wamp or xampp server from within our offices and work in 'real-time'.
Anybody had any luck or advice?
Yes
Replication enables data from one MySQL database server (the master) to be replicated to one or more MySQL database servers (the slaves). Replication is asynchronous - slaves need not to connected permanently to receive updates from the master. This means that updates can occur over long-distance connections and even over temporary or intermittent connections such as a dial-up service. Depending on the configuration, you can replicate all databases, selected databases, or even selected tables within a database.
If you use the external server directly, performance is likely to suffer. A Gigabit LAN might be a thousand times faster than your Internet connection - particularly the upload speed of an ADSL connection.
Just make your internal application use the database from the external one. You may need to add permission to the external server to allow connections from your internal server IP, but otherwise this is just like having a webserver and sperate db server that need to access each other.
Can't really tell you how to do this here - it all depends on your specific configuration, something that I would thing is a little complicated (and too specialized) to figure out on SO.