Synchronization Central database with multiple local database - php

I have to develop a Project with php mysql for sales management system.There are many
outlet. I want to keep a databse centrally and every outlet have a databse locally. user
entry data to local databse. after a while local data can be uploaded to central databse.
Local data will go to central database, but central data will not go to local database.
what will be the procedure for that. (e.g: Synchronization, Replication)

I wouldn't use syncronisation or replication. I would use an import/export mechanism.
Write a little tool which will export the last day/week/month and than send it with an secure line to your main database for import.

Depending on the specs of your project (size of data, longevity of data, frequency of sync, etc.) you might have to implement a one-way synchronization. E.g. your clients will upload data incrementally where only new changes (no need to re-send all information on each sync) are uploaded to the server.
You can achieve this in various ways. The simple way is uploading your data to the server and removing them from the local storage. If your clients need to keep the uploaded data then introduce the additional field “Dirty” in your tables on the client side and use it for designating the new changes.
Recently I blogged about bi-directional sync algorithm, which includes the upload changes functionality using dirty field, which might be helpful to you.

Maybe SymmetricDS (http://www.symmetricds.org) can solve your problem. We're having a similar problem and we've decided to use it.

Related

Import data from an external sql server database into local database using php

I have a website where I query one of our external sql servers and insert the records into the local server of the website.
I'm simply connecting to the external database, querying the table, truncating the local table, and running a foreach to insert data into the local table.
The process works fine, the problem is that it takes a long time.
I just want to see if you guys could give me some hints on how to speed up the process. If there is another way to do this, please let me know.
There are so many factors to determine the best approach. Is the database you are copying to supposed to always be the same as the source one, or will it have entries that are not in the source one. If you just want them to be identical, and the web site database is simply a read only clone, you have a bunch of ways in SQL Server to do this: replication, log shipping, mirroring, SSIS packages. It all depends on how frequently you want to synchronize the databases and a lot of other factors.

Accessing a database located on another computer using PHP

My client has an offline product database for a high street shop that they update fairly frequently for their own purposes. They are now creating an online store which they want to use product information from this database.
Migrating the database to a hosted server and abandoning the offline database is not an option due to their current legacy software set up.
So my question is: how can I get the information from their offline database to an online database? Their local server is always connected to the internet so is it possible to create a script on the website that somehow grabs the data from their server and imports it into the online server? If this ran every 24 hours it would be perfect. But is it even possible? And if so how would I do it?
The only other option I can think of is to manually upload the database after every update, but this isn't really a viable idea.
I did something like this with quickbooks using an odbc connection. Using that I synced data to MySQL. This synchronization however, was just one way. Unless you have keys in the data that indicates when something was changed (updated date), you will end up syncing alot of extra data.
Using SQLYog, i set up a scheduled job that connected to the odbc data source, and pushed the changes since last sync to the mysql database I was using to generate reports. If you can get the data replicated into MySQL it should be easy at that point to make use of it in your online store.
The downside is that it wont be realtime. Inventory could become a problem.
In an ideal world I would look at creating a restful API that would run on the same server or at least run on the same network as your offline database. This restful API would run as a web server via http and return JSON or even XML structures of data from the offline database. Clients running on the internet would be able to connect and fetch any data they need, at any time. A restful API like this has a number of advantages.
Firstly it's secure. You don't have to open up an attack vector to the public by making connections to your offline database public. The only thing you have to do is enable public access to your restful API. In your API's logic you might not even include functionality to write to the database so even if your API's security is compromised at very worst all attackers can do is read your data, not corrupt it.
Having a restful api in this situation represents a good separation of concerns. Your client code should not know anything about the database nor should it know about any internal systems that the offline database uses. What happens when your clients want to update their offline system or even change it? In this situation all you would have to do is update the restful api. Your client that is connecting to the data no longer cares about anything else but the api so changing databases would be easy.
Another reason to consider an API is concurrency. I hinted at this before but having an API would be great if you ever need to have more than one client accessing the offline databases' data. In a web server set up where you would have the API sitting and waiting for requests there is no reason why you could not have more than one client connecting to the api at the same time. HTTP is really good at this!
You talked about having to place old data in a new database. Something like this could be done easily with a restful API as you would just have to map the endpoints of your API to tables in the new database and run that when you need. You could even forgo the new database and use the API as your backend. This solution would require some caching but it would cut down on the duplication of a database if you don't feel it's needed.
The draw back to all of this is the fact that writing an API over a script is more complex. So in this situation I believe in horses for courses. If this database is the backbone of a long term project that will be expanding in the future an API is the way to go. If its a small part of your project then maybe you can swing it with a script that runs every 24 hours however I have done this before and the second I have to change/edit the solution things start getting a little "hairy". Hope this helps and good luck with it.

Phonegap: populate local database with data from external server

I am writing an android application that needs to work both offline and online. In order to do that I need to get mysql database from external server and save that db to phones localdatabase so it can be read from there if user has no connection. Is it possible and if so, how can I do it?
Refer the following thread in which I have mentioned javascript frameworks which can be used to store data locally. The frameworks are using the standard local storage mechanism but they make it easier to store/retrieve and also make it portable across many platforms (this again depends on selection on the framework).
What to Use for PhoneGap Database Storage
Although you still need to load data from server.
This sounds like you need to write an API which does CRUD operations on a database.
There is no trivial way to simply do this directly with MySQL - you'll have to write one.
There are frameworks designed specifically for this, for example FRAPI: http://getfrapi.com/

connect a database with externals tables

I never made something similar .
I have a system and i need to relate my data with external data (in another database).
My preference is get these data and create my own tables, but in this case when the other dbs are updated my personal tables will be obsolete.
So, basically i need to synchronize my tables with external tables, or just get the external data values.
I don't have any idea how i can connect and relate data from ten external databases.
I need to check if an user is registered in another websites basically.
Any help?
I am crrently doing something similar.
Easiset way I found is to pull the data in, though I do bi-directional syncronisation in my project you haven't mentionned this so I imagine it's a data pull you are aiming for .
You need to have user accounts on the other servers, and the account needs to be created with an ip instead of 'localhost'. You will connect from your end through mysql client using the ip of distant host instead of the ususal localhost.
see this page for a bit more info.
If, like me you have to interface to different db server types, I recommend using a database abstraction library to ease the managing of data in a seamless way across different sql servers. I chose Zend_db components, used standaline with Zend_config as they Support MySQL and MSSQL.
UPDATE - Using a proxy DB to access mission critical data
Depending on the scope of your project, if the data is not accessible straight from remote database, there are different possibilities. To answer your comment I will tell you how we resolved the same issues on the current project I am tied to. The client has a big MSSQL database that is is business critical application, accounting, invoicing, inventory, everything is handled by one big app tied to MSSQL. My mandate is to install a CRM , and synchronise the customers of his MSSQL mission-critical-app into the CRM, running on MySQL by the way.
I did not want to access this data straight from my CRM, this CRM should not ever touch their main MSSQL DB, I certainly am not willing to take the responsibility of something ever going wrong down the line, even though in theory this should not happen, in practice theory is often worthless. The recommandation I gave (and was implemented) was to setup a proxy database, on their end. That database located on the same MSSQL instance has a task that copies the data in a second database, nightly. This one, I am free to access remotely. A user was created on MSSQL with just access to the proxy, and connection accepted just from one ip.
My scipt has to sync both ways so in my case I do a nightly 'push' the modified records from MSSQL to the crm and 'pull' the added CRM records in the proxy DB. The intern gets notified by email of new record in proxy to update to their MSSQL app. Hope this was clear enough I realize it's hard to convey clearly in a few lines. If you have other questions feel free to ask.
Good-luck!
You have to download the backup (gzip,zip) of the wanted part(or all) of the Database and upload it to the another Database.
Btw. cronjobs wont help you at this point, because you cant have an access to any DB from outside.
Does the other website have an API for accessing such information? Are they capable of constructing one? If so, that would be the best way.
Otherwise, I presume your way of getting data from their database is by directly querying it. That can work to, just make a mysql_connect to their location and query it just like it was your own database. Note: their db will have to be setup to work with outside connections for this method to work.

Sync large local DB with server DB (MySQL)

I need to weekly sync a large (3GB+ / 40+ tables) local MySQL database to a server database.
The two databases are exactly the same. The local DB is constantly updated and every week or so the server DB need to be updated with the local data. You can call it 'mirrored DB' or 'master/master' but I'm not sure if this is correct.
Right now the DB only exist locally. So:
1) First I need to copy the DB from local to server. With PHPMyAdmin export/import is impossible because of the DB size and PHPMyAdmin limits. Exporting the DB to a gzipped file and uploading it through FTP probably will break in the middle of the transfer because of connection to the server problems or because of the server file size limit. Exporting each table separately will be a pain and the size of each table will also be very big. So, what is the better solution for this?
2) After the local DB us fully uploaded to the server I need to weekly update the server DB. What the better way to doing it?
I never worked with this kind of scenario, I don't know the different ways for achieving this and I'm not precisely strong with SQL so please explain yourself as good as possible.
Thank you very much.
This article should get you started.
Basically, get Maatkit and use the sync tools in there to perform a master-master-synchronization:
mk-table-sync --synctomaster h=serverName,D=databaseName,t=tableName
You can use a DataComparer for mysql.
Customize the template synchronization, which specify the data which tables to synchronize.
Schedule a weekly update on the template.
I have 2 servers daily synchronized with dbForge Data Comparer via command line.

Categories