I am making a system.. an inventory system and I want it to have an online and offline mode.I want to have an offline for it so that I can insert, delete and edit data even there's no internet connection.
Moreover, I want to merge the data from local database with the data from the online database when there is an internet connection. How to do it with Php and mysql?
Can anyone help me to solve this? Do you have an idea on how to do this staff?
You can add uuid (guid), datetime and operation (insert,update,delete) column on every table. Systems must save every successful sync date time to anywhere.
You can sync all table rows after last successful sync when has connection. But two different system changing same row is problem in this methodologies. My be you can design system with insert&delete instead using update.
Related
I'm trying to develop an Android app that provides information on a chosen topic. All the information is stored on a MySQL database with one table for each topic. What I want to achieve is that when the user chooses a topic, the corresponding table should be downloaded to SQLite so that it can be used offline. Also, any changes to that particular table in the MySQL should be in synced with the SQLite db automatically when the phone connects to Internet the next time.
I have understood how to achieve the connection using PHP and HTTP requests. What I wanna know what is the best logic to sync any entries in a particular table in MuSQL database to the one in SQLite. I read about using various sync services but I don't understand how to use them. All my tables have exactly the same schema so is there an efficient way to achieve the sync ?
I have a decent knowledge in SQL but I'm kinda new to Android.
I am executing a sql query using transaction.executeSql() for sqlite db with Javascript (Cordova). I need the last executed query to store in table for future use (sync). I didn't find anything like that. Is there any suggestion?
db.transaction(function(tx) {
tx.executeSql("Insert into tablename (id, name) Values(?,?);", values, function(tx, results)
{
// I need the last executed query here.
});
Data is being updated in the sqlite offline. When user clicks on "Sync" button I will have to push all updates by this user to the server. So, I thought I will store all queries executed by this user in a table. When he/she clicks on sync button I will just execute those query on live mysql.
Edit 1:
This is not for backup purpose only. User can add some new item and update those. all of the other users should get this changes in their db after sync. And noteworthy, communication will be bidirectional. At first, the sqlitedb will be updated from live server, then I will execute all of the saved queries in the live server. This is the plan.
Is there any better approach?
If the size of your database is reasonable, you can compress it and synchronize the ".db.gz" file. If you store each SQL query and sync it, you might end up with something like this
DELETE FROM x where a=1
UPDATE x set a=2,b=3
INSERT INTO x set c=2
But if you sync the ".db" file it would ONLY contain INSERT statements, along with the schema definition. However,On the server side, you would need to efficiently import this into MYSQL.
If it's only for backup purposes, and you wouldn't need to generate any reports or you don't need to merge all those user data, you can survive with just storing the SQLite on the server. But I don't know what exactly is your project about.
I'm using sync in several enterprise apps with backends, this is my way:
Use SSL, if somehow possible.
In my workflow, the devices are registrate by scanning a barcode in the backend website. As a fall back the registration can be done on the device in a form.
I use encoding for the transfer datas. Every device has it's own code. The key comes via barcode/form to the device.
I have the following client tables in the backend:
tblClients: deviceUUID,deviceType, owner, encryptKey, createdDateTime, …
tblClientLogs: Logs all connections between device and backend
tblClientDelete: deviceUUID, tableName, rowUUID
In every table which has to be synced, I have a created and modified column. It's a datetime and created/modified are inserted by triggers.
On a sync task, the device is sending the UUID to the server, the server looks for a valid registration, rows which are created since last sync, rows which are modified since last sync and the row-delete-jobs.
The sync datetime is saved in tblClientLogs
8.The encrypted data, the backends send's to the client are objects and looking like:
tblWhatEver -> create -> {key:value, key:value, …}
or
tblWhatEver -> modify -> rowUUID {key:value, key:value, …}
or
tblWhatEver -> delete -> rowUUID
How the delete jobs are working.
If some data are deleted in the backend, I'm looking for all deviceUUIDs and save in tblDeleteJobs:
deviceUUID, tableName, rowUUID
After a sync task the device-delete-job is deleted in tblClientDelete.
After receiving the data to the device, I decrypt it and create a sql statement for every data. The sql jobs are done by a loop.
After all jobs are done on the device I send a small report to the backend for checking if everything goes well.
In every device table I use UUIDs instead of integers to avoid collisions. On my devices you can send also datas to the server, it's working similar to the «downloads».
Timezone of my apps and the servers are the same, independent of the device location.
In the backend and the apps I have included a force sync from beginning, if something went's wrong.
In some apps I have some more options, there I can change the database, table structure via backend.
I have an online sql database with a few tables for users matches and bets. When I update the result of any game, I need the status of all bets containing that game in the bet table to be updated. So for example if I update game 8 with the result home win I need all bets which have game 8 in them to be updated as either lost, won or still open.
The way I do this currently is that when the user turns on my android app, I retrieve all the information about the games and all the information about the user's bets using asynctasks. I then do some string comparisons in my app and then I update the data in my database using another asynctask. The issue is that this wastes a lot of computation time and makes my app UI laggy.
As someone with minimal experience with php and online databases, I'd like to ask is there a way to carry out these things in the database itself either periodically (every 3 hours for example) or whenever the data in the gamestable is changed using a php file for example which is automatically run?
I tried looking for some kind of onDataChanged function but couldn't find anything. I'm also not sure how to make a php file run and update data without getting the app involved.
Another idea I had was to create a very simple app which I wouldn't distribute to anyone but just keep on my phone with an update button which I could press and trigger a php file to carry out these operations for all users in my database.
I would appreciate some advice on this from someone who has experience.
Thanks :).
You can easily execute php script periodically if your hosting provider supports script executors like Cron.
About updating game status multiple times, first check tables engine. If you are using engine like InnoDB you can create relationship between those tables, so updating status of one row will affect all connected to them.
I have a CSV file with information about our inventory that gets changed locally and then uploaded to my web server at night. The website also has a copy of the inventory information in its MySQL database that might have also changed.
What I want to accomplish is a two-way sync between in the inventory information in the database and the CSV file that's uploaded. Parsing the CSV and extracting the info from the database isn't a problem, but now that I have the two sets of data, I'm struggling to figure out how to sync them.
If a record is different between the CSV and database, how do I know which one to use? I really don't want to resort to having my users time stamp every change they make on the CSV. Is there some way I can tell which information is more current?
Any help is greatly appreciated.
P.S. Just in case you're wondering, I tagged this question PHP because that's the language I'll be using to accomplish the synching.
You should create a time stamp field. And have an application that updates the timestamp overtime the record changes.
I have a similar app done before where multiple sites sync records up and down based on 3 time stamp. One to track when the record was last updated. One to track when the record was deleted. And one to track when the changes was copied to this pc.
Then on every pc, i also track when was the last time the records was.synchronized with each other pc.
This way, the latest record can always be propogated to all the pc.
This is more of a versioning issue. A simple solution would be to compare all 'lines' or 'records' (if you have unique identifiers) and ask the user to pick the right values.
I am building a web-based ERP application for the retail industry using PHP and MySQL. I am going to have different local databases and one on the server(same structure). What I plan to do is run this app in localhost in different stores and at the end of the day update the database on the server from different localhosts in different stores.
Remember, I would like to update the database on the server based on the sequence queries run in different databases.
Can anyone please help me with this?
Thank you.
Perhaps link to your main database from the localhost sites to begin with? No need to update at the end of the day, every change that's made to the database is simply made to the database with no "middle men", so to speak. If you need the local databases separate, run the queries on both at once?
Note: I'm unfamiliar with how an ERP application works, so forgive me if I'm way off base here.
You may have to log every insert/update/delete sql requests in a daily file with a timestamp of your request on local databases.
Example :
2012-03-13 09:15:00 INSERT INTO...
2012-03-13 09:15:02 UPDATE MYTABLE SET...
2012-03-13 09:15:02 DELETE FROM...
...
Then send your log files daily on main server, merge all files, sort them to keep execution order and read new file to execute request on main database.
However, it's a curious way to do thing on ERP application. A product stock information can't be merged, it's a common information, be careful with this kind of data.
You can't use autoincrement with this process, this will cause duplicate key on some request or update requests on bad records.