Two Way Sync Logic - php

I have a CSV file with information about our inventory that gets changed locally and then uploaded to my web server at night. The website also has a copy of the inventory information in its MySQL database that might have also changed.
What I want to accomplish is a two-way sync between in the inventory information in the database and the CSV file that's uploaded. Parsing the CSV and extracting the info from the database isn't a problem, but now that I have the two sets of data, I'm struggling to figure out how to sync them.
If a record is different between the CSV and database, how do I know which one to use? I really don't want to resort to having my users time stamp every change they make on the CSV. Is there some way I can tell which information is more current?
Any help is greatly appreciated.
P.S. Just in case you're wondering, I tagged this question PHP because that's the language I'll be using to accomplish the synching.

You should create a time stamp field. And have an application that updates the timestamp overtime the record changes.
I have a similar app done before where multiple sites sync records up and down based on 3 time stamp. One to track when the record was last updated. One to track when the record was deleted. And one to track when the changes was copied to this pc.
Then on every pc, i also track when was the last time the records was.synchronized with each other pc.
This way, the latest record can always be propogated to all the pc.

This is more of a versioning issue. A simple solution would be to compare all 'lines' or 'records' (if you have unique identifiers) and ask the user to pick the right values.

Related

I'm hitting a race condition in my Laravel application when trying to conditionally INSERT or UPDATE, any suggestions...?

My users need to be able to upload files to my site, so I've implemented a file uploader widget on the frontend. It allows for multiple uploads at once, and each upload triggers code one file at a time to save the file to the DB.
The problem is that files need to be stored as an array in a single row in the database (I know, I know... legacy reasons).
In English pseudocode, here's what's happening:
Laravel sees a new file has been uploaded
Laravel checks whether or not any files (at all) have been uploaded to this entity
No files have been uploaded yet? Create a new record to store that file.
There are already files for this entity? Update the existing record to add this file to the array.
The problem is that when multiple files are uploaded at once in quick succession for the first time, Laravel has entered the first file in the database moments after the second file has conducted it's check to see if any files already exist. So we end up with duplicate rows, rather than it updating them in to a single record.
If I upload 5 files at once, typically I'll get 4 rows in the database - 3 single entries and one double-entry, that managed to catch up in time.
Any practical ways to get around this problem? I know I should be using a many-to-one database schema here, but I've greatly simplified an already complex situation for brevity!
This is Laravel 5.2 using a MySQL InnoDB database.
Plan A: When you see one new file, wait a little while. Then look for 'all' the files and deal with them.
Plan B: Store a timestamp with the record. When another file is noticed, see if there is an existing record with a 'recent' timestamp. If so, assume it is part of the same 'group'.
Both Plans will occasionally have hiccups -- mostly because of the vague definition of "at once".
Perhaps you currently have another problem: A file is half uploaded when you grab it, and you get an incomplete file?
The 'real' answer is to have some 'message' uploaded when they are finished. Then, don't start the processing until you see it. (I realize this is probably not practical.)

Merge local and online database using PHP

I am making a system.. an inventory system and I want it to have an online and offline mode.I want to have an offline for it so that I can insert, delete and edit data even there's no internet connection.
Moreover, I want to merge the data from local database with the data from the online database when there is an internet connection. How to do it with Php and mysql?
Can anyone help me to solve this? Do you have an idea on how to do this staff?
You can add uuid (guid), datetime and operation (insert,update,delete) column on every table. Systems must save every successful sync date time to anywhere.
You can sync all table rows after last successful sync when has connection. But two different system changing same row is problem in this methodologies. My be you can design system with insert&delete instead using update.

Retrieving timestamp of MySQL row creation from meta data?

I have been maintaining a MySQL table of rows of data inserted from PHP code for about a year and a half. Really stupid, I know, I didn't include the timestamp for insertion of these rows.
Is there any possible way that I can retrieve the timestamp of creation of these rows through some metadata or some other way? (Of MySQL or PHPMyAdmin or some other possible ways?)
Unfortunately, there's nothing you can do in this case. If MySQL had a secret timestamp field, the general size of tables would increase by 4 bytes per row.
The only way you can get that timestamp, if it is saved somewhere on one of your servers. You have a web server, which you may keep archive of logs for. Or some other place where there is a timestamp of activity of PHP script making requests to the database.
Say you have web server logs and there is an entry for each or most of PHP script activity, then, potentially, you can parse that log, get the timestamp and map it to the rows in your database. As you can see it is quite labourious, but not utterly impossible.
As for MySQL (or any other database) normally they do not keep a big archive of past information. Main reason for that - it is updo developer or designer of the application to decide what information should be kept or not. Database keeps only data needed for all its parts to run healthy.
Just had an idea, that is you have transaction log archive (which I really doubt), then you can re-run them on a back up of a database and may be they (transaction logs) contain timestamp of a row being added or changed.
If you are lucky, you have records in other tables that depend on the record you are interested in. These records may have a timestamp when they were created.
So you have at least a ballpark when the record you care about may have been created.
Other than that, the rate at which the primary key usually grows may provide another estimate when your record was created.
But yes, these are just estimates. Other approaches are mentioned in the other existing answers.
Here is a way to do so :
https://www.marcus-povey.co.uk/2013/03/11/automatic-create-and-modified-timestamps-in-mysql/
I hope this can help

Should I use txt file or store in database in module "who is online"?

I have a project which must check who is online.
I'm thinking of two ways to store data:
1. Save last access time of user in database, get user online from query.
2. save last access time of user, other info in file, get file, delete timeout user. Every time we want to get online, read this file.
I wonder which way is better? Or other better solution? Any suggestion?
Go with the database. Why would you subject yourself to all of the hardship and inefficiencies of coming up with your own file format, then reading it and parsing it every time you wanted to check something?
With most of the database managers, doing a query in a database is faster than reading and processing a file.
If you're really concerned about the performance, you can create another table where you save the user ID as a foreign key and its last log-in time. And when you do a query, you can also delete the users on that table who haven't log-in in a certain amount of time.
use memcache with the ttl set to your timeout threshold then you don't need to worry about deleting them manually.

Ajax/JQuery database access/mutex

I have created an office scheduling program that uses jQuery to post to a PHP file which then inserts an appointment into a pgSQL database. This has not happened yet but I can foresee this problem in the future--two office workers try to schedule an appointment in the same slot at the same time, creating a race condition and one set of customer data would be lost, or at least I'd have to dig it out of a log. I was wondering if there was a flag I could set in the database, if I need to create some kind of gatekeeper program to control server connections, or if there is some kind of mutex/lock/semaphore I can use with javascript/php/sql to keep this race condition from occurring.
You can either lock it with a database flag, or a better strategy is to detect collisions, since this only happens in rare cases.
To detect the problem, you can save a timestamp from the database containing the last updated time. Send this along with the form, and compare the timestamp before you update the record. If the timestamp has changed, then present the user with all the data and ask them what they want to do. This offers a way for the second saving user to modify their changes based on the previously saved data if they wish.
There are other ways to solve this problem, and the proper solution depends the nature of the specific problem.

Categories