I am building a web-based ERP application for the retail industry using PHP and MySQL. I am going to have different local databases and one on the server(same structure). What I plan to do is run this app in localhost in different stores and at the end of the day update the database on the server from different localhosts in different stores.
Remember, I would like to update the database on the server based on the sequence queries run in different databases.
Can anyone please help me with this?
Thank you.
Perhaps link to your main database from the localhost sites to begin with? No need to update at the end of the day, every change that's made to the database is simply made to the database with no "middle men", so to speak. If you need the local databases separate, run the queries on both at once?
Note: I'm unfamiliar with how an ERP application works, so forgive me if I'm way off base here.
You may have to log every insert/update/delete sql requests in a daily file with a timestamp of your request on local databases.
Example :
2012-03-13 09:15:00 INSERT INTO...
2012-03-13 09:15:02 UPDATE MYTABLE SET...
2012-03-13 09:15:02 DELETE FROM...
...
Then send your log files daily on main server, merge all files, sort them to keep execution order and read new file to execute request on main database.
However, it's a curious way to do thing on ERP application. A product stock information can't be merged, it's a common information, be careful with this kind of data.
You can't use autoincrement with this process, this will cause duplicate key on some request or update requests on bad records.
Related
Maybe this is an obvious question, but it's just something I'm unsure of. If I have two standalone PHP applications running on one LAMP server, and the two PHP applications share the same MySQL database, do I need to worry about data integrity during concurrent database transactions, or is this something that MySQL just takes care of "natively"?
What happens if the two PHP applications both try to update the same record at the same time? What happens if they try to update the same table at the same time? What happens if they both try to read data from the database at the same time? Or if one application tries to read a record at the same time as the other application is updating that record?
What happens if the two PHP applications both try to update the same record at the same time?
What happens if they try to update the same table at the same time?
What happens if they both try to read data from the database at the same time?
Or if one application tries to read a record at the same time as the other application is updating that record?
This depend from several factor ..
the db engine you are using
the locking policy / transaction you have setted for you envirement .. or for you query
https://dev.mysql.com/doc/refman/8.0/en/innodb-locking-reads.html
https://dev.mysql.com/doc/refman/8.0/en/innodb-locks-set.html
the code you are using .. you could use a select for update for lock only the rows you want modify
https://dev.mysql.com/doc/refman/8.0/en/update.html
and how you manage transaction
https://dev.mysql.com/doc/refman/8.0/en/commit.html
this is just a brief suggestion
I am making a system.. an inventory system and I want it to have an online and offline mode.I want to have an offline for it so that I can insert, delete and edit data even there's no internet connection.
Moreover, I want to merge the data from local database with the data from the online database when there is an internet connection. How to do it with Php and mysql?
Can anyone help me to solve this? Do you have an idea on how to do this staff?
You can add uuid (guid), datetime and operation (insert,update,delete) column on every table. Systems must save every successful sync date time to anywhere.
You can sync all table rows after last successful sync when has connection. But two different system changing same row is problem in this methodologies. My be you can design system with insert&delete instead using update.
I have an online sql database with a few tables for users matches and bets. When I update the result of any game, I need the status of all bets containing that game in the bet table to be updated. So for example if I update game 8 with the result home win I need all bets which have game 8 in them to be updated as either lost, won or still open.
The way I do this currently is that when the user turns on my android app, I retrieve all the information about the games and all the information about the user's bets using asynctasks. I then do some string comparisons in my app and then I update the data in my database using another asynctask. The issue is that this wastes a lot of computation time and makes my app UI laggy.
As someone with minimal experience with php and online databases, I'd like to ask is there a way to carry out these things in the database itself either periodically (every 3 hours for example) or whenever the data in the gamestable is changed using a php file for example which is automatically run?
I tried looking for some kind of onDataChanged function but couldn't find anything. I'm also not sure how to make a php file run and update data without getting the app involved.
Another idea I had was to create a very simple app which I wouldn't distribute to anyone but just keep on my phone with an update button which I could press and trigger a php file to carry out these operations for all users in my database.
I would appreciate some advice on this from someone who has experience.
Thanks :).
You can easily execute php script periodically if your hosting provider supports script executors like Cron.
About updating game status multiple times, first check tables engine. If you are using engine like InnoDB you can create relationship between those tables, so updating status of one row will affect all connected to them.
We have decided that we are going to move from a single database to a replicated database in a master-slave architecture and are going to get all of our reads to go to the slave and writes to the master.
The reason we are going down this route is an addition to our product means that we are getting a large increase in database connections which leads to performance problems with our reporting suite.
We are using MySQL(5.1.55) and the application is developed in PHP.
A couple of general queries on this:
How would you tell the application which db to read from? Would you do it within the PHP? Or just something like mysqldnd_ms or mysql proxy?
Where would ajax requests read from? We have a page which allows users to flag a record. This is then saved in the database and users can see which records have been flagged.
Thanks for any advice.
I have 2 websites that offer exactly the same content just with different layouts, Im currently updating both sites daily putting the same content on both, I want to make a php script or something similar and run it on a cron to automatically copy the DB from one server to the other server but i don't want it to duplicate the content already there, The Database has a unique field for which it can check against.
Thanks
You should have only the 1 database, with 2+ templates (skins) for displaying the content. Duplicating the same data in production, when there are no differences between the data sets, is somewhat pointless.
--EDIT-- May 15, 2012 # 7:06 PM ET
If you REALLY want to maintain duplicate production db's, I would suggest a web service that sends the data from one site to another. You can also think about using a trigger in your db. It really does depend on your setup, where the web server(s) are located, the DB server(s), etc.