I am currently helping my company to redesign the CRM.
They plan to extend the CRM to read data from another database which is used by another software.
All database are using mssql technology.
Actually, our company has three brands and each brand's data is stored in different mssql database.
Brand1:
Customer
Invoice
Payment
Brand2:
Customer
Invoice
Payment
Brand3:
Customer
Invoice
Payment
In my new database schema design. A Customer will have several invoices and each invoice receives several payments.
It is ideal to have all data to store in my newly designed db because I can extract the MOST UPDATED payments like this:
SELECT [Everything I want]
FROM Customer
INNER JOIN Invoice WHERE Invoice.customer_id=Customer.id
INNER JOIN Payment WHERE Payment.invoice_id=Invoice.id
But now I need to MAKE CONNECTION TO THREE database, getting data from them, and COMBINE the result to generate data structure like this:
{
customers:[
{
customer_name: cato,
invoices:[
{
invoice_id: 1,
payments: [bla,bla,bla]
}
]
}
]
}
Now, my company thought of using trigger but it is hard to maintain. Is there any better options that can do the job?
yes, sql has a great solution for this,
actually what you need is sql replication,
it gives you the ability to 'copy' tables from remote db to local, first you copy the table to your local and then you set the replication and the remote db will forward all his insert, update etc.. commands your your DB, so in fact, your local replicated table will be always syncronized with the remote one
take a look at it and follow tutorials to do the replication
Related
I am building a SaaS application that has a set of requirements. One of these requirements is that the application has to be multi-tenant with a seperate schema for every tenant. But there has to be a single instance (admin interface) that needs to access all those schemas for data validation.
Example:
We have x customers who will access their application through customerx.our-company.com. Every customer gets its own schema with multiple tables. These tables contain users, invoices, contracts, timesheets,... The customer has to be able to view/modify these records.
We as a company providing the service have to be able to get all the timesheets from all those customers to be able to verify them. Once verified these records will be updated. 'verified' => true
My first idea was to store a tenant_id in a customers table in the admin database to be able to change the database connection in eloquent and merge the records before returning them. This is doable, but I have some questions about performance. What if we grow to 1000+ customers. getting all those records could take a long time.
Dumping all the records in a single schema without providing a separate schema per customer cannot be done because of GDPR. The data of the customer has to be separated.
What would be the best approach for this design wise? Are there some good articles about this subject?
What I began to think while writing this question: It is possible to leverage c++, I can create a custom PHP extension that takes multiple tenant database info to get the records via async tasks std::async, merge those records and return them. std::async creates multiple threads in a pool and I can wait until it is finished. The only big deal will be relations that need to be returned.
Updating those records is a simpler task as we know where it is stored (customer_id) so we can change the DB connection on the fly while updating.
EDIT: I created a simple c++ program that creates async MySQL connections. I've tested this with 10 schemas at once retrieving 1000 records from each. This looks promising, will keep this updated.
I was looking around and couldn't find an answer for this question. If however, it is duplicated in some way, I apologise.
My problem is: I have a physical store, which has a accounting program with some database where my products are. I now created a store with prestashop and wanted to synchronise both of them.
I want to be able to add a REF to my prestashop and take every field from my LOCAL DB (except images and descriptions); I also want to synchronise the STOCKS in "real time".
My idea is having a "middle DB" which takes requests and updates the other 2 DB, as such:
Prestashop DB <-> MIDDLE DB <-> Local DB
My middle DB would have the REFs and Quantities for all the products and check if the latter field is changed in either DB, let's say every 2 mins, to then update the database that did NOT perform the request.
How difficult is it to do this? How can I connect to both databases, using python, c# or c++ and perform such tasks?
I'm using prestashop 1.6. and I have access to my database via PHPmyAdmin.
Thank you!
One of my client has an ecommerce site with a database server catering it.
Now we are going to integrate POS Logic (selling from store through Sales Associate) with eCommerce Site.
In his raw requirements mentioned to use separate physical database server for POS related sale/return transactions in cash-register for POS (and sales associate) specific reports.
He wants business logic to maintain same transactions also on Existing eCommerce Server Database. Why? he answers this as below:
The POS SERVER will keep track of all sales transactions made in the store. Some of the reasons for doing this are:
Offload some retail store reporting to the POS SERVER
Sales associates records are stored on the POS SERVER not the e-commerce server and we need to produce sales associate reports.
We need to provide a failover function in case of network interruption so we need a sales journal to post transactions to on the POS SERVER.
The following database tables store the sales:
sales_transactions_journal
sales_transactions_journal_lines
sales_transactions_journal_sales_associate_chain
My gutt feeling is that executing same database transaction twice, each for each separate physical Database Servers will not be a good idea. I want to oppose this design because i feel that:
if we run same query twice for each DB Server then we will need to write fault-tolerance and synchronization code.
Customer + Store (like, which customer prefer to shop from which store) Specific reports will require to store cusomer information with store-specific server to answer above query. Hence we would not only be copying sale transaction but also customer info for whome sale took place on POS.
I want you to add any more solid concrete points so that i can convince my Project Manager and Client that this will not be a good approach to achieve reporting+load balancing, instead some other techniques like Data Replicatoin, Sharding etc would be better.
I've been working on a web app for a few months now. It's a PHP and MySQL driven database app which relates objects between each other.
I'd like to add functionality so that someone could register to use it and set up a monthly subscription. When they log in the app would simply populate with data from their own database.
I've done some scouring online but I'm struggling to find a starting point for adding this sort of feature.
The app relies on half a dozen tables within a database. So I'm also not sure if creating an individual database per user is practical.
Creating a db per user is very rarely the way to go - it's complicated and has very few benefits with lots of drawbacks (eg maintaining multiple simultaneous Db connections as most libraries only connect to a single Db). What you really need to do is create a user table and then tag various records with a UserId.
When a user registers, create a new User record, then create their new entries in other tables all referencing that Id. (Actually, only the "root" of each relational object needs a UserId eg If you have Order and OrderItems only the Order table needs a UserId)
Without more detail, it's impossible to help further.
Incidentally, the only time you should consider db-per-user is if each user requires unique tables (in which case your app would have to be very complex to actually use the data) or if there are security concerns re: storing data together (more likely with multiple clients than multiple users).
I'm trying to develop a web application for work, using PHP, MySQL and jQuery.
The main aim of the application is to store information about the work we do for our customer, so that we can give them detailed reports at the end of the year.
an example of what I'm trying to accomplish:
a report would be created, using a customers details
entries would be added to this report documenting the work that has been carried out.
then both, report & entries would be inserted into there actual tables
I'm wanting to use temp tables to store the information so that I can use it between pages but looking to see if this is the best way of accomplishing this.
Thanks
D
Temporary tables are deleted automatically when the connection to the database is closed. In PHP, this is usually done after each page request. Use normal (i.e. non-temporary) tables to gather report data for all customers. Use foreign keys to link reports to customers and report entries to reports.