Scenario: I have a project which reads data from multiple different databases on different hosts with read-only access. All of them store the same type of information (for demo: Users).
Every database stores the information in their own way (different table names, column names, types etc.)
Now I want to get every User from every database and insert them into my own table 'User'.
Currently I am doing this by having 2 cronjobs - 1 imports Users & 1 updates Users (Select from Read-Only DB and insert into own DB the way I want).
Now I don't really like the idea of having multiple cronjobs running, just to import/update Users - plus I want to have the newest data in my database all the time - which means I need to run the cronjobs atleast every minute.
This with multiple tables are way too many minute-cronjobs IMO.
Question:
Is there any way to achieve the same thing easier? Or is this already the correct way to do so?
Next issue is, that there is in most case no information which data is updated or not - meaning I need to import the same data over and over just to have the newest.
Any help will be gladly appreciated.
Using PHP 7+, Mysql with Symfony & Doctrine - if that helps.
Related
I've got 2 frameworks (Laravel - web, Codeigniter - API) and 2 different databases. I've built a function (on the API) which detect changes on one database (from 2 tables) and apply the changes in the other database.
Note: there is no way to run both web and API on the same databases - so thats why I'm doing this thing.
Anyway, this is important that every little change will recognized. If the case is new record or delete record - its simple and no problem at all. But, if the records exists in both databases - I need to compare their values to detect changes and this section become challenging.
I know how to do this in the slowest and heavy way (pick each record and compare).
My question is - how do you suggest to make it work in smart and fast way?
Thanks a lot.
As long as the mysql user has select rights on both databases, you can qualify the database in the query like so:
SELECT * FROM `db1`.`table1`;
SELECT * FROM `db2`.`table1`;
It doesn't matter which database has been selected when you connected to PHP. The correct database will be used in the query.
The ticks are optional when the database/table name is only alphanumeric and not an SQL keyword.
Depending on the response-time of the 'slave'-database there are a two options which don't increase the overhead too much:
If you can combine both databases within the same database by prefixing one or both of the tables, you can use FOREIGN KEYS to let the database do the tough work for you.
Use the TIMESTAMP-field which you can set to update itself by the DB whenever the row gets updated.
Option 1 would be my best guess, but that might mean a physical change to the running system, and if FOREIGN KEYS are new for you, you might wanna test since they can be a real PITA (IMHO).
Option 2 is easier to implement, but you still have to manually detect changes to deleted/rows.
I have an online sql database with a few tables for users matches and bets. When I update the result of any game, I need the status of all bets containing that game in the bet table to be updated. So for example if I update game 8 with the result home win I need all bets which have game 8 in them to be updated as either lost, won or still open.
The way I do this currently is that when the user turns on my android app, I retrieve all the information about the games and all the information about the user's bets using asynctasks. I then do some string comparisons in my app and then I update the data in my database using another asynctask. The issue is that this wastes a lot of computation time and makes my app UI laggy.
As someone with minimal experience with php and online databases, I'd like to ask is there a way to carry out these things in the database itself either periodically (every 3 hours for example) or whenever the data in the gamestable is changed using a php file for example which is automatically run?
I tried looking for some kind of onDataChanged function but couldn't find anything. I'm also not sure how to make a php file run and update data without getting the app involved.
Another idea I had was to create a very simple app which I wouldn't distribute to anyone but just keep on my phone with an update button which I could press and trigger a php file to carry out these operations for all users in my database.
I would appreciate some advice on this from someone who has experience.
Thanks :).
You can easily execute php script periodically if your hosting provider supports script executors like Cron.
About updating game status multiple times, first check tables engine. If you are using engine like InnoDB you can create relationship between those tables, so updating status of one row will affect all connected to them.
I am currently working on a large application that will be used by many people each having a large amount of data. I thought to manage these multiple users through a single database but I am asked to create a separate database for each new user that is registered. Now what I am wondering is : is it a good idea to do so i.e. having the app create a separate database for each new user that gets registered and manage it's data through it? What will be the performance issues, if any?
Separate DB for each new registered user might be bad idea. You can do it like this;
Put 100 users in each separate db. 1-100 => DB1, 101-200 => DB2, n, n+100 => DBn
You can keep a table for which id interval will be connect to which db. By doing this, you can lower db load. 100 users for each db is just an example. You need to use such a structure for a system that has lots of users.
My answer is create a separate database for each new user is a wonderfool idea.you must set appropiate indexes over the table and you will get good performance
Note: my question is in the last paragraph.
I have multiple sources of files that get inserted into a database (call it process/database A). These files contain the same type of information but in different formats (i.e. different column headers, orders, number of columns, etc.), but when process A puts them into a unified table, and it is nice and neat. I need this data from multiple sources also inserted into another database (process/database B), but I'm not sure what is the best way of doing this. DB B is part of a software we use. It is not open-source, but DB connection can be made.
We already have process A up and running for a while. Process B is something new to improve physical workflow at the warehouse. I think since the data is already unified in process A, it seems to me that I should pull this unified data and insert it into B. This will save me the repetitive work of remapping everything for process B.
My question is, if I want to "sync" these two databases, what would be the optimal approach? It's not exactly "syncing," I suppose, because the two tables (only need to reference one table on each DB) have different columns. I see these approaches..
Check the entire DB's and pull from DB A to insert to DB B for new data. However, DB B has over 50K rows. DB A is much smaller and growing slowly.
Have the user input a date from which to look for new data rows to insert from A to B.
Check the latest date (data rows are dated) in DB B, and insert accordingly.
Do you guys have any inputs? I'm not too familiar with MySQL processing speed, so I'm not sure if approach 1 is a good option. I'm also not sure what some conventions (if any) are for these types of tasks. I imagine it isn't a too-uncommon thing to do. But (1) seems to be a more complete way of doing things. Any comments or alternative options are appreciated. I'd like to keep things in PHP as it will be a feature on a web application. TIA!
Use mysql clustering
Check it : http://en.wikipedia.org/wiki/MySQL_Cluster
I'm looking for guidance. Here is what I'm doing.
I'm creating an advertising/publishing script using PHP and MySQL. At this time, the current program contains 41 million rows (7.5GB).
I'm thinking about storing real-time statistics and other data (users, ads, places, daily/monthly stats) on two mysql databases. And then update data (from real-time db to the 2nd db which shows users their statistics) using a cron thrice a day.
So, will this be an 'UP' or a 'DOWN' (Good or Bad)?
Thanks,
pnm123
I'd probably run a master-slave, and then use the slave as a source for creating the second database you talk about. That should allow you to aggregate results, etc without impacting your main application.