I am trying to figure out if its possible to have EVENT SCHEDULER take data from one database and update it to another. I checked mysql site and google but there isn't any info on this. I know how to create events and understand the process. I am thinking of using php and mysql but not sure if that would work.
Why I need this:
Database A - gets a lot of traffic
Database B - gets little traffic
I want to run certain queries (counts mostly) to run once a day from Database A. Then store the results inside Database B. This way when some loads a page that has to do a count, it wont run the count query but instead just SELECT * FROM database B. Making it load faster.
I think what you need to do is to replicate your databases!
For example you can specify Database 'A' as your publisher and Database 'B' as your subscriber. There are lot of articles on the internet regarding replicating MySQL databases. Google and find out. Anyway have a look at this question over here MySQL Event Scheduler on a specific time everyday. This might give you an idea on what to achieve.
Related
Scenario: I have a project which reads data from multiple different databases on different hosts with read-only access. All of them store the same type of information (for demo: Users).
Every database stores the information in their own way (different table names, column names, types etc.)
Now I want to get every User from every database and insert them into my own table 'User'.
Currently I am doing this by having 2 cronjobs - 1 imports Users & 1 updates Users (Select from Read-Only DB and insert into own DB the way I want).
Now I don't really like the idea of having multiple cronjobs running, just to import/update Users - plus I want to have the newest data in my database all the time - which means I need to run the cronjobs atleast every minute.
This with multiple tables are way too many minute-cronjobs IMO.
Question:
Is there any way to achieve the same thing easier? Or is this already the correct way to do so?
Next issue is, that there is in most case no information which data is updated or not - meaning I need to import the same data over and over just to have the newest.
Any help will be gladly appreciated.
Using PHP 7+, Mysql with Symfony & Doctrine - if that helps.
I have an online sql database with a few tables for users matches and bets. When I update the result of any game, I need the status of all bets containing that game in the bet table to be updated. So for example if I update game 8 with the result home win I need all bets which have game 8 in them to be updated as either lost, won or still open.
The way I do this currently is that when the user turns on my android app, I retrieve all the information about the games and all the information about the user's bets using asynctasks. I then do some string comparisons in my app and then I update the data in my database using another asynctask. The issue is that this wastes a lot of computation time and makes my app UI laggy.
As someone with minimal experience with php and online databases, I'd like to ask is there a way to carry out these things in the database itself either periodically (every 3 hours for example) or whenever the data in the gamestable is changed using a php file for example which is automatically run?
I tried looking for some kind of onDataChanged function but couldn't find anything. I'm also not sure how to make a php file run and update data without getting the app involved.
Another idea I had was to create a very simple app which I wouldn't distribute to anyone but just keep on my phone with an update button which I could press and trigger a php file to carry out these operations for all users in my database.
I would appreciate some advice on this from someone who has experience.
Thanks :).
You can easily execute php script periodically if your hosting provider supports script executors like Cron.
About updating game status multiple times, first check tables engine. If you are using engine like InnoDB you can create relationship between those tables, so updating status of one row will affect all connected to them.
I am building a web-based ERP application for the retail industry using PHP and MySQL. I am going to have different local databases and one on the server(same structure). What I plan to do is run this app in localhost in different stores and at the end of the day update the database on the server from different localhosts in different stores.
Remember, I would like to update the database on the server based on the sequence queries run in different databases.
Can anyone please help me with this?
Thank you.
Perhaps link to your main database from the localhost sites to begin with? No need to update at the end of the day, every change that's made to the database is simply made to the database with no "middle men", so to speak. If you need the local databases separate, run the queries on both at once?
Note: I'm unfamiliar with how an ERP application works, so forgive me if I'm way off base here.
You may have to log every insert/update/delete sql requests in a daily file with a timestamp of your request on local databases.
Example :
2012-03-13 09:15:00 INSERT INTO...
2012-03-13 09:15:02 UPDATE MYTABLE SET...
2012-03-13 09:15:02 DELETE FROM...
...
Then send your log files daily on main server, merge all files, sort them to keep execution order and read new file to execute request on main database.
However, it's a curious way to do thing on ERP application. A product stock information can't be merged, it's a common information, be careful with this kind of data.
You can't use autoincrement with this process, this will cause duplicate key on some request or update requests on bad records.
I'm looking for guidance. Here is what I'm doing.
I'm creating an advertising/publishing script using PHP and MySQL. At this time, the current program contains 41 million rows (7.5GB).
I'm thinking about storing real-time statistics and other data (users, ads, places, daily/monthly stats) on two mysql databases. And then update data (from real-time db to the 2nd db which shows users their statistics) using a cron thrice a day.
So, will this be an 'UP' or a 'DOWN' (Good or Bad)?
Thanks,
pnm123
I'd probably run a master-slave, and then use the slave as a source for creating the second database you talk about. That should allow you to aggregate results, etc without impacting your main application.
I am wondering if it is possible to automate or by button press to move mysql table information from one table to another table deleting it from the first table and putting it in another table? Using php.
My mysql table is big and the page that adds the information to that table has 70 query's on it which slows the page refresh times. I need to move information from the first table to the second at a certain time of day everyday so that those querys don't have to look through all of my giant 27k row table.
Is this possible?
Also if someone could help me with my comment on this page I would be grateful.
link text
PHP doesn't have a constantly running server you can schedule background tasks with.
If you have access to the server you can set up a cron job (or scheduled task under windows) to run the PHP script for you.
Or (and this isnt so nice) you can put the script on the webserver and call it manually at the appropriate time by entering the URL in your browser.
A 27k row table is small by SQL standards, as long as it is properly indexed.
For instance, if you don't care about data from yesterday, you can add an indexed date column and filter with WHERE myDate > NOW() - INTERVAL 1 DAY, and SQL will automatically restrict the query to the rows younger than 24 hours.
I am wondering if it is possible to automate or by button press to move mysql table information from one table to another table deleting it from the first table and putting it in another table? Using php.
You can initiate it from PHP, but what you ask is effectively MySQL's domain.
It can be accomplished in two statements:
Use an INSERT INTO statement to copy the rows from the old table to the new one
Delete the old table
My preference would be that this occurs in a stored procedure for sake of a transaction and ease of execution (in case you want it initiated by CRON/etc) because it would be easier to call one thing vs a couple or more.
27k is not very big table and MySQL should work ok with that. Do you have all the required indexes? Did you used EXPLAIN on your slow queries?
As for the question about moving data from one table to another - create a php script that will be run by CRON and will move rows one by one. What's the problem here?