Best way to update data in mysql online database - php

I have an online sql database with a few tables for users matches and bets. When I update the result of any game, I need the status of all bets containing that game in the bet table to be updated. So for example if I update game 8 with the result home win I need all bets which have game 8 in them to be updated as either lost, won or still open.
The way I do this currently is that when the user turns on my android app, I retrieve all the information about the games and all the information about the user's bets using asynctasks. I then do some string comparisons in my app and then I update the data in my database using another asynctask. The issue is that this wastes a lot of computation time and makes my app UI laggy.
As someone with minimal experience with php and online databases, I'd like to ask is there a way to carry out these things in the database itself either periodically (every 3 hours for example) or whenever the data in the gamestable is changed using a php file for example which is automatically run?
I tried looking for some kind of onDataChanged function but couldn't find anything. I'm also not sure how to make a php file run and update data without getting the app involved.
Another idea I had was to create a very simple app which I wouldn't distribute to anyone but just keep on my phone with an update button which I could press and trigger a php file to carry out these operations for all users in my database.
I would appreciate some advice on this from someone who has experience.
Thanks :).

You can easily execute php script periodically if your hosting provider supports script executors like Cron.
About updating game status multiple times, first check tables engine. If you are using engine like InnoDB you can create relationship between those tables, so updating status of one row will affect all connected to them.

Related

PHP process all inputs one by one

I'm developing an Online Judge as part of my university project. Done setting up the compiler. But the problem is I want to insert all the user inputs in one queue and compile and execute them one by one. This question may seem funny too you but as a noob I don't even know what to write in google search engine. Tried many ways to search in google. Elaborative ways as I should say. Even facebook groups were no help at all.
I repeat my problem again, suppose there are 100 users submitting 100 codes at the same time. I need to put all of them in a single queue then process them one by one.
If you just tell me the name of this procedure that'll be very much helpful. I'll study the rest.
thank you. please don't get offended for such silly question.
regards!
Create a database (mysql or whatsoever) and a table that meets the data you need to store.
For each form submit, store the data in the table.
Run a cronjob task and fetch rows from the database. Mark these rows as "processing" just once you fetch them and "completed" once you process them.
Voila!

what should i use php mysql or json file to retrieve data?

I am developing online exam system for MCQ based Exam in php and mysql.
My database structure is as follows :
tests table -> Store data when admin create
questions table -> store questions with 5 option and ans
testtaken table -> store data of user give exam (inserting 1 row when
user give new test)
ans table-> store ans given by user
Now at user end i am currently using direct MySQL query from database and display 200 questions using jquery ui tabs.
From Past few weeks i analysed many things for online exam from here and there.
I come to conclusion that for questions which will be around 200 or more i will create .JSON file from database once.When users give exam every time it will not go to database and take data from JSON file.
So Is my desicion is good or using direct mysql quires every time ??
I am doing this for future because mysql will load slowly when many users give tests at a time.
Thanks in advance
EDIT :
- I cant use php pagination with ajax Because we have timer in exam page so user may lose time at time of loading question using
pagination ( May be possible internet problem so high time waste for
users ).
Improve yoir db design u will feel better. For example, u can just store pass fail for users, or keep just wrong answers that can be in another table with user_id. Besides, even if you create a file its going to be unsafe. So again, just compare with sending data and retrieve true/false value.
When your data is not changing that often in the database, there would be no drawback of storing a JSON file and sending that to user every time. Definitely it is faster.
But from a design strategy it wouldn't be the best solution. What I suggest is to store the images in the file system and just store the URL in the database and query the database every time a user needs to start a exam. In this way you have more freedom with data you will provide to the user.
For example in the future if your question bank has 1000 questions and you want to select 200 of them randomly, this design would be more flexible

Event Scheduler to multiple MySQL databases

I am trying to figure out if its possible to have EVENT SCHEDULER take data from one database and update it to another. I checked mysql site and google but there isn't any info on this. I know how to create events and understand the process. I am thinking of using php and mysql but not sure if that would work.
Why I need this:
Database A - gets a lot of traffic
Database B - gets little traffic
I want to run certain queries (counts mostly) to run once a day from Database A. Then store the results inside Database B. This way when some loads a page that has to do a count, it wont run the count query but instead just SELECT * FROM database B. Making it load faster.
I think what you need to do is to replicate your databases!
For example you can specify Database 'A' as your publisher and Database 'B' as your subscriber. There are lot of articles on the internet regarding replicating MySQL databases. Google and find out. Anyway have a look at this question over here MySQL Event Scheduler on a specific time everyday. This might give you an idea on what to achieve.

Update a database on server from multiple local databases

I am building a web-based ERP application for the retail industry using PHP and MySQL. I am going to have different local databases and one on the server(same structure). What I plan to do is run this app in localhost in different stores and at the end of the day update the database on the server from different localhosts in different stores.
Remember, I would like to update the database on the server based on the sequence queries run in different databases.
Can anyone please help me with this?
Thank you.
Perhaps link to your main database from the localhost sites to begin with? No need to update at the end of the day, every change that's made to the database is simply made to the database with no "middle men", so to speak. If you need the local databases separate, run the queries on both at once?
Note: I'm unfamiliar with how an ERP application works, so forgive me if I'm way off base here.
You may have to log every insert/update/delete sql requests in a daily file with a timestamp of your request on local databases.
Example :
2012-03-13 09:15:00 INSERT INTO...
2012-03-13 09:15:02 UPDATE MYTABLE SET...
2012-03-13 09:15:02 DELETE FROM...
...
Then send your log files daily on main server, merge all files, sort them to keep execution order and read new file to execute request on main database.
However, it's a curious way to do thing on ERP application. A product stock information can't be merged, it's a common information, be careful with this kind of data.
You can't use autoincrement with this process, this will cause duplicate key on some request or update requests on bad records.

Using one mysql database or few. What's suits for me?

I'm looking for guidance. Here is what I'm doing.
I'm creating an advertising/publishing script using PHP and MySQL. At this time, the current program contains 41 million rows (7.5GB).
I'm thinking about storing real-time statistics and other data (users, ads, places, daily/monthly stats) on two mysql databases. And then update data (from real-time db to the 2nd db which shows users their statistics) using a cron thrice a day.
So, will this be an 'UP' or a 'DOWN' (Good or Bad)?
Thanks,
pnm123
I'd probably run a master-slave, and then use the slave as a source for creating the second database you talk about. That should allow you to aggregate results, etc without impacting your main application.

Categories