Weekly data updates between JSON and MySQL - php

I'm looking to develop an application which will at the end of the week synchronise the data between the web application's database and the data retrieved from JSON.
I understand the process of pulling data from JSON and I understand how to put that data into the SQL database; what i'm curious about is what is the most efficient or effective way of performing this process?
For example; table one stores 3 records of customers in the database; but at the end of the week; the JSON request shows that 1 customer has been deleted; so one customer will need to be deleted in the database.
Is it suitable to clear all the database entries and then re-insert them into the database with the updated fields? Or is there another way to achieve this effectively using PHP/MySQL? (i.e. placing two tables side by side somehow and comparing the two?)

Related

React, MySQL, PHP fetching and storing large data

I am creating an application with React as the frontend and PHP as the backend. MySQL is used as the database.
My application is for viewing ECG data, with markers where the R-Peaks are. Some of the data that users will upload can be up to 1GB.
The data needs to be displayed all at once, I can't fetch only half the data and show that because the user might need to see the other half.
What would be the best way to get this data using axios? I have thought about chunking the data on the server side but I'm not sure how I would implement that.
How should the thousands of data points be stored on the frontend?
The data structure is as follows:
{
ECG_Y: [236.1541928, 234.5303428, 241.9536573, 262.3677722, 278.6062727],
ECG_X: [3306.672, 3306.6722, 3306.6724, 3306.6726]
}
ECG_Y is the time axis, and ECG_X is the value for that specific time.
Is there a better way I can store this in MySQL? I plan on storing it as a plain JSON object

Comparing JSON Data to MySQL Table, Updating Rows and Inserting New Rows

I'm trying to wrap my head around a problem and would appreciate some advice.
I've built a PHP script that pulls a large amount of JSON data from a URL. It then runs through that data and inputs it in to a MySQL database.
I want to use a Cron job that pulls the JSON data every 2 or 3 hours, and if anything has changed compared to the data in the MySQL table, it updates it. If there are new records, it adds those.
The old system a friend was using would basically pull all of the data every 2/3 hours and overwrite the old data. This is fine for small amounts of data, but it seems super impractical to be writing 10,000-20,000 rows to a table every 2/3 hours.
Each JSON object has a unique identifier - so I was thinking of doing something like:
Pull MySQL table data in to an array;
Pull JSON data in to an array.
Use the unique identifier for each entry in the JSON data to search against the MySQL data. If entries are not the same, update MySQL table. If entry doesn't exist, insert a new row.
I'm looking for some tips on the best way / most efficient and fastest way to do this. I've been told I'm super bad at explaining things so let me know if I need to add any more detail.

Optimize way to fetch data from multiple tables in mysqli PHP

Right now I am learning to write an android api in php for some CRUD operation.
I am able fetch data from the database without any issue.
Now I want to fetch 200, 500 and 600 records from three different tables.
At the end I am going to show it the UI by grouping them at their appropriate position.
My question is, should I write multiple php files for fetching the records from each table and send it back to the user separately (obviously I have call the api separately one after the another from the app after getting the response of each of the previous call).
OR
Only one php file where I will fetch all the records from the three tables and send it back to the user in one shot.
I am returning the data in json format.
Please help in figuring out which one of the above method should I use and its advantage or disadvantage if any..

PHP: How bad of an idea is it to store shared user data in MySQL as a JSON string?

I am working on a simple multiplayer game that may have 2 to 6 people in a game at a time. The game is only slightly more complex than Five-Card Draw poker so it isn't very resource intensive.
In order to make the development fast and easy I am thinking about having a table like this: (simplified)
games
id (int) primary-key auto-inc > unique id
data (text) > for storing a JSON string
Here is what I'm thinking for the process:
AJAX post includes game_id field
Do some permissions checking and validation
I fetch the entry from games with the id equal to game_id
json_decode the data field
More permissions and validation stuff
If changes to the game data are made: json_encode the new data and update the DB
Question: What pitfalls may I encounter doing this?
some people do that, here is what you are going to face
(1) every time you need to update the user data, you will have to decode the json data, edit it and then store it (more data to be retrieved, more data to be stored => more processing)
(2) forget about data aggregation using MySQL queries, you will have to do it manually through PHP or what ever language you use
I can't say it is a bad or good idea to store JSON in MySQL, that depends on the application domain and your use

Creating a news feed realtime

I have a database containing many tables : [followers, favorites, posts ...etc)
These tables define the different activities a user can achieve, he can send posts, add other people to favorites and follow others.
What I want to do.. is to extract data from these tables, and build a real-time news feed .
I have two options:
1- Creating a separate table for the notifications (so that I won't have to get data from multiple tables, then using a Javascript timer to return results every x seconds.
2- using XMPP server...that sends (or pushes) notifications every x second, without actually sending any ajax queries.
And for these two options, I don't know whether I should connect to these tables to get news feed, or just create a separate table for notifications.
I searched in the subject, but I didn't find something really helpful yet, Any links will be appreciated.
If your data is normalized, you should be able to pull all the data with one query (using JOINs), or you could try creating a View if you want to query from one table. It's always best to keep your data in the appropriate tables to avoid having to duplicate data.
The push notifications are easier on the server, since they're not receiving requests from each client. Depending on your load, you could probably get away with simple AJAX requests.
The request for news feed will be very frequently. So you must keep your code run fast and take as less resource (CPU-time, database query) as possible.
I suggest you to take the first option. It meets your requirement and is simple enough.
Because you have many tables, all of them will grow bigger day by day. Every time you connect them to get news feed will take a long time and increase the load of your database. In the other hand, you query sql will be complex.
Like #Curtis Mattoon said: avoid having to duplicate data, but sometime, we need spend more space for less time.
So I suggest to create a new table to store the notification data. You even can delete the old data from this table periodically.
At the same time, your sql and php code for news feed will be simple and run fast.

Categories