React, MySQL, PHP fetching and storing large data - php

I am creating an application with React as the frontend and PHP as the backend. MySQL is used as the database.
My application is for viewing ECG data, with markers where the R-Peaks are. Some of the data that users will upload can be up to 1GB.
The data needs to be displayed all at once, I can't fetch only half the data and show that because the user might need to see the other half.
What would be the best way to get this data using axios? I have thought about chunking the data on the server side but I'm not sure how I would implement that.
How should the thousands of data points be stored on the frontend?
The data structure is as follows:
{
ECG_Y: [236.1541928, 234.5303428, 241.9536573, 262.3677722, 278.6062727],
ECG_X: [3306.672, 3306.6722, 3306.6724, 3306.6726]
}
ECG_Y is the time axis, and ECG_X is the value for that specific time.
Is there a better way I can store this in MySQL? I plan on storing it as a plain JSON object

Related

Weekly data updates between JSON and MySQL

I'm looking to develop an application which will at the end of the week synchronise the data between the web application's database and the data retrieved from JSON.
I understand the process of pulling data from JSON and I understand how to put that data into the SQL database; what i'm curious about is what is the most efficient or effective way of performing this process?
For example; table one stores 3 records of customers in the database; but at the end of the week; the JSON request shows that 1 customer has been deleted; so one customer will need to be deleted in the database.
Is it suitable to clear all the database entries and then re-insert them into the database with the updated fields? Or is there another way to achieve this effectively using PHP/MySQL? (i.e. placing two tables side by side somehow and comparing the two?)

Reading data via AJAX call or MYSQL DB

Context: I am creating a website where I will need to show statistics. The statistics are calculated in Python and I need a place to store the calculated stats so it can be read and be presented in the website. The statistics are calculated by going through around 70000 JSON files so the calculations are done beforehand. The data is not dynamic so all I need to do is just read the data and therefore there is no writing or changing the data.
Solutions:
MySQL approach: I put the statistics in the DB beforehand and use PHP to connect to the MYSQL database and use SELECT statements to get the data and present it.
AJAX (JavaScript) approach: I put the statistics I need into a JSON file and put the file into my server. I use an AJAX call to get the JSON data and parse it and show the statistics from JavaScript.
Question: Which would be the best approach to take?
If speed is top priority, PHP/MYSQL is definitely faster.
With AJAX, I assume that your 70,000 JSON files are split up, and your AJAX call queries the "right one". Depending on your client, the user experience might be nicer since you can get new data without doing a page refresh.
One "happy medium" solution could be do make an ajax call to a query.php file which does the MySQL/PHP lookup, but returns a JSON object so that you can get the best of both worlds!
use the php/mysql approach
why
faster
doesn't consume a lot of resources on the client side and doesn't slow the browser.

Suggestion on copying data from server database to android device sqlite

I'm developing an Android app for salesman, so they can use their device to save their order. Specifically, every morning the salesman would go to the office and fetch the data for that day.
Currently, I can get the data by sending a request to php file, and like common practice we insert those data into sqlite in the Android so it can work offline. However, with current approach the device needs 6-8 seconds on getting the data and inserting those data to sqlite. As the data grow bigger I think it would make it slower. What I had found is that the process of inserting data into sqlite takes quite amount of time.
So, I've been thinking about dumping all data that is needed by the salesman into a sqlite file, so I could send only that file which I guess is more efficient. Can you please lead me on how to do that? Or is there any other way which is more efficient approach for this issue?
Note:
Server DB: Mysql
Server: PHP
You can do here different approach to achieve loading speed:
If your data can be pre-loaded with apk, you can just store those inside .apk and when user download app, it will be there, you just need to call remaining updated data.
If you need refreshed data every time, you can do call chunk of data from server in multiple calls, which will fetch and store data in database and update on UI.
If data is not too much, (I say, if there are 200-300 data, we should not consider it much more) you can do simple thing:
When you call network call for fetching data, you should pass that data objects to database for storing and at same time (before storing in db), just return the entire list object to Activity/Fragment, so it will have the data and you can see those data to user, in mean time, it will store in database.
Also, you can use no-sql in side sqlite, so you don't need to parse objects every time (which is costly compare to no-sql) and store all data in sql as entire object and whenever require, just fetch from db and parse it as per requirement.
Thanks to #Skynet for mentioning transaction, it does improve the process alot.. So I'll stay with this approach for now..
You can do something like so:
db.beginTransaction();
try {
saveCustomer();
db.setTransactionSuccessful();
} catch {
//Error in between database transaction
} finally {
db.endTransaction();
}
For more explanation: Android Database Transaction..

is it quicker to do 24 database queries or 1 database query and sort in php?

I have a site accepts entries from users. I want to create a graph that displays the entries over time. Is it more efficient to make 24 calls to the database have sql return the number of entries per hour or should i just do one call and return all the entries and organize them in php?
Depends on the data, the database schema and the query.
Usually the less queries you can make, the better.
If it's still slow after optimising the query, cache the result in PHP?
I think it depends on your settings, such as database, whether database on the same machine as web server, traffic, ... each call uses some over header on database server, but you do not need sort on web server. I would suggest test it with a loop.
Ok, let's compare:
1) Query the database for all the values. Return a large chunk of data across your network to the application server. Parse all of that data through the client's DBD interface. Build your own data structures to store the data the way you want it. Write, document and maintain client code to loop across the detailed data, adding/updating your data structure with each row.
2) Query the database for the data you want in the format you want it. Let the highly tuned database create the summary buckets. Return less data across the network. Parse less data on the app server. Use the data exactly as it was returned.
There's no "it depends". Use the database.

Save data from an api call in MySQL with PHP?

I have database table with the names of businesses and want to take advantage of some of the api's that are available. What I am wondering is how I could write a PHP script that would take my data (business name and maybe address) and process it against the api storing the resulting data in a mysql database.
I would like to in one script run all the business data in my database against an API storing it back in another table.
Thanks in advance for any and all help!
Creating a caching layer is a good idea to prevent unnecessary API calls, but I'd recommend keeping it out of the database, because then you'll be making a DB call which trades overhead A with overhead B. I'd recommend using Memcached or saving the API data in a txt file even.

Categories