I'm developing an Android app for salesman, so they can use their device to save their order. Specifically, every morning the salesman would go to the office and fetch the data for that day.
Currently, I can get the data by sending a request to php file, and like common practice we insert those data into sqlite in the Android so it can work offline. However, with current approach the device needs 6-8 seconds on getting the data and inserting those data to sqlite. As the data grow bigger I think it would make it slower. What I had found is that the process of inserting data into sqlite takes quite amount of time.
So, I've been thinking about dumping all data that is needed by the salesman into a sqlite file, so I could send only that file which I guess is more efficient. Can you please lead me on how to do that? Or is there any other way which is more efficient approach for this issue?
Note:
Server DB: Mysql
Server: PHP
You can do here different approach to achieve loading speed:
If your data can be pre-loaded with apk, you can just store those inside .apk and when user download app, it will be there, you just need to call remaining updated data.
If you need refreshed data every time, you can do call chunk of data from server in multiple calls, which will fetch and store data in database and update on UI.
If data is not too much, (I say, if there are 200-300 data, we should not consider it much more) you can do simple thing:
When you call network call for fetching data, you should pass that data objects to database for storing and at same time (before storing in db), just return the entire list object to Activity/Fragment, so it will have the data and you can see those data to user, in mean time, it will store in database.
Also, you can use no-sql in side sqlite, so you don't need to parse objects every time (which is costly compare to no-sql) and store all data in sql as entire object and whenever require, just fetch from db and parse it as per requirement.
Thanks to #Skynet for mentioning transaction, it does improve the process alot.. So I'll stay with this approach for now..
You can do something like so:
db.beginTransaction();
try {
saveCustomer();
db.setTransactionSuccessful();
} catch {
//Error in between database transaction
} finally {
db.endTransaction();
}
For more explanation: Android Database Transaction..
Related
I am creating an application with React as the frontend and PHP as the backend. MySQL is used as the database.
My application is for viewing ECG data, with markers where the R-Peaks are. Some of the data that users will upload can be up to 1GB.
The data needs to be displayed all at once, I can't fetch only half the data and show that because the user might need to see the other half.
What would be the best way to get this data using axios? I have thought about chunking the data on the server side but I'm not sure how I would implement that.
How should the thousands of data points be stored on the frontend?
The data structure is as follows:
{
ECG_Y: [236.1541928, 234.5303428, 241.9536573, 262.3677722, 278.6062727],
ECG_X: [3306.672, 3306.6722, 3306.6724, 3306.6726]
}
ECG_Y is the time axis, and ECG_X is the value for that specific time.
Is there a better way I can store this in MySQL? I plan on storing it as a plain JSON object
first of all.. sorry for my english, i'll try my best!
I have a web application where i show info about Data stored in my Mysql. I Would like to make it more dynamic and if some new information appear in my DB then update my web content without a refresh. I was thinking about Ajax. Every minute send an ajax function asking for some new data... but it's not a good idea 'cause it can be stressing for my server.
What's the best way to do it?
Thanks a lot for your time!
You could use something like Firebase which automatically pushes data to the client when changes occur in the backend database, but that means that you would have to store all your data in Firebase instead of in your Mysql database.
Apart from a service like that, one option is to use an ajax call to fetch new data, but only when new data is available. I don't know how your data is structured, or what kind of data it is, but one solution to minimize the load on your server is to have a database table with a timestamp column that's updated every time relevant data is changed, added or deleted in your database. Use ajax to fetch only this timestamp every minute and if the timestamp has changed since you last checked it, make another ajax call to fetch the actual data.
We have a heart monitor hooked up to to a TI msp430 microcontroller with a roving networks wifi module. I would like to send some type of a datastream to a webserver so that someone could monitor the data offsite. We were thinking that every half second we could send a datapoint to the php/mysql server about what the heart rate is. My problem is storing all that data. If I get one datapoint every second and create a new table entry for each datapoint, then I will start to get a lot entries in my table that contain very little data. I'm afraid this will slow things down significantly when we try to query the database and display the data causing our 'real time' data wouldn't be so 'real time'.
I was then thinking that every hour or something I could have the database batch up all the entries and turn it into one query. This seems to me like a bit of a hack, and I feel like there is a better way that I am missing.
Is there anyway I might be able to open up some type of a connection between the microcontroller to send the live data to the server and continuously write it to a file or something? Like a datastream of some type?
or
Can you keep session variables and whatnot when the microcontroller connects to the server? If we can, then it we could save all the data in a session variable until it gets to a certain size then write a chunk of data to the database with one entry and reset the session variable?
In no way will one data point per second slow down the performance of your database, even if you are running on a very limited server. This is what database abstractions are for, handling large amounts of data. It will actually be better than writing to a file in the long run since it is easy to select the last data point by id for use in your 'real time' application.
I have a site accepts entries from users. I want to create a graph that displays the entries over time. Is it more efficient to make 24 calls to the database have sql return the number of entries per hour or should i just do one call and return all the entries and organize them in php?
Depends on the data, the database schema and the query.
Usually the less queries you can make, the better.
If it's still slow after optimising the query, cache the result in PHP?
I think it depends on your settings, such as database, whether database on the same machine as web server, traffic, ... each call uses some over header on database server, but you do not need sort on web server. I would suggest test it with a loop.
Ok, let's compare:
1) Query the database for all the values. Return a large chunk of data across your network to the application server. Parse all of that data through the client's DBD interface. Build your own data structures to store the data the way you want it. Write, document and maintain client code to loop across the detailed data, adding/updating your data structure with each row.
2) Query the database for the data you want in the format you want it. Let the highly tuned database create the summary buckets. Return less data across the network. Parse less data on the app server. Use the data exactly as it was returned.
There's no "it depends". Use the database.
I have database table with the names of businesses and want to take advantage of some of the api's that are available. What I am wondering is how I could write a PHP script that would take my data (business name and maybe address) and process it against the api storing the resulting data in a mysql database.
I would like to in one script run all the business data in my database against an API storing it back in another table.
Thanks in advance for any and all help!
Creating a caching layer is a good idea to prevent unnecessary API calls, but I'd recommend keeping it out of the database, because then you'll be making a DB call which trades overhead A with overhead B. I'd recommend using Memcached or saving the API data in a txt file even.