I am working on this project that I cant seem to get right.
Basically the application makes a request to my PHP server and displays the data on a listView.
Due to the large amount of data I have on my server, and the length of time it takes to get all the data on my listview. I decided to implement an OnScrollListener on the android side, to determine if the last item is visible, then I load more data. This I achieved by selecting all the IDs of the data I want to load when the the initial request is made. The IDs are then sorted based on my requirements (time created, points, etc) after which the first five ids are used to select the initial data which is returned to the android app along with the IDs. Then when the last item is visible, i send the next five ids from the list to a function on php which returns data corresponding to the five IDs.
So far this approach works but it is still unsatisfactory due to the amount large amount of data that needs to be processed during the initial request.
I need help with an alternative technique to achieve my objective with minimal delays while performing the initial request or subsequent request.
Any help will be much appreciated.
From what I read in your question, you are loading all the data at the initial request?
I suggest you to did pagination in your server side so you can minimalize the number of data, and call the next portion/page of data only when you need to do it (in this case you can implement it in OnScrollListener)
For more details about pagination
- http://www.phpfreaks.com/tutorial/basic-pagination
- http://www.phpeasystep.com/phptu/29.html
- http://code.tutsplus.com/tutorials/how-to-paginate-data-with-php--net-2928
Related
I am using an API to fetch certain data but due to its large amount, I can not fetch it each time the user accesses the website. Instead I thought of fetching it once, storing in the database and query whatever the user needs and display it. The point is I might need to send a request to API some other time in order to fetch newly added data and for this issue I thought either scheduling a certain time where the system fetches the data again from the API or manually send a request to obtain the new data.
To clarify my point, here's how I imagine the sequence:
Request for the first time to API to obtain data
Save the data in my DB and query what the user needs, thus
minimizing the time needed to fetch the data and also the number of
request sent to the server
Set automatically/manually a certain time schedule where I would resend
another request to the API to find if new data has been added
My question: Is my approach doable and efficient or could all of that be done in much more easier way?
UPDATE:
I would be using gAdventures API to list different tours along with their details.
Max number of tours returned per request is 100.
To fetch tour details, tour id returned should have a request sent to fetch all of the details concerning it, so we are talking about thousands of trips. Exact size of data is unavailable till now.
I recommend using my Swiss Cache library. It supports file based caching and APC. Even image cache (WIP). If the amount of data is big, my library supports Gzip compression enabled by default. So large amounts of data is automatically compressed.
Example Code of my library:
//Your API request function...
require_once("swiss_cache.php");
$cache = new swiss_cache();
$data = $cache->get("gAdventures_response");
if(!$data)
{
$data = someFunctionThatCallsTheService();
$cache->set("gAdventures_response", $data, 3600); //Save data for one hour.
}
Just make sure you set the cache path in the swiss_cache.php! :) If you have any issues or questions with my library. Just send me a message! The documentation is being worked on.
https://github.com/ajm113/swiss_cache
I have a client that needs to retrieve information from various tables in a mysqli database on a web server. I currently have one AJAX query set up on the client that posts an AJAX request to a php page that queries the database and returns a JSON object. The client then iterates through the resulting object and inserts the necessary data into a similar table in a local database. However this query only contains information from one table. I am going to need to populate multiple tables in the client database. Eventually there could be a large amount of requests to the web server occurring at one time when populating these database. The possible solutions that I've come up with are as follows:
Design multiple ajax queries on the client that each post to the same php page with separate handler queries depending on the type of post received so that different JSON objects are returned for the client to iterate and insert into the local database
(Many AJAX -> 1 PHP)
Design many AJAX queries that each post to a different php page with a single resulting JSON for each AJAX/PHP query to reduce the traffic on any single PHP page.
(Many AJAX -> Many PHP)
Design 1 large AJAX query that makes a single request to the PHP page and returns all the necessary information from the database and have the client insert the different pieces it needs into the local database.
(1 AJAX -> 1 PHP)
Do any of these ideas seem better than the others?
I see flaws in all of them so I'm wondering if there is already an ideal solution to minimize the amount of work done on the client as well as reduce the traffic/maintenance on the server that someone may know of. Any help/criticism is appreciated.
Options 1 and 3 are not mutually exclusive: your server-side code can return small (or partial, depends on how you see it) data sets as in option 1 and at the same time reply to a different type of request with the full data set. The client-side code decides what query to make depending on information that it has regarding the operation being performed. For example, it might use one query if the requested operation was "update one record" and another if the operation was "update all records".
Broadly speaking that's what I would do, so I recommend to leave yourself the flexibility to choose the appropriate query type down the road.
I'm rewriting a data visualisation web tool which was written 3 years ago. Since that time, JavaScript engine of browser have become way faster so i was thinking to transfer part of the job from server to client.
On the page, data is visualized in a table and in a map (or chart), it's using the same data, but in a different way so the two algorithm to prepare the data for display are different.
Before at every interaction of the user with the data dropdown selectors (3 main + 2sub depending on the 3 main), 3 ajax request were sent, php doing all the work and sending back only necesary data (in html for the table/xml for the chart) very tiny response, no performance issue and javascript was appending response and doing not much more than chasing change events.
So performance was ok but at every single change of criteria user has to wait for ajax response :/
Now my idea is to send back a json object in one ajax request, only at every change of the main 3 criteria combination and then have javascript populating the data in the table and the chart/map on ajaxsuccess and then also on change of the 2 sub criteria.
My hesitation concerns the structure of the json send by the server, the balance of the payload.
Indeed, if there were only one algorithm necessary to create the wanted json structure to display the data from the raw data, i would have php processing the data into this object ready for javascript to deal with it without any additional treatment; but there are 2.
So
if I make php process the data to create 2 objects (one for table/one for chart), I will double the size of the json response & increase memory usage on the client side; i don't like this approach because this two object contain the same data, just structured differently & redundancy is evil, isn't it ?
if i send the raw object and let javascript search for what to display and where i'm giving lot of job to the client - this also at every subcriteria change (or i could create all the json objects once on ajaxsuccess so they are ready in case of this subcriteria change ?)- here i'm little worry for users with old browser/small ram...
(The raw json object untreated, depending on criteria vary between 3kb and 12kb, between 500 and 2000 records)
I'm failing to spot the best approach...
So for this single raw data to multiple structured objects job, would you have php (increasing response size and sending redundant data) or javascript (increasing javascript payload) processing the raw data ?
Thanks a ton for your opinion
I found an appropriate solution, so I will answer my own question.
I have followed #Daverandom's advice:
PHP sends raw data (along with a couple of parameters that depends on the combination of the main criteria)
JavaScript processes the raw data and render it in the page
JavaScript reprocesses the raw data if sub-criteria are changed, as upon testing the looping process appears to be very fast and doesn't freeze the browser whatsoever, so there is no need to keep the structured object in the scope
Aggressive caching headers are sent with the JSON AJAX response (those data never change - only new records are added every year) in case user re-consults data that has already been consulted: so raw data is not kept in the JavaScript scope if it is not being displayed
On top of that, the JSON strings echoed by php are cached on the server (because those data never change) so this reduces database queries and improves response time
The final code is neat, easy to maintain, and the application works flawlessly.
Thanks to #Daverandom for the help.
Disclaimer: I'm familiar with PHP, MySQL, jQuery, AJAX, but am by no means an expert in any of them.
I'm working on an web application that checks updates to a MySQL database every two seconds. Currently, there are 5 tables and for the sake of discussion we can assume each has less than 50 rows. The design I inherited was to refresh 5 iframes every two seconds (roughly each iframe corresponds to a table).
I've since attempted to improve upon this by replacing the iframes with divs. Checking the UPDATE_TIME in INFORMATION_SCHEMA and only updating the divs in which the content has changed since the save previous UPDATE_TIME. To do this, I use a jQuery AJAX call to get the new data from a PHP script. The problem with this strategy is that an external program is updating the database asynchronously so it is possible that it could make multiple updates within a second.
This question is very similar to other questions with the exception that the whole second resolution provided by the UPDATE_TIME is not enough in my case if I'm to base my updates solely .
Query to find tables modified in the last hour
Any solutions would be greatly appreciated!
I had a similar implementation. The tables which I needed to fetch from had a column which says CREATED_TIME. This column could be inserted with CURRENT_TIMESTAMP using the column default value or from the application.
Initially once you load the contents to the div, keep a javascript variable corresponding to each div which stores the CLIENT_MAX(CREATED_TIME). Each time when you need to update the div the latest rows, follow this step:
Request to server with the CheckIfTable1Updated?maxTime=CLIENT_MAX(CREATED_TIME) value using ajax.
The server should fetch the SERVER_MAX(CREATED_TIME) value from Table1 and compare with the value send by the client.
If the max value in the table is greater than the value send by client, the response should be send with the SERVER_MAX(CREATED_TIME), otherwise send 0.
If client receives a 0, do nothing.
If client receives a value greater than zero, i.e. the SERVER_MAX(CREATED_TIME), call the server with ajax- 'RetrieveTable1Updates?fromTime=CLIENT_MAX(CREATED_TIME)&toTime=SERVER_MAX_TIME_JUST_RECEIVED'
Server handles this, fetches the rows with the constraint BETWEEN CLIENT_MAX_TIME AND SERVER_MAX_TIME_ACCORDING_TO_CLIENT.
Server sends the html elements.
Client receives the data, appends to the corresponding div.
Makes the CLIENT_MAX(CREATED_TIME) as SERVER_MAX(CREATED_TIME), received in the first ajax call.
Note: This can be handles with the row_id too which would be much easier than timestamp handling as BETWEEN would need CLIENT_MAX_TIME + 1 to handle duplication.
Use HTML 5 sockets along with jquery to get maximum benefit
I have an area which gets populated with about 100 records from a php mysql query.
Need to add ajax pagination meaning to load 10 records first, then on user click to let's say "+" character it populates the next 20.
So the area instead of displaying all at once will display 20, then on click, then next 20, the next... and so on with no refresh.
Should I dump the 100 records on a session variable?
Should I call Myqsl query every time user clicks on next page trigger?
Unsure what will be the best aproach...or way to go around this.
My concern is the database will be growing from 100 to 10,000.
Any help direction is greatly apreciated.
If you have a large record set that will be viewed often (but not often updated), look at APC to cache the data and share it among sessions. You can also create a static file that is rewritten when the data changes.
If the data needs to be sorted/manipulated on the page, you will want to limit the number of records loaded to keep the JavaScript from running too long. ExtJS has some nice widgets that do this, just provide it with JSON data (use PHP's encode method on your record set). We made one talk to Oracle and do the 20-record paging fairly easily.
If your large record set is frequently updated, and the data must be "real time" accurate, you have some significant challenges ahead. I would look at comet, ape, or web workers for polling/push solution and build your API to only deal in updates to the "core" data--again, probably cached on the server rather than pulled from the DB every time.
your ajax call should call a page which only pulls the exact amount of rows it needs from the database. so select the top 20 rows from the query on the first page and so on. your ajax call can take a parameter called pagenum and depending on that is what records you actually pull from the database. no need for session variables.