Situation
I am working on a project (CodeIgniter - PHP MySQL) that sources data from an Ad_API and display those listings in my site. The API talks JSON with loads of data (~1kb) about each single entity. I show around 20 - 30 such entities per page so that's what i request the server (about ~20kb data). The server gives data in a random manner and data can not be requested back for single entity by supplying any identifier.
Problem
I, now have to show more results (200+) with pagination. If it were a MySQL db I was querying thing would be butter but here I cant.
My Argument to Solutions
jQuery Pagination : Yes that is an option but again, i will have to load all 200 data entities on the user's browser at once then paginate them using jQuery
So does anyone have any better solution. Please read the situation carefully before answering because this scenario is very different from the ones we come across in daily life.
How about storing the API data in a MySQL table temporarily just for the pagination purpose?.
You may also be interested in MongoDB and store the JSON data in a MongoDB collection and paginate from it.
Related
Right now I am learning to write an android api in php for some CRUD operation.
I am able fetch data from the database without any issue.
Now I want to fetch 200, 500 and 600 records from three different tables.
At the end I am going to show it the UI by grouping them at their appropriate position.
My question is, should I write multiple php files for fetching the records from each table and send it back to the user separately (obviously I have call the api separately one after the another from the app after getting the response of each of the previous call).
OR
Only one php file where I will fetch all the records from the three tables and send it back to the user in one shot.
I am returning the data in json format.
Please help in figuring out which one of the above method should I use and its advantage or disadvantage if any..
I am working on this project that I cant seem to get right.
Basically the application makes a request to my PHP server and displays the data on a listView.
Due to the large amount of data I have on my server, and the length of time it takes to get all the data on my listview. I decided to implement an OnScrollListener on the android side, to determine if the last item is visible, then I load more data. This I achieved by selecting all the IDs of the data I want to load when the the initial request is made. The IDs are then sorted based on my requirements (time created, points, etc) after which the first five ids are used to select the initial data which is returned to the android app along with the IDs. Then when the last item is visible, i send the next five ids from the list to a function on php which returns data corresponding to the five IDs.
So far this approach works but it is still unsatisfactory due to the amount large amount of data that needs to be processed during the initial request.
I need help with an alternative technique to achieve my objective with minimal delays while performing the initial request or subsequent request.
Any help will be much appreciated.
From what I read in your question, you are loading all the data at the initial request?
I suggest you to did pagination in your server side so you can minimalize the number of data, and call the next portion/page of data only when you need to do it (in this case you can implement it in OnScrollListener)
For more details about pagination
- http://www.phpfreaks.com/tutorial/basic-pagination
- http://www.phpeasystep.com/phptu/29.html
- http://code.tutsplus.com/tutorials/how-to-paginate-data-with-php--net-2928
I have a a table on filemaker that has about 1 million + rows and growing. It has about 30 columns. I need to display this on to the datatables on my PHP page. My research online says FileMaker to PHP is super slow. So, i am trying to get the data to a mongodb collection and then send it to the datatables.
Just wanted to know if its a good architectural decision ?
If yes, is there a good way to get the data from FM to Mongodb ?
If you are OK with not having "live access" to the data in FileMaker then I'd periodically export the entire dataset and import into MongoDB. Unfortunately, that would only give you Read-Only access to the data and is a very poor halfbaked solution.
If you wan to continue to keep FileMaker as your primary datastorage I'd work on making it work better with PHP then attempt to work around by introducing another piece into your infrastructure.
Displaying one million rows on a webpage is going to be slow, no matter what the backend is. Do you want to do batch fetching? Infinite scroll? You can fetch batches straight from FileMaker 500 at a time and it should perform all right.
We've been prototyping a search results system for a mySQL database with about 2 million names and addresses and 3 million associated subscription and conference attendance records.
At the moment the search is executed and all results returned - for each result I then execute a second query to look up subscriptions / conferences for the person's unique ID. I've got indexes on all the important columns and the individual queries execute quite quickly in phpMyAdmin (0.0xxx seconds) but feed this into a webpage to display (PHP, paged using DataTables) and the page takes seconds to render. We've tried porting the data to a Lucene database and it's like LIGHTNING but the bottleneck still seems to be displayng the results rather than retrieving them.
I guess this would be due to the overhead of building, serving and rendering the page in browser. I think I can remove the subquery I mention above by doing GROUP_CONCAT to get the subscription codes in the original query, but how can I speed up the display of the page with the results on?
I'm thinking little and often querying with AJAX / server side paging might be the way to go here (maybe get 50 results, the query is smaller, the page is smaller and can be served quicker) but I welcome any suggestions you guys might have.
Even if you are using pagination with Datatables, all the results are loaded into the page source code at first although you are using the server side feature.
Loading 2 million rows at once will always render slowly. You have to go for server side pagination, it can be by AJAX or by a normal PHP script.
You can also consider using a cache system to speed up the loading of data from the server and avoiding calling the database when it is not needed. If your data can be changing randomly in time, you can always use a function to check whether or not the data has changed since the last time you cached the data and if so, updating the cached data.
Ok so i am new to making complex database structure into the page. I have a basic people table with a few categories. Students, teachers, parents and mods. There are again tables one for parents, students and teachers/mods. Its basically a school's website.
Now for example for a profile page where a student's info is showed to parents. Info like who are teachers, subjects, attendance, homework, etc. This will query a lot of tables. So what is my best bet here? I plan to do it in a web-app way. I was thinking maybe i can JSON data to page with ajax and let javascript do the heavy lifting of calculation. All the tables will only be queried once.
So is it even ok to do so? or will i face some hidden problems when i have dug my feet too deep in it? Maybe i can go one more level deep and make a huge JSON with entire database being sent to user and then it is cached in the browser. I dont have to support really old browsers :)
Also things like attendance and marks/result will need to be calculated every time. Is there any way to cache the result per student? like wen a parent views a student's result, it calculates it once and then caches it for x number of days on the server somehow.
EDIT:
Another non JSON approach i can think of is traditional way. I do all the things on server side with php. This means i wont have to worry about browser compatibility. Maybe i can make a call to all the needed tables in the beginning and store it is an array. That way a table only gets called once too.
EDIT 2
I read somewhere about battlefield 3 battlelog website. It is also made the same way. All the data is pulled from server with JSON and then calculated on client side. If that helps put my idea in perspective.
Probably to move away some misconceptions, but this is more a lengthy comment than an answer:
Your database has tables and fiels. That's perfectly valid. It's the job of a database server to store the data and handle the querying for you. Your own code will never be better in joining tables than the code of the database. Because the database server is optimized for the task.
So I think the idea to query all data and put it into your webapp via JSON is just a bad idea. Instead contact your server if you need specific data, the server will build the related SQL query, fire it up to the database server, get's the result back, converts the result into JSON and sends it back to the browser.
AJAX (Asynchronus Javascript and XML) just allows you to fetch data on the fly after the main portion of the page is painted. All things being equal it might be easier for you to design it as a standard (fetch data then paint the page) layout. If you design the application correctly the calls to do the individual components (Attendance, Teachers, Grades, etc.) can be gathered pre-page-render or post page render. There's obvious security concerns (URL hacking) by going the AJAX route, and I personally design non-AJAX as it's less flukey when things start going wierd.