How to structure database calls in a page? - php

Ok so i am new to making complex database structure into the page. I have a basic people table with a few categories. Students, teachers, parents and mods. There are again tables one for parents, students and teachers/mods. Its basically a school's website.
Now for example for a profile page where a student's info is showed to parents. Info like who are teachers, subjects, attendance, homework, etc. This will query a lot of tables. So what is my best bet here? I plan to do it in a web-app way. I was thinking maybe i can JSON data to page with ajax and let javascript do the heavy lifting of calculation. All the tables will only be queried once.
So is it even ok to do so? or will i face some hidden problems when i have dug my feet too deep in it? Maybe i can go one more level deep and make a huge JSON with entire database being sent to user and then it is cached in the browser. I dont have to support really old browsers :)
Also things like attendance and marks/result will need to be calculated every time. Is there any way to cache the result per student? like wen a parent views a student's result, it calculates it once and then caches it for x number of days on the server somehow.
EDIT:
Another non JSON approach i can think of is traditional way. I do all the things on server side with php. This means i wont have to worry about browser compatibility. Maybe i can make a call to all the needed tables in the beginning and store it is an array. That way a table only gets called once too.
EDIT 2
I read somewhere about battlefield 3 battlelog website. It is also made the same way. All the data is pulled from server with JSON and then calculated on client side. If that helps put my idea in perspective.

Probably to move away some misconceptions, but this is more a lengthy comment than an answer:
Your database has tables and fiels. That's perfectly valid. It's the job of a database server to store the data and handle the querying for you. Your own code will never be better in joining tables than the code of the database. Because the database server is optimized for the task.
So I think the idea to query all data and put it into your webapp via JSON is just a bad idea. Instead contact your server if you need specific data, the server will build the related SQL query, fire it up to the database server, get's the result back, converts the result into JSON and sends it back to the browser.

AJAX (Asynchronus Javascript and XML) just allows you to fetch data on the fly after the main portion of the page is painted. All things being equal it might be easier for you to design it as a standard (fetch data then paint the page) layout. If you design the application correctly the calls to do the individual components (Attendance, Teachers, Grades, etc.) can be gathered pre-page-render or post page render. There's obvious security concerns (URL hacking) by going the AJAX route, and I personally design non-AJAX as it's less flukey when things start going wierd.

Related

Question about storing/retrieving data for a complex page

My stack is php and mysql.
I am trying to design a page to display details of a mutual fund.
Data for a single fund is distributed over 15-20 different tables.
Currently, my front-end is a brute-force php page that queries/joins these tables using 8 different queries for a single scheme. It's messy and poor performing.
I am considering alternatives. Good thing is that the data changes only once a day, so I can do some preprocessing.
An option that I am considering is to create run these queries for every fund (about 2000 funds) and create a complex json object for each of them, store it in mysql indexed for the fund code, retrieve the json at run time and show the data. I am thinking of using the simple json_object() mysql function to create the json, and json_decode in php to get the values for display. Is this a good approach?
I was tempted to store them in a separate MongoDB store - would that be an overkill for this?
Any other suggestion?
Thanks much!
To meet your objective of quick pageviews, your overnight-run approach is very good. You could generate JSON objects with your distilled data, or even prerendered HTML pages, and store them.
You can certainly store JSON objects in MySQL columns. If you don't need the database server to search the objects, simply use TEXT (or LONGTEXT) data types to store them.
To my way of thinking, adding a new type of server (mongodb) to your operations to store a few thousand JSON objects does not seem worth the the trouble. If you find it necessary to search the contents of your JSON objects, another type of server might be useful, however.
Other things to consider:
Optimize your SQL queries. Read up: https://use-the-index-luke.com and other sources of good info. Consider your queries one-by-one starting with the slowest one. Use the EXPLAIN or even the EXPLAIN ANALYZE command to get your MySQL server to tell you how it plans each query. And judiciously add indexes. Using the query-optimization tag here on StackOverflow, you can get help. Many queries can be optimized by adding indexes to MySQL without changing anything in your php code or your data. So this can be an ongoing project rather than a big new software release.
Consider measuring your query times. You can do this with MySQL's slow query log. The point of this is to identify your "dirty dozen" slowest queries in a particular time period. Then, see step one.
Make your pages fill up progressively, to keep your users busy reading while you get the data they need. Put the toplevel stuff (fund name, etc) in server-side HTML so search engines can see it. Use some sort of front-end tech (React, maybe, or Datatables that fetch data via AJAX) to render your pages client-side, and provide REST endpoints on your server to get the data, in JSON format, for each data block in the page.
In your overnight run create a sitemap file along with your JSON data rows. That lets you control exactly how you want search engines to present your data.

MySQL or JSON for data retrieval

So, I have situation and I need second opinion. I have database and it' s working great with all foreign keys, indexes and stuff, but, when I reach certain amount of visitors, around 700-800 co-current visitors, my server hits bottle neck and displays "Service temporarily unavailable." So, I had and idea, what if I pull data from JSON instead of database. I mean, I would still update database, but on each update I would regenerate JSON file and pull data from it to show on my homepage. That way I would not press my CPU to hard and I would be able to make some kind of cache on user-end.
What you are describing is caching.
Yes, it's a common optimization to avoid over-burdening your database with query load.
The idea is you store a copy of data you had fetched from the database, and you hold it in some form that is quick to access on the application end. You could store it in RAM, or in a JSON file. Some people operate a Memcached or Redis in-memory database as a shared resource, so your app can run many processes or threads that access the same copy of data in RAM.
It's typical that your app reads some given data many times for every single time it updates the data. The greater this ratio of reads to writes, the better the savings in terms of lightening the load on your database.
It can be tricky, however, to keep the data in cache in sync with the most recent changes in the database. In other words, how do all the cache copies know when they should re-fetch the data from the database?
There's an old joke about this:
There are only two hard things in Computer Science: cache invalidation and naming things.
— Phil Karlton
So after another few days of exploring and trying to get the right answer this is what I have done. I decided to create another table, instead of JSON, and put all data, that was suposed to go in JSON file, in the table.
WHY?
Number one reason is MySQL has ability to lock tables while they're being updated, JSON has not.
Number two is that I will downgrade from few dozens of queries to just one, simplest, query: SELECT * FROM table.
Number three is that I have better control over content this way.
Number four, while I was searching for answer I found out that some people had issues with JSON availability if a lot of co-current connections were making request for same JSON, I would never have a problem with availability.

Should I use API every time I need data or Store all data in database and access whenever needed

This is a design question.
I am creating a web application, where I have a page where I would need to display different sections but not all of a wikipedia page.
Now let's consider this scenario,
I need to display 5 sections of a wikipedia page on my website.
How do I do it?
I would make 5 calls for different sections as below
http://en.wikipedia.org/w/api.php?format=xml&action=parse&page=xyz&prop=text&section=2
I would have database with all 5 sections stored already, just make single call and get all sections. (It would need a lot of db space to keep data stored in my db).
Any other suggestions you might have????
I am looking for a solution which can pull data with high speed.
Thanks.
As far as speed is concern go for local database storing but if memory is concern then go for API call. As per my opinion, I think you should go for local database...

Getting all data once for future use

Well this is kind of a question of how to design a website which uses less resources than normal websites. Mobile optimized as well.
Here it goes: I was about to display a specific overview of e.g. 5 posts (from e.g. a blog). Then if I'd click for example on the first post, I'd load this post in a new window. But instead of connecting to the Database again and getting this specific post with the specific id, I'd just look up that post (in PHP) in my array of 5 posts, that I've created earlier, when I fetched the website for the first time.
Would it save data to download? Because PHP works server-side as well, so that's why I'm not sure.
Ok, I'll explain again:
Method 1:
User connects to my website
5 Posts become displayed & saved to an array (with all its data)
User clicks on the first Post and expects more Information about this post.
My program looks up the post in my array and displays it.
Method 2:
User connects to my website
5 Posts become displayed
User clicks on the first Post and expects more Information about this post.
My program connects to MySQL again and fetches the post from the server.
First off, this sounds like a case of premature optimization. I would not start caching anything outside of the database until measurements prove that it's a wise thing to do. Caching takes your focus away from the core task at hand, and introduces complexity.
If you do want to keep DB results in memory, just using an array allocated in a PHP-processed HTTP request will not be sufficient. Once the page is processed, memory allocated at that scope is no longer available.
You could certainly put the results in SESSION scope. The advantage of saving some DB results in the SESSION is that you avoid DB round trips. Disadvantages include the increased complexity to program the solution, use of memory in the web server for data that may never be accessed, and increased initial load in the DB to retrieve the extra pages that may or may not every be requested by the user.
If DB performance, after measurement, really is causing you to miss your performance objectives you can use a well-proven caching system such as memcached to keep frequently accessed data in the web server's (or dedicated cache server's) memory.
Final note: You say
PHP works server-side as well
That's not accurate. PHP works server-side only.
Have you think in saving the posts in divs, and only make it visible when the user click somewhere? Here how to do that.
Put some sort of cache between your code and the database.
So your code will look like
if(isPostInCache()) {
loadPostFromCache();
} else {
loadPostFromDatabase();
}
Go for some caching system, the web is full of them. You can use memcached or a static caching you can made by yourself (i.e. save post in txt files on the server)
To me, this is a little more inefficient than making a 2nd call to the database and here is why.
The first query should only be pulling the fields you want like: title, author, date. The content of the post maybe a heavy query, so I'd exclude that (you can pull a teaser if you'd like).
Then if the user wants the details of the post, i would then query for the content with an indexed key column.
That way you're not pulling content for 5 posts that may never been seen.
If your PHP code is constantly re-connecting to the database you've configured it wrong and aren't using connection pooling properly. The execution time of a query should be a few milliseconds at most if you've got your stack properly tuned. Do not cache unless you absolutely have to.
What you're advocating here is side-stepping a serious problem. Database queries should be effortless provided your database is properly configured. Fix that issue and you won't need to go down the caching road.
Saving data from one request to the other is a broken design and if not done perfectly could lead to embarrassing data bleed situations where one user is seeing content intended for another. This is why caching is an option usually pursued after all other avenues have been exhausted.

dealing with large amounts of data

I'm building the front end of a website that'll be holding data for users. Type of data is name, email, ethnicity, income, pets etc etc. Each person can have a partner (with the same questions) and an infinite number of children (names, dob, gender etc). The user can sign up, and then must be able to log in in the future to update their details if necessary.
The problem I'm having is things are just getting really messy. It all starts with validation have loops to check how many children there are and add then redisplay and set up validators if there is an error. Inserting all the data is easy enough, but my insert_user function has 30 paramaters so far.
Everything's getting annoying and frustrating. Is there an established way to deal with data like this? I'm thinking propel or doctrine may help, and I've had a play with PEAR's HTML_QuickForm with limited success (it can't handle things like "select your ethnicity" and an input for "other" or unlimited children)
I'm sure I'm not the first to have this trouble so what to others do?
Have a look at Symfony, it will make your life a lot easier. Your datamodel here is pretty simple, but be prepared to learn how Symfony works.
http://www.symfony-project.org/
This is the simplest tutorial I know of : http://articles.sitepoint.com/print/symfony-beginners-tutorial : it should get you up and running in a couple of hours.
When I deal with something like this I start with trying to come up with a good model to make it less complex.
For example, if you have server functions, in php, then you can use ajax calls, in javascript to deal with the display, so, you have separated concerns, so you can focus on having each side do what it does best.
If you want to keep everything in php, and just use form submission, then again, split the two parts so that the parts of the code that deals with display is separate from the api that deals with the database.
This is basically just an MVC structure.
That would be the best way to start, is to go back to your design phase, decide which languages you want to use, and separate the work.
Either way you end up with writing an API to get to the database, and the controller code (which handles getting requests and displaying them) doesn't care what type of database, if there is a database or any of those details, it wants to get the children for a particular person, so that request is given to the API and an array is returned.

Categories