Sql queries - select all or multiple small queries - php

I have been running into hosting issues lately with CPU usage being too high resulting in the host disabling the website (its free hosting, im poor :P) . I programmed the site myself in PHP and is a blog style site. The host has suggested trying to reduce SQL calls. For the home page of my site there are 3 sql queries made to the same table to obtain 7 specific rows each time. The table is for blog posts so as of now it contains around 100 posts but needs to eventually handle having a lot more.
Is it more efficient to do a SELECT * when the page loads then use php to find the specific rows i want or is 3 small SQL queries more efficient?

You shouldn't use * to return all columns in a table. You should only extract the data you need. Even if you require every field, your tables will inevitably change.
If you want to save a certain number of rows for use throughout a page, instead of calling over and over again, could you save the output in an array (or other structure) and reference that each time instead of grabbing the data in 3 separate calls?
Hope I'm understanding your question.
Also, try this link to read about db call speeds: http://coding.smashingmagazine.com/2011/03/23/speeding-up-your-websites-database/

Related

PHP: Simultaneously Query Multiple Databases To Improve Load Time

All, I have a web page that loads data from 4-6 tables in 4 Oracle databases for approximately 10 schemas each on demand. What I have setup is a require at the top of the page such as:
<?php
require_once('src/connects_client1.php');
require_once('src/connects_client2.php');
require_once('src/connects_client3.php');
require_once('src/connects_client4.php');
?>
Each of those files contains a connect string such as:
$connClient = oci_pconnect("Login", "Password", "//database:1521");
And a series of queries and executes such as:
$dailystatusClientSCHEMA = oci_parse($connClient, 'select * from OPS$SCHEMA.DAILY_STATUS order by table_name');
oci_execute($dailystatusClientSCHEMA);
This is then a part of a Jquery based tree view with a series of unordered lists and a PHP foreach loop that prints each row of data appropriately for the query results.
Some of the result sets contain upwards of a thousand rows of data. This works, and displays the data in a usable format.
The problem is it takes 1 minute 52 seconds to load. The end-user for this in the company thinks this is too long, and frankly I think it is too long also. I am not a web developer, but it was most logical for this project to fall within our team so here I am.
Are there any ways that I can force all of the queries to load in a more simultaneous manner so that the data can be returned more quickly? This is a VERY fast Exadata environment, so I don't believe there is any bottleneck at the DB and the web server has light load at any given time so I don't think there's a limit there, just in how I've built this page.
I really appreciate any solution to this one may provide.
Thanks.
there's a bottle neck on select * try changing that to the fields you are actually interested in.
the RDBS has to work out what fields to return, what their names and datatypes are and how to allocate memory. By using named fields, the RDBS does not have to work out what fields are present in the tables you are querying.

problems due to large number of sql queries in a php page

i am working on a project where i need to put large number of sql queries on a single page ..
my question is that is there any problem that i will be having in future if my site gets heavy traffic ...
i do not want my site to slow down..
please suggest some way so that the number of queries does not affect my site performance..
i am working on php
sql query may look like
$selectcomments=mysql_query("select `comment`,`email`,`Date` from `fk_views` where (`onid`='$idselect_forcomments' and comment !='') order by Date asc");
Of course, if your site gets bigger, you will have problems putting everything on one page. That's logic, and you can't change it.
Different solutions:
Pagination: you could create a pagination system (plenty of tutorials out there... http://net.tutsplus.com/tutorials/php/how-to-paginate-data-with-php/)
If it's possible, divide your pages. Don't have all the comments on one and only one page. Try to have different pages, with different type of data, so it'll divide the load.
It's obvious that if your database gets too big, it'll be impossible to simply dump all the data on one page. Even the quickest browsers would crash.
One thing what you can do is Memcached. It will store in cache those results. So for the next visitor, who chick on the same page will read a cached objects from sql not need to run again.
Other trick: order by Date asc if you have huge result, than better, faster to do it in PHP side, those can rally slow down the query if they need to do a full table scan.
Other like Yannik told: pagination ( this is basic, ofc ) and divide pages.
You can speed up the delay from pagination with pre-executing sql with Ajax: get count of total results for pagination.
Yes, obviously if you have a lot of queries on a single page then even in moderate traffic it can flood your database with queries.
Few Tips:
1)You should work on your database structure,how you have created tables,which table stores what,normalization etc.Try to optimise storage and retrieval of information so that in a single query,you fetch max. information.This will reduce the calls to database.
2)Never store and fetch redundant info (like age,that you can calculate from DOB) from database.
3)Pagination (as pointed earlier).
4)Caching
5)If you are updating small portions on your page at a time,then instead of loading entire page,use AJAX to update necessary portions.It will also increase interactivity.

Best way to speed up search result display

We've been prototyping a search results system for a mySQL database with about 2 million names and addresses and 3 million associated subscription and conference attendance records.
At the moment the search is executed and all results returned - for each result I then execute a second query to look up subscriptions / conferences for the person's unique ID. I've got indexes on all the important columns and the individual queries execute quite quickly in phpMyAdmin (0.0xxx seconds) but feed this into a webpage to display (PHP, paged using DataTables) and the page takes seconds to render. We've tried porting the data to a Lucene database and it's like LIGHTNING but the bottleneck still seems to be displayng the results rather than retrieving them.
I guess this would be due to the overhead of building, serving and rendering the page in browser. I think I can remove the subquery I mention above by doing GROUP_CONCAT to get the subscription codes in the original query, but how can I speed up the display of the page with the results on?
I'm thinking little and often querying with AJAX / server side paging might be the way to go here (maybe get 50 results, the query is smaller, the page is smaller and can be served quicker) but I welcome any suggestions you guys might have.
Even if you are using pagination with Datatables, all the results are loaded into the page source code at first although you are using the server side feature.
Loading 2 million rows at once will always render slowly. You have to go for server side pagination, it can be by AJAX or by a normal PHP script.
You can also consider using a cache system to speed up the loading of data from the server and avoiding calling the database when it is not needed. If your data can be changing randomly in time, you can always use a function to check whether or not the data has changed since the last time you cached the data and if so, updating the cached data.

MySQL Query Questions (number of queries sent to the server.)?

The website I am currently developing is handled by a variety of classes that I have developed to ensure the best of my sites needs to run in tip-top shape.
The only thing I can think of right now is, how many questions(queries) should a user sending per page load?
This page does include the following:
Announcement - 5PER PAGE
USER LOGIN/VERIFICATION - EACH PAGE
VIDEO BLOG - CHECKS IF IT'S ENABLED OR OFF
8 queries on average are sent at a maximum of 3KB's per page. Should I be worried or encouraged to keep continuing? :)
It's not about how many questions, it's about smart questions. You could select all records in table A and then use this result to get a matching result from table B, you could also use a single query using a JOIN and get the answer you're looking for.
The query count is not that important, it's all about how these queries help you to get the right anwser and how the queries are executed by the database server. Use EXPLAIN to see how a query is executed and see if something could be optimized.
I know a site (a webshop) that performs quite well while executing a minimum of 400 queries per page and having over 2000 orders a day (let alone the page views).
Although I admit that is running on quite a heavy server and still could use some optimization.
But no, 8 queries is fine, and you could do with some more if you need to.

Catalog entries: Updated html files or on-the-fly from database?

I've got a database site that will serve approximately 1,200 primary entries at launch, with the prospect of adding ~100 new entries per year. Each entry would be composed of ~20 basic values from the database, as well as a calculated average rating, and a variable amount of user comments.
The rating and comments output would have to be generated upon page request, but the rest of the data would be static unless an error correction had been made by an admin.
Considering bandwith, database load, and download times - would it be better to just generate the entire page on the fly with a few queries after a GET or would it be better to have html files and append the ratings & comments then write a refresher script that would update all the html records when run?
In my mind the pros & cons are:
On-the-fly
+ saves hosting space by building pages only when needed
- leaves nothing for a search engine to find?
- slightly slower due to extra queries
HTML & appends
+ search engine friendly
+ slightly faster due to less queries
- uses disk space
- slightly more complex code requirements
Neutral
= templating would be the same either way
Thoughts?
Do whatever's easiest to code: you'll find that in practice all your pros and cons are actually the same for both options.
Personally, given that you will have to go to the database anyway to deliver the comments, I'd go for a completely on-the-fly generated page.
Creating a web page dynamically from 1,200 database records -- in fact, frankly, 1,200,000 database records -- are well within the capabilities of MySQL and PHP on even a moderately specified shared host. There are plenty of examples of sites that use this combination with millions of records so you won't find performance to be an issue for a long time!
And as it happens you'll probably not save hosting space as the database records take up space on the host in the same way that static data does.
A search engine will replicate what a user's browser does. It issues an good ol' HTTP GET request to the root of your site, then analyses each of the links and requests each of them until the spider has got every page that it can. So to make sure that a database-driven site is indexed by a search engine, provide Link text links in your page to each record.
Something as simple as an A-Z list of entries with a grid underneath would do - for an example a site I'm working on at the moment is http://arkive.org where we do just that.

Categories