I am running the 1 website on my server. That website has a search facility.
When I search for a category named "apple" at that time its showing 10 results and I am getting search result page fine and fast. [RECORDS ARE PAGINATED 10 records per page]
Now, When I search for "orange" named category at that time its giving me an internal server 500 error. It's just because it's trying to pull 300000 records from the database. [After 2 mins I am getting this error on page]
So How can I resolve this issue? I have checked queries too and its all fine. I need to record faster with no errors like internal server.
Is there any way to resolve this?
Please help! Thank you....
You can store your information in another table or field and actualize it on inserting and deleting items
or
if you don't want to change your code, you can implement a cron job, that will actualize it in an interval of time.
Use LIMIT in your database queries to get only the number of rows you need, for example this will select the first 10 rows :
SELECT * FROM "table" WHERE something = "something" LIMIT 10
And you can also use an offset, this will select 10 rows from the 5th row :
SELECT * FROM "table" WHERE something = "something" LIMIT 5,10
More information in the official MySQL documentation.
If you need the total number of rows you can use COUNT :
SELECT COUNT(*) FROM "table" WHERE something = "something"
Related
i'm using this code to genarate html table from mysql tabe. table has 200k rows.
$view->ach = $db->Query("SELECT from_unixtime(`date`), `aav_id`, `aav_domain`, `aav_word`, `aav_referer`, `aav_ip`, `aav_country`
FROM aav_views
where aav_user_id=$USER_ID
ORDER BY date DESC
");
but it's not working. web browser saying
"The page isn’t working
www.mysite.com is currently unable to handle this request.
HTTP ERROR 500 "
(not the 500 internal server error)
i add a limit to sql query like this
$view->ach = $db->Query("SELECT from_unixtime(`date`), `aav_id`, `aav_domain`, `aav_word`, `aav_referer`, `aav_ip`, `aav_country`
FROM aav_views
where aav_user_id=$USER_ID
ORDER BY date DESC
LIMIT 1000
");
now it is working fine. but i need to use without limit. i need to query all 200k rows
The way to handle such a large result set from MySQL is to use something called pagination. With pagination, you might only retrieve records 100 or 1000 at a time. This eliminates the problem of crashing your web server or web page with too much information.
MySQL has two keywords which are well suited to handle this problem. The first one, LIMIT, you already know, and it controls how many total records appear in the result set. The second one is OFFSET, and it specifies the position in the result set from which to begin taking records.
To give you an example, if you wanted to return the second 100 records from your table, you would issue the following query:
SELECT from_unixtime(date), aav_id, aav_domain, aav_word, aav_referer, aav_ip, aav_country
FROM aav_views
where aav_user_id=$USER_ID
ORDER BY date DESC
LIMIT 100 OFFSET 100
Typically, the user controls the offset by paging through a UI containing the results from the query.
My server is running slow as I am trying to fetch 200 records from a MySQl database (using PHP). They are posts that I need to display and I know this is my error because when I try to fetch 1 record it is fast, 200 slows it down tremendously.
Is this a known problem, fetching for too many entries causes a problem?
Your PHP code must be a complicated function looping every time for each record. So it should be running 200 times.. That will slow the page response time. Fetching 200 records in MYSQL is not problem at all. It will run instantly if you run in MySql Terminal..
There are three possibilities that might slow you server down from your side.
Your database is not optimized. Optimizing your database can give you a tremendous performance increase
Your query is doing something wrong. We need to see what query you are running to get the 200 rows.
You are running an individual query for each row in a loop.
What i would suggest though is base your query on this eg.
SELECT fields FROM table WHERE condition = required condition LIMIT 200
Also if that query runs slowly then do an explain to see what indexing its using
EXPLAIN SELECT fields FROM table WHERE condition = required condition LIMIT 200
Because to get 200 rows should take milliseconds
Number of records you can store in your table, that number of records you can fetch.
for unsigned int largest value is 4,294,967,295
for unsigned big int largest value is 18,446,744,073,709,551,615
for access records fast you need to define LIMIT in query.
You should fetch the records needed for displaying, no more no less. Do not fetch records for (simple) calculations, as that can be done in the query.
I would say that displaying 50 ~ 100 records is the max a users brain can scan, getting all the info in the records.
I am an exception, when seeing more then 15 records, my brain tilts :)
I am making a simple message board in PHP with a MySQL database. I have limited messages to 20 a page with the 'LIMIT' operation.
An example of my URL is: http://www.example.com/?page=1
If page is not specified, it defaults to 1. As I mentioned earlier, there is a limit of 20 per page, however if that is out of a possible 30 and I wish to view page 2, I only end up with 10 results. In this case, the LIMIT part of my query resembles LIMIT 20,40 - How can I ensure in this case that 20 are returned?
I would prefer to try and keep this as much on the MySQL side as possible.
EDIT:
To clarify, if I am on page 2, I will be fetching rows 20-30, however this is only 10 rows, so I wish to select 10-30 instead.
EDIT:
I am currently using the following query:
My query:
SELECT MOD(COUNT(`ID`),20) AS lmt WHERE `threadID`=2;
SELECT * FROM `msg_messages` WHERE `threadID`=2 LIMIT 20-(20-lmt) , 40-(20-lmt) ;
There are 30 records this matches.
I'm not sure to really understand the question, but if I do I think that the best practive would be to prevent users to go to a page with no results. To do so, you can easily check how many rows you have in total even if you are using the "LIMIT" clause using SQL_CALC_FOUND_ROWS.
For example you could do:
Select SQL_CALC_FOUND_ROWS * from blog_posts where balblabla...
Then you have to run another query like this:
Select FOUND_ROWS() as posts_count
It will return the total rows very quickly. With this result and knowing the current page, you can decide if you display next/prev links to the user.
You could do a:
SELECT COUNT(*)/20 AS pages FROM tbl;
..to get the max number of pages, then work out if you're going to be left with a partial set and adjust your paging query accordingly.
What I try to do is that I have a table to keep user information (one row for each user), and I run a php script daily to fill in information I get from users. For one column say column A, if I find information I'll fill it in, otherwise I don't touch it so it remains NULL. The reason is to allow them to be updated in the next update when the information might possibly be available.
The problem is that I have too many rows to update, if I blindly SELECT all rows that's with column A as NULL then the result won't fit into memory. If I SELECT 5000 at a time, then in the next SELECT 5000 I could get the same rows that didn't get updated last time, which would be an infinite loop...
Does anyone have any idea of how to do this? I don't have ID columns so I can't just say SELECT WHERE ID > X... Is there a solution (either on the MySQL side or on the php side) without modifying the table?
You'll want to use the LIMIT and OFFSET keywords.
SELECT [stuff] LIMIT 5000 OFFSET 5000;
LIMIT indicates the number of rows to return, and OFFSET indicates how far along the table is read from.
I have a MySQL table that's being updated very frequently. In essence, I'm trying to grab 500 rows with multiple PHP scripts at once, and I don't want the PHP scripts to grab the same rows. I don't to use ORDER BY RAND() due to its server load with thousands of rows.
So, I thought of simply having each script set every row's status as "1" (so it wouldn't be grabbed again). So, I want to grab 500 rows where status = 0 (I use SELECT order by asc), and then have those exact 500 rows set to status "1" so that another script doesn't grab those.
Since the table is being updated all the time, I can't select 500 rows by asc order, and then update 500 rows by asc rows, because by the time it takes to start the script and do SELECT, more rows might be added.
Therefore, I need a way to SELECT 500 rows and then somehow "remember" which rows I selected and update them.
How would I go about doing SELECT and UPDATE quickly like I described?
Generate a unique ID for each script (just a random integer usually works fine for these purposes).
Run an UPDATE table SET status = <my random id> WHERE status = 0 LIMIT 500 query from each process.
Then have each process run a SELECT ... FROM table WHERE status = <my random id> to actually get the rows.
In essence, you "lock" the rows to your current script first, and then go retrieve the ones you successfully locked.