mysql_fetch_array() and output in chunks - php

So I have a table with 45 records (but can be dynamic) and I use mysql_fetch_array() to get the data from the database. What is the best way to output 5 records at a time? So I need to do record 1-5, then have a link for records 6-10, 11-15, and so on. I thought about doing something with array_chunk but not sure how to keep track of the record number. Thanks for hints.

To get the first 5 results form a table:
SELECT * FROM table ORDER BY table.column_name ASC LIMIT 0, 5
Selects from `table`
Ordered by the column name Ascending
Limit 0,5 selects the first 5 results, starting at 0.
Change LIMIT 0,5 to 5,5 to list results 6-10 (start at record 5, and continue for 5 records.)
Ordering is just good practice to ensure consistency. Under most circumstances set this to 'id' if you have an auto-increment 'id' column. If you want results sorted by date, order by a timestamp column. If you want data reversed, order by DESC.
You can keep track of where your queries are though PHP Sessions, Passing GET parameters, temporary database tables, and probably a few more I missed.
Other solution:
Use the array returned from the mysql_fetch_array() and utilite http://php.net/manual/en/language.types.array.php
The obvious disadvantage to this approach is the fact that it fetches all rows in the table. This is okay if you'll NEVER have more than a manageable number of rows. In your case, 45 should be fine, assuming they're not gigantic rows. This approach may also may useful if you want data pre-loaded.

I'd suggest using limits and incremental offsets in your query. Your first query would then be:
select * from TABLE limit 0,5;
Your link has a parameter referencing the next offset so the next query would be:
select * from TABLE limit 5,5;
And so on.

You need in your query LIMIT 0,5. Search web for php paginator.

Related

How to stop mysqli duplicate on random select [duplicate]

This is a problem with a ordering search results on my website,
When a search is made, random results appear on the content page, this page includes pagination too. I user following as my SQL query.
SELECT * FROM table ORDER BY RAND() LIMIT 0,10;
so my questions are
I need to make sure that everytime user visits the next page, results they already seen not to appear again (exclude them in the next query, in a memory efficient way but still order by rand() )
everytime the visitor goes to the 1st page there is a different sets of results, Is it possible to use pagination with this, or will the ordering always be random.
I can use seed in the MYSQL, however i am not sure how to use that practically ..
Use RAND(SEED). Quoting docs: "If a constant integer argument N is specified, it is used as the seed value." (http://dev.mysql.com/doc/refman/5.0/en/mathematical-functions.html#function_rand).
In the example above the result order is rand, but it is always the same. You can just change the seed to get a new order.
SELECT * FROM your_table ORDER BY RAND(351);
You can change the seed every time the user hits the first results page and store it in the user session.
Random ordering in MySQL is as sticky a problem as they come. In the past, I've usually chosen to go around the problem whenever possible. Typically, a user won't ever come back to a set of pages like this more than once or twice. So this gives you the opportunity to avoid all of the various disgusting implementations of random order in favor of a couple simple, but not quite 100% random solutions.
Solution 1
Pick from a number of existing columns that already indexed for being sorted on. This can include created on, modified timestamps, or any other column you may sort by. When a user first comes to the site, have these handy in an array, pick one at random, and then randomly pick ASC or DESC.
In your case, every time a user comes back to page 1, pick something new, store it in session. Every subsequent page, you can use that sort to generate a consistent set of paging.
Solution 2
You could have an additional column that stores a random number for sorting. It should be indexed, obviously. Periodically, run the following query;
UPDATE table SET rand_col = RAND();
This may not work for your specs, as you seem to require every user to see something different every time they hit page 1.
First you should stop using the ORDER BY RAND syntax. This will bad for performance in large set of rows.
You need to manually determine the LIMIT constraints. If you still want to use the random results and you don't want users to see the same results on next page the only way is to save all the result for this search session in database and manipulate this information when user navigate to next page.
The next thing in web design you should understand - using any random data blocks on your site is very, very, very bad for users visual perception.
You have several problems to deal with! I recommend that you go step by step.
First issue: results they already seen not to appear again
Every item returned, store it in one array. (assuming the index id on the example)
When the user goes to the next page, pass to the query the NOT IN:
MySQL Query
SELECT * FROM table WHERE id NOT IN (1, 14, 25, 645) ORDER BY RAND() LIMIT 0,10;
What this does is to match all id that are not 1, 14, 25 or 645.
As far as the performance issue goes: in a memory efficient way
SELECT RAND( )
FROM table
WHERE id NOT
IN ( 1, 14, 25, 645 )
LIMIT 0 , 10
Showing rows 0 - 9 (10 total, Query took 0.0004 sec)
AND
SELECT *
FROM table
WHERE id NOT
IN ( 1, 14, 25, 645 )
ORDER BY RAND( )
LIMIT 0 , 10
Showing rows 0 - 9 (10 total, Query took 0.0609 sec)
So, don't use ORDER BY RAND(), preferably use SELECT RAND().
I would have your PHP generate your random record numbers or rows to retrieve, pass those to your query, and save a cookie on the user's client indicating what records they've already seen.
There's no reason for that user specific data to live on the server (unless you're tracking it, but it's random anyway so who cares).
The combination of
random ordering
pagination
HTTP (stateless)
is as ugly as it comes: 1. and 2. together need some sort of "persistent randomness", while 3. makes this harder to achieve. On top of this 1. is not a job a RDBMS is optimized to do.
My suggestion depends on how big your dataset is:
Few rows (ca. <1K):
select all PK values in first query (first page)
shuffle these in PHP
store shuffled list in session
for each page call select the data according to the stored PKs
Many rows (10K+):
This assumes, you have an AUTO_INCREMENT unique key called ID with a manageable number of holes. Use a amintenace script if needed (high delete ratio)
Use a shuffling function that is parameterized with e.g. the session ID to create a function rand_id(continuous_id)
If you need e.g. the records 100,000 to 100,009 calculate $a=array(rand_id(100,000), rand_id(100,001), ... rand_id(100,009));
$a=implode(',',$a);
$sql="SELECT foo FROM bar WHERE ID IN($a) ORDER BY FIELD(ID,$a)";
To take care of the holes in your ID select a few records too many (and throw away the exess), looping on too few records selected.

paginating random list sql

Im working in a basic pagination list where I need all the results be retrieved ramdomly
this is why I use
SELECT *
FROM table
WHERE 1
ORDER BY rand()
It works well until I need to paginate it...
SELECT *
FROM table
WHERE 1
ORDER BY rand()
LIMIT $offset, $recordsperPage
How do I retrieve the whole list in random order, but when paginated, every page do not repeat the prior random words?
If you provide a consistent seed to the rand() call, you'll get the same sequence. You might use something derived from the date, for example, to give the same results for the day, or generate some other random seed from PHP and save it in the session, to give the same results only for a given visitor.

Mysql count vs select performance in wordpress

I have a very large database (~10 million rows) and I want to list these things as fast as possible in a table. I have few options :
I can limit the rows from Mysql - Not Preferred as I want to count the rows with specific type of data say attachment
Fetch all rows and use while loop to limit 1000 records each time - I think it's good to do but calling 10 million rows in memory looks insane and I am quite sure that it must have worse performance.
Count the total data and then list using limit - but mysql count is a deal breaker as inspite of unique and indexed id I have faced bad time with mysql count.
What is the best way to do this?
If I just want to list 10 million rows and parsing data using php to stop it and display each time 1000 rows it is a bad idea ?
Theres some things to consider:
Is the database optimized? if yes, skip
Indexing columns you want to filter the search from
Select the columns you require from it only (instead of select *)
If you want to count the total and the id is sequencial, you get select the latest row and count based on the id if it's 'that slow'
If you're looking at some sort of pagination, you can count the rows and select only a few records based of an user input (select with limit 1000, skip '1000' when its page 2, etc)
You wouldn't want 10million data in your "memory" when you'd be using 0.1% of it right?

Pick random entry in MySQL database based on certain criteria

Alright, let's say my MySQL table is set up with entries similar to:
id code sold
1 JKDA983J1KZMN49 0
2 JZMA093KANZB481 1
3 KZLMMA98309Z874 0
I'd like it to pick a random ID within the ranges already in the database (or just go from 1-X) and then just assign it to a variable for my own action to be taken. So let's say we want to pick a code that isn't sold (marked as 0 and not 1), then we'd pick it.
Now, it doesn't have to be 100% random, it could check if the first one is sold, if not, keep going. But I'm not 100% sure on how to go by this. Snippets would be appreciated because I can work out things easily on my own, I just need an example to see where I am going.
How about using a WHERE and ORDER BY RAND()
SELECT id, code
FROM tablename
WHERE sold = 0
ORDER BY RAND()
LIMIT 1
If you don't need the random, then don't use it. It can affect performance very negatively. Since you mentioned in your post that it wasn't necessary, I would recomment using Ezequiel's answer above and dropping the rand. See Most Efficient Way To Retrieve MYSQL data in random order PHP for more info.
It seems that your codes are already random, so why not just take the first item; if you have many unsold records in your database, doing the typical ORDER BY RAND() will hurt the database performance.
SELECT *
FROM codes
WHERE sold = 0
LIMIT 1
FOR UPDATE;
I've also added FOR UPDATE to avoid race conditions, assuming that you're using transactions, as you update the record later (to actually sell it).
Have you tried
SELECT * FROM myTable WHERE sold = 0 ORDER BY RAND() LIMIT 1
Adding ORDER BY RAND() to the rest of your SELECT query is the most straightforward way to accomplish this.

PHP MySQL pagination with random ordering

This is a problem with a ordering search results on my website,
When a search is made, random results appear on the content page, this page includes pagination too. I user following as my SQL query.
SELECT * FROM table ORDER BY RAND() LIMIT 0,10;
so my questions are
I need to make sure that everytime user visits the next page, results they already seen not to appear again (exclude them in the next query, in a memory efficient way but still order by rand() )
everytime the visitor goes to the 1st page there is a different sets of results, Is it possible to use pagination with this, or will the ordering always be random.
I can use seed in the MYSQL, however i am not sure how to use that practically ..
Use RAND(SEED). Quoting docs: "If a constant integer argument N is specified, it is used as the seed value." (http://dev.mysql.com/doc/refman/5.0/en/mathematical-functions.html#function_rand).
In the example above the result order is rand, but it is always the same. You can just change the seed to get a new order.
SELECT * FROM your_table ORDER BY RAND(351);
You can change the seed every time the user hits the first results page and store it in the user session.
Random ordering in MySQL is as sticky a problem as they come. In the past, I've usually chosen to go around the problem whenever possible. Typically, a user won't ever come back to a set of pages like this more than once or twice. So this gives you the opportunity to avoid all of the various disgusting implementations of random order in favor of a couple simple, but not quite 100% random solutions.
Solution 1
Pick from a number of existing columns that already indexed for being sorted on. This can include created on, modified timestamps, or any other column you may sort by. When a user first comes to the site, have these handy in an array, pick one at random, and then randomly pick ASC or DESC.
In your case, every time a user comes back to page 1, pick something new, store it in session. Every subsequent page, you can use that sort to generate a consistent set of paging.
Solution 2
You could have an additional column that stores a random number for sorting. It should be indexed, obviously. Periodically, run the following query;
UPDATE table SET rand_col = RAND();
This may not work for your specs, as you seem to require every user to see something different every time they hit page 1.
First you should stop using the ORDER BY RAND syntax. This will bad for performance in large set of rows.
You need to manually determine the LIMIT constraints. If you still want to use the random results and you don't want users to see the same results on next page the only way is to save all the result for this search session in database and manipulate this information when user navigate to next page.
The next thing in web design you should understand - using any random data blocks on your site is very, very, very bad for users visual perception.
You have several problems to deal with! I recommend that you go step by step.
First issue: results they already seen not to appear again
Every item returned, store it in one array. (assuming the index id on the example)
When the user goes to the next page, pass to the query the NOT IN:
MySQL Query
SELECT * FROM table WHERE id NOT IN (1, 14, 25, 645) ORDER BY RAND() LIMIT 0,10;
What this does is to match all id that are not 1, 14, 25 or 645.
As far as the performance issue goes: in a memory efficient way
SELECT RAND( )
FROM table
WHERE id NOT
IN ( 1, 14, 25, 645 )
LIMIT 0 , 10
Showing rows 0 - 9 (10 total, Query took 0.0004 sec)
AND
SELECT *
FROM table
WHERE id NOT
IN ( 1, 14, 25, 645 )
ORDER BY RAND( )
LIMIT 0 , 10
Showing rows 0 - 9 (10 total, Query took 0.0609 sec)
So, don't use ORDER BY RAND(), preferably use SELECT RAND().
I would have your PHP generate your random record numbers or rows to retrieve, pass those to your query, and save a cookie on the user's client indicating what records they've already seen.
There's no reason for that user specific data to live on the server (unless you're tracking it, but it's random anyway so who cares).
The combination of
random ordering
pagination
HTTP (stateless)
is as ugly as it comes: 1. and 2. together need some sort of "persistent randomness", while 3. makes this harder to achieve. On top of this 1. is not a job a RDBMS is optimized to do.
My suggestion depends on how big your dataset is:
Few rows (ca. <1K):
select all PK values in first query (first page)
shuffle these in PHP
store shuffled list in session
for each page call select the data according to the stored PKs
Many rows (10K+):
This assumes, you have an AUTO_INCREMENT unique key called ID with a manageable number of holes. Use a amintenace script if needed (high delete ratio)
Use a shuffling function that is parameterized with e.g. the session ID to create a function rand_id(continuous_id)
If you need e.g. the records 100,000 to 100,009 calculate $a=array(rand_id(100,000), rand_id(100,001), ... rand_id(100,009));
$a=implode(',',$a);
$sql="SELECT foo FROM bar WHERE ID IN($a) ORDER BY FIELD(ID,$a)";
To take care of the holes in your ID select a few records too many (and throw away the exess), looping on too few records selected.

Categories