How to write pager effect in PHP - php

I'm sorry, I am a newbie in PHP. Now I want to learn how to write a pagination effect in PHP. I know there are many tutorials for this on the net. I searched it and have a reference to them. When I try it by myself and don't have a reference to other resources, though, I still don't know how to write a pagination effect with PHP.
Now, I want to know what's the basic and important thing of writing a pagination in PHP. If anyone could give me more details, thank you.

For this example, I will use MySQL because it's a popular database to use with PHP. The general principle applies to other databases, but the precise way you use SQL to achieve the results is different.
The essence of pagination is that you have a number records, say 105, and you only want to display a smaller, more manageable number of records at a time, say 10 of them. In order to do this, you need to know how to find out the total number of records, and you need to know how to select just a subset of the records.
First, find out how many records you have. For this, you use COUNT(*) to count the records in the table. If your table is called Users:
SELECT COUNT(*) FROM Users;
Since we said there are 105 records in the table, this will return the result 105. We said each page has 10 records, so figure out how many pages there are in total: 105 / 10 = 10.5; round up to 11 because the extra records after page 10 need their own page. So, there are 11 pages in total. Your web page will display controls that enable the user to select a page from 1 to 11.
Instead of fetching all the records, you will only fetch one page at a time. In MySQL you generally do this using the LIMIT keyword; this allows you to choose a range of records. LIMIT syntax works like this: LIMIT $SKIP, $COUNT, where $SKIP is the number of records in the result set to skip, and $COUNT is the number to return. (Actually, you will get up to $COUNT records, in case fewer are available.)
The user has requested page 6. This means that I need to skip 5 pages before the results I want to show. With pages of 10 records each, this means I will skip 5 * 10 records, i.e. 50 records. In other words, $SKIP = ($PAGE_NUMBER - 1) * $PAGE_SIZE. You query would look like this:
SELECT User_ID, UserName, City, State from Users LIMIT 50, 10;
This gives you records 51-60 of the table.
See the documentation to learn more about how LIMIT works:
http://dev.mysql.com/doc/refman/5.5/en/select.html
Now you have a single page of records. However, the numbering of the records is dependent on their ordering. The query above will give you unpredictable results because the ordering is undefined. (A database can return records in any order it chooses unless you command it to use a certain ordering.) Thus, to do pagination, you need to define the order of the records. Use ORDER BY to define the ordering:
SELECT User_ID, UserName, City, State from Users ORDER BY UserName LIMIT 50, 10;
This allows you to show different pages of results, and always know that when you show page 6, it contains the records that come after the ones on page 5 and before the ones on page 7.
(If your records don't have any natural ordering, you will have to define an arbitrary one to make this work correctly.)
I have avoided mentioning PHP at all in this answer because I don't think it's necessary to do so to explain pagination. Pagination is entirely a matter of handling data. If you understand this, I think you can go read the tutorials and they will make sense to you and then you can figure how to write the PHP to do this.

Get page number
Know how many items to display on each page.
Using mysql command "LIMIT", get the offset and select how many rows you want.
Want to use a class? http://net.tutsplus.com/tutorials/php/how-to-paginate-data-with-php/
Or just a plain script to modify? http://pastebin.com/J8nTk1q5
If you are new to PHP, then you need to learn more PHP or risk future confusion.

Related

How to stop mysqli duplicate on random select [duplicate]

This is a problem with a ordering search results on my website,
When a search is made, random results appear on the content page, this page includes pagination too. I user following as my SQL query.
SELECT * FROM table ORDER BY RAND() LIMIT 0,10;
so my questions are
I need to make sure that everytime user visits the next page, results they already seen not to appear again (exclude them in the next query, in a memory efficient way but still order by rand() )
everytime the visitor goes to the 1st page there is a different sets of results, Is it possible to use pagination with this, or will the ordering always be random.
I can use seed in the MYSQL, however i am not sure how to use that practically ..
Use RAND(SEED). Quoting docs: "If a constant integer argument N is specified, it is used as the seed value." (http://dev.mysql.com/doc/refman/5.0/en/mathematical-functions.html#function_rand).
In the example above the result order is rand, but it is always the same. You can just change the seed to get a new order.
SELECT * FROM your_table ORDER BY RAND(351);
You can change the seed every time the user hits the first results page and store it in the user session.
Random ordering in MySQL is as sticky a problem as they come. In the past, I've usually chosen to go around the problem whenever possible. Typically, a user won't ever come back to a set of pages like this more than once or twice. So this gives you the opportunity to avoid all of the various disgusting implementations of random order in favor of a couple simple, but not quite 100% random solutions.
Solution 1
Pick from a number of existing columns that already indexed for being sorted on. This can include created on, modified timestamps, or any other column you may sort by. When a user first comes to the site, have these handy in an array, pick one at random, and then randomly pick ASC or DESC.
In your case, every time a user comes back to page 1, pick something new, store it in session. Every subsequent page, you can use that sort to generate a consistent set of paging.
Solution 2
You could have an additional column that stores a random number for sorting. It should be indexed, obviously. Periodically, run the following query;
UPDATE table SET rand_col = RAND();
This may not work for your specs, as you seem to require every user to see something different every time they hit page 1.
First you should stop using the ORDER BY RAND syntax. This will bad for performance in large set of rows.
You need to manually determine the LIMIT constraints. If you still want to use the random results and you don't want users to see the same results on next page the only way is to save all the result for this search session in database and manipulate this information when user navigate to next page.
The next thing in web design you should understand - using any random data blocks on your site is very, very, very bad for users visual perception.
You have several problems to deal with! I recommend that you go step by step.
First issue: results they already seen not to appear again
Every item returned, store it in one array. (assuming the index id on the example)
When the user goes to the next page, pass to the query the NOT IN:
MySQL Query
SELECT * FROM table WHERE id NOT IN (1, 14, 25, 645) ORDER BY RAND() LIMIT 0,10;
What this does is to match all id that are not 1, 14, 25 or 645.
As far as the performance issue goes: in a memory efficient way
SELECT RAND( )
FROM table
WHERE id NOT
IN ( 1, 14, 25, 645 )
LIMIT 0 , 10
Showing rows 0 - 9 (10 total, Query took 0.0004 sec)
AND
SELECT *
FROM table
WHERE id NOT
IN ( 1, 14, 25, 645 )
ORDER BY RAND( )
LIMIT 0 , 10
Showing rows 0 - 9 (10 total, Query took 0.0609 sec)
So, don't use ORDER BY RAND(), preferably use SELECT RAND().
I would have your PHP generate your random record numbers or rows to retrieve, pass those to your query, and save a cookie on the user's client indicating what records they've already seen.
There's no reason for that user specific data to live on the server (unless you're tracking it, but it's random anyway so who cares).
The combination of
random ordering
pagination
HTTP (stateless)
is as ugly as it comes: 1. and 2. together need some sort of "persistent randomness", while 3. makes this harder to achieve. On top of this 1. is not a job a RDBMS is optimized to do.
My suggestion depends on how big your dataset is:
Few rows (ca. <1K):
select all PK values in first query (first page)
shuffle these in PHP
store shuffled list in session
for each page call select the data according to the stored PKs
Many rows (10K+):
This assumes, you have an AUTO_INCREMENT unique key called ID with a manageable number of holes. Use a amintenace script if needed (high delete ratio)
Use a shuffling function that is parameterized with e.g. the session ID to create a function rand_id(continuous_id)
If you need e.g. the records 100,000 to 100,009 calculate $a=array(rand_id(100,000), rand_id(100,001), ... rand_id(100,009));
$a=implode(',',$a);
$sql="SELECT foo FROM bar WHERE ID IN($a) ORDER BY FIELD(ID,$a)";
To take care of the holes in your ID select a few records too many (and throw away the exess), looping on too few records selected.

MySQL Performance for Online Games Highscore Lists

I have a question about making "Highscore-Lists".
Lets say I have an online game with 1.000.000 active users. Each user has points from 0 to X. Now, I want to show a ranking-list. It would be insane to show all million entries in one page so it is divided into Y pages (100 entries each page => 10.000 pages).
I am not really sure how to solve it.
1. The easiest way to do that would be loading all 1m entries
in one SELECT, get the result and find current user with a for loop and show that specific page. (but all other 999.900 entries will be saved in RAM eventhough its not showing up). For a page change I could just use the result data with no second database call. (So I don't care about point changes during that time)
SELECT UserName, UserID, Points FROM UserAccount ORDER BY Points;
2. My second idea was, to load each page individually but than I do not know
2.1 if it is really better performance
2.2 how to get the right start page because I only have the points of the user but not really his place
So how could I solve that problem. I dont really know what mysql can handle. Are more small calls better then one huge call.
Can I even save huge result data?
Second solution would update all changed points with each page change, though but i care more about performance then always uptodate list-data.
Thank you for your help!
Markus
Use pagination. In SQL it's a "limit" clause:
SELECT UserName, UserID, Points FROM UserAccount ORDER BY Points LIMIT 0, 20;
The above query will return only the first 20 rows of the original selection.
You can pass page parameters via get, like this: highscore.php?page=1 or ?page=2 and so on.

Optimize Pagination and Filter Query Results with Count

I am having performance issues when dealing with pagination and filtering products as seen on many ecommerce sites, here is an example from Zappos
Kind of the standard:
Showing 1-10 of 132 results. [prev] 1 2 [3] 4 ... 13 [next]
[10] Results per page
To me it seems like a large part of the problem is the query is run twice, once to count the number of results and again to actually populate the array. Below is the "filter" query:
SELECT product_id, product_title, orderable
FROM table_view
WHERE (family_title = 'Shirts' OR category_title = 'Shirts')
AND ((detail_value = 'Blue' AND detail_title = 'Color')
OR (detail_value = 'XL' AND detail_title = 'Size'))
GROUP BY product_id, product_title, orderable
HAVING COUNT(detail_title)=2
ORDER BY product_id
LIMIT 10 OFFSET 0
The query takes about 20ms to run just by itself. The table it is selecting from is a view which is a join of about five different tables. The parameters that are passed in by the user are the "detail_value" & "detail_title" which are the filtering criterial. The "family" & "category" and then the Limit is set by the "results per page". So if they want to view all the results the limit is set to 2000. And every time they go to a new page via the pagination the whole query is run again. Below is a snippet of the PHP, $products is an array of the query results. And then the $number_of_results is a count of the same thing with the maximum limit.
$products = filter($value, $category_title, $number_per_page, $subcategory, $start_number);
$number_of_results = count(filter($value, $category_title, 2000, $subcategory, 0));
$pages = ceil($number_of_results / $number_per_page);
When run on my local machine the results page takes about 600-800ms to load, when deployed to Heroku the page takes 13-16 seconds to load. I've left out a lot of the PHP code, but I'm using PHP's PDO class to make the query results into an object to display in PHP. The tables being joined are the product table, category table, detail table, and the two tables linking them via foreign keys.
Google results show that this is a pretty common/complex problem, but I have yet to come across any real solution that works for me.
Many queries for pagination generally need to run several times: once to determine how many records would be shown, then again to grab a screen of records. Then subsequent queries grabbing the next screen of records, etc.
Two solutions to slow pagination queries are:
Use a cursor to pull n-records from the open query resultset
Speed up the queries
Solution 1 can be expensive memory-wise for the server's resources and might not scale well if you have many concurrent users generating queries like this. It might also be difficult to implement cursors with the PDO class you're using.
Solution 2 could be done via improving view queries, adding indexes, etc. However that may not be enough. If the tables are read much more often than they are written to, you might try using UPDATE/INSERT/DELETE trigger tricks. Rather than running the query against a VIEW, create a table with the same column structure and data as the VIEW. Any time that one of the underlying tables changes, manually modify this new table to follow the changes. This will slow down writes, but greatly improve reading.

PHP MySQL pagination with random ordering

This is a problem with a ordering search results on my website,
When a search is made, random results appear on the content page, this page includes pagination too. I user following as my SQL query.
SELECT * FROM table ORDER BY RAND() LIMIT 0,10;
so my questions are
I need to make sure that everytime user visits the next page, results they already seen not to appear again (exclude them in the next query, in a memory efficient way but still order by rand() )
everytime the visitor goes to the 1st page there is a different sets of results, Is it possible to use pagination with this, or will the ordering always be random.
I can use seed in the MYSQL, however i am not sure how to use that practically ..
Use RAND(SEED). Quoting docs: "If a constant integer argument N is specified, it is used as the seed value." (http://dev.mysql.com/doc/refman/5.0/en/mathematical-functions.html#function_rand).
In the example above the result order is rand, but it is always the same. You can just change the seed to get a new order.
SELECT * FROM your_table ORDER BY RAND(351);
You can change the seed every time the user hits the first results page and store it in the user session.
Random ordering in MySQL is as sticky a problem as they come. In the past, I've usually chosen to go around the problem whenever possible. Typically, a user won't ever come back to a set of pages like this more than once or twice. So this gives you the opportunity to avoid all of the various disgusting implementations of random order in favor of a couple simple, but not quite 100% random solutions.
Solution 1
Pick from a number of existing columns that already indexed for being sorted on. This can include created on, modified timestamps, or any other column you may sort by. When a user first comes to the site, have these handy in an array, pick one at random, and then randomly pick ASC or DESC.
In your case, every time a user comes back to page 1, pick something new, store it in session. Every subsequent page, you can use that sort to generate a consistent set of paging.
Solution 2
You could have an additional column that stores a random number for sorting. It should be indexed, obviously. Periodically, run the following query;
UPDATE table SET rand_col = RAND();
This may not work for your specs, as you seem to require every user to see something different every time they hit page 1.
First you should stop using the ORDER BY RAND syntax. This will bad for performance in large set of rows.
You need to manually determine the LIMIT constraints. If you still want to use the random results and you don't want users to see the same results on next page the only way is to save all the result for this search session in database and manipulate this information when user navigate to next page.
The next thing in web design you should understand - using any random data blocks on your site is very, very, very bad for users visual perception.
You have several problems to deal with! I recommend that you go step by step.
First issue: results they already seen not to appear again
Every item returned, store it in one array. (assuming the index id on the example)
When the user goes to the next page, pass to the query the NOT IN:
MySQL Query
SELECT * FROM table WHERE id NOT IN (1, 14, 25, 645) ORDER BY RAND() LIMIT 0,10;
What this does is to match all id that are not 1, 14, 25 or 645.
As far as the performance issue goes: in a memory efficient way
SELECT RAND( )
FROM table
WHERE id NOT
IN ( 1, 14, 25, 645 )
LIMIT 0 , 10
Showing rows 0 - 9 (10 total, Query took 0.0004 sec)
AND
SELECT *
FROM table
WHERE id NOT
IN ( 1, 14, 25, 645 )
ORDER BY RAND( )
LIMIT 0 , 10
Showing rows 0 - 9 (10 total, Query took 0.0609 sec)
So, don't use ORDER BY RAND(), preferably use SELECT RAND().
I would have your PHP generate your random record numbers or rows to retrieve, pass those to your query, and save a cookie on the user's client indicating what records they've already seen.
There's no reason for that user specific data to live on the server (unless you're tracking it, but it's random anyway so who cares).
The combination of
random ordering
pagination
HTTP (stateless)
is as ugly as it comes: 1. and 2. together need some sort of "persistent randomness", while 3. makes this harder to achieve. On top of this 1. is not a job a RDBMS is optimized to do.
My suggestion depends on how big your dataset is:
Few rows (ca. <1K):
select all PK values in first query (first page)
shuffle these in PHP
store shuffled list in session
for each page call select the data according to the stored PKs
Many rows (10K+):
This assumes, you have an AUTO_INCREMENT unique key called ID with a manageable number of holes. Use a amintenace script if needed (high delete ratio)
Use a shuffling function that is parameterized with e.g. the session ID to create a function rand_id(continuous_id)
If you need e.g. the records 100,000 to 100,009 calculate $a=array(rand_id(100,000), rand_id(100,001), ... rand_id(100,009));
$a=implode(',',$a);
$sql="SELECT foo FROM bar WHERE ID IN($a) ORDER BY FIELD(ID,$a)";
To take care of the holes in your ID select a few records too many (and throw away the exess), looping on too few records selected.

How to efficiently paginate large datasets with PHP and MySQL?

As some of you may know, use of the LIMIT keyword in MySQL does not preclude it from reading the preceding records.
For example:
SELECT * FROM my_table LIMIT 10000, 20;
Means that MySQL will still read the first 10,000 records and throw them away before producing the 20 we are after.
So, when paginating a large dataset, high page numbers mean long load times.
Does anyone know of any existing pagination class/technique/methodology that can paginate large datasets in a more efficient way i.e. that does not rely on the LIMIT MySQL keyword?
In PHP if possible as that is the weapon of choice at my company.
Cheers.
First of all, if you want to paginate, you absolutely have to have an ORDER BY clause. Then you simply have to use that clause to dig deeper in your data set. For example, consider this:
SELECT * FROM my_table ORDER BY id LIMIT 20
You'll have the first 20 records, let's say their id's are: 5,8,9,...,55,64. Your pagination link to page 2 will look like "list.php?page=2&id=64" and your query will be
SELECT * FROM my_table WHERE id > 64 ORDER BY id LIMIT 20
No offset, only 20 records read. It doesn't allow you to jump arbitrarily to any page, but most of the time people just browse the next/prev page. An index on "id" will improve the performance, even with big OFFSET values.
A solution might be to not use the limit clause, and use a join instead -- joining on a table used as some kind of sequence.
For more informations, on SO, I found this question / answer, which gives an example -- that might help you ;-)
There are basically 3 approaches to this, each of which have their own trade-offs:
Send all 10000 records to the client, and handle pagination client-side via Javascript or the like. Obvious benefit is that only a single query is necessary for all of the records; obvious downside is that if the record size is in any way significant, the size of the page sent to the browser will be of proportionate size - and the user might not actually care about the full record set.
Do what you're currently doing, namely SQL LIMIT and grab only the records you need with each request, completely stateless. Benefit in that it only sends the records for the page currently requested, so requests are small, downsides in that a) it requires a server request for each page, and b) it's slower as the number of records/pages increases for later pages in the result, as you mentioned. Using a JOIN or a WHERE clause on a monotonically increasing id field can sometimes help in this regard, specifically if you're requesting results from a static table as opposed to a dynamic query.
Maintain some sort of state object on the server which caches the query results and can be referenced in future requests for a limited period of time. Upside is that it has the best query speed, since the actual query only needs to run once; downside is having to manage/store/cleanup those state objects (especially nasty for high-traffic websites).
SELECT * FROM my_table LIMIT 10000, 20;
means show 20 records starting from record # 10000 in the search , if ur using primary keys in the where clause there will not be a heavy load on my sql
any other methods for pagnation will take real huge load like using a join method
I'm not aware of that performance decrease that you've mentioned, and I don't know of any other solution for pagination however a ORDER BY clause might help you reduce the load time.
Best way is to define index field in my_table and for every new inserted row you need increment this field. And after all you need to use WHERE YOUR_INDEX_FIELD BETWEEN 10000 AND 10020
It will much faster.
some other options,
Partition the tables per each page so ignore the limit
Store the results into a session (a good idea would be to create a hash of that data using md5, then using that cache the session per multiple users)

Categories