Make multiple pages out of a mysql query - php

So, i have this database right, with some fields called 'id', 'title' and 'message'. Now i got like 700 messages in the database. So all i wanna do, is set a limit of max 50 message title's per page, and make multiple pages... How can i do that?
I only know to get the first page, using LIMIT...

As you guessed, you have to use the LIMIT keyword.
It accepts two value (quoting) :
the offset of the first row to return
the maximum number of rows to return
In your case, you'll have to use something like this for the first page :
select * from your_table order by ... limit 0, 50
And, then, for the second page :
select * from your_table order by ... limit 50, 50
And for the third one :
select * from your_table order by ... limit 100, 50
And so on ;-)
Edit after the comment : to get the page number, you'll have to receive it from your URLs, that would look like this :
http://www.example.com/page.php?pagenum=2
Then, you'll calculate the first value for the limit, :
$offset = 50 * intval($_GET['pagenum']);
And inject it in your query :
select * from your_table order by ... limit $offset, 50
Constructing URLs to the differents pages is now a matter of getting URLs such as these :
http://www.example.com/page.php?pagenum=0
http://www.example.com/page.php?pagenum=1
http://www.example.com/page.php?pagenum=2
...
If you know you have 700 elements, and 50 per page, you'll have 700/50 pages ;-)
So, something like this should do the trick :
for ($i=0 ; $i<700/50 ; i++) {
// Use http://www.example.com/page.php?pagenum=$i as URL
}
Of course, 700 is a value that can probably change, and should not be hard-coded : it should be determined from the database, using a count query :
select count(*) as total
from your_table
...

Your PHP file may receive a GET argument being the page number.
Then you do your query with LIMIT ($page_number * $messages_per_page), $messages_per_page (pseudo-code).
$messages_per_page = 50 in your case. $page_number is deduced from a GET argument, after sanitizing, the first page being page number 0.

Related

Using count_all_results or get_compiled_select and $this->db->get('table') lists table twice in query?

How do I use get_compiled_select or count_all_results before running the query without getting the table name added twice? When I use $this->db->get('tblName') after either of those, I get the error:
Not unique table/alias: 'tblProgram'
SELECT * FROM (`tblProgram`, `tblProgram`) JOIN `tblPlots` ON `tblPlots`.`programID`=`tblProgram`.`pkProgramID` JOIN `tblTrees` ON `tblTrees`.`treePlotID`=`tblPlots`.`id` ORDER BY `tblTrees`.`id` ASC LIMIT 2000
If I don't use a table name in count_all_results or $this->db->get(), then I get an error that no table is used. How can I get it to set the table name just once?
public function get_download_tree_data($options=array(), $rand=""){
//join tables and order by tree id
$this->db->reset_query();
$this->db->join('tblPlots','tblPlots.programID=tblProgram.pkProgramID');
$this->db->join('tblTrees','tblTrees.treePlotID=tblPlots.id');
$this->db->order_by('tblTrees.id', 'ASC');
//get number of results to return
$allResults=$this->db->count_all_results('tblProgram', false);
//chunk data and write to CSV to avoid reaching memory limit
$offset=0;
$chunk=2000;
$treePath=$this->config->item('temp_path')."$rand/trees.csv";
$tree_handle=fopen($treePath,'a');
while (($offset<$allResults)) {
$this->db->limit($chunk, $offset);
$result=$this->db->get('tblProgram')->result_array();
foreach ($result as $row) {
fputcsv($tree_handle, $row);
}
$offset=$offset+$chunk;
}
fclose($tree_handle);
return array('resultCount'=>$allResults);
}
To count how many rows would be returned by a query, essentially all the work must be performed. That is, it is impractical to get the count, then perform the query; you may as well just do the query.
If your goal is to "paginate" by getting some of the rows, plus the total count, that is essentially two separate actions (that may be combined to look like one.)
If the goal is to estimate the number of rows, then SHOW TABLE STATUS or SELECT Rows FROM information_schema.TABLES WHERE ... gives you an estimate.
If you want to see if there are, say "at least 100 rows", then this may be practical:
SELECT 1 FROM ... WHERE ... ORDER BY ... LIMIT 99,1
and see if you get a row back. However, this may or may not be efficient, depending on the indexes and the WHERE and the ORDER BY. (Show us the query and I can elaborate.)
Using OFFSET for chunking is grossly inefficient. If there is not a usable index, then it is performing essentially the entire query for each chunk. If there is a usable index, the chunks are slower and slower. Here is a discussion of why OFFSET is not good for "pagination", plus an efficient workaround: Pagination . It talks about how to "remember where you left off " as an efficient technique for chunking. Fetch between 100 and 1000 rows per chunk.
The flaw in your code is that it aims to select a subset of some records and their total count in the same query. This is impossible in MySQL, so you cannot generate such a query, hence, you get the error as mentioned. The problem is that if you do a
select ... from t where ... limit 0, 2000
then you get maximum 2000 records, so, if the total records matching the criteria have a count that is greater than the limit, then you will not get accurately the count from above, so, in that case you need a
select count(1) from t where ...
This means that you need to build your actual query (the code below your count_all_results call), see whether the number of results reaches the limit. If the number of results does not reach the limit, then you do not need to perform a separate query in order to get the count, because you can compute $offset * $chunk + $recordCount. However, if you get as many records as they can be, then you will need to build another query, without the order_by call, since the count is independent of your sort and get the counts.
$this->db->count_all_results()
Counting the number of returned results with count_all_results()
It's useful to count the number of results returned—often bugs can arise if a section of code which expects to have at least one row is passed zero rows. Without handling the eventuality of a zero result, an application may become unpredictably unstable and may give away hints to a malicious user about the architecture of the app. Ensuring correct handling of zero results is what we're going to focus on here.
Permits you to determine the number of rows in a particular Active Record query. Queries will accept Query Builder restrictors such as where(), or_where(), like(), or_like(), etc. Example:
echo $this->db->count_all_results('my_table'); // Produces an integer, like 25
$this->db->like('title', 'match');
$this->db->from('my_table');
echo $this->db->count_all_results(); // Produces an integer, like 17
However, this method also resets any field values that you may have passed to select(). If you need to keep them, you can pass FALSE as the second parameter:
echo $this->db->count_all_results('my_table', FALSE);
get_compiled_select()
The method $this->db->get_compiled_select(); is introduced in codeigniter v3.0 and compiles active records query without actually executing it. But this is not a completely new method. In older versions of CI it is like $this->db->_compile_select(); but the method has been made protected in later versions making it impossible to call back.
// Note that the second parameter of the get_compiled_select method is FALSE
$sql = $this->db->select(array('field1','field2'))
->where('field3',5)
->get_compiled_select('mytable', FALSE);
// ...
// Do something crazy with the SQL code... like add it to a cron script for
// later execution or something...
// ...
$data = $this->db->get()->result_array();
// Would execute and return an array of results of the following query:
// SELECT field1, field1 from mytable where field3 = 5;
NOTE:- Double calls to get_compiled_select() while you’re using the Query Builder Caching functionality and NOT resetting your queries will results in the cache being merged twice. That in turn will i.e. if you’re caching a select() - select the same field twice.
Rick James got me on the right track. I ended up having to chunk the results using pagination AND a nested query. Using LIMIT on even 1 chunk of 2000 records was timing out. This is the code I ended up with, which uses get_compiled_select('tblProgram') and then get('tblTrees O1'). Since I didn't use FALSE as the second argument to get_compiled_select, the query was cleared before the get() was run.
//grab the data in chunks, write it to CSV chunk by chunk
$offset=0;
$chunk=2000;
$i=10; //counter for the progress bar
$this->db->limit($chunk);
$this->db->select('tblTrees.id');
//nesting the limited query and then joining the other field later improved performance significantly
$query1=' ('.$this->db->get_compiled_select('tblProgram').') AS O2';
$this->db->join($query1, 'O1.id=O2.id');
$result=$this->db->get('tblTrees O1')->result_array();
$allResults=count($result);
$putHeaders=0;
$treePath=$this->config->item('temp_path')."$rand/trees.csv";
$tree_handle=fopen($treePath,'a');
//while select limit returns the limit
while (count($result)===$chunk) {
$highestID=max(array_column($result, 'id'));
//update progres bar with estimate
if ($i<90) {
$this->set_runStatus($qcRunId, $status = "processing", $progress = $i);
$i=$i+1;
}
//only get the fields the first time
foreach ($result as $row) {
if ($offset===0 && $putHeaders===0){
fputcsv($tree_handle, array_keys($row));
$putHeaders=1;
}
fputcsv($tree_handle, $row);
}
//get the next chunk
$offset=$offset+$chunk;
$this->db->reset_query();
$this->make_query($options);
$this->db->order_by('tblTrees.id', 'ASC');
$this->db->where('tblTrees.id >', $highestID);
$this->db->limit($chunk);
$this->db->select('tblTrees.id');
$query1=' ('.$this->db->get_compiled_select('tblProgram').') AS O2';
$this->db->join($query1, 'O1.id=O2.id');
$result=$this->db->get('tblTrees O1')->result_array();
$allResults=$allResults+count($result);
}
//write out last chunk
foreach ($result as $row) {
fputcsv($tree_handle, $row);
}
fclose($tree_handle);
return array('resultCount'=>$allResults);

mySQL - FIND_IN_SET() doesn't find the value

I m trying to get posts which includes a spesific tag.
The tag row content
,iphone|1468338028,,android|1468338028,,blackberry|1468338028,
query
SELECT * FROM shares WHERE FIND_IN_SET(tag, 'iphone') > 0 ORDER BY DATE DESC limit 10
What is the correct way to do it ?
Your tag is iphone|1468338028 and you look for iphone. That does not match.
Replace the | with , to separate the values.
SELECT * FROM shares
WHERE FIND_IN_SET(replace(tag, '|', ','), 'iphone') > 0
Another alternative is to use LIKE "%text%", if you're not required to use FIND_IN_SET().
SELECT * FROM shares
WHERE tag LIKE "%iphone%"
ORDER BY DATE DESC limit 10
Above snippet should achieve the same, thus avoiding replacing and trimming issues.

Limit results of all MySQL queries

I wrote a PHP/MySQLi frontend, in which the user can enter SQL queries, and the server then returns the results in a table (or prints OK on INSERTs and UPDATEs)
As printing the results can take a very long time (e.g. SELECT * FROM movies) in a IMDb extract with about 1.6M movies, 1.9M actors and 3.2M keywords, I limited the output to 50 rows by cancelling the printing for-loop after 50 iterations.
However, the queries themselves also take quite some time, so I hoped that it might be possible to set a global maximum row return value, nevertheless whether the LIMIT keyword is used or not. I only intended to use the server for my own practice, but as some people in my class are struggling with the frontend provided by the teacher (Windows EXE, but half of the class uses Mac/Linux), I decided to make it accessible to them, too. But I want to keep my Debian VM from crashing because of - well, basically it would be a DDoS.
For clarification (examples with a global limit of 50):
SELECT * FROM movies;
> First 50 rows
SELECT * FROM movies LIMIT 10;
> First 10 rows
SELECT * FROM movies LIMIT 50,100;
> 50 rows (from 50 to 99)
Is there any possibility to limit the number of returned values using either PHP/MySQLi or the MySQL server itself? Or would I have to append/replace LIMIT to/in the queries?
You can use there queries and add "LIMIT 50" to it.
And if they added LIMIT by them self just filter it out with regex and still add your LIMIT.
I believe you have to build yourself a paginator anyway, avoiding to use LIMIT statement is not really possible i believe.
Here is what I would suggest for a Paginator:
if($_REQUEST['page'] == ""){
$page = 1;
}else{
$page = $_REQUEST['page']; // perhaps double check if numeric
}
$perpage = 50;
$start = ($page - 1) * $perpage;
$limit_string = " LIMIT ". $start . "," . $perpage ;
$query = "SELECT * FROM movies";
$query .= $limit_string;
Hope that helps
You can create a function.
https://dev.mysql.com/doc/refman/5.0/en/create-function.html
Let us know if this helps.

How to make a custom column's value in mysql query

for exam I have 2 table
1) Post - and its columns :
pid :
uid :
title :
content :
created date :
...
2) URL alias
aid :
sou_url :
des_url :
The second table used to store url-alias for all pages in my site
I created an function to get alias like this
function get_alias_url($surl);
for exam get_alias_url("pid/50") = "this-is-my-first-post";
So now, I want to select all posts that has url alias's length > 50
I wondering,is there any way to make a query to do that?
It may be look like :
Select * from post where length(get_alias_url("pid/nid")) > 50
Thanks
LENGTH() returns the length of the string measured in bytes.
CHAR_LENGTH() returns the length of the string measured in characters.
$query = 'SELECT * FROM post WHERE char_length('. get_alias_url("pid/nid").' ) > 50';
Your PHP function should still work inside your query.
Hope this helps!
You can use the MySQL CHAR_LENGTH() function in your query.
For a full list of functions related to strings in MySQL, go here

MySQL wildcard return only matches with hyphen and number

In my database I have the following rows which will increment if it's a duplicate:
foo
foo-1
foo-2
foo-3
f
f-1
f-2
f-3
bar
bar-1
I want to query the db and get the last f-#. I've tried using the LIKE operator below:
SELECT * FROM links WHERE slug LIKE '$slug%' ORDER BY timestamp DESC LIMIT 1";
My Problem
If $slug == f it returns foo-3 rather than f-3. Is there a better way to use the % wildcard?
SELECT *
FROM links
WHERE slug LIKE '$slug-%' OR slug = '$slug'
ORDER BY timestamp DESC
LIMIT 1
The alternative, if you didn't want to do an OR, would be to use a REGEXP match. However, you might give up the ability to use an index for the query if you do that.

Categories