server-side rendering mysql limit n rows and paginate - php

I am using server-side rendering in DataTable.
I have a variable that determines limit to number of rows from database.
I am trying to fetch, say, 100 rows, and from DataTable with pagelength of 25, paginate it upto 4 pages.
Also, if the limit variable is 4 or 10, or anything less than pagelength value, then all records will be displayed in a single page.
Here's my code:
#get datatable params
$draw = $this->request->input('draw');
$start = $this->request->input('start');
$dt_limit = $this->request->input('length'); // datatable pagination limit
$limit_rows = $this->request->input('limit');
# query
$result_product_urls = DB::table('product_match_unmatches')
->select('r_product_url', 'r_image_url_main')
->selectRaw('count(distinct s_product_url, r_product_url) as frequency,
GROUP_CONCAT(CONCAT(s_image_url, "&s_product_url=", s_product_url) SEPARATOR " | ") as source_products')
->groupBy('r_product_url')
->where('request_id', '=', $reqId)
->orderBy('frequency', 'DESC')
->take($limit_rows);
$recordsTotal = $result_product_urls->get()->count();
$urls = $result_product_urls->offset($start)->limit($dt_limit)->get();
This will always fetch 25 rows from the database, since the dataTable page-length is 25.
But I need to limit the rows based on $limit_rows value, and not $dt_limit.
If I only do $urls = $result_product_urls->get();, then if $limit_rows value is 100, all 100 rows are displayed in the same page.
How can I fetch limit rows from database, and then paginate through it?
I hope I make my post clear.

I'll take a stab at this, but it's not entirely clear to me the exact thing you are attempting to accomplish based on the code sample.
According to Laravel docs take and limit appear to be equivalent.
https://laravel.com/docs/5.8/queries#ordering-grouping-limit-and-offset (bottom of this section)
What it looks like the code is doing:
Using $limit_rows value with ->take() method in the $result_product_urls query builder object:
# query
$result_product_urls = DB::table('product_match_unmatches')
->select('r_product_url', 'r_image_url_main')
->selectRaw('count(distinct s_product_url, r_product_url) as frequency,
GROUP_CONCAT(CONCAT(s_image_url, "&s_product_url=", s_product_url) SEPARATOR " | ") as source_products')
->groupBy('r_product_url')
->where('request_id', '=', $reqId)
->orderBy('frequency', 'DESC')
->take($limit_rows); /* <------------- limit_rows HERE */
But then also using $dt_limit value with ->limit() method when setting the $urls variable:
$recordsTotal = $result_product_urls->get()->count();
$urls = $result_product_urls->offset($start)->limit($dt_limit)->get();/*<-- dt_limit HERE */
If you don't want to use $dt_limit as a limit then don't use it...? I think maybe what you are trying to do is have a query with a limit on the result set, and then a sub limit on that? I can't figure out a reason to want to do that, so I might be missing something. I don't think you can use ->take() and ->limit() on the same query.
Shouldn't this accomplish your goal?:
# query
$result_product_urls = DB::table('product_match_unmatches')
->select('r_product_url', 'r_image_url_main')
->selectRaw('count(distinct s_product_url, r_product_url) as frequency,
GROUP_CONCAT(CONCAT(s_image_url, "&s_product_url=", s_product_url) SEPARATOR " | ") as source_products')
->groupBy('r_product_url')
->where('request_id', '=', $reqId)
->orderBy('frequency', 'DESC');
/* ->take($limit_rows); remove this */
$recordsTotal = $result_product_urls->get()->count();
$urls = $result_product_urls->offset($start)
->limit($limit_rows) /* set limit to $limit_rows */
->get();
If not, then maybe you can clarify.

use offset alongwith limit i.e.
$users = DB::table('users')->skip(10)->take(5)->get();
or
$users = DB::table('users')->offset(10)->limit(5)->get();

Related

Laravel Paginate method takes too much time for 1 million record

So this is my case. I have a main table payment_transactions having almost 1 million records.
I am getting the data from this table with joins and where clauses and there is Laravel paginate method,
But this method takes too much time and after investigation, I found that its count method is taking time like 4 to 5 seconds just for counting.
So how can I optimize and speed up this query, especially is there any way to improve paginate method speed?
Note: I can't use simplePaginate because there is a datatable on frontend and I need a total count for that.
So for paginate, two queries run
1 is the main query and other one is for count, and I felt that, the count query is taking much time.
Here is the count query after getQueryLog
select count(*) as aggregate from `payment_transactions`
left join `users` as `U` on `U`.`id` = `payment_transactions`.`user_id`
left join `coupons` as `C`
on `C`.`id` = `payment_transactions`.`coupon_id`
where `payment_transactions`.`refund_status` = 'NO_REFUND'
and `payment_transactions`.`transaction_type`
in ('BOOKING','SB_ANPR','QUERCUS_ANPR','CANDID_ANPR','SB_TICKET',
'ORBILITY_TICKET','TOPUP,CREDIT','DEBIT','GIFT')
and `payment_transactions`.`status` != 'INITIATED'
Here is my code example:
//Get Transactions data
public function adminTransactions(Request $request)
{
$selectableFields = [
'payment_transactions.id', 'payment_transactions.transaction_id AS transaction_id',
'payment_transactions.refund_status',
'payment_transactions.created_at', 'payment_transactions.response_data', 'payment_transactions.status',
'payment_transactions.transaction_type', 'payment_transactions.payment_mode','payment_transactions.payment_source',
'payment_transactions.subtotal', 'payment_transactions.service_fees', 'C.coupon_code','C.amount AS coupon_value',
DB::raw("IF(payment_transactions.refund_remarks='NULL','-NA-',payment_transactions.refund_remarks) as refund_remarks"),
DB::raw("IF(payment_transactions.transaction_type='TOPUP' AND payment_transactions.coupon_id IS NOT NULL
AND payment_transactions.coupon_id!=0,
payment_transactions.amount + C.amount,payment_transactions.amount) as amount"),
DB::raw("CONCAT(U.first_name,' ',U.last_name) AS username"), 'U.id AS user_id',
DB::raw("JSON_UNQUOTE(json_extract(payment_transactions.response_data, '$.description')) AS description"),
DB::raw("payment_transactions.invoice_id"),
DB::raw("JSON_UNQUOTE(json_extract(payment_transactions.response_data, '$.Data.PaymentID')) AS upay_payment_id"),
];
return PaymentTransactions::select($selectableFields)
->with('homeScreenMessages:payment_transaction_id,from_name,message,amount')
->leftJoin('users AS U', 'U.id', '=', 'payment_transactions.user_id')
->leftJoin('coupons AS C', 'C.id', '=', 'payment_transactions.coupon_id')
->where(DB::raw("CONCAT(U.first_name,' ',U.last_name)"), 'like', "%{$request->input('query')}%")
->orWhere('U.id', $request->input('query'))
->orWhere("U.phone_number", "LIKE", "%" . $request->input('query') . "%")
->orWhere("U.email", "LIKE", "%" . $request->input('query') . "%")
->orWhere('payment_transactions.id', $request->input('query'))
->orWhere('payment_transactions.transaction_id', $request->input('query'));
}
//Paginate function
public function paginationCalculate($queryObject, $request) {
$draw = $request->get('draw');
$start = $request->get("start");
$rowperpage = $request->get("length"); // Rows display per page
$columnIndex_arr = $request->get('order');
$columnName_arr = $request->get('columns');
$order_arr = $request->get('order');
$columnIndex = $columnIndex_arr[0]['column']; // Column index
$columnName = $columnName_arr[$columnIndex]['name']; // Column name
$columnSortOrder = $order_arr[0]['dir']; // asc or desc
$pageNumber = ($start + $rowperpage) / $rowperpage;
if(!empty($columnName)) {
$queryObject->orderBy($columnName, $columnSortOrder);
}
$records = $queryObject->paginate($rowperpage, ['*'], 'page', $pageNumber)->toArray();
return array(
"draw" => intval($draw),
"recordsFiltered" => $records['total'],
"recordsTotal" => $records['total'],
"data" => $records['data']
);
}
I don't think you want "LEFT". With LEFT, the COUNT will count payment transactions even if there is no matching user and/or coupon.
Or maybe there are multiple users or coupons for each payment_transaction? In this case, the COUNT will be inflated. To fix this, simply remove the two extra tables.
Regardless of the situation, add this composite (and covering) index to payment_transactions:
INDEX(refund_status, transaction_type, status, user_id, coupon_id)
refund_type must be first since it is tested with '='.
If that is not "fast enough", then I must ask you "Who cares what the exact number when there are a million rows?" Do you ever see a search engine giving an exact number of hits? Does that imprecision bother you?
That is, rethink the requirement to calculate the exact number of rows. Or even an approximate number.
You mentioned "pagination". Please show us how that is done. OFFSET is an inefficient way since it rescans the previously seen rows again and again. Instead, 'remember where you left off': Pagination
You HAVE to adding INDEX to your MySQL database. Makes it lighting fast!
If you are using Navicat, click on the Index tab, and add the Column name and Name it the same. You WILL notice a huge difference.

WhereBetween not working in laravel 5.2

I am using laravel framework but in this WhereBetween not working. i am using price range where price starts form 1 to 500. when i set price 1-100
it gives me all the records that is in beetween 1 to 100 i.e. 20,40,50 etc. When i change the value from 1- 150 the above result will not be displayed it gives me no result(20,40,50). Can anyone help me . Here is my Code
enter code here
$products = Product::with("productimages")
->whereBetween('price',[$data['from'], $data['to']])
->get();
Note:- $data['from'] start value i.e 1 and $data['to'] end value i.e 150 or more
I just had the same issue. When the data type of the value is not an integer, it will behave very unpredictably. So the solution is to either change the datatype of the column or cast it while fetching, like this:
$from = $data['from'];
$to = $data['to'];
$products = Product::with("productimages")
->whereRaw("CAST(price AS UNSIGNED) BETWEEN ${from} AND ${to}")
->get();
I actually had this issue on the PostgreSQL jsonb data type, as it always returns nested values as strings, and between fetching doesn't work properly.
Using Where Between
$projects = Product::where('created_at', '>', $data['from'])
->where('created_at', '<', $data['to'])
->get();
OR
$current = Product::with("productimages")
->whereBetween('created_at', array($data['from'], $data['to']))->get();
OR
DB::table('Product')->whereBetween('created_at', array($data['from'], $data['to']))->get();

CodeIgniter Where_In select from different table?

i am trying to covert this query in active record
SELECT
crm_clients.id,
crm_clients.`moved_date`,
crm_clients.`contractor_id`
FROM
dev_pfands.`crm_clients`
WHERE crm_clients.`contractor_id` = 11
AND (
crm_clients.`status` = 9
OR crm_clients.`status` = 8
OR crm_clients.`status` = 7
)
AND crm_clients.id IN
(SELECT
crm_client_cheques.`client_id`
FROM
dev_pfands.`crm_client_cheques`)
AND crm_clients.`moved_date` BETWEEN '2014-08-01'
AND '2014-11-29 '
AND crm_clients.`contractor_id`<>''
GROUP BY crm_clients.`id
the section I'm having issue is
AND crm_clients.id IN
(SELECT
crm_client_cheques.client_id
FROM
dev_pfands.crm_client_cheques) `
i've tried the where_in method but overtime i try to include my attempt of $this ->db_pfands -> where('crm_client_cheques.client id' ,'id'); get hit with errors and have no idea how to get past this.
the original query should return 703 rows and when I've removed the part I'm stuck with it increase to 3045 so i need it to be included. any help is appreciated.
First of all you have a error in your code.
$this->db_pfands->where('crm_client_cheques.client id', 'id');
This will be
$this->db_pfands->where('crm_client_cheques.client_id', 'id');
You have to provide the right column name and as far i know database's column name have not contain any space.
Now I think this active record query will help you to get your result.
$query = $this->db->select('crm_clients.id, crm_clients.moved_date, crm_clients.contractor_id')
->where('moved_date BETWEEN "2014-08-01" AND "2014-11-29"')
->where('contractor_id', 'id')
->where_in('status', [9,8,7])
->from('crm_clients')
->join('crm_client_cheques', 'crm_client_cheques.client_id = crm_clients.id')
->group_by('id')
->get();
$result = $query->result();
May be you have change couple of names because they are in different database, but i believe you can do it.

Get row count on previous Codeigniter AR query

I'm writing a web app, with Codeigniter, that allows a user to enter query parameters and then spits out the results.
For pagination, I'm passing a limit and offset (pretty standard stuff) and can return results based on that, however, I'm having trouble passing back the TOTAL number or records that would have been returned without using the LIMIT and OFFSET parameters.
My question: Is it possible to pass the total row count that would have been returned by a previous query using Codeigniters AR syntax?
I've tried a few variations of the below, but have (at best) been able to return a count of ALL records in the items table (using count_all_results). I feel like I'm missing something here.
if (isset($brand)) {$this->db->where('brand', $brand);}
if (isset($color)) {$this->db->where('color', $color);}
$items = $this->db->get('items', $page, $offset)->result_array();
$rowcount = $this->db->count_all_results('items);
return array('items' => $items, 'row_count' => $rowcount);
Yes,
if (isset($brand)) {$this->db->where('brand', $brand);}
if (isset($color)) {$this->db->where('color', $color);}
$query = $this->db->get('items', $page, $offset);
$items = $query->result_array();
$rowcount = $query->num_rows();
return array('items' => $items, 'row_count' => $rowcount);

How to get rows count from table in doctrine

I have wrote universal script to get all rows from Doctrine Table Models, but if rows amount is too large, i get exception:
Cannot define NULL as part of query when defining 'offset'.
Running script:
$table = new JV_Model_StoreOrder();
$this->data['list'] = $table->getTable()->findAll()->toArray();
I understand from the error above is due to the large number of entries in the table (> 20 000). So I decided to make a paginator to break records on the pages of 100 pieces.
Could you help me, how can I do something like that:
...
$total_amount = $table->getTable()->count();
$this->data['list'] = $table->getTable()->offset(0)->limit(100)
->findAll()->toArray();
...
Doctrine has it's own Pagination component:
http://www.doctrine-project.org/documentation/manual/1_0/ru/utilities:pagination
Example from Doctrine manual:
// Defining initial variables
$currentPage = 1;
$resultsPerPage = 50;
// Creating pager object
$pager = new Doctrine_Pager(
Doctrine_Query::create()
->from( 'User u' )
->leftJoin( 'u.Group g' )
->orderby( 'u.username ASC' ),
$currentPage, // Current page of request
$resultsPerPage // (Optional) Number of results per page. Default is 25
);

Categories