I have data in MYSQL table where i have to paginate through 4 rows at a time.
To do this I use the command below where i increase X from 0 by 4 until i reach the end of the data.
The command works for X=0 and X=4, when i reach X=8 i get the error #1038 - out of sort memory, i tried increasing it to 256K but same result.
Anybody know how to solve? Im using PHP
SELECT DISTINCT * FROM ((SELECT DISTINCT * FROM table WHERE (scope=0) AND (id='6')) UNION (SELECT DISTINCT * FROM table WHERE (scope=0) AND (id<=1000))) as total ORDER BY timestamp DESC LIMIT X,4
Not an answer, but your query seems to be functionally identical to this:
SELECT columns
, you
, actually
, want
FROM table
WHERE scope = 0
AND id<=1000
ORDER
BY timestamp DESC
LIMIT X,4
Related
I have a table in my database with approx 400K rows, and I am executing the following statement (query that laravel is executing):
select * from `activities` where `device_id` = ? and `battery_level` is not null order by `created_at` desc limit 1
This takes less than 100ms when I execute it directly using a MySQL client. But laravel takes 1,5 - 2 seconds
My table looks like this:
And how I use the models: (takes 1500ms)
Activity::query()
->where('device_id', 288)
->whereNotNull('battery_level')
->orderByDesc('created_at')
->first();
And with DB: (also takes 1500 ms)
DB::table('activities')
->where('device_id', 288)
->whereNotNull('battery_level')
->orderByDesc('created_at')
->first();
I am looking to reduce this query to the same speed as MySQL between 100 and 200ms max. Removing the ordering desc reduces the time of the query, but I need to use order desc to get the latest.
Any idea what I am doing wrong here?
It may be helpful to provide SHOW CREATE TABLE.
Replacing INDEX(device_id) by
INDEX(device_id, battery_leve, created_at)
may help.
Try this:
$record = DB::raw("SELECT * FROM activities WHERE device_id = 1 AND battery_level IS NOT NULL ORDER BY created_at DESC LIMIT 1")
I'm trying to make a quiz using mysql and php
I want to fetch questions randomly from the database
I have a truble with rand function
sometimes it seturns no value and returns also duplications
I tried to find a solution on the net but couldn't make it
this is the part of code that generate the problem
$link=mysqli_connect("localhost","root","","database");
$req="SELECT DISTINCT *
FROM `qst_s`
WHERE `id_qst` = ROUND( RAND()*49 ) + 1 AND `level` = '1' LIMIT 40";
$result=mysqli_query($link,$req);
$question=$result->fetch_assoc();
I have 50 questions in my database level 1 by the way
The simplest way of selecting random rows from the MySQL database is to use "ORDER BY RAND()" clause in the query.
SELECT * FROM `table` ORDER BY RAND() LIMIT 0,1;
The problem with this method is that it is very slow. The reason for it being so slow is that MySQL creates a temporary table with all the result rows and assigns each one of them a random sorting index. The results are then sorted and returned.
Your query would look like this:
$req="SELECT DISTINCT *
FROM `qst_s`
WHERE `level` = '1'
ORDER BY RAND()
LIMIT 40";
I have a row in my db table that is datetime. How can i write a query to only all rows except one (the most recent).
I would have used
ORDER BY col_name DESC LIMIT 1
if i was choosing only the most recent.. but i actually require all but the most recent.
Thanks
Just select all rows but the first:
ORDER BY col_name DESC LIMIT 1, 18446744073709551615
See 13.2.9. SELECT Syntax which explains the LIMIT clause.
Title says:
MySQL Query to choose all but the most recent datetime
If have duplicated dates you'll have to go for:
select * from t
where val != (select max(val) from t);
This is because if there are 2 max values then limit will only filter the first one and you'll get the other max value in the resultset.
I have query that gets 10 random posts , and as you know this is very slow and heavy query, is there any alternatives to submit this query without any slow appearance?
my current rand query :
SELECT * FROM posts ORDER BY RAND() LIMIT 10
From MySQL document:
SELECT * FROM tablename ORDER BY RAND() LIMIT 1
works for small tables, but once the tables grow larger than 300,000 records or so this will be very slow because MySQL will have to process ALL the entries from the table, order them randomly and then return the first row of the ordered result, and this sorting takes long time. Instead you can do it like this (atleast if you have an auto_increment PK):
SELECT MIN(id), MAX(id) FROM tablename;
Fetch the result into $a
//php code
$id=rand($a[0],$a[1]);
SELECT * FROM tablename WHERE id>='$id' LIMIT 1
I would like to be able to pull back 15 or so records from a database. I've seen that using WHERE id = rand() can cause performance issues as my database gets larger. All solutions I've seen are geared towards selecting a single random record. I would like to get multiples.
Does anyone know of an efficient way to do this for large databases?
edit:
Further Edit and Testing:
I made a fairly simple table, on a new database using MyISAM. I gave this 3 fields: autokey (unsigned auto number key) bigdata (a large blob) and somemore (a medium int). I then applied random data to the table and ran a series of queries using Navicat. Here are the results:
Query 1: select * from test order by rand() limit 15
Query 2: select *
from
test
join
(select round(rand()*(select max(autokey) from test)) as val from test limit 15) as rnd
on
rnd.val=test.autokey;`
(I tried both select and select distinct and it made no discernible difference)
and:
Query 3 (I only ran this on the second test):
SELECT *
FROM (
SELECT #cnt := COUNT(*) + 1,
#lim := 10
FROM test
) vars
STRAIGHT_JOIN
(
SELECT r.*,
#lim := #lim - 1
FROM test r
WHERE (#cnt := #cnt - 1)
AND RAND(20090301) < #lim / #cnt
) i
ROWS: QUERY 1: QUERY 2: QUERY 3:
2,060,922 2.977s 0.002s N/A
3,043,406 5.334s 0.001s 1.260
I would like to do more rows so I can see how query 3 scales, but at the moment, it seems as though the clear winner is query 2.
Before I wrap up this testing and declare an answer, and while I have all this data and the test environment set up, can anyone recommend any further testing?
Try:
select * from table order by rand() limit 15
Another (and possibly more efficient way) would be to join against a set of random values. This should work, if there's some contiguous integer key in the table. Here is how I would do it in postgres (My MySQL is a bit rusty)
select * from table join
(select (random()*maxid)::integer as val from generate_series(1,15)) as rnd
on rand.val=table.id;
where maxid is the highest id in table. If id has an index, then this would mean only 15 index lookup, so its very fast.
UPDATE:
Looks like there no such thing as generate_series in MySQL. My fault. We don't need it actually:
select *
from
table
join
-- this just returns 15 random numbers.
-- I need `table` here only to produce rows for rand()
(select round(rand()*(select max(id) from table)) as val from table limit 15) as rnd
on
rnd.val=table.id;
P.S. If I don't want duplicates returned, I can use (select distinct [...]) in the random generator expression.
Update: Check out the accepted answer in this question. It's pure mySQL and even deals with even distribution.
The problem with id = rand() or anything comparable in PHP is that you can't be sure whether that particular ID still exists. Therefore, you need to work with LIMIT, and that can become slow for large amounts of data.
As an alternative to that, you could try using a loop in PHP.
What the loop does is
Create a random integer number using rand(), with a scope between 0 and the number of records in the database
Query the database whether a record with that ID exists
If it exists, add the number to an array
If it doesn't, go back to step 1
End the loop when the array of random numbers contains the desired number of elements
this method could cause a lot of queries in a fragmented table, but they should be pretty fast to execute. It may be faster than LIMIT rand() in certain situations.
The LIMIT method, as outlined by #Luther, is certainly the simplest code-wise.
You could do a query with all the results or however many limited, then use mysqli_fetch_all followed by:
shuffle($a);
$a = array_slice($a, 0, 15);
For a large dataset doing
select * from table order by rand() limit 15
can be quite time and memory consuming.
If your data records happen to be numbered you can put and index on the numbering colum and do a
select * from table where no >= rand() limit 15
Or even better do the random number generation in your application and do
select * from table where no >= $rand and no <= $rand+15
If your data doesn't change too often, it might be worth to add such a numbering a column to make the selection efficient.
Assuming MySQL supports nested queries and that operations on the primary key are fast, I'd try something like
select * from table where id in (select id from table order by rand() limit 15)