Selecting data from DB using LIMIT - php

I have a total of 5000 rows in my database. Now, what i want to achieve is, my supervisor script should be able to select 1000 at a time and a record selected before shouldn't be selected any longer. I could achieve this by toggling status for a selected row but is there a way i can achieve this with LIMIT?
In the query below, i am selecting the data in batches of 1000. After the first 1000 rows is processed, the script must select another 1000 unique rows. Is there a way to achieve that with LIMITS ?
Index.php
$sql = "select * from `table` where `crude_at` between '".$date_from."'
and '".$date_to."' LIMIT 1000";
$res = $mysqli->query($sql);
while($row = mysqli_fetch_assoc($res)){
$no_of_counts = 0
//do something for the selected rows here
$no_of_counts++
if ($no_of_counts == '1000')
{
//fetch the next 1000 rows
}
}

Related

Flagging last overall row in MySQL query using LIMIT

I am retrieving results from a database in batches of 5, each time taking the last 'id' and returning it via jquery so only rows with smaller 'id' are returned. I have a load more button which should disappear after all rows are loaded, even if that means 5, 7 or 11 rows.
I have this query:
$query = $pdo->prepare("SELECT * FROM names WHERE id < ? ORDER BY id DESC LIMIT 5");
$query->execute([$_POST["id"]]);
while($row = $query -> fetch()) {
echo "<div id="$row["id"].">".$row["name"]."</div>";
});
The question is, is there an easy and performant way to add a column to flag each row if it's last or not in a single query? Considering I am using LIMIT of 5, obviously the query should also check the potential existence of 5+1 row.
Set the limit to 6 and check the number of returned rows.
for the while loop, use a counter:
$query = $pdo->prepare("SELECT * FROM names WHERE id < ? ORDER BY id DESC LIMIT 6");
$query->execute([$_POST["id"]]);
$n=5;
while($row = $query -> fetch() && $n-->0)
{
echo "<div id="$row["id"].">".$row["name"]."</div>";
}

How do you do a count query in php with a where clause?

So I have this query like this
$sql2 = "SELECT count(*) FROM comments WHERE YourUsername = '$MyUsername' AND PostId = '$PostId'";
if ($result2=mysqli_query($conn,$sql2))
{
// Return the number of rows in result set
$rowcount=mysqli_num_rows($result2);
echo $rowcount;
}
and I have 2 rows in my database which meet the clause requirements but for some reason it keeps outputting 1 as the result. How do I make it display the actual count and not just 1 when in reality the count is 2 and so on for future rows.
You're SELECTing the count of the rows in your first line, so when the query is run, it's returning the row count into $result2. You don't need to use mysqli_num_rows.
Foul

Querying a 100k record table, execution time exceeded

Every member has multiple records stored in the database. I need to sum a column for every user and get the highest / lowest value from another column. The table has over 100k records, one user might have more than 2k records in the table.
I tried this:
$query = $mysqli->query("SELECT DISTINCT `id` FROM `table`");
if($query){
$IDs = new SplFixedArray($query->num_rows);
$IDs = $query->fetch_all();
unset($query, $row);
set_time_limit(300);
foreach ($IDs as $key => $value) {
$query = $mysqli->query("SELECT SUM(price), dtime FROM `table` WHERE `id` = '".$value[0]."' ORDER BY dtime DESC");
if($query){
$row = $query->fetch_assoc();
print_r($row);
}
}
But is setting the time limit to 300 really the proper way doing this? I also tried a prepared statement, only assigning the ID in the loop, and several other things. All of which aren't working as I'd whish.
100k records really isn't that many, there should be no reason for this query to take longer than 5 minutes.
Instead of getting a distinct list of IDs and iterating through them, querying these values for each ID, it would probably be better to do everything all at once, then iterate over your results to do what you need to.
select
`id`,
sum(`price`) as `sum_price`,
min(`dtime`) as `min_dtime`,
max(`dtime`) as `max_dtime`
from
`table`
group by
`id`
(this is assuming that the "other field" that you need to get the min and max of is dtime)
I'm not strong on the PHP side though, but from a SQL perspective it's much, much faster to do things this way.

PHP get max. 1000 entries from MySql

So my problem is the following:
I am developing a WordPress plugin, and i am getting values from a custom database table. Lets call the table devices. From the device table, i am now getting all entries from the colum ids. But there is my problem: If this colum has more then 1000 entries, i need the "split" them, i mean, i need first to get the first 1000 entries, then the next 1000 until i reach the end. Any ideas how to do this in php?
For example, this is how i get the ids now:
function hlp_getIds() {
global $wpdb;
$table_name = $wpdb->prefix.'devices';
$devices = array();
$sql = "SELECT id FROM $table_name";
$res = $wpdb->get_results($sql);
if ($res != false) {
foreach($res as $row){
array_push($devices, $row->id);
}
}
return $devices;
}
So how can i integrate a method to get only each 1000 entries and not all the entries at once?
Use limit
select id
from device
order by id
limit 0, 1000
to get the next 1000 do
select id
from device
order by id
limit 1000, 1000

Delete old rows in table if maximum exceeded

I have insert in table any time when users open any post on my site, in this way im get real time 'Whats happend on site'
mysql_query("INSERT INTO `just_watched` (`content_id`) VALUES ('{$id}')");
but now have problem because have over 100K hits every day, this is a 100K new rows in this table every day, there is any way to limit table to max 100 rows, and if max is exceeded then delete old 90 and insert again or something like that, have no idea what's the right way to make this
my table just_watched
ID - content_id
ID INT(11) - AUTO_INCREMENT
content_id INT(11)
Easiest way that popped into my head would be to use php logic to delete and insert your information. Then every time a user open a new post you would then add the count the database. (this you are already doing)
The new stuff comes here
Enter a control before the insertion, meaning before anything is inserted you would first count all the rows, if it does not exceed 100 rows then add a new row.
If it does exceed 100 rows then you before inserting a new row you, first do a delete statement THEN you insert a new row.
Example (sudo code) :
$sql = "SELECT COUNT(*) FROM yourtable";
$count = $db -> prepare($sql);
$count -> execute();
if ($count -> fetchColumn() >= 100) { // If the count is over a 100
............... //Delete the first 90 leave 10 then insert a new row which will leave you at 11 after the delete.
} else {
.................. // Keep inserting until you have 100 then repeat the process
}
More information on counting here. Then some more information on PDO here.
Hopefully this helps :)
Good luck.
Also information on how to set up PDO if you haven't already.
What I would do? :
At 12:00 AM every night run a cron job that deletes all rows from the past day. But thats just some advice. Have a good one.
Use this query for deleting old rows except last 100 rows:
DELETE FROM just_watched where
ID not in (SELECT id fromjust_watched order by ID DESC LIMIT 100)
You can run it by CRON in every n period where (n= hours, or minutes, or any)
$numRows = mysql_num_rows(mysql_query("SELECT ID FROM just_watched"));
if ($numRows > 100){
mysql_query("DELETE FROM just_watched LIMIT 90");
}
mysql_query("INSERT INTO `just_watched` (`content_id`) VALUES ('{$id}')");
I guess this should work fine.
You can get the number of rows in your table with:
$size = mysql_num_rows($result);
With the size of the table, you can check, if it's getting to big, and then remove 90 rows:
// Get 90 lines of code
$query = "Select * FROM just_watched ORDER BY id ASC LIMIT 90";
$result = mysql_query($query);
// Go through them
while($row = mysql_fetch_object($result)) {
// Delete the row with the id
$id = $row['id'];
$sql = 'DELETE FROM just_watched
WHERE id=$id';
}
Another way would be to just delete an old row if you add a new row to the table. The only problem is, that if something get's jammed, the table might get to big.
You may use
DELETE FROM just_watched ORDER BY id DESC LIMIT 100, 9999999999999999;
So, it'll delete all the rows from the offset 100 to a big number (for end of the tables). if you always run this query before you insert new one then it'll do the job for you.

Categories