Deleting on record from table every 5 seconds - php

I want to delete one record every 5 seconds and have a cron job for it too. But once the cron starts it deletes all records at once.
Whether I use sleep 5 here or not it does not effect in execution
The used code is below.
mysql_select_db($database_xm, $xm);
$query_ex = "SELECT * FROM table";
$ex = mysql_query($query_ex, $xm) or die(mysql_error());
$row_ex = mysql_fetch_assoc($ex);
$RecordCount=mysql_num_rows($ex);
for ($l=0;$l<=$RecordCount;$l++) {
mysql_select_db($database_xm, $xm);
$query_ss = "delete from table2 limit 1";
$ss = mysql_query($query_ss, $xm) or die(mysql_error());
sleep(5);
ob_flush();
}
How do I delete one record every 5 seconds.

Why don't you simply pick a record and identify it from the information you already have in $row_ex? This way you can also control the order in which records are deleted.
for ($l=0;$l<=$RecordCount;$l++) {
$row = $row_ex[$l];
$query_ss = "delete from table2 WHERE id = ".$row['id']; // EXAMPLE
$ss = mysql_query($query_ss, $xm) or die(mysql_error());
sleep(5);
ob_flush();
}

The cron min exec time is 1m, you have to make infinitive loop, inside there must me sleep(5) after every loop

SET ROWCOUNT 1 will limit the number of rows to 1
For example:
-- Sets limit of rows to 1
SET ROWCOUNT 1
delete from table2
-- Sets limit back to default
SET ROWCOUNT 0
Would do it.

Related

Selecting data from DB using LIMIT

I have a total of 5000 rows in my database. Now, what i want to achieve is, my supervisor script should be able to select 1000 at a time and a record selected before shouldn't be selected any longer. I could achieve this by toggling status for a selected row but is there a way i can achieve this with LIMIT?
In the query below, i am selecting the data in batches of 1000. After the first 1000 rows is processed, the script must select another 1000 unique rows. Is there a way to achieve that with LIMITS ?
Index.php
$sql = "select * from `table` where `crude_at` between '".$date_from."'
and '".$date_to."' LIMIT 1000";
$res = $mysqli->query($sql);
while($row = mysqli_fetch_assoc($res)){
$no_of_counts = 0
//do something for the selected rows here
$no_of_counts++
if ($no_of_counts == '1000')
{
//fetch the next 1000 rows
}
}

How to fix my code to reset rows when none are found

I grab random rows, 100 at a time and once used I mark the column 'used' with a 1. I need to reset all those with a 1 when it has no more that are marked 0 to pick from. Here is my code for this...
$result = mysqli_query($res, "SELECT id, unit_id, shown FROM units WHERE suspended = '0' AND used = '0' ORDER BY RAND() LIMIT 100");
while($myrow = mysqli_fetch_array($result)){
if (!$result){
$result2 = mysqli_query($res, "UPDATE units SET used='0' WHERE used =
'1'");
}
else{
other code here
}
}
$res->close();
I have also tried this:
if ($myrow[id] == ''){
But that does not work either. I just need to know when it has marked all of them used so it can reset 'used' back to 0 so it will start over again.
I have also tried moving the if statement outside the while and it still is not working.
It should be noted that this runs on a cron.
Set $i = 0; outside the while and inside it add 1. After the while loop check if the $i is less than 100 and if it is then for sure you reached the end of the list, but if $i is equal 100 then you need to check with dB query, because there is a chance that you have result with the last 100 records and again it will be the end of the list.

Mysql query eating large amount of space using limit

I'm trying to query a very large table some 35+ millions rows to process each row 1 by 1 because I can't pull in the full database in php at once (out of memory) I'm using 'limit' in a loop but every time it trys to query the 700K mark it throws an out of disk space error (error 28)
select * from dbm_new order by id asc limit 700000,10000
I'm pulling in 10K rows at once into php and I can even make it pull in 100K rows it still throws the same error trying to start at row 700K, I can see it's eating a huge amount of disk space.
In php I'm freeing the result set after each loop
mysql_free_result ($res);
But it's not a PHP related issue, I've run the query in mysql only and it gives the same error
Why does starting the limit at the 700K mark eat up so much disk space, I'm talking over 47gig here, surely it doesn't need that much space, What other options do I have?
here's the code
$start = 0;
$increment = 10000;
$hasResults = true;
while ($hasResults) {
$sql = "select * from dbm_new order by id asc limit $start,$increment ";
....
}
You can use the PK instead of OFFSET to get chunks of data:
$start = 0;
while(1) {
$sql = "SELECT * FROM table WHERE id > $start ORDER BY id ASC LIMIT 10000";
//get records...
if(empty($rows)) break;
foreach($rows as $row) {
//do stuff...
$start = $row['id'];
}
}

How to only select row if table timestamp has been updated

Currently im selecting most updated row using the following php block, i have an ajax loop which runs this php block every few seconds to return feed. i want to echo false to ajax when latest timestamp hasn't changed so that ajax doesn't duplicate results and fill my Div (#feed) with identical content.
<?php
require_once 'db_conx.php';
$Result = mysql_query("SELECT * FROM profiles ORDER BY lastupdated desc limit 1") or die (mysql_error());
while($row = mysql_fetch_array($Result)){
echo $row['name'];
}
mysql_close($con);
?>
Somewhere in the session, or in the requestor, you need to store the last fetched time. It would be better to store it as a session variable (this I presume is client-specific because different clients will have loaded at different times) and then fetch all records that have their lastupdated time greater than the last_fetched time.
Everytime entries are fetched from the DB, just update the last_fetched variable to the current timestamp.
If you are running this every 5 seconds, I would do something like
$Result = mysql_query("SELECT * FROM profiles WHERE lastupdated > ADDTIME( NOW( ) , '-00:00:05.00000' ) ORDER BY lastupdated desc limit 1") or die (mysql_error());
$num_rows = mysql_num_rows($result);
if ($num_rows = 0) {
return false;
} else {
return true;
}
This will give you any rows that have been updated in the last 5 seconds, if it is older than this it should have been picked up in the last run.
Hope this helps
When you retrieve your results from the server I'm guessing you store the timestamp of the last query in some PHP variable in order to compare the returned results of the next query.
I would concatenate the timestamp into your query as a where clause so your sql query would become something like
$Result = mysql_query("SELECT * FROM profiles WHERE lastupdated > " . $timestamp . " ORDER BY lastupdated desc limit 1") or die (mysql_error());
It's also worth noting that when this executes if there has been more than one profile updated since the last update cycle then it will only return the most recently updated row.
Does this answer your question?

Delete old rows in table if maximum exceeded

I have insert in table any time when users open any post on my site, in this way im get real time 'Whats happend on site'
mysql_query("INSERT INTO `just_watched` (`content_id`) VALUES ('{$id}')");
but now have problem because have over 100K hits every day, this is a 100K new rows in this table every day, there is any way to limit table to max 100 rows, and if max is exceeded then delete old 90 and insert again or something like that, have no idea what's the right way to make this
my table just_watched
ID - content_id
ID INT(11) - AUTO_INCREMENT
content_id INT(11)
Easiest way that popped into my head would be to use php logic to delete and insert your information. Then every time a user open a new post you would then add the count the database. (this you are already doing)
The new stuff comes here
Enter a control before the insertion, meaning before anything is inserted you would first count all the rows, if it does not exceed 100 rows then add a new row.
If it does exceed 100 rows then you before inserting a new row you, first do a delete statement THEN you insert a new row.
Example (sudo code) :
$sql = "SELECT COUNT(*) FROM yourtable";
$count = $db -> prepare($sql);
$count -> execute();
if ($count -> fetchColumn() >= 100) { // If the count is over a 100
............... //Delete the first 90 leave 10 then insert a new row which will leave you at 11 after the delete.
} else {
.................. // Keep inserting until you have 100 then repeat the process
}
More information on counting here. Then some more information on PDO here.
Hopefully this helps :)
Good luck.
Also information on how to set up PDO if you haven't already.
What I would do? :
At 12:00 AM every night run a cron job that deletes all rows from the past day. But thats just some advice. Have a good one.
Use this query for deleting old rows except last 100 rows:
DELETE FROM just_watched where
ID not in (SELECT id fromjust_watched order by ID DESC LIMIT 100)
You can run it by CRON in every n period where (n= hours, or minutes, or any)
$numRows = mysql_num_rows(mysql_query("SELECT ID FROM just_watched"));
if ($numRows > 100){
mysql_query("DELETE FROM just_watched LIMIT 90");
}
mysql_query("INSERT INTO `just_watched` (`content_id`) VALUES ('{$id}')");
I guess this should work fine.
You can get the number of rows in your table with:
$size = mysql_num_rows($result);
With the size of the table, you can check, if it's getting to big, and then remove 90 rows:
// Get 90 lines of code
$query = "Select * FROM just_watched ORDER BY id ASC LIMIT 90";
$result = mysql_query($query);
// Go through them
while($row = mysql_fetch_object($result)) {
// Delete the row with the id
$id = $row['id'];
$sql = 'DELETE FROM just_watched
WHERE id=$id';
}
Another way would be to just delete an old row if you add a new row to the table. The only problem is, that if something get's jammed, the table might get to big.
You may use
DELETE FROM just_watched ORDER BY id DESC LIMIT 100, 9999999999999999;
So, it'll delete all the rows from the offset 100 to a big number (for end of the tables). if you always run this query before you insert new one then it'll do the job for you.

Categories