I run points system on my site so I need to keep logs of different action of my users into database. The problem is that I have too many users and keeping all the records permanently may cause server overload... I there a way to keep only 10 records per user and automatically delete older entries? Does mysql have some function for this?
Thanks in advance
You can add a trigger that takes care of removing old entries.
For instance,
DELIMITER //
CREATE definer='root'#'localhost' TRIGGER afterMytableInsert AFTER INSERT ON MyTable
FOR EACH ROW
BEGIN
DELETE FROM MyTable WHERE user_id = NEW.user_id AND id NOT IN
(SELECT id FROM MyTable WHERE user_id = NEW.user_id ORDER BY action_time DESC LIMIT 10);
END//
Just run an hourly cron job that deletes the 11th - n records.
Before insert a record you could check how many the user has first. If they have >=10 delete the oldest one. Then insert the new one.
If your goal is to have the database ensure that for a given table there are never more than N rows per a given subkey (user) then the correct way to solve this will be either:
Use stored procedures to manage inserts in the table.
Use a trigger to delete older rows after an insert.
If you're already using stored procedures for data access, then modifying the insert procedure would make the most sense, otherwise a trigger will be the easiest solution.
Alternately if your goal is to periodically remove old data, then using a cron job to start a stored procedure to prune old data would make the most sense.
When you are inserting a new record for a user. Just do a query like this before (Don't forget the where-condition):
DELETE FROM tablename WHERE userID = 'currentUserId' LIMIT 9, 999999
After that you can insert new data. This keeps the data always to ten records for each user.
INSERT INTO tablename VALUES(....)
DELETE FROM Table WHERE ID NOT IN (SELECT TOP 10 ID FROM Table WHERE USER_ID = 1) AND USER_ID = 1
Clearer Version
DELETE FROM Table
WHERE ID NOT IN
(
SELECT TOP 10 ID FROM Table WHERE USER_ID = 1
)
AND USER_ID = 1
Related
I have this query in php. It's an insert select copying from table2, but I need to get the IDs of the newly created rows and store them into an array. Here is my code:
$sql = "INSERT INTO table1 SELECT distinct * from table2";
$db->query($sql);
I could revert the flow starting with a select on table2 and making all single inserts but it would slow down the script on a big table. Ideas?
You could lock the table, insert the rows, and get the ID of the last item inserted, and then unlock; that way you know that the IDs will be contiguous as no other concurrent user could have changed them. Locking and unlocking is something you want to use with caution though.
An alternative approach could be to use one of the columns in the table - either an 'updated' datetime column, or an insert-id column (for which you put in a value that will be the same across all of your rows.)
That way you can do a subsequent SELECT of the IDs back out of the database matching either the updated time or your chosen insert ID.
Need help from you guys. I'm using MSQL. My application contains user information in multiple databases which contain so many tables. I need to delete user information for an id from all the tables which connects to different database.
I have written multiple delete queries in one file by connecting to all database(written at the top of the file) at the same time. I need to delete 4000 records related to the user. But till now only(from yesterday night the script was running) 150 records got deleted.
My requirement is to delete all the users coming under one group. So what i have done is:
Get the group id
Check if this group id exists, then retrieve all the users under this group.
$sql = "select distinct user_id, fname, sname, email from ps.user_group_link where group_id in (SELECT group_id from ps.groups_main where tree_path like '".$path."%')";
Now I connect to 3 database
$dbt = db_connect('dB1','w');
$dbm = db_connect('dB2','w');
$dbmp = db_connect('dB3','w');
Then in foreach loop I'm writing all the delete queries from all tables from 3 database.
eg:(like below 20-30 delete statements are present)
$delete_classroom_records_sql = "DELETE FROM `table1`.classroom_records WHERE user_id=". $user;
$delete_classroom_records_res = $dbt->query($delete_classroom_records_sql);
But it seems the queries is taking too long for deletion.
Is there anything I have to take care for the query to run faster?
Thank you
Check that a foreign key exists on user_id
So, I have a table A that each time a user sends an image, a record is created storing the time it was uploaded, the username of the user and the image number out of all the images uploaded over time.
I need to make a second table B that will store the amount of images uploaded per user and the user name. I need this table B to be updated when a new entry is generated in A.
I found that a trigger function can be created, nevertheless I'm having a rough time finding an example that will suit my needs.
Does anyone know a way of doin what I want?
Just update b table with a select count of total inserted records on a from current user NEW.userid (userid is your column name or whatever name you have there, and NEW is a fixed mySql reference for the current values to be inserted):
CREATE TRIGGER img_sum AFTER INSERT ON a
FOR EACH ROW SET b.total = (SELECT COUNT(*) FROM a WHERE a.userid=NEW.userid)
WHERE b.userid = NEW.userid;
From what you have described i don't think you need a second table. You can just count the number of time a user name has occurred, and you will get the number of images that user has uploaded.
You can get the count doing something like that
SELECT COUNT(DISTINCT username) FROM table_name;
If you still need to create 2 tables, you might want to take a look at procedures and how they work.
Let's say we have 3 tables for this case:
- users(id, username, email ....),
- user_images(id, userId, image_num, date_uploaded)
- user_images_count(id, user_name, images_count)
The user_images_count is initially empty. We have to fill it up by such query:
INSERT into user_images_count(user_name, images_count)
SELECT (select username from users where ui.userId = id) as username, count(userId) as counter FROM `user_images` ui group by ui.userId;
Then, we must immediately create the trigger that will process every INSERT operation into user_images table.
CREATE TRIGGER `count_user_images` AFTER INSERT ON `user_images`
FOR EACH ROW begin
declare u_name tinytext default "";
set u_name = (select username from users where id = NEW.userId limit 1);
if(u_name != "") then
update user_images_count set images_count = images_count + 1 where user_name = u_name;
end if;
end
This two queries (user_images_count fulfillment and trigger creation must be performed in one transaction, one by one).
I've created similar triggers on my local databases. They work pretty good. )))
My professor wants us to create a web-based comment system wherein a user can send up to 3 comments, if the user decides to create another one when he already has 3 comments named after him on the database, the program should delete the oldest one and save the new one.
What I thought about was to fetch the rows named after the user and if it is greater or equal to 4, I should delete the row where username = session user and insert the new record. Although this was just in theory, is this the best way to go about this? Do you guys have any other suggestions? How exactly do I pick the rows to remove? Do I base it off the highest comment_id?
Read: Leave only first 50 records in SQL database and delete the rest
So basically do this:
Create an Auto Increment Id with the comments.
Then:
DELETE FROM comments
WHERE
id NOT IN (
SELECT * FROM (
SELECT id
FROM comments
ORDER BY date
desc LIMIT 3) s
)
Do not delete the rows in the database, just do a limit 0,4 on your select query
with a order by on a create date
I'm thinking of implement a view history for my wordpress blog, where users can view their previously viewed articles as a list in their account page.
I would like to limit this to 24 unique page history per user at any point of time, meaning, if the number of articles exceeds 24, the oldest article row would be deleted, and the new article added to the table.
I'm using PHP and MySQL.
Here's my current thoughts on implementation:
Create a table with user_id and post_id columns
When user views an article, insert new row into the table
Select the rows with the current user_id, and if number of rows is more than 24,
Delete the oldest row
I'm not sure if this is the best method, since it's 3 additional database queries per user page view which is pretty heavy.
Is there a better way to do this?
The ideea is good. Improve it by updating the oldest row instead of deleting it and then adding a new one. ;)
Also make a single read query and a single write query.
make a query that is like this one
SELECT (SELECT COUNT(*) FROM recentArticles WHERE userID = 23)
AS NoOfArticles, articleID, timestamp
FROM recentArticles
WHERE userID = 23
ORDER BY timestamp ASC
LIMIT 1;
If NoOfArticles < 24 then execute an insert query, else execute an update query to articleID
You could always implement this using cookies to key on which pages have been visited and keeping a running list on the users PC. This will reduce the traffic to the site, put the processing scripts on the user-side, and could be easier to implement.
That being said, I agree with the other answer about updating the oldest entry if the table is full. This eliminates the need to delete and add a row. Key off a time-date stamp to sort the entries when you're displaying them and to figure out which page was the oldest (and needs to be updated if >= 24 pages.
You can merge two queries in one. In this way, You will save execution time of your script. So, basically, DELETE row if the post count is more than 24. You can modify this query according to your exact need. But yes, you can think on this way.
DELETE FROM `table_name`
WHERE id=(SELECT (CASE WHEN COUNT(id)>24 then id END)
AS last_id
FROM `table_name`
WHERE user_id='XX'
ORDER BY id DESC LIMIT 1);