I have a mysql database, or more specific, a mysql table which I store IP adresses in.
This is because I limit the nr of messages being sent from my website.
I simply check if the IP is in the table, and if it is, I tell the user to "slow down".
Is there any way to make this MySql table only store a row (a record) for x minutes?
Other solutions are also appreciated...
No, but you can use a TIMESTAMP field to store when the row was inserted / modified and occasionally delete rows that are older than x minutes.
DELETE FROM your_table
WHERE your_timestamp < NOW() - interval 5 minute
To solve your actual problem though, I'd suggest having a table with a row for each user and the last time they sent a message. Assuming it is indexed correctly and your queries are efficient you probably won't ever need to delete any rows from this table, except perhaps if you use a foreign key to the user table and delete the corresponding user. When a user sends a message insert a row if it already exists, otherwise update the existing row (you can use for example the MySQL extension REPLACE for this if you wish).
I would recommend that you add a WHERE clause concerning time to the SELECT "simply check if the IP is in the table"
SELECT * FROM table WHERE ip = <whatever> and timestamp > NOW() - 3*60
Then maybe empty out that table once every night.
I'd make a column that has the timestamp of the last sent message and another that has the total number of posts. Before updating the table check if at least X minutes has passed since the last post. If so, change the total number of posts to 1, otherwise increment the value by 1.
One approach that doesn't involve deleting the IP addresses after a certain interval is to store the addresses as "temporal" data, i.e. records that are only valid for a certain period.
Simplest way to do that would be to add a timestamp column to the table and, when entering an IP, capture either the time it was entered into the table, or the time after which it is no longer being "limited".
The code that grabs IPs to be limited then checks the timestamp to see if it's either:
older than a certain threshold (e.g. if you recorded the IP more than an hour ago, ignore it) or
the current time is greater than the expiry date stored there (e.g. if an IP's timestamp says 2010-11-23 23:59:59 and that is in the past, ignore it)
depending on what you choose the timestamp to represent.
The other solutions here using a timestamp and a cron job are probably your best option, but if you insist on mysql handling this itself, you could use Events. They're like cron jobs, except mysql handles the scheduling itself. It requires 5.1+ though.
Related
I need some help about database management. I am trying to retrieve data from my database (filtering them by the created_at field).
There will be no problem when I am retrieving data from my database created in today's date.
For example today is 4/17. When I run the insert function today, the value for created_at will be 4/17 as well. So when I go to my web page and display data for 4/17, the data will be right.
But let's say I forgot to fetch data for 4/15, and I need to fetch those data today. When I insert these data in my database now, the created_at will be 4/17, but the adjacent data is actually for 4/15.
Now, when I go to my web page and display data for 4/15, I will get nothing.
As a workaround, I added a date field in my table, and this will contain a specified date, unlike the created_field that takes the server's date. I now use the date field to filter the data in my web page.
However, I think this is somewhat redundant or inefficient approach. Does anyone have any suggestions?
Here is a screenshot of my current table structure:
The accepted answer solves the XY problem. It's probably not the way to solve the actual problem.
There are lots of reasons for putting the current datetime into a database (rather than a datetime which is intrinsic in the data such as an appointment, or a date of birth). While this shouldn't be used for auditing purpose is it is handy for debugging and for dealing with optimistic locking.
But here, you seem to be looking for a transaction control mechanism - a way of identifying what records have been subjected to some action. If it is important to maintain a record of the sequence in which the records were processed, then a date, or even a date time, or even a millisecond timestamp may not be adequate. What happens iwhen you need to apply this operation more than once per day? What if it fails half way through and you need to resume the operation? This mechanism also precludes the notion that there may be more than 2 stati for a record.
Assuming the thing which is being done with the record is part of an ACID transaction, then there are 2 states to consider - before and after. Your data domain should explicitly describe those two states (using a null/non-null date value is merely implicit). If the transaction is not atomic then there will likely be more states to consider.
In the case of MySQL I would implement this as an enum datatype (with a non-null constraint). But more generally I seek to avoid a situation where data is being updated like this by using synchronous operations wrapped in transactions.
Since you are using Laravel, you can simply override the created_at value when creating your model. So for example, you can do the following:
$myModel->created_at = Carbon::parse('2019-04-15');
$myModel->save();
This would set the created_at value to April 15th, not today. Hence you don't need a second date column in your table.
UPDATE
Nonetheless, if you need the time part to still reflect the current time, you can do the following:
$myModel->created_at = Carbon::now()->setYear(2019)->setMonth(4)->setDay(15);
$myModel->save();
So, I've done quite a bit of googling on this topic, and I just can't find an answer. So, basically, I'm looking to make a small website, that will pull information from a HTML form, send it to a database, then after two hours, it will automatically delete itself. I have a basic theory on how this could work, but I can't figure out how to do it: I could pull the current time and date, add two hours to that, then put the time into an "expires" column in the database. Once the time is the one that is in the expires column, the data will be removed from the database. Sorry if this is a very "noobish" question, I'm still a bit new to databases with PHP.
Anyway, any help would be much appreciated! Thanks!
You could add a new timestamp column to your table which will automatically add the timestamp of when the row was created like so
CREATE TABLE t1 (
#your existing columns defined as before + this new column
ts_created TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
Now every time you create a row on this table, MySQL does all the work of recording when it was created.
Assuming you may not be able to create a cron job on your host you could then add the deletion code in the most obvious place in your existing site code to do the removal.
// remove old stale data
$sql = "DELETE FROM user
WHERE ts_created < DATE_ADD(NOW(),INTERVAL -2 HOUR)";
if ( ! $mysqli->query($sql) ) {
// log $mysqli->error somewhere
}
ALthough a cron job seems a good idea at first sight, in order to make sure things are always accurate on this table you would have to run it every 30 seconds or maybe even more often. That would get in the way of other activities on this table, if the site was busy that could be a problem, if it was not busy you would just be running the cron unnecessarily most of the time.
If you add the deletion code just before you present this information to the user at least it would only be run when required and you would also ensure that the table was always accurate at the time the data was presented to the user.
You can ensure the scheduler starts when MySQL is launched with the command-line option --event-scheduler=ON or setting event_scheduler=ON in your MySQL configuration file (my.cnf or my.ini on Windows).
Run this query statement in mysql
SET GLOBAL event_scheduler = ON;
Create an mysql event scheduler using following - this will behave like Cron Job but actually it is a mysql trigger on specific interval. This is triggered from mysql server.
CREATE EVENT e_hourly
ON SCHEDULE
EVERY 1 HOUR
COMMENT 'Clears out sessions table each hour.'
DO
DELETE FROM table_name WHERE UNIX_TIMESTAMP(NOW()) - UNIX_TIMESTAMP(remove_time) > 120
http://dev.mysql.com/doc/refman/5.1/en/create-event.html
http://www.sitepoint.com/how-to-create-mysql-events/
Pardon my explaination - I myself have implemented this just now and it worked.
Just add a column remove_time (DATETIME) and set the time you want the row to be deleted. Than use cron (configuration depends on webhosting you have) to run this query (probably as poart of PHP script):
DELETE FROM table WHERE remove_time <= NOW()
You can configure cron to run every minute (or less/more, depending on your needs).
Try implementing a cron which will run at specified time automatically to check and delete the rows whose created_at is less than the current_time by 2 hours.
On how to implement cron, check Skilldrick's answer here
Thank you
:)
I'm trying to create a computer reservation system, where user chooses a computer and select the time how long he will be using this PC. In that time other persons can't reserve this pc, I need to find a solution, how to automaticaly delete all rows containing reserved pc's after their time expires. Thank you for the advice.
The common way to handle this is to store an expires_at timestamp on the reservation row. Then your query to find any "open" reservations would have WHERE 'expires_at' < NOW() or something similar.
This is an untested answer, that may only be a suggestion, but I just started looking at these, so am interested in feedback as well. i'm still working through possibilities and drawbacks, but it might well suit your need.
Take a look at MySQL Events, an article about it is here, and official syntax at Mysql Docs.
Per the article:
An event is similar to a trigger. However, rather than running in
response to a data change, events can be scheduled to run any number
of times during a specific period. In effect, it’s a database-only
cron job.
Pondering this, I'd envision a procedure that deleted anything >1hr (if that's the expiration). This procedure would be TRIGGERED on new inserts to get rid of anything expired at that moment, but also in an event to run every 15 minutes or so so that automatic deletes by the trigger aren't dependant on somebody else adding a reservation to trigger that procedure.
If your server is linux, you can use cron jobs to check once a day every reservation dates. If these dates have expired .. modified field reserves to be available.
Normally I would do it this way:
when storing a reservation, store date_from and date_to both of datatype DATETIME
when checking if there is a computer free check for all computers and filter with WHERE '{$my_date}' >= date_to AND '{$my_date}' <= date_from - by this You should be able to get all the PCs that are not reserved within a certain time...
To be complete in the solution, you need to run a CRON job which calls a query to remove all reservations that have a reservation_time + (15 * 60) < unix_timestamp().
I am assuming you have a time that the reservation was placed or started and are using UNIX/Epoch Timestamps.
Instead of doing a expires_now, if you know it will always be a fixed interval ie 15 minutes, you can do:
DELETE FROM reservations WHERE reservation_time + (15 * 60) < unix_timestamp()
Something you could look into is managing cron job's from PHP, http://www.highonphp.com/cron-job-manager.
The above script will, when a reservation is created, insert an entry into /etc/cron.d/ and you could configure it to run at the expected reservation endtime. Then inside the php file which would be executed, you could do:
DELETE FROM reservations WHERE id = :id
I am working on a PHP / MySQL stat logging program and am trying to find the best MySQL DB Structure for it.
There is a part where visitors will be able to see up to the date stats (i.e the latest 20 entries) but also will be able to see today's overall, yesterday's overall, last 7 days overall and last 30 days overall stats.
From the data I'm pulling the real-time stats will be updated every 60 seconds with at least 10 new entries per update.
Is my logic correct to setup two tables ... one to act as "today's" stats and another to act as the overall archive ... like:
todays_stats
id
from_url
entry_date
overall_stats
id
from_url
entry_date
Then double insert for each new entry but truncate the todays_stats at midnight every night via a cron job?
Or is there a more efficient way of doing this?
It depends on your daily stat row count, whether to delete historical data, and how much indexes you has. We need to delete historical data and has 7~8 indexes with large amount of stat data, so we separate data into daily tables and write stored procedures to fetch data(last day, last 7 day, last 30 day etc). Dropping table is much more faster than DELETE FROM table WHERE index=6-month-old-data
I think best way is to be keep one table that will hold the current data set, then you will have separate table for overall stats and at midnight you will insert all data from current to overall table with
INSERT INTO `overall` SELECT * FROM `current`
query. Then you will truncate the current table after successful data copying.
I have a site that has few request options (for example add photo, add comment). I want to limit of requests made by use per certain time, for example so he can't post more than 5 comments within hour, he can't add more than 5 photos per hour etc.
My idea was to make/update session variable every time form action is sent, so it sums up to 5 (and if session var == 5 it would deny action on every form). My idea seems good in my mind, but i just can't find the way to reset certain session variable 1 hour from it's initation). Looking forward for any ideas
Do it from SQL using simple SQL commands you can get the number of items done in the past hour and thus no need to use session variables (which will die if a user reset it's session)
Check the number of "posts" for a specific element in the current hour
SELECT
COUNT(*)
FROM
my_elements_table
WHERE
HOUR(createdon) = HOUR(NOW())
AND DATE(createdon) = CURDATE()
AND createdby = the_user_you_are_checking
Check the number of "posts" for a specific element in the past hour
SELECT
COUNT(*)
FROM
my_elements_table
WHERE
DATE_ADD(createdon, INTERVAL 1 HOUR) > NOW()
AND createdby = the_user_you_are_checking
Obviously, adapt the SQL based on your database fields and tables but you should have a good starting point with that.
I guess you store data about the comments and the photos in a database, at least you have to do it about the comments, but I guess you do it for the photos as well. In that case I would save a timestamp for when the comment/photo was created and an ID of the user who created it, along with the rest of the information.
When a user then tries to add another photo or comment, you count the number of comments and photos in the db that were created by that particular user within the last 60 minutes. If it exceeds five, you discard the request, otherwise you add the information.
Well if you got users, you store them in a database, don't you ? Then why not just store the last time they commented something in the database and use that to check if they can comment ?
Make 5 variables in the session containing the time of the actions and every time a user is trying to post check to see first if all 5 have something recorded and if all of them have data check the time recorded.If all times are within one hour from the current time then deny access.
Another solution would be to query your database to return comments posted within your specified time frame and if the result count is higher than allowed, don't allow new comment.
Something like: "SELECT created_on FROM tblComments WHERE user_id=x"
With this I am making an assumption that you are storing comments in a database and that you have a field containing the post time.