For our recent project we need to include a counter based on date.
For example a page view counter for answer.php is set in mysql table called counter.
Daily access to answer.php is limited to 150(page views). the table counter will store each access and when daily allowance 150 is over then it gives a warning that you exceeded your limit and block the display.
But I am not able to figure out how this can be done on daily basis. I mean when the next day starts how the counter can be reset and start from 0.
The curdate function returns the current date, so something similar to this:
$sql = "SELECT count(*) from logintable where 'logindate' = CURDATE()";
I realise your query would probably involve more tables and fields. This is a very simplistic off the top of my head untested reply, I'm not sure it even works. Just thinking out loud here.
The answer from #stefgosselin was the first thing you'd want to do; then if the count was < 150, insert a row. If the count was 150 or more, reject the query and tell them they had reached their threshold.
Is it 150 pages per user??? or total no matter who. If its per user, then it should be as simple as adding a single counter to that user's record and keep increasing each time a countable "request" comes in, and look at that count as needed to prevent results as restricted.
Then, since a user has no way of back-dating his/her requests, at the beginning of each day (even via some trigger), you can just update your table to reset all counts back to zero...
Update UserLoginTable set RequestCount = 0
Done.
You create a table called loginlog (a log of all the logins)
table loginlog
id integer autoincrement primary key
user_id integer
logindate date
logincount integer
unique key userdate(user_id, logindate)
Next you create a before update trigger on your table
DELIMITER $$
CREATE TRIGGER bu_loginlog_each BEFORE UPDATE ON loginlog FOR EACH ROW
BEGIN
/*Force an error by selecting from a non-existing table*/
/*this will prevent the update*/
IF new.logincount > 150 THEN SELECT * FROM create_error_only_150_logins_allowed
END $$
DELIMITER ;
In your php code do the following:
INSERT INTO loginlog (user_id, logindate, logincount)
VALUES ('$user_id',curdate(),1)
ON DUPLICATE KEY UPDATE logincount = logincount + 1
Test if the insert succeeded.
If the insert/update fails your user has logged in > 150 times and you can refuse the user.
This will only keep one record per user per day. You can even purge past days from the loginlog table after x number of days.
You need a table which maps dates to pageviews
Related
Hi have a bunch of unique codes in a database which should only be used once.
Two users hit a script which assigns them at the same time and got the same codes!
The script is in Magento and the user can order multiple codes. The issue is if one customer orders 1000 codes the script grabs the top 1000 codes from the DB into an array and then runs through them setting them to "Used" and assigning them to an order. If a second user hits the same script at a similar time the script then grabs the top 1000 codes in the DB at that point in time which crosses over as the first script hasn't had a chance to finish assigning them.
This is unfortunate but has happened quite a few times!
My idea was to create a new table, once the user hits the script a row is made with "order_id" "code_type". Then in the same script a check is done so if a row is in this new table and the "code_type" matches that of which the user is ordering it will wait 60 seconds and check again until the previous codes are issued and the table is empty where it will then create a row and off it goes.
I am not sure if this is the best way or if two users hit at the same second again whether two rows will just be inserted and off we go with the same problem!
Any advice is much appreciated!
The correct answer depends on the database you use.
For example in MySQL with InnoDB the possible solution is a transaction with SELECT ... LOCK IN SHARE MODE.
Schematically it works this by firing following queries:
START TRANSACTION;
SELECT * FROM codes WHERE used = 0 LIMIT 1000 LOCK IN SHARE MODE;
// save ids
UPDATE codes SET used=1 WHERE id IN ( ...ids....);
COMMIT;
More information at http://dev.mysql.com/doc/refman/5.7/en/innodb-locking-reads.html
I am creating a job number system that a few users will be using at the same time. I have created a job number on the php page and then it saves the number to the job sheet and uses this to link other tables to the job.
I take the job number from a table called numbers which then should increment the number by 1 each time the job is submitted ready to create the next job.
But the numbers are not working correctly.
As an example I get 1,2,3,4,8, then 43,44,45,then 105
I cant see why they would jump so much
$job_number_query = "SELECT * FROM numbers";
$job_result =($mysqli-> query($job_number_query)) ;
$job_num = mysqli_fetch_assoc($job_result);
$increment_job_number = $job_num[job_number];
$update_job_number_query = "UPDATE numbers SET job_number = $increment_job_number +1 ";
$mysqli-> query($update_job_number_query);
//echo ($customer_id);
Then I simply insert the $increment_job_number into the jobsheet table.
I am using int for the Job_number field in the table numbers
I cant think of a way to test the numbers. I guess a way is to look through the jobsheets and add another number to there but because more than one user might have a job that hasn't been submitted yet would this also cause problems.
Just increase the value without the first SELECT:
UPDATE numbers SET job_number = job_number +1
You have no where clause on your update query, so you're incrementing the job_number field in ALL records in the table.
It was me that was the technical failure in the end. I have got the incremental numbers on the create page but then unfortunately I had also got the incremental number on the edit pages so every time I edited the pages I then added 1 to the number field in the numbers table.
What is the appropriate and most efficient way to store a view count each time a database record is access?
I have table ITEMS containing the following fields: id, item_name
Each item has its own permalink: http://domain.com/item_name
I would like to be able to use this data and display a Views: 2,938 on the page. Which method is best?
Method A
Create an additional field view_count in the ITEMS table and update it to increment the view count:
view_count = view_count + 1;
Method B
Create a new table VIEWS containing the following fields:
id, item_id, timestamp, ip
And add a new record to the VIEWS table each time a page is viewed.
Method C
Is there another method altogether?
I have seen Method A used in a variety of PHP forum software, however my University instructors have informed me that Method B is better because add() requires less strain than update()
It's somewhat true that an INSERT operation will often consume less resources than an UPDATE statement; but a well tuned UPDATE statement isn't necessarily a strain".
There are other considerations. If the only known and foreseen requirement is for a "total" page count, then a single row will consume much less storage than storing individual rows for each "view" event. Also, the queries to retrieve the view count will be much more efficient.
(Where the "strain" is in the real world is in storage, not in terms of just disk space, but the number of tapes and the amount of clock time required for backups, time required for restore, etc.)
If you need to be able to report on views by hour, or by day, then having the more granular data will provide you the ability to do that, which you can't get from just a total.
You could actually do both.
If I needed a "summary" count to put the page, I wouldn't want to have to run a query against a boatload of rows in the historical event record, just to come up with a new value of 2,939 the next time the page is viewed. It would be much less of a "strain" on database resources to retrieve a single column value from a single row, than it would be to churn through thousands of rows of event data to aggregate it into a count, every time a page is viewed.
If I also needed to be able to administratively report "views" by country, or by IP, by hour, by day or week or month, I would also store the individual event rows, to have them available for slice and dice analytics.
there is another way so you can get unique IP.. this will help you : http://codebase.eu/source/code-php/ip-counter/
but if you want to store it in database, let me know if this work for you:
CREATE TABLE `counter` (
`file` VARCHAR(200),
`time` VARCHAR(20),
PRIMARY KEY (`file`)
);
and create counter.php
<?php
//read the name of your page
$file=$_SERVER['PHP_SELF'];
//connect to database
mysql_connect("dbhost","dbuser","dbpass");
mysql_select_db("dbname");
//database check
$sql=mysql_fetch_array(mysql_query("SELECT * FROM `counter` WHERE `file` = '$file'"));
if ($sql)
{
//if there is data - it will add one
$add_counter=mysql_query("UPDATE `counter` SET `time` = `time` + 1 WHERE `file` = '$file' LIMIT 1");
}
else
{
//if empty - it will create one
$add_data=mysql_query("INSERT INTO `counter` (`file`,`time`) VALUES ('$file','1')");
} ?>
then put this to show your counter
<?php
//connect database
mysql_connect("dbhost","dbuser","dbpass");
mysql_select_db("dbname");
//show array from database
$sql=mysql_query("SELECT * FROM `counter` ORDER BY `file` ASC");
while ($data=mysql_fetch_array($sql))
{
echo ("This ".$data['file']." Get ".$data['time']." Views");
}
?>
What I try to do is that I have a table to keep user information (one row for each user), and I run a php script daily to fill in information I get from users. For one column say column A, if I find information I'll fill it in, otherwise I don't touch it so it remains NULL. The reason is to allow them to be updated in the next update when the information might possibly be available.
The problem is that I have too many rows to update, if I blindly SELECT all rows that's with column A as NULL then the result won't fit into memory. If I SELECT 5000 at a time, then in the next SELECT 5000 I could get the same rows that didn't get updated last time, which would be an infinite loop...
Does anyone have any idea of how to do this? I don't have ID columns so I can't just say SELECT WHERE ID > X... Is there a solution (either on the MySQL side or on the php side) without modifying the table?
You'll want to use the LIMIT and OFFSET keywords.
SELECT [stuff] LIMIT 5000 OFFSET 5000;
LIMIT indicates the number of rows to return, and OFFSET indicates how far along the table is read from.
An interesting one, I think :) - Users in my app get points for actions they perform.
I wish to find the first user to reach a goal of 100 points.
One method to do that is to return all performed actions, with their date and number of points, and run through them in PHP to find the first user who reached the goal, but I was wondering if this can be performed in SQL.
I know how to return all users who reached to goal to a specific date, but any thoughts on how to find the first one who reached the goal?
Thanks in advance,
Dorian
Create a query that computes the cumulative points scores. You then just need the lowest date with a cumulative point score of at least 100.
you could create a cache table (the data in this table is completely redundant so it can be discarded and recreated as needed)
in this table store the user (id), the date of each "event" that added points to this users balance, and the ballance itself ... like a bank balance for every user ...
the table can be recreated from the stored events/actions since it is a sum-history grouped by the user ... or it can be updated with every new event/Action ...
Create a column in the MySQL table, namely FIRST.
The first query that meets a certain condition, say points = 500, will set the 'FIRST' value in the table to 1. Every other query will have FIRST set to 0.
Example:
ID | USER | POINTS | FIRST
1 dor 480 0
2 tim 500 1
3 mit 200 0
then search for FIRST=1
$q = mysql_query("SELECT * FROM table WHERE FIRST='1'");