Storing view counts in database table - php

What is the appropriate and most efficient way to store a view count each time a database record is access?
I have table ITEMS containing the following fields: id, item_name
Each item has its own permalink: http://domain.com/item_name
I would like to be able to use this data and display a Views: 2,938 on the page. Which method is best?
Method A
Create an additional field view_count in the ITEMS table and update it to increment the view count:
view_count = view_count + 1;
Method B
Create a new table VIEWS containing the following fields:
id, item_id, timestamp, ip
And add a new record to the VIEWS table each time a page is viewed.
Method C
Is there another method altogether?
I have seen Method A used in a variety of PHP forum software, however my University instructors have informed me that Method B is better because add() requires less strain than update()

It's somewhat true that an INSERT operation will often consume less resources than an UPDATE statement; but a well tuned UPDATE statement isn't necessarily a strain".
There are other considerations. If the only known and foreseen requirement is for a "total" page count, then a single row will consume much less storage than storing individual rows for each "view" event. Also, the queries to retrieve the view count will be much more efficient.
(Where the "strain" is in the real world is in storage, not in terms of just disk space, but the number of tapes and the amount of clock time required for backups, time required for restore, etc.)
If you need to be able to report on views by hour, or by day, then having the more granular data will provide you the ability to do that, which you can't get from just a total.
You could actually do both.
If I needed a "summary" count to put the page, I wouldn't want to have to run a query against a boatload of rows in the historical event record, just to come up with a new value of 2,939 the next time the page is viewed. It would be much less of a "strain" on database resources to retrieve a single column value from a single row, than it would be to churn through thousands of rows of event data to aggregate it into a count, every time a page is viewed.
If I also needed to be able to administratively report "views" by country, or by IP, by hour, by day or week or month, I would also store the individual event rows, to have them available for slice and dice analytics.

there is another way so you can get unique IP.. this will help you : http://codebase.eu/source/code-php/ip-counter/
but if you want to store it in database, let me know if this work for you:
CREATE TABLE `counter` (
`file` VARCHAR(200),
`time` VARCHAR(20),
PRIMARY KEY (`file`)
);
and create counter.php
<?php
//read the name of your page
$file=$_SERVER['PHP_SELF'];
//connect to database
mysql_connect("dbhost","dbuser","dbpass");
mysql_select_db("dbname");
//database check
$sql=mysql_fetch_array(mysql_query("SELECT * FROM `counter` WHERE `file` = '$file'"));
if ($sql)
{
//if there is data - it will add one
$add_counter=mysql_query("UPDATE `counter` SET `time` = `time` + 1 WHERE `file` = '$file' LIMIT 1");
}
else
{
//if empty - it will create one
$add_data=mysql_query("INSERT INTO `counter` (`file`,`time`) VALUES ('$file','1')");
} ?>
then put this to show your counter
<?php
//connect database
mysql_connect("dbhost","dbuser","dbpass");
mysql_select_db("dbname");
//show array from database
$sql=mysql_query("SELECT * FROM `counter` ORDER BY `file` ASC");
while ($data=mysql_fetch_array($sql))
{
echo ("This ".$data['file']." Get ".$data['time']." Views");
}
?>

Related

Is it possible to partially get/modify a field?

I'm setting up to gather long time statistics. It will be recorded in little blocks that I'm planning to stick all into one TEXT field, latest first.. sorta like this
[date:03.01.2016,data][date:02.01.2016,data][date:01.01.2016,data]...
it will be more frequent than that (just a sample) but should remain small enough to keep recording for decades, yet big enough to make me want to optimize it.
I'm looking for 2 things
Can you append to the front of a field in mysql?
Can you read the field partially, just the first 100 characters for example?
The blocks will be fixed length so I can accurately estimate how many characters I need to download to display statistics for X time period.
The answer to your two questions is "yes":
update t
set field = concat($newval, field)
where id = $id;
And:
select left(field, 100)
from t
where id = $id;
(These assume that you have multiple rows in the table.)
That said, you method of storing the data is absolutely not the right thing to do in a relational database.
Presumably, you want a table that looks something like this:
create table t (
tId int auto_increment primary key,
creationDate date,
data <something>
);
(This may be more complicated if data should be multiple columns.)
Then you insert into the table:
insert into t(createDate, data)
select $date, $data;
And you can fetch the most recent row:
select t.*
from t
order by tId desc
limit 1;
All of these are just examples, because your question doesn't give a complete picture of the data.

checking number of records in mySQL table

I am looking for a way to check if there are certain number of records within mysql table. For example: After POST request been before putting data to dabase, it checks first how many records there are. If lets say there are 24 records, then it will delete record with latest date based on timestamp and then inster new value from POST request. Has anyone got idea on how to do it? Looking forward fpr your answers. Below I attached simple code i wrote to insert data from post request into table.
<?php
include("connect.php");
$link=Connection();
$temp1=$_POST["temp1"];
$hum1=$_POST["hum1"];
$query = "INSERT INTO `tempLog` (`temperature`, `humidity`)
VALUES ('".$temp1."','".$hum1."')";
mysql_query($query,$link);
mysql_close($link);
header("Location: index.php");
?>
When you say delete with the latest date I have to assume you mean the oldest record? Your description doesnt tell me the name of you date field so Lets assume its onDate. You also didnt mention what your primary key is so lets assume that is just id. if you run the below query before inserting it will purge all the oldest records leaving only the newest 23 in the database.
delete from templog where id in (
select id from (
select #rownum:=#rownum+1 'rowid', t.id from templog t, (select #rownum:=0)r order by t.onDate
)v where v.rowid > 23
);
Of course you should test on data you don't mind losing.
It is best to do a cleanup purge each time instead of removing a single row before adding a new one because in the event of exceptions it will never clean itself down to the 24 rows you wish to truly have.
I also want to note that you may want to reconsider this method all together. Instead leave the data there, and only query the most recent 24 when displaying the log. Since you are going through the trouble of collecting the data you might as well keep it for future reporting. Then later down the road if your table gets to large run a simple daily purge query to delete anything older than a certain threshold.
Hope this helps.

PHP Score assigning

I have a web page where people are able to post a single number between 0 and 10.
There is like a lotto single number generation once daily. I want my PHP script to check on the the posted numbers of all the users and assign a score of +1 or -1 to the relative winners (or losers).
The problem is that once I query the DB for the list of the winning users, I want to update their "score" field (in "users" table). I was thinking of a loop like this (pseudocode)
foreach winner{
update score +1
}
but this would mean that if there are 100 winners, then there will be 100 queries. Is there a way to do some sort of batch inserting with one single query?
Thanks in advance.
I'll assume you are using a database, with sql, and suggest that would probably want to do something like
UPDATE `table` SET `score`=`score`+1 WHERE `number`=3;
and the corresponding -1 for losers (strange, can't see a reason to -1 them).
Without more details though, I can't be of further help.
You didn't specify how the numbers were stored. If there is a huge number of people posting, a good option is to use a database to store their numbers.
You can have for example a table called lotto with three fields: posted_number, score and email. Create an (non-unique!) index on the posted_number field.
create table lotto (posted_number integer(1) unsigned, score integer, email varchar(255), index(posted_number));
To update their score you can execute two queries:
update lotto set score = score+1 where posted_number = <randomly drawn number here>
update lotto set score = score-1 where posted_number = <randomly drawn number here>
Let's just assume we have a datatable named posts and users.
Obviously, users contain the data of the gambler (with a convenient id field and points for the number of points they have), and posts contain the post_id ID field for the row, user_id, which is the ID of the user and value, the posted number itself.
Now you only need to implement the following SQL queries into your script:
UPDATE users INNER JOIN posts ON users.id = posts.user_id SET users.points = (users.points + 1)
WHERE posts.value = 0;
Where 0 at the end is to be replaced with the randomly drawn number.
What will this query do? With the INNER JOIN construct, it will create a link between the two tables. Automatically, if posts.value matches our number, it will link posts.user_id to users.id, knowing which user has to get his/her points modified. If someone gambled 0, and his ID (posts.user_id) is 8170, the points field will update for the user having user.id = 8170.
If you alter the query to make it (users.points - 1) and WHERE posts.value != 0, you will get the non-winners having one point deducted. It can be tweaked as much as you want.
Just be careful! After each daily draw, the posts table needs to be truncated or archived.
Another option would be storing the timestamp (time() in PHP) of the user betting the number, and when executing, checking against the stored timestamp... whether it is in between the beginning and the end of the current day or not.
Just a tip: you can use graphical database software (like Microsoft Access or LibreOffice Base) to have your JOINs and such simulated on a graphical display. It makes modelling such questions a lot easier for beginners. If you don't want desktop-installed software, trying out an installation of phpMyAdmin is another solution too.
Edit:
Non-relational databases
If you are to use non-relational databases, you will first need to fetch all the winner IDs with:
SELECT user_id FROM posts WHERE value=0;
This will give you a result of multiple rows. Now, you will need to go through this result, one-by-one, and executing the following query:
UPDATE users SET points=(users.points + 1) WHERE id=1;
(0 is the drawn winning number, 1 is the concurrent id of the user to update.)
Without using the relation capabilities of MySQL, but using a MySQL database, the script would look like this:
<?php
$number = 0; // This is the winning number we have drawn
$result = mysql_query("SELECT user_id FROM posts WHERE number=" .$number);
while ( $row = mysql_fetch_assoc($result) )
{
$curpoints_result = mysql_query("SELECT points FROM users WHERE user_id=" .$row['user_id']);
$current_points = mysql_fetch_assoc($curpoints_results);
mysql_query("UPDATE users SET points=" .($current_points['points'] + 1). " WHERE user_id=" .$row['user_id']);
}
?>
The while construct make this loop to run until every row of the result (list of winners) is updated.
Oh and: I know MySQL is a relational database, but it is just what it is: an example.

Setting up a counter on daily basis

For our recent project we need to include a counter based on date.
For example a page view counter for answer.php is set in mysql table called counter.
Daily access to answer.php is limited to 150(page views). the table counter will store each access and when daily allowance 150 is over then it gives a warning that you exceeded your limit and block the display.
But I am not able to figure out how this can be done on daily basis. I mean when the next day starts how the counter can be reset and start from 0.
The curdate function returns the current date, so something similar to this:
$sql = "SELECT count(*) from logintable where 'logindate' = CURDATE()";
I realise your query would probably involve more tables and fields. This is a very simplistic off the top of my head untested reply, I'm not sure it even works. Just thinking out loud here.
The answer from #stefgosselin was the first thing you'd want to do; then if the count was < 150, insert a row. If the count was 150 or more, reject the query and tell them they had reached their threshold.
Is it 150 pages per user??? or total no matter who. If its per user, then it should be as simple as adding a single counter to that user's record and keep increasing each time a countable "request" comes in, and look at that count as needed to prevent results as restricted.
Then, since a user has no way of back-dating his/her requests, at the beginning of each day (even via some trigger), you can just update your table to reset all counts back to zero...
Update UserLoginTable set RequestCount = 0
Done.
You create a table called loginlog (a log of all the logins)
table loginlog
id integer autoincrement primary key
user_id integer
logindate date
logincount integer
unique key userdate(user_id, logindate)
Next you create a before update trigger on your table
DELIMITER $$
CREATE TRIGGER bu_loginlog_each BEFORE UPDATE ON loginlog FOR EACH ROW
BEGIN
/*Force an error by selecting from a non-existing table*/
/*this will prevent the update*/
IF new.logincount > 150 THEN SELECT * FROM create_error_only_150_logins_allowed
END $$
DELIMITER ;
In your php code do the following:
INSERT INTO loginlog (user_id, logindate, logincount)
VALUES ('$user_id',curdate(),1)
ON DUPLICATE KEY UPDATE logincount = logincount + 1
Test if the insert succeeded.
If the insert/update fails your user has logged in > 150 times and you can refuse the user.
This will only keep one record per user per day. You can even purge past days from the loginlog table after x number of days.
You need a table which maps dates to pageviews

Best way to update user rankings without killing the server

I have a website that has user ranking as a central part, but the user count has grown to over 50,000 and it is putting a strain on the server to loop through all of those to update the rank every 5 minutes. Is there a better method that can be used to easily update the ranks at least every 5 minutes? It doesn't have to be with php, it could be something that is run like a perl script or something if something like that would be able to do the job better (though I'm not sure why that would be, just leaving my options open here).
This is what I currently do to update ranks:
$get_users = mysql_query("SELECT id FROM users WHERE status = '1' ORDER BY month_score DESC");
$i=0;
while ($a = mysql_fetch_array($get_users)) {
$i++;
mysql_query("UPDATE users SET month_rank = '$i' WHERE id = '$a[id]'");
}
UPDATE (solution):
Here is the solution code, which takes less than 1/2 of a second to execute and update all 50,000 rows (make rank the primary key as suggested by Tom Haigh).
mysql_query("TRUNCATE TABLE userRanks");
mysql_query("INSERT INTO userRanks (userid) SELECT id FROM users WHERE status = '1' ORDER BY month_score DESC");
mysql_query("UPDATE users, userRanks SET users.month_rank = userRanks.rank WHERE users.id = userRanks.id");
Make userRanks.rank an autoincrementing primary key. If you then insert userids into userRanks in descending rank order it will increment the rank column on every row. This should be extremely fast.
TRUNCATE TABLE userRanks;
INSERT INTO userRanks (userid) SELECT id FROM users WHERE status = '1' ORDER BY month_score DESC;
UPDATE users, userRanks SET users.month_rank = userRanks.rank WHERE users.id = userRanks.id;
My first question would be: why are you doing this polling-type operation every five minutes?
Surely rank changes will be in response to some event and you can localize the changes to a few rows in the database at the time when that event occurs. I'm pretty certain the entire user base of 50,000 doesn't change rankings every five minutes.
I'm assuming the "status = '1'" indicates that a user's rank has changed so, rather than setting this when the user triggers a rank change, why don't you calculate the rank at that time?
That would seem to be a better solution as the cost of re-ranking would be amortized over all the operations.
Now I may have misunderstood what you meant by ranking in which case feel free to set me straight.
A simple alternative for bulk update might be something like:
set #rnk = 0;
update users
set month_rank = (#rnk := #rnk + 1)
order by month_score DESC
This code uses a local variable (#rnk) that is incremented on each update. Because the update is done over the ordered list of rows, the month_rank column will be set to the incremented value for each row.
Updating the users table row by row will be a time consuming task. It would be better if you could re-organise your query so that row by row updates are not required.
I'm not 100% sure of the syntax (as I've never used MySQL before) but here's a sample of the syntax used in MS SQL Server 2000
DECLARE #tmp TABLE
(
[MonthRank] [INT] NOT NULL,
[UserId] [INT] NOT NULL,
)
INSERT INTO #tmp ([UserId])
SELECT [id]
FROM [users]
WHERE [status] = '1'
ORDER BY [month_score] DESC
UPDATE users
SET month_rank = [tmp].[MonthRank]
FROM #tmp AS [tmp], [users]
WHERE [users].[Id] = [tmp].[UserId]
In MS SQL Server 2005/2008 you would probably use a CTE.
Any time you have a loop of any significant size that executes queries inside, you've got a very likely antipattern. We could look at the schema and processing requirement with more info, and see if we can do the whole job without a loop.
How much time does it spend calculating the scores, compared with assigning the rankings?
Your problem can be handled in a number of ways. Honestly more details from your server may point you in a totally different direction. But doing it that way you are causing 50,000 little locks on a heavily read table. You might get better performance with a staging table and then some sort of transition. Inserts into a table no one is reading from are probably going to be better.
Consider
mysql_query("delete from month_rank_staging;");
while(bla){
mysql_query("insert into month_rank_staging values ('$id', '$i');");
}
mysql_query("update month_rank_staging src, users set users.month_rank=src.month_rank where src.id=users.id;");
That'll cause one (bigger) lock on the table, but might improve your situation. But again, that may be way off base depending on the true source of your performance problem. You should probably look deeper at your logs, mysql config, database connections, etc.
Possibly you could use shards by time or other category. But read this carefully before...
You can split up the rank processing and the updating execution. So, run through all the data and process the query. Add each update statement to a cache. When the processing is complete, run the updates. You should have the WHERE portion of the UPDATE reference a primary key set to auto_increment, as mentioned in other posts. This will prevent the updates from interfering with the performance of the processing. It will also prevent users later in the processing queue from wrongfully taking advantage of the values from the users who were processed before them (if one user's rank affects that of another). It also prevents the database from clearing out its table caches from the SELECTS your processing code does.

Categories