I noticed a bug in my application when a user logs into my site from multiple computers, they are displayed twice in the list of logged in. I managed to fix this by adding the groupBy('user_id') to my query, however I noticed something else wrong now: the user's last active field only updates for 1 of them.
return $query->whereNotNull('user_id')->with('user')->groupBy('user_id')->get();
To make this a bit clearer, prior to adding groupBy, lets say a user Bob logged in at 2 locations. My logged in users would look like this:
Bob Last Active 1 hour ago, Bob Last Active 6 minutes ago
However, with the groupBy, it might show this:
Bob Last Active 1 hour ago
Even though the latest activity is 6 mintues ago. I tried running a sortBy('last_activity'), however both ascending and descending have failed to fix this issue. Does anyone have any ideas?
Thanks.
It's hard to say with the question as-is because you don't explain what you're storing in that table and how you store it. I'm guessing you use a database session driver?
Anyhow, each user session is unique to a given location. This is why you get 2 entries when a user is logged in at multiple locations. A "location" here is considered different if it doesn't share cookies with another location (e.g. a private browsing window is enough). The session in Laravel is identified by the session cookie laravel_session.
What I suggest for your "last active" timestamp is to store it in your User model directly and update it when a user accesses your site. It makes much more sense to persist this uniquely with the user than trying to reconstitute the information from active sessions. Also, stale sessions are eventually removed and the information for a user would be lost.
update
group by works with aggregate functions. What you want is something like this:
SELECT user_id, MAX(last_activity)
FROM SessionTable WHERE user_id IS NOT NULL
GROUP BY user_id
With query builder:
\DB::table('session_table')
->select('user_id', \DB::raw('max(last_activity) as last_activity'))
->whereNotNull('user_id')
->groupBy('user_id')
->lists('user_id', 'last_activity');
should return something like this:
[
'user_id1' => 'last_activity1',
'user_id2' => 'last_activity2',
...
]
Related
I am curious what path I should take to accomplish the following. I want multiple computers at one location to be able to view and make changes to data inside a mysql DB with a web browser. I dont have extensive knowledge in this area, but from what I do remember this was very difficult if not impossible.
Example: Lets say I have a record for John and I want 2 computers to be able to edit Johns record. Please note that the computers will not be editing the same portion of Johns record. Lets say one record is changing a status from need to be called to called and the other computer is changing the status of need to be ordered to ordered.
I want a solution that could natively handle this.
My current knowledge is building web interfaces with PHP and SQL. I would like to use these languages as I have some prior knowledge.
So my question: Is this possible? If, so exactly how would it work(flow of info)?
There are several ways that you can accomplish this. There's already some great PHP database editing software packages out there (phpMyAdmin).
To handle this in code though you can either use Transactions (depending on what flavor of SQL you're using this would be done differently)
One of the easier ways to ensure that you don't have people's data clashing with one another is just by adding additional where clauses to your statement.
Lets say you have a user record and you want to update the last name from Smith to Bill, and the user ID is 4.
Instead of writing
UPDATE users SET lastName='Bill' WHERE id='4'
You would add in:
UPDATE users SET lastName='Bill' WHERE id='4' AND lastName='Smith'
That way if someone else updates the last name field while you're working on it, your query will fail and you'll have to re-enter the data, thus faking a transaction
Use Transactions. Updating a single record at the exact same time isn't really supported, but applying one transaction followed immediately by another certainly is. This is native to MySQL.
START TRANSACTION;
SELECT #A:=SUM(salary) FROM table1 WHERE type=1;
UPDATE table2 SET summary=#A WHERE type=1;
COMMIT;
One other thing to do is the old desktop approach. Wich is almost mannualy control the flow of modifications. I will show:
Say that you have a client table with the fields id, firstname, lastname, age. In order to control multiple users updates you will add the version integer default 0 field to this table.
When you populate the object on the form to an user you will also store the actual version that the user has selected.
So lets assume that your client table is like this:
id firstname lastname age version
1 Tomas Luv 20 0
2 Lucas Duh 22 0
3 Christian Bah 30 0
When the user select the client with the id=1 the version of this row is, in this moment, 0. Then the user update the lastname of this client to Bob and submit it.
Here comes the magic:
Create a trigger (before update) that will check the current version of that registry with the version that the user previously selected, something like this (this is just pseudo code, as I'm doing it from my head):
create trigger check_client_version on client before update as
begin
if new.version != old.version then
throw some error saying that a modification already was done;
else
new.version = old.version + 1;
end if;
end;
On the application you check if the update has this error and inform to user that someone else made change on the registry he try to change.
So with the given example it would be like:
1 - The user A selected the row 1 and start editing it
2 - At the same time the user B selected the row 1 and save it before the user A
3 - The user A try to save his modifications and get the error from the application
On this context the user A has the version field pointed to 0 also is the user B but when the user B save the registry it now is 1 and when the user A try to save it it will fail because of the check trigger.
The problem with this approch is that you will have to have a before update trigger to every table in your model or at least the one you are concerned with.
I'm not used to work with values that should decrement every a timelapse like for a user warned, for example a warn which persist for 30 days that can reach a maximum value of 3 warns before the user get banned
I thought to design a user table like this but now I should work on it, I find it not useful on decrementing the values every 30 days:
table_user
- username
- email
- warnings (integer)
- last_warn (timestamp data type)
should I use some php timer?
does exist any standard tecnique on user warnings?
You could create another table
User_warnings:
user_id
warn_timestamp
Whenever the user is warned, you first delete all entries older than 30 days, then you check if there still exist two or more warnings. Ban the user then.
If you want a history about all warnings, don't delete old warnings, but just query for warnings within the last 30 days.
This way you don't have to decrement every day, but just have to check when another warning appears.
Normalize your tables, by breaking out the warnings from the user, like:
Table: Users
UserID int auto generate PK
UserName
UserEmail
Table: UserWarnings
UserID
WarningDate
you can now write a query to determine if there are three warning in the last 30 days. Run this query when a "warn" happens, and if a row is returned, ban the user.
The query would look something like this:
SELECT
COUNT(*)
FROM UserWarnings
WHERE UserID=...your user id... AND WarningDate>=...current date time...
HAVING COUNT(*)>2
By making a warning table, you can keep a complete warning history, which may be useful.
There's really no standard design for user warning systems, I believe. The "three strikes and you're out" is a typical approach, but not always the best. For example, if I have N rules on my website, and we'll say that K of those rules are serious offenses, then the offenses that aren't so serious I would say give three strikes. But maybe the serious offenses are autoban or give two strikes?
If I had to set up something like this, I would create a table that looked like this:
user_warnings:
- warning_id
- user_id
- created_at
- offense_level
And then maybe have a query set up where you could find any users that had a sum offense level over the last T days that were greater than or equal to the value of the bannable offense level. And if their total offense level was over the recommended value, ban the user. I'd say set the offense level to be something like 5, and have tiered levels of offenses.
Never delete past offenses, though, in my opinion. You never know when it's important to remember the stuff that happened previously, and it's good to keep records of it. Just make sure this query only checks the dates that are less than 30 days old (or however many days old that the warnings you're wanting to set can be).
I need to implement "viewed" system.
How it can be done, so that pressing F5 would not increase viewed number for more than 1 per user?
SO also has such system.
Cookies, sessions, db? How it is usually done?
You will need a combination of technologies here. Each user needs to be identified uniquely (using sessions, cookies, whatever works best in your scenario). From there, you will need to be maintaining a database of hits for an item with the user's unique key (stored in their cookie or session or whatever).
When the user accesses the page, check the database to see if that user's unique key already has a hit on that page. If not, add it. Regardless, once done, pull the total number of hits that the item has had from the database. Tahdah.
Just store in your database user_id, resource_id (eventually timestamp) and before you increase viewed value check whether SQL like this:
SELECT COUNT(*) FROM ... WHERE user_id = ? AND resource_id = ? (AND timestamp > NOW() - 7 DAYS or sth)
doesn't return 1.
This depends a lot on the situation. For example, if each user is logged in with a user ID, it would be very different then if you are doing a splash page where users are not expected to be logged in.
I will assume you are in the latter category, and that users are not logged in to your page. If this were the case, I would recommend setting a cookie using the setcookie command, this could be accomplished like this:
if (empty($_COOKIE['hasViewed'])) {
//increment the total number of views in the
//database or wherever we are storing it.
$viewer->incrementViews();
}
//make sure they have a cookie for next time
setcookie("hasViewed", "1", time() + 60*60*24*30);
Note that in this example, the user would be able to cause your view to increment again if they haven't seen the page in 30 days.
My site logs clicks to a database whenever a user views an article. This table is automatically cleared out every 3 days. We use the data to work out the most viewed pages over that 3 day period.
I'd also like to use this data for marketing purposes, eg, to determine which users like which sections of the site.
I've written a script in php that does the following:
grab the user IDs of any logged-in member who viewed an article in the past 3 days
for each user, i query to count how many times they viewed articles within each section, Eg, Bobby viewed 10 pages in Food & Drink, and 6 pages in Sport. I tried combining this step and the previous one together but got weird results for the totals.
This gives me a nice array in this form:
[Bobby Jones] => Array
(
[Film] => 10
[Home Page] => 1
[Food & Drink] => 2
[Health & Beauty] => 1
[Gifts & Gadgets] => 3
)
What I want from this data is to eventually have a table that logs the data in the array above, and then when I run my queries, increments it.
Unfortunately, this adds a lot of overhead. When I have my arrays like the one above, I have to run another query to check if that combination of user and category already exists in the database, and if it does, I have to increment it by that day's values. Eg, if Bobby viewed 10 Film pages last week, and this week viewed 6, I need to UPDATE the table. If he's never viewed a Food page before, I need to INSERT instead.
This query is returning 400 or so users who've interacted with the site in the last 3 days. This means that for each I user I have to do 1 query to get their browsing totals, 1 query to see if they've already browsed that category before, and another query to update/insert, depending on whether they've browsed it or not. You can see how inefficient this is.
Can anyone suggest a better way of doing this? My ultimate goal is to end up with a table that shows me how frequently my users browse my categories, so I can say "show me all the users who like Food & Drink" etc.
Thanks,
Matt
You can accomplish the UPDATE/INSERT behavior using MySQL's INSERT...ON DUPLICATE KEY UPDATE.
You can combine the SELECT and INSERT queries using MySQL's INSERT...SELECT.
If you post more details (say, a schema, for example), I could show you a sample query combining those two techniques, though the MySQL Manual is pretty in-depth on both subjects.
If you're using MySQL and the version is sufficiently high, look into INSERT ... ON DUPLICATE KEY UPDATE. That should cut down on a query.
Then make sure your tables are properly keyed, and those two queries should be a breeze.
In my research to find a way to make PHP tell me how many people are 'online' on my site I've discovered that there are ways to 'estimate' this.
I've chosen to log everything that happens on the site, also for error-management sake, but now i'm stuck at writing my SQL query.
Basicly I have a database with 'IP', 'userid' and 'datetime' and I figured that a query like this would do the trick:
SELECT distinct(IP), datetime
FROM `bigBrother`
WHERE datetime BETWEEN DATE_SUB(NOW(), INTERVAL 3 MINUTE) AND NOW()
The problem is that my site is mostly viewed and used by students on the school network, and well... they all have the same IP.
So the question is, am I doing this right, and can I select two distinct rows from my database, so that I can sort out the registered users (who will have a 'userid' - others will have userid = 0)?
Just use the session id instead of the IP.
Use cookies instead of IP addresses.
PHP makes it very easy with it’s session mechanism. On each you first do a session_start() and then you use the value returned by session_id() as a identifier of the visitor that you can put in your database.
I created a system for a school site that asked for this feature as well and here's how i did it.
I had a table of users, in that table there was a field called "online_time"
On every page a function was called if the user was logged in that updated the "online_time" to the current time of that user. (unixtime)
Then i had an "Who is online" function that looked at the "online_time" and displayed all the users with the online time of the last 5 minutes.
EDIT to the comment:
You could make the same function save the session id in another table and the time it was saved. The session id is unique to that user browsing, so you could get the number of session id's active within the last 5 minutes.
session_id()