I'm working on campus project using PHP and CodeIgniter.
It's only a project for my assignment.
Then it comes that i have to limit bandwidth used by user (of app, not linux user).
And i have no idea to implement this.
I wonder if someone know the logic or ever work on the similar tasks.
What basically i need is, how to track that user (system user, not linux user) bandwidth?
Should i count every requests and responses for that user?
How to count for imagse and static files download for specific user system?
Any hints is greatly appreciated.
Thaks
Ivan
One of the only ways I can think of (using php) is to parse the access.log of the webserver, and add up the bandwidth for each client.
The next time a page is loaded and the client has reached a set limit you can then run what ever code you want.
Parsing the log each page load though does seem like it would be time consuming.
Thats how some website statistic programs get that info.
EDIT
Also some log files get archived at certain points, like mine gets a fresh start every Sunday at 6am so if a user was browsing during that time, their access history would disapear after 6, so saving the clients bandwidth in a database is a way to keep that information all the time
Related
I have a website set up to check if certain channels on ustream.com and livestream.com are live or not.
The way it currently works it queries a database table of channels and then for each channel uses the API for the ustream.com or livestream.com to check if it is live or not, and does so each time someone visits the site.
The problem is that in just the first half a day of making the site live it received over 350 visits, and people keep refreshing the page so it had 15,000 hits. Which is great except that it is overloading the database.
I am thinking I need to use a cron job and create a cached page that refreshes every few minutes so that it queries the database and the API's far fewer times per hour.
Can someone give me some pointers on how to go about doing that? I know how to set up a cron job, but how do I create a cached page that is constantly being updated?
Or if you have a better solution I'd like to hear it.
This isn't a paid job, I built it as a free service to help people know which livestreamers are currently live at any particular moment.
Here is a link to the site,
http://freedomfighterstreams.com/
I am using Codeigniter MVC framework.
I would recommemd you this caching library for Codeigniter, since it's more customizable than CI's buit one:
https://github.com/philsturgeon/codeigniter-cache/blob/master/README.md
You'll find usefull examples there.
Your results will be cached as a files.
(posting this as an answer, not a comment, because of the privelages)
This issue has been quite the brain teaser for me for a little while. Apologies if I write quite a lot, I just want to be clear on what I've already tried etc.
I will explain the idea of my problem as simply as possible, as the complexities are pretty irrelevant.
We may have up to 80-90 users on the site at any one time. They will likely all be accessing the same page, that I will call result.php. They will be accessing different results however via a get variable for the ID (result.php?ID=456). It is likely that less than 3 or 4 users will be on an individual record at any one time, and there are upwards of 10000 records.
I need to know, with less than a 20-25 second margin of error (this is very important), who is on that particular ID on that page, and update the page accordingly. Removing their name once they are no longer on the page, once again as soon as possible.
At the moment, I am using a jQuery script which calls a php file, reading from a database of "Currently Accessing" usernames who are accessing this particular ID, and only if the date at which they accessed it is within the last 25 seconds. The file will also remove all entries older than 5 minutes, to keep the table tidy.
This was alright with 20 or 30 users, but now that load has more than doubled, I am noticing this is a particularly slow method.
What other methods are available to me? Has anyone had any experience in a similar situation?
Everything we use at the moment is coded in PHP with a little jQuery. We are running on a server managed offsite by a hosting company, if that matters.
I have come across something called Comet or a Comet Server which sounds like it could potentially be of assistance, but it also sounds extremely complicated for my purposes and far beyond my understanding at the moment.
Look into websockets for a realtime socket connection. You could use websockets to push out updates in real time (instead of polling) to ensure changes in the 'currently online users' is sent within milliseconds.
What you want is an in-memory cache with a service layer that maintains the state of activity on the site. Using memcached might be a good starting point. Your pseudo-code would be something like:
On page access, make a call to CurrentUserService
CurrentUserService takes as a parameter the page you're accessing and who you are.
Each time you call it, it removes whatever you were accessing before from the cache.
Then it adds what you're currently accessing.
Then it compiles a list of who else is accessing the same thing based on the current state in the cache.
It returns this list, which your page processes and displays.
If you record when someone accesses a page, you can set a timeout for when the service stops 'counting' them as accessing the page.
I want to have an Online user counter but something which performs real time. I mean when someone comes in, the counter updates, or when someone leaves the site, the counter decrease.
I can't find anything like this on net. Is there any script for this ?
You could probably keep a list of all sessions in a database and update the "online time" every time someone hits a page. Then check how many sessions were updated in the last x minutes. However, this won't be very real time: depending on the amount of minutes you defined it will be a little bit off.
Even Google Analytics (the new real time version) gets it wrong sometimes. Don't worry too much if you can't get it right either. ;-)
You should have a look to WebSocket. There is a lot of demos out there, mostly real-time chat application, you could hack something on it :)
In my opinion, WebSocket seems a bit overhead in you case (you just want a number, no real two-sides communications) but it's the good way to do "real-time" apps.
Here are some links:
Socket.IO (node.js backend)
WebSocket and Socket.IO
Introduction to Server-Sent Events (another technique)
phpwebsocket
as far as i know there is no way to track when a user leaves your site, short of a log out button (which one can easily avoid by simply closing the window)
To expand on Tom's answer, you could create a table that tracks sessions in a database. At minimum the fields should be session_id, ip_address, activity_time. Name them whatever you'd like. You would need a function that executes on every page load that matches a record on session_id and ip_address. If a matching record does not exist, you create one; if a match is made, then update the time.
A few caveats:
1) getting the right ip address can be tricky, especially with AOL users and/or proxy users. You need to look for X_Forwarded_For headers. If they exist, for a user, use that address, otherwise use $_SERVER['REMOTE_ADDR']. I would suggest looking up X_Forwarded_For for your setup because Im not sure its available for all setups
1a) if you dont get the right ip address, some users will create a new entry on every page view
2) You need a way to remove stale sessions. I suggest as part of the function that updates the activity time, it also checks for any activity_time that is older than 5 minutes (I use 15 minutes) and if so, removes the corresponding record.
Then all you need to do is a simple count on the table and that will give you a reasonably accurate representation of the number of users currently online. With very little coding you could put this to a lot of uses. On a dating site I created I added an extra column to the table and was able to display an online icon next to users that were logged in, the same in search results to show users doing searches what users were currently online. with a bit of imagination it could be used for a few more scenarios.
Also, with a membership feature, when a user logs in you can update the session table to show that they are a member rather than a guest and if the user logs out you can remove the session from the table. Its best when a user logs out but stays on the site that you generate them a new session for security purposes. This is a bit more than what you were asking for.
You have the browser leave an HTTP connection open to your server for some unused resource (like /usercounter) that your server never responds to. Then you count the number of open connections. You can have the request send a cookie associated with the user's session so you can know if the connections are all unique users. This solution is very difficult and you will likely not find any ready-to-go solutions for implementing this.
The solution above will get a count of users who have javascript enabled. For other users, you would have to have some sort of guesstimate of how long a user would be around and update that timer on each page load.
I'm building a ajax tic tac toe game in PHP/MySQL.
The premise of the game is to be able to share a url like mygame.com/123 with your friends and you play multiple simultaneous games.
The way I have it set up is that a file (reload.php) is being called every 3 seconds while the user is viewing their game board space. This reload.php builds their game boards and the output (html) replaces their current game board (thus showing games in which it is their turn)
Initially I built it entirely with PHP/MySQL and had zero caching. A friend gave me a suggestion to try doing all of the temporary/quick read information through memcache (storing moves, and ID matchups) and then building the game boards from this information.
My issue is that, both solutions encounter a wall when there is roughly 30-40 active users with roughly 40-50 games running.
It is running on a VPS from VPS.net with 2 nodes. (Dedicated CPU: 1.2GHz, RAM: 752MB)
Each call to reload.php peforms 3 selects and 2 insert queries. The size of the data being pulled is negligible. The same actions happen on index.php to build the boards for the initial visit.
Now that the backstory is done, my question is:
Would there be a bottleneck in that each user is polling the same file every 3 seconds to rebuild their gameboards, and that all users are sitting on index.php from which the AJAX calls are made within the HTML.
If so, is it possible to spread the users' calls out over a set of files designated to building the game boards (eg. reload1.php 2, 3 etc) and direct users to the appropriate file. Would this relieve the pressure?
A long winded explanation; however, I didn't have anywhere else to ask.
Thanks very much for any insight.
Use a socket server to share active game information, PHP & MySQL really shouldn't be used to maintain active game sessions.
An opensource example of a socket server would be red5, if you dont mind paying a bit then I would recommend Smartfox.
It's also not too difficult to set up your own socket server, if you're only using it for basic communication between clients.
each game gets its own file. eg, 459675.html or .txt or .json or w/e
this file can be an html page, or whatever you need to communicate the current state of the game.
the clients poll the webserver for the latest version of the file. the webserver acts like a good little webserver and serves this file extremely efficiently from disk because it doesnt need to do any scripting language junk to process the request.
when a client makes a move, it sends request to a script. script rewrites the file...repeat.
your webserver is probably already configured to send last modified headers for static files. your clients(browsers) already know how to do a conditional http request when they're given last-modified header. So, you get bonus efficiency boost for very little work.
I am really close to finishing up on a project that I've been working on. I have done websites before, but never on my own and never a site that involved user generated data.
I have been reading up on things that should be considered before you go live and I have some questions.
1) Staging... (Deploying updates without affecting users). I'm not really sure what this would entail, since I'm sure that any type of update would affect users in some way. Does this mean some type of temporary downtime for every update? can somebody please explain this and a solution to this as well.
2) Limits... I'm using the Kohana framework and I'm using the Auth module for logging users in. I was wondering if this already has some type of limit (on login attempts) built in, and if not, what would be the best way to implement this. (save attempts in database, cookie, etc.). If this is not whats meant by limits, can somebody elaborate.
Edit: I think a good way to do this would be to freeze logging in for a period of time (say 15 minutes), or displaying a captcha after a handful (10 or so) of unseccesful login attempts
3) Caching... Like I said, this is my first site built around user content. Considering that, should I cache it?
4) Back Ups... How often should I backup my (MySQL) database, and how should I back it up (MySQL export?).
The site is currently up, yet not finished, if anybody wants to look at it and see if something pops out to you that should be looked at/fixed. Clashing Thoughts.
If there is anything else I overlooked, thats not already in the list linked to above, please let me know.
Edit: If anybody has any advice as to getting the word out (marketing), i'd appreciate that too.
Thanks.
EDIT: I've made the changes, and the site is now live.
1) Most sites who incorporate frequent updates or when their is a massive update that will take some time use a beta domain such as beta.example.com that is restricted to staff until it is released to the main site for the public.
2) If you use cookies then they can just disable cookies and have infinite login attempts, so your efforts will go to waste. So yeah, use the database instead. How you want it to keep track is up to you.
3) Depends on what type of content it is and how much there is. If you have a lot of different variables, you should only keep the key variables that recognize the data in the database and keep all the additional data in a cache so that database queries will run faster. You will be able to quickly find the results you want and then just open the cache file associated with them.
4) It's up to you, it really depends on traffic. If you're only getting 2 or 3 new pieces of data per day, you probably don't want to waste the time and space backing it up every day. P.S. MySQL exports work just fine, I find them much easier to import and work with.
1) You will want to keep taking your site down for updates to a minimum. I tend to let jobs build up, and then do a big update at the end of the month.
2) In terms of limiting login attempts; Cookies will be simple to implement but is not fool-proof, it will prevent the majority of your users but it can be easily circumvented so it would be best to choose another way. Using a database would be better but a bit more complicated to implement and could add more strain to a database.
3) Cacheing depends greatly on how often content is updated or changes. If content is changing a lot it may not be worth caching data but if a lot of more static then maybe using something like memcache or APC will be of use.
4) You should always make regular backups. I do one daily via a cron job to my home server although a weekly one would suffice.
Side notes: YSlow indicates that:
you are not serving up expires headers on your CSS or images (causes pages to load slower, and costs you more bandwidth)
you have CSS files that are not served up with gzip compression (same issues)
also consider moving your static content (CSS,Images,etc.) to a separate domain (CDN) for faster load times