I was wondering how I could check if someone has left the site/page and perform an action after they left. I was reading on here and I found this:
No, there isn't. The best you can do is send an AJAX request every X seconds (perhaps only if the user moves the mouse). If the server doesn't receive any requests for 2X seconds, assume that the user left.
That's what I had planned for before but how could you make the server do something (in my case it's to remove them from the DB) if they stop sending the request? An example I can think of is how on Facebook when you go to the site you tell them you're here and online which marks you as online in chat but when you leave it marks you as offline. How is that possible?
Edit: After a while of using cron jobs I found out that web hosting sites don't like running cron jobs often enough to generate a "live" feelings on your site. So instead I found node.js and it works a 1000x better and is much simpler. I'd recommend anyone with the same issue to use it, it's very cheap to buy hosting for and it's simple to learn, if you know Javascript you can build in it no problem. Just be sure to know how Async works.
A not uncommon approach is to run a cron job periodically that checks the list of users, and does XYZ if they have been inactive for X minutes.
Facebook uses the XMPP protocol via the Jabber service to have a constant or real-time connection with the user. However, implementing one isn't an easy task at all. The most simple solution would be, as mentioned in the comments, to have the client make AJAX requests to the server every several seconds, so that the server may check whether the user is still viewing the site or not.
You might want to check out my question, which might be related to yours.
The only method I ever remember in my time developing is one that's not 100% reliable as a number of factors can actually and most likely cause it to either misfired, or not even run fully. Up to and including someone disabling JavaScript. Which grant it isn't highly likely with the way websites of today are put together. But people have the option to turn it off, and then people who are acting maliciously tend to have it off as well.
Anyway, the method I have read about but never put much stock in is, the onunload() event. That you tie into something like the <body> tag.
Example:
<body onunload="myFunction()">
Where myFunction() is a bit of JavaScript to do whatever it is your seeking to have done.
Again not superbly reliable for many reasons, but I think in all it's the best you have with almost all client side languages.
Related
I'm running an enterprise level PHP application. It's a browser game with thousands of users online on an infrastructure that my boss refuses to upgrade and the machinery is running on 2-3 system load (yep linux) at all times. Anyhow that's not the real issue. The real issue is that some users wait until the server gets loaded (prime time) and they bring their mouse clickers and they click the same submit button like 10 - 20 times, sending 10-20 requests at the same time while the server is still producing the initial request, thus not updated the cache and the database.
Currently I have an output variable on each request, which is valid for 2 minutes and I have "mutex" lock which is basically a flag inside memcache which if found blocks the execution of the script further, but the mouse clicker makes so many requests at the same time that they run almost simultaneously which is a big issue for me.
How are you, the majority of StackOverflow folks dealing with this issue. I was thinking of flagging the cookie/session but I think I will get in the same issue if the server gets overloaded. Optimization is impossible, the source is 7 years old and is quite optimized, with no queries on most pages (running off of cache) and only querying the database on certain user input, like the one I'm trying to prevent.
Yep it's procedural code with no real objects. Machines run PHP 5 but the code itself is more of a PHP 4. I know, I know it's old and stuff but we can't spare the resource of rewriting this whole mess since most of the original developers left that know how stuff is intertwined and yeah, I'm basically patching old holes. But as far as I know this is a general issue on loaded PHP websites.
P.S: Disabling the button with javascript on submit is not an option. The real cheaters are advanced users. One of them had written a bot clicker and packed it as a Google Chrome extension. Don't ask how I dealt with that.
I would look for a solution outside your code.
Don't know which server you use but apache has some modules like mod_evasive for example.
You can also limit connections per second from an IP in your firewall
I'm getting the feeling this is touching more on how to update a legacy code base than anything else. While implementing some type of concurrency would be nice, the old code base is your real problem.
I highly recommend this video which discusses Technical Debt.
Watch it, then if you haven't already, explain to your boss in business terms what technical debt is. He will likely understand this. Explain that because the code hasn't been managed well (debt paid down) there is a very high level of technical debt. Suggest to him/her how to address this by using small incremental iterations to improve things.
limiting the IP connections will only make your players angry.
I fixed and rewrote a lot of stuff in some famous opensource game clones with old style code:
well, i must say that cheating can be always avoid executing the right queries and logic.
for example look at here http://www.xgproyect.net/2-9-x-fixes/9407-2-9-9-cheat-buildings-page.html
Anyway, about performace, keep in mind that code inside sessions will block all others thread untill current one is closed. So be carefull to inglobe all your code inside sessions.Also, sessions should never contain heavy data.
About scripts: in my games i have a php module that automatically rewrite links adding an random id saved in database, a sort of CSRFprotection. Human user will click on the changed link, so they will not see the changes but scripts will try to ask for the old link and after some try there are banned!
others scripts use the DOM , so its easy to avoid them inserting some useless DIV around the page.
edit: you can boost your app with https://github.com/facebook/hiphop-php/wiki
I don't know if there's an implementation already out there, but I'm looking into writing a cache server which has responsibility for populating itself on cache misses. That approach could work well in this scenario.
Basically you need a mechanism to mark a cache slot as pending on a miss; a read of a pending value should cause the client to sleep a small but random amount of time and retry; population of pending data in a traditional model would be done by the client encountering a miss instead of pending.
In this context, the script is the client, not the browser.
First I'll outline my problem.
What I want to do is create a site. When a client connects, every second or so a number will be broadcast by him.
This is done by everybody on the site.
So every second every client receives every other clients number.
My Solution (that isn't currently making sense)
I thought of using XMPP and an OpenFire server to do this, but I can't seem to make it work with PHP.
Finally the question
Is there a better way to solve my problem than the one I outlined? Another potocol or something?
Is there something that'll play nicely with OpenFire
I already looked at these
http://code.google.com/p/xmpphp/
https://github.com/tong/hxmpp/
and Happy new Year,
XMPP, is the most common way with dealing with notifying problems, but yet you can use a less heavy approach (Technique) to deal with your problem which is Pushlets, and for sure the previous link is not the only one. Pushlets area servlet-based mechanism where data is pushed directly from server-side to (Dynamic) HTML pages within a client-browser. This allows a web page to be periodically updated by the server.
and sure it's much lighter than XMPP.
you can also use it with Java server side like in Here, which will give you some new ideas.
anyways, if you have a web application which has a lot of users you have to think twice then. and make sure that XMPP gives you a lot of controlling features over many requests. When pushlet is good enough to do your broadcasting.
Hope that will help you.
Read this http://belski.net/archives/37-Phurple-for-PHP-5.3-and-up.html
You can make it work with PHP+XMPP using the phurple extension. It works upon libpurple which is the base for Pidgin. That will make you able to work with many other protocols as well, XMPP will already enable Facebook, Google and any other XMPP based.
PHP - Apache with Codeigniter
JS - typical with jQuery and in house lib
The Problem: Determining (without forcing a download) a user's PC ability &/or virus issue
The Why: We put out a software that is mostly used in clinics, but can be used from home, however, we need to know, before they go to our mainsite, if their pc can handle the enormities of our web-based, browser-served software.
Progress: So far, we've come up with a decent way to test dl speed, but that's about it.
What we've done: In php we create about a 2.5Gb array of data to send to the user in a view, from there the view calculates the time it took to get the data and then subtracts the php benchmark from this time in order to get a point of reference of upload/download time. This is not enough.
Some of our (local) users have been found to have "crappy" pc's or are virus infected and this can lead to 2 problems. (1)They crash in the middle of preforming task in our program, or (2) their virus' could be trying to inject into our js thus creating a bad experience that may make us look bad to the average (uneducated on how this stuff works) user, thus hurting "our" integrity.
I've done some googling around, but most plug-ins or advice forums/blogs i've found simply give ways to benchmark the speed of your JS and that is simply not enough. I need a simple bit of code (no visual interface included, another problem i found with one nice piece of js lib that did this, but would take days to remove all of the authors personal visual code) that will allow me to test the following 3 things:
The user's data transfer rate (i think we have this covered, but if better method presented i won't rule it out)
The user's processing speed, how fast is the computer in general
possible test for infection via malware, adware, whatever maybe harmful to the user's experience
What we are not looking to do: repair their pc! We don't care if they have problems, we just don't want to lead them into our site if they have too many problems. If they can't do it from home, then they will be recommended to go to their nearest local office to use this software "in house" so to speak.
Further Explanation
We know your can't test the user-side stuff with PHP, we're not that stupid, PHP is mentioned because it can still be useful in either determining connection speed or in delivering a script that may do what we want. Also, this is not a software for just anyone on the net to go sign up and use, if you find it online, unless you are affiliated with a specific clinic and have a login name and what not, your not ment to use the sight, and if you get in otherwise, it's illegal. I can't really reveal a whole lot of information yet as the sight is not live yet. What I can say, is it mostly used by clinics/offices for customers to preform a certain task. If they don't have time/transport/or otherwise and need to do it from home, then the option is available. However, if their home PC is not "up to snuff" it will be nothing but a problem for them and make the 2 hours task they are meant to preform become a 4-6hour nightmare. Thus the reason, i'm at one of my fav quest sights asking if anyone may have had experience with this before and may know a good way to test the user's PC so they can have the best possible resolution, either do it from home (as their PC is suitable) or be told they need to go to their local office. Hopefully this clears things up enough we can refrain from the "sillier" answers. I need a REAL viable solution and/or suggestions, please.
PHP has (virtually) no access to information about the client's computer. Data transfer can just as easily be limited by network speed as computer speed. Though if you don't care which is the limiter, it might work.
JavaScript can reliably check how quickly a set of operations are run, and send them back to the server... but that's about it. It has no access to the file system, for security reasons.
EDIT: Okay, with that revision, I think I can offer a real suggestion - basically, compromise. You are not going to be able to gather enough information to absolutely guarantee one way or another that the user's computer and connection are adequate, but you can get a general idea.
As someone suggested, use a 10MB-20MB file and several smaller ones to test actual transfer rate; this will give you a reasonable estimate. Then, use JavaScript to test their system speed. But don't just stick with one test, because that can be heavily dependent on browser. Do the research on what tests will best give an accurate representation of capability across browsers; things like looping over arrays, manipulating (invisible) elements, and complex math. If there is a significant discrepancy between browsers, then use different thresholds; PHP does know what browser they're using, so you can give the system different "good enough" ratings depending on that. Limiting by version (like, completely rejecting IE6) may help in that.
Finally... inform the user. Gently. First let them know, "Hey, this is going to run a test to see if your network connection and computer are fast enough to use our system." And if it fails, tell them which part, and give them a warning. "Hey, this really isn't as fast as we recommend. You really ought to go down to the local clinic to perform this task; if you choose to proceed, it may take a lot longer than intended." Hopefully, at that point, the user will realize that any issues are on them, not on you.
What you've heard is correct, there's no way to effectively benchmark a machine based on Javascript - especially because the javascript engine mostly depends on the actual browser the user is using, amongst numerous other variables - no file system permissions etc. A computer is hardly going to let a browsers sub-process stress itself anyway, the browser would simply crash first. PHP is obviously out as it's server-side.
Sites like System Requirements Lab have the user download a java applet to run in it's own scope.
I'm here to ask you if what I think is the right way to go around coding this.
I have a site that receives private messages and I wish a flag to show up the moment the person receives a message. Should I check for new messages every 3 seconds and show the flag if there is a new message or is there a better way?
If I did it in ajax, I was thinking check every 3-5 seconds for new messages, and once there's a flag, stop checking for more.
My only concern is, if it checks every 3-5 second, will it cause any lag or glitchyness for the person when they're typing? Lets say they're typing out a paragraph somewhere, I don't want their writing to glitch while it checks those 3-5 second intervals.
One of my coder friends mentioned there is a method with Ping(?) or something like that. Where the person is always connected to the server and when there's a change it notifies the user. I'm totally unsure of how this works.
Anyone know how facebook does it? haha.
Thank you!
If you have done the AJAX well, it should not lag/glitch while typing. Something like 3-5s is good as its fast enough but won't slow down server/browser.
Did he mean "push"? In push the messages are pushed to client in realtime, client is not asking if there is new messages. This is most likely the method Facebook is using.
One of my coder friends mentioned there is a method with Ping(?) or
something like that
To be honest I really don't like periodic refresh(polling at intervals), because tt has scaling problems(I got notice from hosting provider when using periodic refresh). You should use more efficient transports like for example:
WebSocket
Adobe® Flash® Socket
AJAX long polling
AJAX multipart streaming
Forever Iframe
JSONP Polling(cross domain)
To use this you could for example use:
hosted solution pusher with API and generous free plan. This gives you max 20 concurrent and 100,000 messages per day, but no SSL, so do not transmit sensitive information over the wire. They also provide third-party PHP client available at github implementing REST API.
socket.io (I like this a lot)
tornado
netty
Anyone know how facebook does it? haha.
For chat they use Erlang. They also have open-sourced tornado(see link above) which they required from friendfeed which they acquired in the past. Facebook is a PHP-shop, but they decided to not use PHP for this, because PHP can not yet do this efficiently. Anyway they are using one of the efficients transports above.
I wrote a PHP chatscript with JQUERY on the userend pullstyle. It reloaded a pull.php page once a second and retrieved only new chat records from my sql database since the last check.
However I got an email from my host saying I was using too much bandwidth. Bleh. My chat was working just as I wanted it and this happened. I'm scared to do a COMET php chat because I was told it uses a separate process or thread foreach user.
I suppose I just need a strategy that's good, and more efficient than what I had.
Well you've got the premise down. 1 second recalls are far too frequent - can anyone bang out a response in a second and expect it to go through? It takes me at least 2 seconds to get my SO messages together / relatively typo free.
So part 1 - increase the recall time. You kick it up to 10 seconds, the app is still going to feel pretty speedy, but you're decreasing processing by a factor of 10 - that's huge. Remember every message includes extra data (in the HTML and in all the network layers that support it), and every received message by the server requires processing to decide what to do with it.
Part 2 - You might want to introduce a sliding scale of interactivity speed. When the user isn't typing and hasn't received a message in a while the recheck rate could be decreased to perhaps 30 seconds. If the user is receiving more than one message in a "pull" or has received a message for the last 3 consecutive pulls then the speed should increase (perhaps even more frequent than 10 seconds). This will feel speedy to users during conversations, and not slow/laggy when there's nothing going on.
Part 3 - Sounds like you're already doing it, but your pull should be providing the minimum possible reply. That is just the HTML fragment to render the new messages, or perhaps even just a JSON structure that can be rendered by client-side JavaScript. If you're getting all the messages again (bad) or the whole page again (worse) you can seriously decrease your bandwidth this way.
Part 4 - You may want to look at alternate server software. Apache has some serious overhead... and that's for good reason, you've got all your plugs running, apache config, .htaccess config, rewrites, the kitchen sink - you might be able to switch to a lighter weight http server - nginx, lighttp - and realise some serious performance increase (strictly speaking it won't help much with bandwidth - but one wonders if that's what the host is really complaining about, or that your service consumes serious server resources).
Part 5 - You could look at switching host, or package - a horrible thought I'm sure, but there may be packages with more bandwidth/processor (the later being more key I suspect) available... you could even move to your own server which allows you to do what you want without the service provider getting involved.
You don't need to check for a new message every second, kick it down to maybe 3 or 4. Make sure you are only using 1 query and that the file loaded is as small as possible. And I would always add a timeout... it would stop reloading after a minute or 2 of no activity.
Make these changes and see where that puts you.
Post some code if you want to see if we can help.
I believe that comet is the only solution.
Threads are much lighter than processes, so you would want to use a COMET type technology with a light threaded web server like lighthttpd.
Take a look at Simple Chat. It's very tiny and extensible i think.
All you need to do is "move ahead with one of the standard solutions available"....
Comet, Web Socket, XMPP are a few words you will hear every now n then, just pick one of them and it should be more efficient than classic periodic ajax call based php chat.
If you want to go ahead with PHP and jQuery based solution, try these two working examples of browser based chat system using XMPP: http://bit.ly/jaxlBoshChat (for one-to-one chat example) and http://bit.ly/jaxlBoshMUC (for multi-user chat example). Let me know if you face any problem setting them up.