I am working on a updating an existing visitor tracking script on a high traffic website. I noticed that there is a problem not with the script itself, but with what happens when there are multiple requests. Let's say a user double clicks on certain links to my site and there end up being two requests made at effectively the same time. Request 1 gets processed and a session is created. The script then proceeds to add a visitor record to the database. At the same time, request 2 is getting processed. It checks whether a session is set and there isn't, so it does the same thing as request 1 does. Now, we have 2 different sessions and 2 records in the visitors table in the database, when there should really be one. The session id for the current session ends up being from whichever request finished last.
So, what I'm looking to do is to prevent this from happening. Even if there are 100 multiple concurrent requests from the same visitor, I want there to be only one session id created and above all, only one record (not 100 records) inserted into the visitors table in the database. This involves determining in a matter of a few milliseconds that one request was already made. Any ideas?
You can set the beginning of your script to force a specific session. http://www.php.net/manual/en/function.session-start.php
I have the following workaround for my java application. When first accessing the php server I send a blocking ping call that establishes the unique session id. Afterwards I start my concurrent requests to the php server. This should also be possible from a web page for example in an init block.
Related
I have php web applications, mostly on reporting section if a report is taking long for one user, it stops whole server, means all other clients or users wait for his request to finish after that their page loads completly.
What are possible ways to remove this in web (server client) applications?
how can we do specifically in wamp/PHP. something like restrict user to use server so that it do not affect other users...
MORE SPECIFIC DETAILS:
I have data entry appliactoin about 2000-4000 records being entered per 4 hours by multiple users.
as admin each record then (about after min 10 mins) I need to check in another application through web service and get more information and updating that record fields.
I noticed if i am running this re-checking in loop. All other users complain they can not use application.
I have some kind of chat/forum application that checks for new messages using periodic polling (every 15 seconds) using jquery ajax. I was wondering if i can get around the issue of users who try to be 'funny' by loading several same browser instances, with lots of tabs, all pointing to the same application. Each tab is sending an ajax request, which potentially can overflow a server if several users start to do the same thing.
I do store sessions in a table, along with the last access time and IP address, which works fine as long as users don't use the same browser. I could store a unique identifyer that is sent using the ajax POST or GET request, but that would give problems if a regular (non abusing) user refreshes his page, which would then create a new identifyer.
This is not a real problem yet, but better catch it before someone thinks of abusing the system like this :) Any idea how to do this?
One option could be to fetch data like so:
Your script is preparing to poll data. Before executing the request, write (with LocalStorage), a value saying that you're going to fetch data. localStorage.setItem("last-request-timestamp", new Date().getTime());
Poll for data. You get a result. Write that result to the localStorage: localStorage.setItem("latest-messages", ajax_result);
Check if a page is preparing to poll data by checking if localStorage.getItem("last-request-timestamp") is longer than 15 seconds ago. If so, go to step 1. If not, wait 15 seconds and check again.
Regardless if the current page polled for data or not, check the latest-messages variable and update the page.
Other pages will of course share the localStorage data. They won't get data if another page is fetching at the moment. If page #1 is closed, one of the other pages will continue to fetch data.
I haven't used LocalStorage before, but browser support seems decent enough. You should also be able to just use it as a key-value array: localStorage["last-request-timestamp"].
You can only store strings in localStorage, but you can of course serialize it into JSON.
Not sure if it is do-able in javascript. You can check if the tab is active. And only do the ajax on the active tab?
I have the similar problem. Now I force all users to log in (it means i have their e-mails). Also i setup connections limit per account and request limit per connection, after 5 overflows i ask user to enter captcha, then i block account for a 30 min and send e-mail with password recovery link. It's not a clear solution but for now it works for me.
UPD:
The simplest way to do this is to use cookie or session storage. I use cookies. The algorithm is simple:
User login on web.
Check is there any opened session for this user,
is opened, then delete the other session or trigger exception or
switch to that session, you have decide your own the desired
behavior.
Create session id for user and store it in database.
Increase sessions counter field for specific user to detect opened
sessions, so now it doesn't matter is there one browser in use
or many.
Update last access mark (i use microtime(true) + $delay and mysql
decimal(14,4)). Send it to user
Send id to client
On each request:
Search for session by passed id in $_COOKIE.
Check last access mark. If it less then microtime(true) it means that client send requests to frequent, so decide yourself what to do, increase the mark, for example microtime(true) + $delay + $penalty or drop whole session or trigger error. The behavior depends of your application.
Why not throw something like Memcached/Redis at the problem? Cache a response with a 10-15s lifetime and avoid as much processing as possible.
I want to create a bidding system where user can see the current price of items. And if any other user on any other location place a bid before me it should auto update bid in my browser.
I have read about autoupdate JS+Ajax functions but even if I place a 5 second timer to auto update the content on user's browser will it not put some extra load on server by making an ajax call every 5 second? Its a bidding system so user will have bids updating within 1-2 seconds so if i put an auto update ajax call for every 1-2 seconds it will put a lot of burden on server.
So I am wondering is there any better way to handle this type of stuff? how do twitter/facebook do update user's feeds?
AJAX or not, bidding systems always have high requests because people keeps refreshing the page to check for the latest bid information.
You can take a look and attempt long polling. Long polling a method where you "push" data from the server to the browser in response to the browser's HTTP request. It is a normal HTTP connection. This may reduce the number of requests sent from users to server, however you will still have many open and active connections between your users and your server.
You will want to look at long polling. In essence, this is how it works
On the server you need some sort of event mechanism (no probem with PHP)
Client (Browser) starts an AJAX request referencing a bidding item
Server checks for changes on the bid, if there is one, returns the request
If not, he waits for some time (minute range), waiting on an event concerning this bid
If such an event occurs, server returns the request with the info, if not he returns the request with "no bid" info
You might be able to get away with a streaming model...
Each JS client connects to the server once and keeps the conneciton open. As new events arrive at the server, they are broadcast to all the open connections in real time.
This is similar to the mechanism twitter uses to broadcast tweets.
I'm working on a project that involves multiple users and the ability to alter data.
Basically, when a user land on a page, he can enter information for a particular entry in a database. While he's on this page, no other user can access that particular entry. When he finishes, the entry becomes open again. Now, to restrict access to the entry is easy. I set it up so when the particular entry is selected, a value in the db states it is "Inactive" and no one else can get to that page. On the page itself, there's a "leave" and a "submit" button. Either of these will set the entry back to active.
The trouble I have is if the user decides to click on a different link, close the tab or somehow navigate away. How can I structure it to restore the entry back to an active state? I was looking into the "onunload" event and potentially using it to make an AJAX call. Is this the most logical route to take or is there something similar I'm missing due to my limited knowledge? Thanks for all your help.
I wouldn't go the onunload way (at least not exclusively), as it's not reliable in case of crash, power loss, etc. Chances are that entries could be locked "forever" in such case.
I'd embed an Ajax method in the page, which periodically requests a PHP script to "confirm" that the page is still alive.
You can use such "confirmation" requests to build/update a table to track current locks, working with s/t like a lock_id which uniquely identifies the "entry" being locked, session_id to uniquely identify the browser session holding that lock and expire_timestamp to set the time at which "entry" should be unlocked again in case no more "confirmation" requests of session_id come in and raise its expire_timestamp.
In combination with a cron job, periodically deleting records having exceeded their expire_timestamp, that should be a more reliably way to do what you are trying to achieve.
I worked on a similar problem and "onunload" is the right way to go. The down side of this is that if the browser crashed or is killed (from taskmanager), the unload does not get invoked. So it is better to have a job that sets entries to active state if it is sessionId corresponding to that entry is not present. (you can store a combination of sessionId & isactive lock to detect the browser idleness as well)
Okay so I'm running into a small problem.
Basicly my whole website runs through the AJAX system, content is loaded in the middle page, and theres a left and right menu which dont refresh.
Currently I'm trying to look for a PHP->Ajax feature that refreshes the whole website incase a certain record changes in the MYSQL table
Okay so every user has a record called "State" which indicates the state of their account, this can be changed by anyone, for example the account gets shot and killed by someone. How do I make it so it checks what state you have and if it changes from the "standart" state that it performs a full page refresh.
I tried to find an answer for this everywhere but haven't been able to figure something out.
-----Edit-----
Okay so I'll also notify, I kind of know how to perform a full page refresh, and I know how to retrieve data from the mysql database, this isn't the problem.
I have a table with all the users accounts in it, one of the records for every user is called "State" everybodies state will be 1 which means alive. when its 0 it means its a dead account.
On a part of my website theres an auto refresh with always fetches data from the database every 5 seconds, to check if your online if you have money etc. it also checks what state you have.
the only thing I want to do, is that when it sees your state is 0, it performs a full page refresh, considering state 0 means death, you should be seeing a deathscreen, I want it to perform a full page refresh cause the menu's have to dissapear. and it has to redirect you to the deathpage.
You need long pooling / comet - basically you keep open connection between the client and the server, and when the state is changed, the server sends the response to the client.
Basically, you'll open a long pooling connection, sending the userid.
The server script receives the userid, and starts monitoring for changes for that user. If such change is detected, send the response.
If performance is concern, you can use Tornado web server. What's nice about it, is that you can post from another application to the web server, and it can detect which client is affected by the change and send response to that client.