Session-based Server Side Data Setting and Retrieval - php

I have a theoretical question about how to approach a current project. It is a fairly simple matching quiz using JS + PHP. I am simply taking care of business logic on the server (answer checking, score updating) such as to roughly follow MVC conventions.
My current setup:
HTML + JS page to allow the user to drag and drop answers onto questions. On a successful drop, the question + answer combo is sent to the following:
A server-side PHP page to check the answer correctness based on an XML file. I return a few pieces of data in some XML client, such as true/false and number of attempts at a certain question. In addition, if the answer is correct, I increment a Session Variable on the server to keep track of the user's score.
My question revolves around best practices for setting the above mentioned session variable for tracking the score. I understand that a more persistent setup is most likely preferable, in case of computer shut-off, accidental browser closing, etc...but strictly based on this setup -
Is this a secure method for storing a score for a final insertion into the database?
I eventually will have to pull the score down from the server at the end of the game (or even mid game, for that matter), as well. Should I create a simple 'getter' PHP page to pull the score down, and just access the session variable and send it to the client?
Currently, the user actually has access to the php server-side page becuase it resides in the same folder as the actual quiz. This is moooost likely a no-no - but what is the common practice for hiding this server-only file from the user's prying eyes (without having to use authentication)?

Is this a secure method for storing a score for a final insertion into the database?
It is secure. But I don't see a why you wouldn't update the scores in the database as they are changed. This way it will be persistent.
I eventually will have to pull the score down from the server at the end of the game (or even mid game, for that matter), as well. Should I create a simple 'getter' PHP page to pull the score down, and just access the session variable and send it to the client?
Sounds like a plan.
Currently, the user actually has access to the php server-side page becuase it resides in the same folder as the actual quiz. This is moooost likely a no-no - but what is the common practice for hiding this server-only file from the user's prying eyes (without having to use authentication)?
As long as the files are .php files and are parsed by the webserver the user can only do requests to the files and that's it (if I understood the question correct that is).

check out this. I hope this will help you
http://www.linuxforu.com/2009/01/server-side-sessions/

Related

turn based game score recording

I'm developing a very, very basic turn-based game using php and jquery and I'm looking at two different ways of keeping track of the current user's score:
1) global javascript variables - for example var currentScore at the beginning of the js. The game action and turns are all controlled via ajax so I don't have to worry about a page refreshing losing the variable data.
2) mysql - create a row with currentScore, user, etc and access it / update it every turn.
I'm trying to balance a) load speed and b) making the score tamper-proof. I'm thinking that local javascript would be fast and less load time but mysql records would be more tamper-proof. Does anyone have any advice as to which is faster and which is more tamper-proof, or perhaps have another way of accomplishing this that I didn't list above?
Run your game in PHP, not in JS.
What I mean to say is instead of allowing the player's computer to control the action, and send the results back to the server...
that allows for people to hijack and send messages to your PHP like auto-firing pistols...
...or headshot scripts ...or speed-hacking.
...or even worse -- sending in messages like: "I just scored 500 points on my turn", and having your PHP script go: "Okay!".
So instead, the core of the game engine should run in PHP, the client should just say: "My character wants to move X squares.", and then the server can say: "No, you're a cheating tool, you can only move 3 squares.", and then the client will have to adhere to those rules.
In this regard, PHP will be 100% in control of the score-keeping.
any data that is not stored on the server will be tamperable. any data sent to the server can be doctored. not only should the server store all of the game data, but it should be validating all incoming data from the client. for instance, do the rules actually allow this player to use the move they are telling me they are using? Otherwise, it will be fairly easy to cheat. Then again, your project may not require that amount of scrutiny.
both,
never trust javascript in games. There will always be a clever player which will mess with it.
Use javascript for the gui part and controlling the game, but always check ALL results in PHP, especially player specific values. Check for the right player!! Else some losers will mess with your game.
Don't worry about speed, just script your game (of course with speed and data in mind) and investigate when you hit performance problems. One issue is important from beginning: think about your database tables and queries. That will become most likely your performnce bottleneck,more then bad php scripting.

Ways other than an iframe to pass parameter from php to asp.net

I have an PHP Application. If I have logged in that application I am trying to pass the parameter as querystring through an iframe to the asp.net page.
Is there any other way to implement other than using an iframe?
Instead of having the PHP application submit data to your ASP application, it would be better if they could natively and securely share some of the data.
How?
Well, your goal is having one script tell the other that the user has been logged in, right? In PHP, this is usually done by putting something in the $_SESSION. Your ASP application can't read $_SESSION, though. You'll need to use something else.
When the user logs in, create a unique value. Maybe the result of hash_hmac over some interesting data? Whatever it is, it should be unique every time it's created and unguessable. Don't throw in things like the user's IP address or the current time.
Save the unique value to a database table that both applications can read. Also store other information that will help identify the user, such as her identifier (user_id or whatever you have on hand).
So, the PHP code that logs the user in has created this unique value and stuck it in a shared database table. Now, the PHP application should forward the user to your ASP application. Include the unique value in the request.
When the ASP application receives the request, it will look for the unique value. If it's found, it can look in the shared table. If the value is found in the table, it can then take whatever measures it needs to in order to mark the user as logged in.
Once the ASP application has logged the user in, then it should delete the unique value from the shared table. The user can be forwarded to wherever she was going in the first place.
By making the key usable only one time, and only after a successful login in the PHP application, you'll reduce the possibilities of abuse by malicious or curious users. All of the important information will be hidden in the shared database table.
Be warned that this is an overly simplistic implementation of "single sign on" and is full of caveats and edge cases. While it might work for you, it might not be the best solution. Given your question history, it looks like you've been struggling with similar issues for quite some time. You might want to give some thought into using a slightly more "industry standard" SSO mechanism. SAML is the 800 pound gorilla of SSO standards. I normally wouldn't wish it on my worst enemy, but maybe it's the thing you're really looking for here.
Also, don't use iframes, they're cookie eating disasters in some browsers.

PHP - What data should we include in the session?

This is a beginner question...
In a website, what type of data should or should not be included inside the session? I understand that I should not include any info that needs to remain secure. I'm more interested in programming best practice. For example, it is possible to include into the session some data which would otherwise be sent from page to page as dependency injection. Wouldn't that correspond to creating a global variable?
Generally speaking, what kind of data has or hasn't its place inside a session table?
Thanks,
JDelage
The minimum amount of information needed to maintain needed state information between requests.
You should treat your session as a write-once, read many storage. But one which is rather volatile - e.g. the state of your underlying application data should be consistent (or recoverable) if all the sessions suddenly disappeared.
There are some exceptions to this (normally the shopping basket would be stored in the session - but you might want to perform stock adjustments to 'reserve' items prior to checkout). Here items may be added/edited/changed multiple times - so its not really write-once - but by pre-reserving stock items you are maintaining the recoverabiltiy of the database - but an implication of this is that you should reverse the stock adjustments when the session expires in the absence of completion.
If you start trying to store information about the data relating to individual page turns, you're quickly going to get into problems when the user starts clicking on the forward/back buttons or opens a new window.
In general you can put anything you like in a session. It's bad practice to put information in a session that has to be present to make your page run without (technical) errors.
I suggest to minimize the amount of data in your session as much as possible.
stuff you can save in the session so that you dont have to make another database query for info that isn't going to change. like their username, address, phone number, account balance, security permissions on your site, etc.
(This is perhaps more than you're looking for, but might make for good additional information to add to the good answers already posted.)
Since you mention best practices, you may want to look into some projects/technologies which can be used to take the idea of session state a bit further. One common pitfall with horizontally scaling web applications across multiple servers is maintaining session state between them. (User A logs in to Server A which stores the user's session, but on the next request hits Server B which doesn't know about User A's session, etc.)
One of the things I always end up saying to myself and to colleagues is that session by itself isn't really the best place to store data, even if that data is highly transient in nature. A web server is a request/response system, not a data store. It's highly tuned to the former, but not always so great for the latter.
Thus, there are ways to externalize your application's session data (or any stateful data, which should really be kept to a design minimum in the RESTful stateless nature of the web) from your web server and to another system. Memcached is a very common tool for this. There are also drop-in session replacements (or configurable session options for various frameworks/environments) which store session in a database like SQL or MySQL.
One idea I've been toying with lately is to store session data (well, any transient data where it's OK to lose it in a catastrophe) in a NoSQL database. CouchDB and MongoDB are my current top choices for this, but there's no shortage of other options. CouchDB has excellent horizontal scaling, MongoDB is ridiculously fast when run entirely in-memory, etc.
One of the major benefits of something like this, at least for me, is that deployments can easily become non-events. The web services on any given server can be re-started and the applications therein re-initialized without losing stateful data. If the data is persisted to the disk (that is, not entirely run in-memory) then the server can even be rebooted without losing it. Servers/services can drop in and out of the farm and users would never know the difference.
Additionally, externalizing this data allows you to analyze the data in potentially useful ways. Query it, run metrics on it, interface with it via other web applications or entirely offline tools, etc. It really opens up the options as a project grows in complexity.
(Again, this isn't really intended to answer your question, but rather to just add information that you may find useful. It's something my colleagues and I have been tinkering with as of late and your question seemed like a good place to mention it.)

What is a best practice method to log visits per page / object [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
Take my profile for example, or any question number of views on this site, what is the process of logging the number of visits per page or object on a website, which I presumably think includes:
Counting registered users once (this must be reflected in the db, which pages / objects the user has visited). this will also not include unregistered users
IP: log the visit of each IP per page / object; this could be troublesome as you might have 2 different people checking the same website; or you really do want to track repeat visitors.
Cookie: this will probably result in that people with multiple computers would be counted twice
other method goes here ....
The question is, what is the process and best practice to count user requests?
EDIT
I've added the computer languages to the list of tags as they are of interest to me. Feel free to include any libraries, modules, and/or extensions that achieve the task.
The question could be rephrased into:
How does someone go about measuring the number of imprints when a user goes on a page? The question is not intended to be similar to what Google analytics does, rather it should be something similar to when you click on a stackoverflow question or profile and see the number of views.
The "correct" answer varies according to the situation; primarily the most desired statistic and the availability of resources to gather and process them:
eg:
Server Side
Raw web server logs
All webservers have some facility to log requests. The trouble with them is that it requires a lot of processing to get meaningful data out and, for your example scenario, they won't record application specific details; like whether or not the request was associated with a registered user.
This option won't work for what you're interested in.
File based application logs
The application programmer can apply custom code to the application to record the stuff you're most interested in to a log file. This is similiar to the webserver log; except that it can be application aware and record things like the member making the request.
The programmers may also need to build scripts which extract from these logs the stuff you're most interested. This option might be suited to a high traffic site with lots of disk space and sysadmins who know how to ensure the logs get rotated and pruned from the production servers before bad things happen.
Database based application logs
The application programmer can write custom code for the application which records every request in a database. This makes it relatively easy to run reports and makes the data instantly accessible. This solution incurs more system overhead at the time of each request so better suited to lesser traffic sites, or scenarios where the data is highly valued.
Client Side
Javascript postback
This is a consideration on top of the above options. Google analytics does this.
Each page includes some javascript code which tells the client to report back to the webserver that the page was viewed. The data might be recorded in a database, or written to file.
Has an strong advantage of improving accuracy in scenarios where impressions get lost due to heavy caching/proxying between the client and server.
Cookies
Every time a request is received from someone who doesn't present a cookie then you assume they're new and record that hit as 'anonymous' and return a uniquely identifying cookie after they login. It depends on your application as to how accurate this proves. Some applications don't lend themselves to caching so it will be quite accurate; others (high traffic) encourage caching which will reduce the accuracy. Obviously it's not much use till they re-authenticate whenever they switch browsers/location.
What's most interesting to you?
Then there's the question of what statistics are important to you. For example, in some situations you're keen to know:
how many times a page was viewed, period,
how many times a page was viewed, by a known user
how many of your known users have viewed a specific page
Thence you typically want to break it down into periods of time to see trending.
Respectively:
are we getting more views from random people?
or we getting more views from registered users?
or has pretty much every one who is going to see the page now seen it?
So back to your question: best practice for "number of imprints when a user goes on a page"?
It depends on your application.
My guess is that you're best off with a database backed application which records what is most interesting to your application and uses cookies to trace the member's sessions.
The best practice for a hit counter depends on how much traffic you expect your site to receive. As wybiral suggested, you can implement something that writes to a database after every request. This might include the IP address if you want to count unique visitors, or it could be a simple as just incrementing a running total for each page or for each (page, user) pair.
But that requires a database write for every request, even if you just want to serve a static page. Ideally speaking, a scalable web app should serve as much as possible from an in-memory cache. Database or disk I/O should be avoided as much as possible.
So the ideal set up would be to build up some representation of the server's activity in-memory and then occasionally (say every 15 minutes) write those events to the database. You could conceivably queue up thousands of requests and then store them with a single database write.
There's a tutorial describing how to do exactly this in python using Celery and Carrot: http://packages.python.org/celery/tutorials/clickcounter.html. It also includes some examples of how to set up your database tables using Django models and what code to call whenever someone accesses a page.
This tutorial will certainly be helpful to you regardless of what you choose to implement, although this level of architecture might be overkill if you don't expect thousands of hits each hour.
Use a database to keep a record of the unique IPs (if the IP doesn't exist in the DB, create it, otherwise continue as planned) and then query the database for the number of those entities. Index this with IP and URL to store views for individual pages. You wont have to worry about tracking registered users this way, they will be totaled into the unique IP count. As far as multiple people from one IP, there's not much you can do there short of requiring an account and counting user->to->page-views similarly.
I would suggest using a persistent key/value store like Redis. If you use a list with the list key being the serialized identifier, you can store other serialized entries and use llen to find the list size.
Example (python) after initializing your Redis store:
def intializeAndPush(serializedKey, serializedValue):
if not redisStore.exists(serializedKey):
redisStore.push(serializedKey, serializedValue)
else:
if serializedValue not in redisStore.lrange(serializedKey, 0, -1):
redisStore.push(serializedKey, serializedValue)
def getSizeOf(serializedKey):
if redisStore.exists(serializedKey):
return redisStore.llen(serializedKey)
else:
return 0
Using this technique, you can use anything as serializedKey or serializedValue. If you want to store IPs with today's date or serialized login information, both are just as simple. Also, only unique serializedValues are stored since writes are locked on read (at least as I recall).
I will try and implement pixel tracking to track views on your page/object. This method is used by google (google analytics) and other high profile media companies.
Pixel tracking will be fine, since you can have point the trackingpixel to a HttpHandler specific for that purpose. That way you can seperate the load and even use some kind of queue for highload scenarios.
Also, you can incorporate user specific information in the tracking pixel such as WHO has visited the page.
eg:
<a href="fakeimages/imba.gif?uid=123&info2=a&info3=b" style="height:1px;width:1px;" />
Then you need to handle the request going to fakeimages/*.gif with a specific HttpHandler / php redirect/controller (whatever language you're using) and process the infos.
regards

Database / PHP security question

We are developing a very simple first stage GUI for a company database.
At the moment our time to deliver is rather limited.
So we thought about using a simple SQL stored procedure and retrieve all data.
The data the users are allowed to see is depending on security levels defined in the database and also in our Active Directory.
So after fetching all the data, the GUI displays only what the user has access to view / edit.
My question is if there are any remarkable security issues with this aproach? It should also be noted that both the webinterface and the database are located in our intranet.
Our backend uses W2K3, IIS, PHP 5, SQL 2005
Any feedback would be greatly appreciated
Jonas
Considering the time to deliver (about 1month), it should be rather ok.
First thing: since it is in intranet only, your site should be rather secured since outside world cannot be accessing your site.
secondly, XSS and cross site request forgery should be disabled no matter what.
next, SQL injection.
with these few things in mind, the application should be basically secured.
Don't put an outward facing web server on your internal network. Seriously. Put it in a DMZ.
As far as your data is concerned, will you be filtering based on user access before or after the data hits the web front end? I'd suggest doing it in the proc.
Also, if you can, I'd suggest putting your DB on a separate box as well, for added security.
It is a sound enough approach. This way the data the user is not allowed to see remains in the database.
"So after fetching all the data, the GUI displays only what the user has access to view / edit."
A frequent mistake when dealing with access control on websites is implementing them for the data fetching scenario but not the data writing scenario. This is often the result of the assumption "the user will only send us editing requests on resources that we told her she could edit". Unfortunately...
As I coudln't spot this in your question's content, I'd just recommend making sure you effectively dealt with access control when building the GUI but also when receiving data modification requests.
If we consider the following scenario:
The user fetches data she has legitimate access to.
The user requests edition of that said data. Let's imagine an edition form is now displayed.
The user submits the form with the changes.
Before leaving her machine, the user intercepts the HTTP request and replaces the identifier of the edited resource by another identifier, to which she shouldn't have access.
Does your model ensure that when receiving the editing request, the access control rules are also applied? From a SQL-like scenario, this would translate to asking whether you're using a request template such as the first one below or the second one below:
1) "UPDATE ... WHERE ID = x"
2) "UPDATE ... WHERE ID = x AND (SELECT ... FROM ... WHERE userID = y)"
If your model is more likely to be the first, then you might have an authorization model issue. Else, it should be okay.
Hope it helps.
sb.

Categories