implementation of ajax status check - php

I am battling with race condition protection in PHP.
My application is written in symfony 1.4 and PHP locks session data until a page completes processing. I have a long running (~10 second) login script and I want to display a progress bar showing the user what is being done while they wait. (I want to actually display what is being done and not use a standard [faux] loading bar.)
Whenever a script calls session_start(), PHP locks that user's session data until that script completes. This prevents my status check ajax calls from returning anything until the longer running script completes. (My question on why my ajax calls were not asynchronous is here.)
I have devised a way to do this but I want to make sure this way is secure enough for general purposes (i.e.- this is not a banking application).
My idea is:
On authentication of username & password (before the long login script starts), a cookie is set on the client computer with a unique identifier.
This same unique identifier is written to a file on the server along with the client IP address.
While the long login script runs, it will update that file with the status of the login process.
The ajax status check will ping the server on a special page that does not use session_start(). This page will get the cookie value and the client IP and check the server side file for any status updates.
Are there any glaringly obvious problems with this solution?
Again, from the security angle, even if someone hacked this all they would get is a number representing the state of the login progress.

I don't see anything inherently wrong with the approach that you are proposing.
But if your machine has APC installed you can use apc_store and apc_fetch to store your status in a sort of shared memory instead of writing to disk. Use something like apc_store(SID, 'login not started') to initialize and update the request state in memory, then apc_fetch(SID) to retrieve it on subsequent requests.
There are other shared memory systems, including Apache, or even a database connection might be simpler.

I have same problem and think the trick is session_write_close() that frees the session file.
Please see my https://github.com/jlaso/MySession repository and check if this can be apply to your particular question.

Related

PHP while(true) loop for file updates

I've got the following problem at hand:
I'm having users on two seperate pages, but saving page input to the same text file. While one user is editing, the other can't. I'm keeping track of this with sessions and writing changes and who's turn to edit it is in a file.
Works fine so far, the output in the end is quite similar to a chat. However, right now I'm having users manually actualize their page and reload the file. What I'd like to do is have the page execute a redirect when the file-timestamp changes (to indicate that the last user has saved their edits and its another users turn). I've looked into javascript shortpolling a little, but then found the php filmtime function and it looks much easier to use. Well - here's what I got:
while(true){
$oldtimestamp=filemtime(msks/$session['user']['kampfnr'].txt);
$waittimer=2;
$waittimer++;
sleep($waittimer);
$newtimestamp=filemtime(msks/$session['user']['kampfnr'].txt);
if ($eintragszeit2 > $eintragszeit1){
addnav("","kampf_ms.php?op=akt");
redirect("kampf_ms.php?op=akt");
}}
In theory, while the user sees the output "it's ... turn to edit the file." this should loop in the background, checking if the file has already been updated, and if yes, redirect the user.
Practically this heavily affects server perfomance (I'm on shared hosting) until it breaks with a memory exceeded error message.
Is something wrong with the code? Or is it generally a bad idea to use a while loop in this case?
Thanks in advance!
PHP language should be only used to generate web content (client do a request to the server => server calls the required script, and returns the response to the client).
Once page is loaded and displayed to the client, the connection is closed, so Internet can die, the client isn't informed...
So with an infinite loop, not only the client can wait for response... an infinite time, but also the server may be heavy impacted because of load... Effectively It is a really bad idea :)
PHP can't be used to make a bidirectional communication: it is just called to build web pages that client demands, and so he can't do anything "in the background" (not directly, effectively you can call an external script, but not for notify a client...)
Also, to do a bidirectional communication, php and "regular" http is not good, because of client / server architecture (the server only answers client request, it is passive)
I can suggest to use WebSocket protocol, to do a chat application:
http://socket.io/
https://en.wikipedia.org/wiki/WebSocket
But for that, you need to use an "active" server solution, such as node.js or ruby (depends of your server capabilities...)
The other way if you want to stay in php is that client makes Ajax request every 10 seconds, for example, to call a php script which check the file, and send back a message to the client if file is updated, but it is really deprecated, because of heavy performance loss, so forget it immediately.

How secure is it to use client side idle timer / how do big web app developers do it?

I'm building a PHP based web app and have implemented a session variable based login method. Before anything is loaded, this method clears the "logged in" state of all those that have spent X amount of time on the site without loading a new page or reloading the same page. Now this is working great, however this only does the check when the page is loaded and doesn't look for mouse / keyboard activity (think about a long form being filled out).
A similar question has been asked here however the code's security vulnerability was never questioned. I also found Paul Irish's solution to this however didn't find any reference to security there either.
Is it a realistic fear at all that the javascript code may be de-activated / intercepted in case I'd keep an "absolute timeout" server side? How do other big web app designers do it?
There's a number of inter-related issues here. My approach is that this can be done entirely server-side first, and then a JavaScript improvement added atop such that it doesn't matter if the client-side stuff fails to run.
The core of the solution is to set a last_seen time on the server when the user logs on. Whenever you detect any user activity (either a page rendering or an AJAX response), check whether this time is within acceptable bounds for you, and if it is overwrite it with the current time. If it is outside of the current bounds, destroy the session on the server side, and force a redirect to the login page, with a suitable message.
Once you have done that, there is no way that server-side value can be tinkered with, unless the user artificially refreshes their page pointlessly in order to avoid the timeout - but in that case, you would be unable to determine the difference between a "real" page request and one designed just to reset the session timer. If you are worried about malicious software on the user's computer, or an advanced man-in-the-middle attack on the user's connection, then you've got bigger problems than session timeout anyway.
Once this is set up, you may wish to use JavaScript to run a timer on each page that automatically shows the logged-off-automatically message. Just set this timer for the same length of time as your timeout, and of course restart it if an AJAX operation is triggered on the page. You could inject this message into the DOM, rather than redirecting, so if the user is filling out a form, they don't lose their work.
As you say, you could always detect form key-strokes to reset the timer. To do so, send an AJAX operation to the server that answers with an empty reply - that will send the cookie automatically - just ensure that your standard session code is run here too. Do use a JavaScript timer so that you don't send an AJAX op for every press - if it sometimes goes ten seconds later than the key press, then you'll not overwhelm the connection and you'll ensure your application remains speedy.

Can't have several threads per session

I am buidling some webapp and have implemented long-polling (and a command queue in my db) so my server can send commands to my cleint asynchronously, etc. The commands are encoded into json and sent over ajax calls for the client to server, and via long-polling for the server to client way.
Everything was working just fine, until I included my "Authentication module" in the ajax.php file. This module wraps the session stuff and calls session_start().
The problem is that, my long polling routine can wait up to 21 seconds before comming back to the client. During this time, the server won't run anything from the same session. It's instead executed right after the long polling ajax call returned.
I understand there's probably a restriction of only 1 thread per session at a time, and that the requests are queued up.
Now here's the question : What is the best way to address this? Is there a setting to allow several threads per sessions (3 would be fine, in my case). Or should I just send tell the client what is his SessionID (i have some sessions table in my db, to track which user is connected to which session(s)). The client could then send it along with any ajax calls so authentication module could be bypassed.
On the later option, iam afraid it open's up a bunch of security problems because of eventual session spoofing. I would need to send a "random string" to each session, to make sure you can't spoof too easily, but even then, it's not perfect...
Thanks for your awnsers :)
Nicolas Gauthier
It's a well known issue/fact that PHP locks session files for the duration of their usage in order to prevent race conditions.
If you take a look at the PHP source code, (ext/session/mod_files.c) you can see that the ps_files_open function locks the session file, and ps_files_close unlocks it.
If you call session_start() right at the beginning of your long-running script, and do not explicitly close the session file, it will be locked until the script terminates, where PHP will release all file locks during script shutdown.
While you are not using the session, you should call session_write_close to flush the session data to disk, and release the lock so that your other "threads" can read the data.
I'm sure you can imagine what would happen if the file was not locked.
T1: Open Session
T2: Open Session
...
T2: Write Data
T1: Write Data
The data written by thread 2 will be completely overwritten by thread 1, and at the same time, any data that thread 1 wanted to write out, was not available to thread 2.

Should PHP session be created before login or after successful login

If PHP session is created before login, there will be one session file created for each request to login page.
The problem is if user makes multiple requests to server through a script then those many session files will be created.
If user wants to attack server,he can send abnormally huge number of requests creating so many session files eating up all the temporary space and making the service unavailable.
I am not sure if this kind of attack is really possible/feasible.
Please share your comments on this and implications if PHP sessions is created before/after successful login.
I think you are misunderstanding session_start()
What happens with session_start is, yes, it will create a file for the individual user. But the next time you call session_start(), it is going to use that same file for that same user because the user has a cookie on their system that tells it what ID to use. In order to have the $_SESSION array available, you must call session_start() on every page.
It is very possible that someone could whip up a scenario like you just described.
In reality, yes, a hacker could always have a robot that clears its cookies after every attempt, make 10,000 requests, and possibly create a disk writing problem, but I really wouldn't worry about it too much, because the files are tiny, much smaller than the script you are writing. You'd have to write a lot more files (on the size of millions or billions) to actually create a problem.
If you really want to see what the effects would be on your server. Write a script that creates files in a directory with the equivalent of 2 paragraphs of text. and put it in a loop for 10,000 files.
If you are then worried about the affects it would have, I suggest implementing a tracker that can see an large amount of hits coming to the site from a single IP address and then either temporarily ban the IP address, or do what Google does and just provide them with a static captcha page that doesn't take many resources to serve.
So, going back to the actual 'question':
I set a session for every single user that ever visits my site, because I use sessions for not only User Authentication, but for tracking other variables on my site. So, I believe that you should set it even if they aren't logged in.
If you're worried about a session fixation attack, think about using session_regenerate_id() function.
once you run the session_start() creates the file.
what you can do as an attack is to create a robot to send separate session id in the cookie, but just have to give that one exists.
It doesn't matter, really. Your server, as cheap as it could be, will have enough space to store millions of (almost empty) session files.
Worst it can do is slow down the files access in the folder where session files are stored, but your server's disks should be monitored to begin with, and a quickly filling /tmp partition should raise an alert at some point.

Dealing with long server-side operations using ajax?

I've a particularly long operation that is going to get run when a
user presses a button on an interface and I'm wondering what would be the best
way to indicate this back to the client.
The operation is populating a fact table for a number of years worth of data
which will take roughly 20 minutes so I'm not intending the interface to be
synchronous. Even though it is generating large quantities of data server side,
I'd still like everything to remain responsive since the data for the month the
user is currently viewing will be updated fairly quickly which isn't a problem.
I thought about setting a session variable after the operation has completed
and polling for that session variable. Is this a feasible way to do such a
thing? However, I'm particularly concerned
about the user navigating away/closing their browser and then all status
about the long running job is lost.
Would it be better to perhaps insert a record somewhere lodging the processing record when it has started and finished. Then create some other sort of interface so the user (or users) can monitor the jobs that are currently executing/finished/failed?
Has anyone any resources I could look at?
How'd you do it?
The server side portion of code should spawn or communicate with a process that lives outside the web server. Using web page code to run tasks that should be handled by a daemon is just sloppy work.
You can't expect them to hang around for 20 minutes. Even the most cooperative users in the world are bound to go off and do something else, forget, and close the window. Allowing such long connection times screws up any chance of a sensible HTTP timeout and leaves you open to trivial DOS too.
As Spencer suggests, use the first request to start a process which is independent of the http request, pass an id back in the AJAX response, store the id in the session or in a DB against that user, or whatever you want. The user can then do whatever they want and it won't interrupt the task. The id can be used to poll for status. If you save it to a DB, the user can log off, clear their cookies, and when they log back in you will still be able to retrieve the status of the task.
Session are not that realible, I would probably design some sort of tasks list. So I can keep records of tasks per user. With this design I will be able to show "done" tasks, to keep user aware.
Also I will move long operation out of the worker process. This is required because web-servers could be restrated.
And, yes, I will request status every dozens of seconds from server with ajax calls.
You can have JS timer that periodically pings your server to see if any jobs are done. If user goes away and comes back you restart the timer. When job is done you indicate that to the user so they can click on the link and open the report (I would not recommend forcefully load something though it can be done)
From my experience the best way to do this is saving on the server side which reports are running for each users, and their statuses. The client would then poll this status periodically.
Basically, instead of checkStatusOf(int session), have the client ask the server of getRunningJobsFor(int userId) returning all running jobs and statuses.

Categories