Use session data on websocket handshake - php

If a logged in user navigates to a certain area of the site which is to use WebSockets, How can I grab that session Id so I can identify him on the server?
My server is basically an endless while loop which holds information about all connected users and stuff, so in order to grab that id I figured the only suitable moment is at the handshake, but unfortunately the handshake's request headers contain no cookie data:
Request Headers
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.5
Cache-Control: no-cache
Connection: keep-alive, Upgrade
DNT: 1
Host: 192.168.1.2:9300
Origin: http://localhost
Pragma: no-cache
Sec-WebSocket-Key: 5C7zarsxeh1kdcAIdjQezg==
Sec-WebSocket-Version: 13
Upgrade: websocket
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64;
rv:27.0) Gecko/20100101 Firefox/27.0
So how can I really grab that id? I thought I could somehow force javascript to send cookie data along with that request but any self-respecting website in 2014 will have httpOnly session cookies so that wont work out. Any help is greatly appreciated!
Here's a link for the server I'm using: https://github.com/Flynsarmy/PHPWebSocket-Chat/blob/master/class.PHPWebSocket.php (thanks to accepted answer)

http only cookies as well as secure cookies work fine with websocket.
Some websocket modules have chosen to ignore cookies in the request, so you need to read the specs of the module.
Try: websocket node: https://github.com/Worlize/WebSocket-Node.
Make sure to use the secure websocket protocol as wss://xyz.com
Update:
Also, chrome will not show the cookies in the "inspect element" Network tab.
In node try dumping the request, something like:
wsServer.on('request', function(request) {
console.log(request);
console.log(request.cookies); // works in websocket node
}
If you see the cookies somewhere in the log...you've got it.
If you're using secure-only cookies, you need to be in secure web sockets: wss://
Update2:
The cookies are passed in the initial request. Chrome does not show it (all the time) as sometimes it shows provisional headers which omits cookie information.
It is up to the websocket server to do 'something' with the cookies and attach them to each request.
Looking at the code of your server: https://github.com/Flynsarmy/PHPWebSocket-Chat/blob/master/class.PHPWebSocket.php I do not see the word "cookie" anywhere, so it is not being nicely packaged and attached to each websocket connection. I could be wrong, that's why you might want to contact the developer and see if the whole header is being attached to each connection and how to access it.
This I can say for certain: If you're using secure cookies then cookies will not be transmitted unless you use the secure websocket wss://mysite.com. Plain ws://mysite.com will not work.
Also, cookies will only be transmitted in the request if the domain is the same as the webpage.

Related

Secure cookies available via HTTP

I noticed something earlier today while inspecting cookies on some of my subdomains that surprised me a little. Although I have set PHP to use only secure cookies, they are nonetheless available using HTTP.
My root domain and most of my subdomains are HTTPS. In fact, they are essentially HTTPS only. I haven't enabled HSTS because I have subdomains that are HTTP only and do not support HTTPS. (Essentially, my domains are either HTTPS only or HTTP only, not both.) While navigating to these subdomains, I noticed the cookies set by the root domain show up in the browser developer tools, and are seemingly sent on requests to the server.
I'd prefer this not happen because a) it's insecure and b) the server handling these subdomains does squat with these cookies so sending them at all is an unnecessary risk that could compromise the main applications.
The weird thing is my cookies are already secure. All cookies are set like so:
setCookie("my_cookie", $cookieValue, $expires, '/', $domain, true, true);
At the top of my session manager, I also have:
session_set_cookie_params(3600, '/', '.example.com', true, true);
The last trues are secure and httpOnly. One would think this makes them HTTP-only:
HTTPS: Cookie with "Secure" will be returned only on HTTPS connections
Reading cookies via HTTPS that were set using HTTP
To ensure that the session cookie is sent only on HTTPS connections,
you can use the function session_set_cookie_params() before starting
the session: https://stackoverflow.com/a/6531754/6110631
A cookie with the Secure attribute is sent to the server only with an
encrypted request over the HTTPS protocol, never with unsecured HTTP,
and therefore can't easily be accessed by a man-in-the-middle
attacker. Insecure sites (with http: in the URL) can't set cookies
with the Secure attribute. However, do not assume that Secure prevents
all access to sensitive information in cookies; for example, it can be
read by someone with access to the client's hard disk.
https://developer.mozilla.org/en-US/docs/Web/HTTP/Cookies#Secure_and_HttpOnly_cookies
But lo and behold, all cookies that I set on the root domain continue to be available on my HTTP-only subdomains. Using the developer tools, any changes with cookies in the root domain continue to be reflected on HTTP-only subdomains!
I intentionally set the cookie as .domain to make it available on all subdomains, since they all share session information and enable SSO (the HTTPS domains, that is). However, I would think that with the secure flag, this would still prevent the cookies from being available on HTTP-only subdomains. Does one of these parameters take precedence over another? (I would think secure would).
Why is this not working as intended? It seems that because the cookies are available, even though I have secure and httpOnly, the cookies could be stolen from an unencrypted HTTP connection. Is it that the cookies are not actually sent, but the browser (erroneously) displays them as present anyways, or is there a real security risk here?
The browser developer tools seems to make a distinction between displaying cookies that may be available on a domain and those that are actually sent in the request - that is to say, the developer tools will show cookies for a subdomain even if they are never sent on requests to the server.
Here's an example of a request to the root domain:
:authority: example.com
:method: GET
:path: /account
:scheme: https
accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8
accept-encoding: gzip, deflate, br
accept-language: en-US,en;q=0.9
cache-control: max-age=0
cookie: [cookies redacted]
dnt: 1
referer: [redacted]
upgrade-insecure-requests: 1
user-agent: [redacted]
Here's an example of a request to an HTTP-only subdomain:
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.9
Cache-Control: max-age=0
Connection: keep-alive
DNT: 1
Host: subdomain.example.com
If-Modified-Since: Wed, 27 May 2020 20:51:35 GMT
If-None-Match: "6c5-5a6a760afe782-gzip"
Upgrade-Insecure-Requests: 1
User-Agent: [redacted]
As one can see, cookies are not sent with the second request. However, if you examine the cookies on both domains in the browser, you can see the browser displays them regardless:
This can be a source of confusion, but examining the request headers shows that the browser does indeed refrain from sending the secure cookies on insecure requests.

Mediawiki login cancelled to prevent session hijacking

I have just set up a MediaWiki 1.29.0 page on an AS400 IBM i machine. I am using MariaDB as a database. I am using PHP 5.5.37
Every time I try to log into an account, I get the error:
There seems to be a problem with your login session; this action has been canceled as a precaution against session hijacking. Go back to the previous page, reload that page and then try again.
Obviously, the behavior I'm looking for is to log in.
I've tried:
changing $wgMainCacheType and $wgSessionCacheType to various permutations of CACHE_NONE, CACHE_ACCEL, CACHE_DB, and CACHE_ANYTHING.
creating a tmp directory and setting its permissions.
rebuilding my LocalSettings.php file.
setting session.referer_check=off in php.ini
I've checked and I know my cookies are enabled (I'm able to call document.cookie; and get data back).
This question has been asked before here, and the linked questions within, but no solutions fixed my problem. They also deal with an older version of WikiMedia, though I don't know if that makes a difference in this instance.
EDIT: I am also getting the same behavior when I try to create a new account. However, I am able to navigate the wiki, create pages, and edit pages without any sort of error.
Here is my request header:
Cache-Control: private, must-revalidate, max-age=0
Connection: close
Content-language: en
Content-Type: text/html; charset=UTF-8
Date: Thu, 10 Aug 2017 13:48:36 GMT
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Link: </<path>/resources/assets/logo.png?88d75>;rel=preload;as=image
Server: Apache
Set-Cookie: ZDEDebuggerPresent=php,phtml,php3; path=/
Set-Cookie: <wikiname>_session=n7gs0ct99ck5i2juq0togto9q7bfou6u; path=/; secure; httponly
Transfer-Encoding: chunked
Vary: Accept-Encoding,Cookie
X-Content-Type-Options: nosniff
X-Frame-Options: DENY
X-Powered-By: PHP/5.5.37 ZendServer/8.5.5
X-UA-Compatible: IE=Edge
Here is my response header:
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8
Accept-Encoding:gzip, deflate
Accept-Language:en-US,en;q=0.8
Connection:keep-alive
Cookie:ZDEDebuggerPresent=php,phtml,php3
Host:tdidev:10080
Referer:http://<wikiepath>/index.php?title=Special:UserLogin&retirnto=Main+Page
Upgrade-Insecure-Requests:1
User-Agent:Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.101 Safari/537.36
I've finally found the issue to my problem. By default, MediaWiki passes the <wikiname>_session cookie with the secure flag set. Taken from OWASP:
The secure flag is an option that can be set by the application server when sending a new cookie to the user within an HTTP Response. The purpose of the secure flag is to prevent cookies from being observed by unauthorized parties due to the transmission of a the cookie in clear text.
To accomplish this goal, browsers which support the secure flag will only send cookies with the secure flag when the request is going to a HTTPS page. Said in another way, the browser will not send a cookie with the secure flag set over an unencrypted HTTP request.
So my MediaWiki installation correctly creates and caches a session token, and it even still passes it through the response header. However, since my browser sees an http instead of https, that's as far as the token gets. The Set-Cookie line is simply ignored.
There is a setting in php.ini called session.cookie_secure, but MediaWiki ignores this flag.
Instead, the solution was to add this line to the bottom of my localSettings.php file:
$wgCookieSecure = false;
I had something similar happen on a different application, when the sessionId was being updated out of sequence.
So normally you request a login form, and it creates a session with a sessionId, and stores this somewhere.
Then you submit the form, it ties that into the original sessionId, checks your authentication, and either logs in the original session, or creates you a new one, and updates yours (normally with a HTTP Set-Cookie command you can see in the Network log).
But you can follow everything, by looking at the sessionId in your current cookies, and any token on the form (to prevent replays), and checking it against either your /tmp/php-session-xxx file (maybe in /var/lib/php) or whatever database it's storing the session in.
What tipped me off to my problem was identifing that by the time I was about to submit a form, with a particular sessionid, that sessionid, was no-longer valid. Hence I failed the replay checks, and I got an error similar to yours. It turned out in my case it was to do with the databases replicating in a way that didn't match how they were being accessed downstream so I could attempt to access a session, that hadn't been created yet.
Looking at all your code, the sessionIds don't match. wpTokenLogin starts with 510a85 but your wiki session in SetCookie starts with n7gs0c and in your log it talks about 6ov933... So assuming you copied/pasted from different attempts, you need to run through it yourself from a clean state and check that everything looks like it's using the same session. If not, try to figure out what's happening to the session you have (if it's created/changed) or why you're not getting the right one, if it's created but never passed out to you properly.
That said, I just took at look at the client side of logging into our own inhouse version of mediawiki, and wpLoginToken, wikidb_session and JSESSIONID don't match either (although I'd expect one of them to show up in the wiki log, which I don't have access too).
If you have to, grep the source for the error message you're finding, and insert error_log(__FILE__.':'.__LINE__.' '.var_export(debug_backtrace(DEBUG_BACKTRACE_IGNORE_ARGS), true)); to find out work back up the stack, to see what didn't match, to generate the error.
This is an ongoing problem with MediaWiki, and is the result of your password being incorrectly entered, or MediaWiki failed to write SOMETHING during the login process (database, cookie, disk file, whatever). In my case, I was using the $wgReadOnly variable to make the wiki read-only. I found that I had to use $wgMainCacheType = CACHE_MEMCACHED for my system to work properly.
See: https://www.mediawiki.org/wiki/Manual:Memcached

access denied to local intranet page for only 1 user

We have a PHP site hosted from an Apache server.
We have a .NET site hosted from a windows server.
Both are internal, and inside our domain.
When a user accesses the PHP site, it checks if your username has been sessioned. If not, it does an ajax GET postback to the .NET "GetUsername.aspx" page. The GetUsername.aspx page simply outputs Request.ServerVariables("LOGON_USER")
The .NET site requries windows authentication to be enabled in IE. All of our users use IE8.
In order for our PHP site to request data from the .NET site, a setting "Access data across domains" must be enabled... and it is.
In order for our .NET site to get your username, "Enable Integrated Windows Authentication" must be nabled... and it is.
Both the .NET site, and the PHP site are intranet sites. If you go to Internet Options -> Local Intranet -> Sites -> Advanced, both sites are in the list.
At this line: xmlhttp.open("GET","http://intranet.MySite.vmv/IS/GetUsername.aspx",false);
A javascript error occurrs with the message "Error: Access is denied."
If i type that same .NET page into the browser url... it loads just fine, and shows her username.
The confusing thing about all this, is that our policy updates (pushed out on every login) set all of these settings. I've verified all the above settings are the same on my browser as they are on this users. Both of us are within the same domain.
Any ideas of some other setting that could be causing this?
Thanks!
EDIT
here are the results from firebug... doesnt seem to be very helpful, just shows me what we already knew.
http://intranet.MyCompany.vmv/IS/GetUsername.aspx
401 Unauthorized
20ms
login (line 103)
HeadersResponseHTML
Response Headers
Content-Length 1656
Content-Type text/html
Date Mon, 09 Apr 2012 16:15:33 GMT
Server Microsoft-IIS/6.0
WWW-Authenticate Negotiate NTLM
X-Powered-By ASP.NET
Request Headers
Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7
Accept-Encoding gzip, deflate
Accept-Language en-us,en;q=0.5
Connection keep-alive
Host intranet.MyCompany.vmv
Origin http://192.168.1.2:10078
Referer http://192.168.1.2:10078/login
User-Agent Mozilla/5.0 (Windows NT 5.1; rv:8.0.1) Gecko/20100101 Firefox/8.0.1
Well... the issue appears to be with IE7. Somehow this user's version was not upgraded with everyone else. Once we upgraded her to IE8 it worked as expected.

How does PHP access information about the client's browser?

How is it possible for client browser data to be saved in an array in PHP?
PHP runs on the server side, so I don't understand how it has access to information about the client's browser.
User agent data is usually sent with every HTTP requests, in the User-Agent HTTP header field. You might want to read up on HTTP message formats in general. For example, this is part of the HTTP request that my browser sent to load jQuery on this very page:
GET http://ajax.googleapis.com/ajax/libs/jquery/1.5.2/jquery.min.js HTTP/1.1
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Connection: keep-alive
If-Modified-Since: Fri, 01 Apr 2011 21:23:55 GMT
Accept-Charset: UTF-8,*;q=0.5
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.60 Safari/534.24
Accept: */*
PHP reads the client browser data that you're asking about from the User-Agent header field.
The client sends data to the server which the server uses to build the array (I'm assuming you're talking about $_GET, $_POST, $_SERVER, etc.)
You will find it here
$_SERVER['HTTP_USER_AGENT']
You may need to parse this by regex to get the browser name and version separately.
$_REQUEST
An associative array that by default contains the contents of $_GET, $_POST and $_COOKIE.
The data is submited by the browser when a new page is requested, PHP just puts it into an array for your convenience.
You should start by reading a bit about HTTP (GET and POST to begin with), and HTTP headers.

Can a cURL based HTTP request imitate a browser based request completely?

This is a two part question.
Q1: Can cURL based request 100% imitate a browser based request?
Q2: If yes, what all options should be set. If not what extra does the browser do that cannot bee imitated by cURL?
I have a website and I see thousands of request being made from a single IP in a very short time. These requests harvest all my data. When looked at the log to identify the agent used, it looks like a request from browser. So was curious to know if its a bot and not a user.
Thanks in advance
This page has all the answers to your questions. You can imitate the things mostly.
R1 : I suppose, if you set all the correct headers, that, yes, a curl-based request can imitate a browser-based one : after all, both send an HTTP request, which is just a couple of lines of text following a specific convention (namely, the HTTP RFC)
R2 : The best way to answer that question is to take a look at what your browser is sending ; with Firefox, for instance, you can use either Firebug or LiveHTTPHeaders to get that.
For instance, to get this page, Firefox sent those request headers :
GET /questions/1926876/can-a-curl-based-http-request-imitate-a-browser-based-request-completely HTTP/1.1
Host: stackoverflow.com
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; fr; rv:1.9.2b4) Gecko/20091124 Firefox/3.6b4
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: fr,fr-fr;q=0.8,en-us;q=0.5,en;q=0.3
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Referer: http://stackoverflow.com/questions/1926876/can-a-curl-based-http-request-imitate-a-browser-based-request-completely/1926889
Cookie: .......
Cache-Control: max-age=0
(I Just removed a couple of informations -- but you get the idea ;-) )
Using curl, you can work with curl_setopt to set the HTTP headers ; here, you'd probably have to use a combination of CURLOPT_HTTPHEADER, CURLOPT_COOKIE, CURLOPT_USERAGENT, ...

Categories