PHP Curl Mozilla LiveHttpHeaders analysis needed - php

I was trying to login to a site using the PHP Curl library. Even though i have ben successfully logged in, i cant seem to access any other pages beyond the login page. Now i know there could be some issue with cookies but trust me, ive tried all possible combinations with COOKIEJAR and COOKIEFILE.
I needed some help with analyzing this set of LiveHTTPHeaders info. Im worried about the post fields- particularly the Login.x and the Login.y. They seem to change on every login. Could that be an issue? How do i figure out the way a random integer is being assigned to this value? Also, are more than 1 cookies being added? If so, how do i incorporate that into curl? Do i use one COOKIEJAR, multiple or name number of cookies in a single statement..
Ive pasted the Headers below-
http://amizone.net/Amizone/default.aspx
POST /Amizone/default.aspx HTTP/1.1
Host: amizone.net
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:6.0.2) Gecko/20100101 Firefox/6.0.2
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip, deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection: keep-alive
Referer: http://amizone.net/
Cookie: ASPSESSIONIDSSBCDQAQ=FJHPMILBALMDGIFEOOOBNFHI
Content-Type: application/x-www-form-urlencoded
Content-Length: 55
username=1596681&password=CENSORED&Login.x=14&Login.y=15
I will only post the cURL code if needed.
LiveHTTPHeaders info for HOME PAGE:
GET /amizone/default.aspx HTTP/1.1
Host: amizone.net
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:6.0.2) Gecko/20100101 Firefox/6.0.2
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip, deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection: keep-alive
LiveHTTPHeaders info for LOGIN: ** Shown on top. No changes.
LiveHTTPHeaders info for ANY PAGE ACCESS AFTER LOGIN--
GET /amizone/WebForms/TimeTable/StudentTimeTableForWeek.aspx HTTP/1.1
Host: amizone.net
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:6.0.2) Gecko/20100101 Firefox/6.0.2
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip, deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection: keep-alive
Referer: http://amizone.net/amizone/WebForms/Home.aspx
Cookie: ASP.NET_SessionId=hn5mfsre0y3b1l45nxlgzr55; UserId=127953D3849DEF71FB6CF9F55DD3BBADE48E686D24ADC87923FB6C60077ECC0362AB0C5A9C4DF194461C348DBAE6FEC861827F886FE2C17EA79155500CA4FC04EE897B7658A59DA2F286F2436F6EDD07BE2DD7DD829798F4C81ABAEFEE400B3A71078A74BF1C169BF1DA2865CC9E5968FF26ED7D; countrytabs=0; countrytabsRT=0; countrytabsRB=0
***Notice how multiple cookies are sent in this case (i think). How should my cookiejar and cookiefile commands change?

When recording a session it is important that you first flush all cookies and then make sure you note when cookies are set by the server.
Very often, the required cookies are set in the login page or another page that the browser loads first, and then when you POST to the particular URL the previously set cookies must be passed on.
So, the attached trace is insufficient.

This cURL code has been sufficient for me in the past to maintain login sessions by storing cookies:
$ch = curl_init('https://somesecureurl.com/login/account');
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.8.1.1) Gecko/20061204 Firefox/2.0.0.1');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_ENCODING, 'gzip, deflate');
curl_setopt($ch, CURLOPT_COOKIEFILE, '/tmp/hmcookies.txt'); // cookies in this file are sent by curl with the request
curl_setopt($ch, CURLOPT_COOKIEJAR, '/tmp/hmcookies.txt'); // upon completing request, curl saves/updates any cookies in this file
$data = curl_exec($ch);
Other things to ensure, the cookiejar file is writable by the webserver, or it has the permission to create the file.
As also stated by Daniel, some sites may require that you first visit a login page to get some initial cookies set, and then post the login form. So your requests may go:
Request login page
Post to login form
Try to access protected page

Related

Kanboard and reverse-proxy authentication

According the documentation (https://docs.kanboard.org/en/latest/admin_guide/reverse_proxy_authentication.html), i try to configure a simple authentication using the REVERSE_PROXY_USER_HEADER parameter.
For that, i set the parameters (data/config.php) like that !
define('REVERSE_PROXY_AUTH', true);
define('REVERSE_PROXY_USER_HEADER', 'REMOTE_USER');
define('REVERSE_PROXY_DEFAULT_ADMIN', 'admin');
define('REVERSE_PROXY_DEFAULT_DOMAIN', 'my-domain.fr');
I also change the apache2 log format for display my HTTP headers and i can see my 'user-name' in the corresponding log.
My HTTP request is this one :
Host: kbsso.my-domain.fr
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: fr,fr-FR;q=0.8,en-US;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
DNT: 1
Connection: keep-alive
Cookie: KB_SID=g6re22ivihojnle991idmo4vhh; KB_RM=4c14de632e17976bc894b9dd0f273c0415d0fef620b62b00bc8e782084775a97%7Cf47e8c6b8ab3721c60dd32fbc4ed129fedbeafc388ff66b3c9933e28c08d
Upgrade-Insecure-Requests: 1
REMOTE_USER: ex.valere.viandier
Pragma: no-cache
Cache-Control: no-cache
So, my problem is...
When REVERSE_PROXY_AUTH is set to true, the controller show me the login screen but if i try to loggin, i receive an "Access denied" HTML page.
When REVERSE_PROXY_AUTH is set to false, the controller show me the login screen and, using the same user account, i can access to kanboard.
The documentation say :
Your reverse proxy authenticates the user and send the username through a HTTP header.
Kanboard retrieve the username from the request
The user is created automatically if necessary
Open a new Kanboard session without any prompt assuming it’s valid
I try with existing and non-existing user but i have the same problem.
I try with v1.2.5, v1.2.8 but no way...
Do you know an issue ?
Thx a lot
Valère

Cookie values not being sent in POST request from SWF

I have a SWF file which allows a user to upload files on my website. That SWF sends a POST request to upload.php on my server, thus uploading the file.
But before it does so, at upload.php I wanna make sure that the person who is doing this is signed in. For this currently, I check the SID cookie value sent in the request headers and check an entry against that SID value in my database (i store sessions in database). I works well for chrome but fails in firefox as it doesn't send any cookie headers when request is generated by the SWF.
How should I go about it. Thanks for help.
Headers sent in Chrome
Content-Length: 355894
Origin: http://localhost
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1312.52
Content-Type: multipart/form-data; boundary=----------ei4Ij5ae0ae0ei4Ef1Ef1KM7GI3Ef1
Accept: */*
Referer: http://localhost/home Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Cookie: PHPSSID = 238e320eewbjdbew923e092ejhwbjhwebd
Headers sent in FireFox
Accept: text/*
Content-Type: multipart/form-data; boundary=----------GI3Ef1Ef1ae0GI3cH2Ef1Ef1cH2KM7
User-Agent: Shockwave Flash
Host: localhost
Content-Length: 355894
Connection: Keep-Alive
Cache-Control: no-cache
It's a feature of Chrome, but not a Bug of FireFox. Uploading SWF's must explictly send browser cookies. Look at FancyUpload starting from version 3.0.
Also note, PHP with suhosin can remove session variables if User-Agent header is not equal to the browsers. This called 'Session Fixtion'. So Uploading SWF's must send browsers User-Agent during requests too (or just disable suhosin patch).

wordpress pages return 404

I have a very strange problem.
The blog has a permalink structure /%category%/%title%/. Everything works fine for posts but when accessing pages by directly typing their URL, I get 404. I noticed this when checking the Google Webmasters Tools. The pages open fine when clicking on their link within the site.
The last thing I should mention is that the titles contain some non Latin characters but they are properly encoded.
Does WordPress check the referrer header and makes some decisions based on its value?
Look at the headers that Firefox sends when accessing the pade directly and it returns 404 and when clicking on internal link:
Host localhost:8088
User-Agent Mozilla/5.0 (Windows NT 6.1; WOW64; rv:7.0.1) Gecko/20100101 Firefox/7.0.1
Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language en-us,en;q=0.5
Accept-Encoding gzip, deflate
Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection keep-alive
Host localhost:8088
User-Agent Mozilla/5.0 (Windows NT 6.1; WOW64; rv:7.0.1) Gecko/20100101 Firefox/7.0.1
Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language en-us,en;q=0.5
Accept-Encoding gzip, deflate
Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection keep-alive
Referer http://localhost:8088/dani/
The only difference is the referer header. Very strange.
The problem turned out to be not in the referer header but in the casing of the utf-8 url parts. The internal links are bult with lower-case letters as:
http://localhost:8088/dani/%d0%b1%d0%bb%d0%be%d0%b3/
and the page titles have the same look. But when typing a URL in a browser, it encodes the string with upper-case letters as:
http://localhost:8088/dani/%D0%B1%D0%BB%D0%BE%D0%B3/
which doesn't match the title of the page.

how can a program login to a website having a html form for username and password

I need to scrape from a website that requires authentication, that is a user must login by supplying there username and password in an html form. The connection is not secure. I plan on using the Pear framework to format the request. I'm stuck at the first part which is posting the login information. How can I submit form data? Thanks in advance for any help.
I've got fiddler2 installed but am not sure what to look for when I do the login manually.
I'm using curl and getting some success. I deleted all cookies in my web browser and went to the site to login. In fiddler it gave me the following request headers
GET http://example.com/niceday/dirlogin.php HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Accept-Language: en-CA
User-Agent: Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)
Accept-Encoding: gzip, deflate
Host: example.com
DNT: 1
Connection: Keep-Alive
Cookie: chatRoomUserID=119972; ASPSESSIONIDSSRQCCBR=26jqrt3f8dus2l2s42g4r9npp5
I'm confused about the last one, am I requesting a cookie?
In my script I faked the first request with
curl_setopt($ch,CURLOPT_HTTPHEADER, array('GET http://example.com/niceday /dirlogin.php HTTP/1.1',
'Accept: text/html, application/xhtml+xml, */*',
'Referer: http://r2sports.bkoehler.j43.ca/tourney/tourneyTop.php?TID=3206',
'Accept-Language: en-CA',
'User-Agent: Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)',
'Accept-Encoding: gzip, deflate',
'Connection: Keep-Alive',
'DNT: 1',
'Host: r2sports.bkoehler.j43.ca',
'Cookie: chatRoomUserID=995222; ASPSESSIONIDSSRQCCBR=26jqrt3f8dus2l2s42g4r9npp5'));
You might consider using this: http://www.php.net/manual/en/book.curl.php#90821 and setting a cookie file location, so your script can remain authenticated. Next time it will contact that website it will use the cookies it previously saved in a file in the cookie file location.
How can I submit form data?
Here is how:
<form action="./login.php" method="post">
<input type="text" name="username" />
<input type="password" name="password" />
</form>
Use login.php to process your $_POST data.

cURL gets less cookies than FireFox! How to fix it?

How can I make cURL to get all cookies?
I thought maybe firefox gets different cookies as the page loads or it has some built-in javascript that sets some cookies after the page is loaded, or maybe it redirects to other pages and other pages set other cookies, but I don't know how to make curl do the same thing. I set curl to follow redirects but still no success. Curl does sets some cookies but not all of them.
following is the code I use in php:
$url = 'https://www.example.com';
$handle = curl_init($url);
curl_setopt($handle, CURLOPT_COOKIESESSION, true);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, true);
curl_setopt($handle, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($handle, CURLOPT_COOKIEJAR, "cookies.txt");
curl_setopt($handle, CURLOPT_COOKIEFILE, "cookies.txt");
curl_setopt($handle, CURLOPT_AUTOREFERER, true);
curl_setopt($handle, CURLOPT_USERAGENT, 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.0.3705; .NET CLR 1.1.4322)');
$htmlContent = curl_exec($handle);
Following is from Live HTTP header in Firefox
https://www.example.com
GET /index.ext HTTP/1.1
Host: www.example.com User-Agent:
Mozilla/5.0 (Macintosh; U; Intel Mac
OS X 10.6; en-US; rv:1.9.2.10)
Gecko/20100914 Firefox/3.6.10
Accept:
text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset:
ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Cookie:
JSESSIONID=3E85C5D0436D160D0623C085F68DC50E.catalog2;
__utma=137925942.1883663033.1299196810.1299196810.1299198374.2; __utmz=137925942.1299196810.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none);
citrix_ns_id=0pQdumY48kxToPcBPS/QQC+w2vAA1;
__utmc=137925942
HTTP/1.1 200 OK
Date: Fri, 04 Mar 2011 01:20:30 GMT
Server: Apache/2.2.15
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html;charset=UTF-8
I only get JSESSIONID with curl
Please help!
possibly page you are loading has some other content that actually sets cookies and since ou are only rading one page you don't get them, or some cookies are set through javascript.
Try using a Firefox user agent on CURL and see if you get the same amount of cookies. You should.
Use a network sniffer or a proxy to compare requests and responses, you have differences for sure. Post the requests and responses here if you still can't find.
If faking the user agent on curl side does not work, try to do the opposite by installing a firefox extension which fakes the user agent, and set it to the one used by curl. If it works, it may be some passive browser fingerprinting (such as p0f by lcamtuf) relying on network timing, and you may have hard time to workaround it. Would be extremely surprising though!
I figured it out. It was actually JavaScript that set cookies after the page was loaded:)
Thanks everybody

Categories