I am trying to better understand OAuth by experimenting with the twitteroauth php library.
It is my understanding that the way to authenticate over OAuth is to make use of an 'Authorize' header when using cUrl. However, examining the source for the twitteroauth library, I can see that the header is set as so for post requests:
curl_setopt($ci, CURLOPT_HTTPHEADER, array('Expect:'));
And the parameters that should be set in the 'Authorize' header are actually being set in the post body instead in the line:
curl_setopt($ci, CURLOPT_POSTFIELDS, $postfields);
What is the reason for it being done this way? When in the twitter API guidelines is specifies the following implementation for the header:
POST /1/statuses/update.json?include_entities=true HTTP/1.1
Accept: */*
Connection: close
User-Agent: OAuth gem v0.4.4
Content-Type: application/x-www-form-urlencoded
Authorization:
OAuth oauth_consumer_key="xvz1evFS4wEEPTGEFPHBog",
oauth_nonce="kYjzVBB8Y0ZFabxSWbWovY3uYSQ2pTgmZeNu2VS4cg",
oauth_signature="tnnArxj06cWHq44gCs1OSKk%2FjLY%3D",
oauth_signature_method="HMAC-SHA1",
oauth_timestamp="1318622958",
oauth_token="370773112-GmHxMAgYyLbNEtIKZeRNFsMKPR9EyMZeS9weJAEb",
oauth_version="1.0"
Content-Length: 76
Host: api.twitter.com
status=Hello%20Ladies%20%2b%20Gentlemen%2c%20a%20signed%20OAuth%20request%21
A client may add the HTTP Expect header to tell the server "hey, I expect you to behave in a certain way". For some reason, Twitter's implementation wants you to expect nothing. I ran into this with my own home-grown implementation. I'm not sure why Twitter wants this header.
You may present your credentials and signature in the POST variables, or in the header. Both will work as long as the correct variables are set (oauth_consumer_key, oauth_nonce, oauth_signature, oauth_signature_method, oauth_timestamp, and oauth_token).
I find setting the Authorization header to be cleaner because it does not depend upon the request method (GET, POST, PUT, etc). But Twitter handles both cases perfectly fine. If that's how they implemented it in their library, so be it.
Related
Example Authorization with Postman Authorization:
oauth_token="AQDSADSAAA****************",
oauth_client_id="5c9ASDASD********"
The same thing I need to repeat in curl in php, how can I do it?
PHP 7.4, Windows Server 2012, IIS8
I'm using the DocuSign PHP SDK method createEnvelope, which sends the envelope object to their server. The URL is
https://demo.docusign.net/restapi/v2.1/accounts/cheese-burger/envelopes
with these headers
X-DocuSign-SDK: PHP
Authorization: Bearer ee-i-ee-i-oh
Accept: application/json
Content-Type: application/json
I'm successfully using the JWT key for, say, getUserInfo, but when I submit the envelope I get
HTTP/1.0 301 Moved Permanently
back. As far as I can tell, the envelope is created correctly, I know the JWT key works, and account id is correct. Any insight as to why I'm getting a 301 redirect?
#matt clark supplied the answer in the comments--the 301 response header helped the OP realize that he was missing the protocol section of the URL.
My organization is testing my laravel app with IBM AppScan and it reveled the following issue. I'm not sure the best way I should be verifying the referer. Details of the scan
The following changes were applied to the original request:
- Set header to 'http://bogus.referer.ibm.com'
Reasoning:
The test result seems to indicate a vulnerability because the Test
Response is identical to the Original Response, indicating that the
Cross-Site Request Forgery attempt was successful, even
though it included a fictive 'Referer' header.
Request/Response:
GET /data/1?page=3 HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 9.0; Win32) Referer:
http://bogus.referer.ibm.com
Host: xxxx.xxx.xxx.xxx
Accept: text/html,application/
Laravel relies on CSRF token to prevent the application from CSRF. Validating the header only adds an extra layer of security, however, this can be forged.
How can I view the full request headers, including post data, using libcurl in php?
I am trying to simulate the post of a page, which when done from a browser and viewed in Live HTTP Headers looks like this:
https://###.com
POST /###/### HTTP/1.1
Host: ###.###.com
...snipped normal looking headers...
Content-Type: multipart/form-data; boundary=---------------------------28001808731060
Content-Length: 697
-----------------------------28001808731060
Content-Disposition: form-data; name="file_data"; filename="stats.csv"
Content-Type: text/csv
id,stats_id,scope_id,stat_begin,stat_end,value
61281,1,4,2011-01-01 00:00:00,2011-12-31 23:59:59,0
-----------------------------28001808731060
Content-Disposition: form-data; name="-save"
Submit
-----------------------------28001808731060--
So we nicely see the file I'm uploading, it's content, everything's there. But all my attempts at getting data out of cURL when I try to make the same post from php (using CURLOPT_VERBOSE, or CURLINFO_HEADER_OUT) show request headers that lack the post data, like so:
POST /###/### HTTP/1.1
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:2.0.1) Gecko/20100101 Firefox/4.0.1
Host: ###.###.com
...snipped normal-looking headers...
Content-Length: 697
Content-Type: multipart/form-data; boundary=----------------------------e90de36c15f5
Based on the Content-Length here, it appears things are going well, but it would really help my debugging efforts to be able to see the complete request. I am also irked that it is difficult, I should be able to see the whole thing; I know I must be missing something.
--- EDIT ---
What I'm looking for is the equivalent of this:
curl --trace-ascii debugdump.txt http://www.example.com/
which seems to be available with the option CURLOPT_DEBUGFUNCTION in libcurl, but isn't implemented in php. Boo.
I had a need to do precisely this, but I needed to test communication with a bank.
It is extremely easy to use Fiddler2, enable HTTPS traffic decryption, and have cURL use Fiddler2 as a proxy for debugging in this situation:
$proxy = '127.0.0.1:8888';
curl_setopt($ch, CURLOPT_PROXY, $proxy);
curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, 1);
You are sending multipart/formdata. cURL basically shows the HTTP header completely I guess. The "problem" is that multipart/formdata consist of multiple parts. This is beyond "first level HTTP headers" and part of the body of the "main HTTP body".
I don't know your environment, but you can debug using TCP traffic monitoring as well. For this, you can use Wireshark or tcpdump - Wireshark can as well show dump files created by tcpdump.
I'm using a PHP soap toolkit called nusoap (http://sourceforge.net/projects/nusoap), and I have a newbie question.
I am calling a WSDL service, which is running over ssl (e.g. https://server.com/service/method.asmx?WSDL).
Using nusoap, I traced out the request. It looks like this:
POST /service/method.asmx HTTP/1.0
Host: server.com
User-Agent: NuSOAP/0.9.5 (1.123)
Content-Type: text/xml; charset=ISO-8859-1
SOAPAction: "http://software.com/service/Method"
Content-Length: 713
I'm wondering, though, is this being done over SSL, or no?
Use Wireshark to see the actual traffic. Then it'll be fairly obvious if its encrypted or not.
SOAPAction: "http://software.com/service/Method" indicates it's probably not using SSL. You'd expect it to have https in that URI. I certainly wouldn't trust it.