What Differences Are There Between Form Submissions and cURL Requests? - php

I'm trying to submit data to SalesForce.com, but I'd like to do it through AJAX. Since there are restrictions to deter XSS, I'm having jQuery use AJAX to submit to a PHP page on my server and then having that page simply forward the formdata it's passed along to the proper URL.
If I submit the form with JS turned off, everything goes through fine. If I turn it on, Salesforce confirms receipt of the data (in debug mode), but it's not showing up in my queue, or anywhere really, in SF. SF spits back all of the fields it was passed, and it's spitting back every field that I have in my form, properly filled out.
Are there any differences between submitting something through this method (jQuery's $.ajax() to PHP cURL) and through the native HTML Submit button? Something that could be causing SF to register the data, but register it differently? I've tried adding CURLOPT_HEADER/CURLOPT_HTTPHEADER information

Well, the only thing that's different that you can't fake is the IP address of the request. Depending on how tough the protection is that salesforce is using, you may not be able to spoof from a separate IP address (it would detect and deny the request).
Everything else should be 100% fakeable (headers, etc). What I would suggest is that you get firebug or TamperData and look at the raw headers being sent to salesforce from your browser normally. Then replicate that exact request from PHP. If you need other information, you could detect it in JS and pass it to PHP (Cookie information, browser info, etc)...

$.ajax() transmits cookies from the client browser; and it also adds a "X-Requested-With: XMLHttpRequest" request header.

Maybe try adding the (external) IP address of the machine that is running the php code to the list of trusted networks in salesforce. Login to salesforce, and go to setup -> security controls -> network access, and add the IP there.
I ran into a similar problem and had to add the ip address of the server that was running the java app that connected to sf and this fixed the problem for me.

Related

PHP curl and Infusionsoft Cookie

I have a simple web form that feeds into Infusionsoft. Not my call. I had been submitting it fine with curl to run an AJAX routine, to eliminate then going to Infusionsoft's domain and a thank you page - instead just displaying a thank you/error msg based on the return. All good so far. The problem I've run into is with the affiliate cookies. Apparently each time I set up an affiliate it generates them a url to my sign up form that has their affiliate data in it and sets a cookie in the url (http://www.example.com?p=XXX&w=XXX).
I'm getting that stripped out with my curl routine. How do I keep a 'url cookie' enabled when submitting through curl? I apologize if my terminology is incorrect, this is beyond the scope of what I usually do and would appreciate any correction.
I know it's a curl_setopt but don't understand them enough to make a qualified decision. I basically just need it to keep alive the session that was started with the url. I know this post is lacking, but unfortunately so is my understanding.
The affiliate cookie is tied to the infusionsoft.com domain and you can't access that unless your script is on the Infusionsoft domain. The only other way to do this is to use some hacked version of the instructions on this page - http://kb.infusionsoft.com/index.php?/article/AA-00878/0/How-can-I-track-affiliate-activity-if-I-capture-leads-or-process-orders-through-the-Infusionsoft-API.html
Best,
Jordan
You can do it with modern browsers, because you can make a CORS AJAX request to the hosted version of the web form on InfusionSoft's site. You have to mimic all of the form fields and names, including the hidden ones. So you will need to submit the form to their hosted WebForm version and use Firebug or Chrome's dev tools to watch the HTTP request. Then you will have to submit your form, using the same form fields/names to wherever the action attribute of their hosted form is pointing via an AJAX request.
I have implemented this successfully using AJAX, it just doesn't work on IE 7, 8, and 9 because of CORS AJAX security issues. My workaround for those browsers is to use cURL as a proxy to submit the form. The only downside for those special case browsers is that they don't receive the cookie that shows which URLs they've visited in their web profile in InfusionSoft.

Jquery form doesn't show submission message on web server but it shows submission message on local host

i am messed up with a strange problem. My Form works fine on local machine but when i upload this to some web server, it does not work fine . Without uploading online, when i add some values and click Submit. It shows ThankYou message. But When it is uploaded, after adding any value, when some values are added and Submit pressed, values are entered to database but it doesn't show Thankyou message rather it just remains as it is. Please, suggest some solution. What should i do to make it work fine online ?? Should i send the single HTML file that contains the form ? Have a look at form here: http://dl.dropbox.com/u/33855631/lon_dec/form.htm
i also tried it by uploading to different servers like bluehost etc but no solution.
You're indeed making a cross-domain request (to http://www.londondeclaration.com/wp-admin/admin-ajax.php), which your browser doesn't allow. Either host the front-end and back-end on the same domain, or (if that's not possible) host a proxy to the external source on your own domain.
As #PPvG already mentioned this looks like cross-domain scripting. In general, it is possible to perform croos-domain scripting, but you must set the according HTTP headers as specified here. That's what happens in detail:
user accesses a web page on DomainA including some JavaScript (i.e. jQuery)
user submits button and jQuery fires request to your server on DomainB
result is returned to the users browser, but per plicy the client forbids the scripts from DomainA to examine the response retrieved from DomainB. It's important to understand that security is enforced on the client.
How to solve the problem: Your application on DomainB must set the correct HTTP response header, so that the browser allows your jQuery script from DomainA to work with the response from DomainB:
Access-Control-Allow-Origin:DomainA
This may still not work in all situations. I.e. Internet Explorer does enforce fairly rigid rules when it comes to HTTPS, if I remember correcly cookie management is a problem as well.
EDIT: In Google Chrome you can easily see that this is the problem:

Gmail/Facebook without username/password - PHP Login Header Problem

I want to create my own personal login gateway into Gmail/Facebook/any other site. In this gateway I enter my master username and password, then I can choose where to login (gmail/facebook/etc) without entering those usernames because they are stored on the server.
I tried to implement this by using cURL to send POST request with the headers and post data sent in Firefox during regular login. However, this doesn't work for neither facebook or gmail.
Has anyone tried this or have an idea about why this doesn't work?
Thanks.
// Edited
I am thinking the problem that it doesn't work lie in the fact that the IP address of the php server which sent the curl request to gmail is different from my browser's so, when the response from the gmail server is fed back to the browser, it still cannot authenticate.
Or is that the cookie I sent using curl to Gmail server actually changes according to time.
Based on your reply to my comment cURL is useless for your problem. You need to authenticate your browser with your services (gmail, facebook, ...), what you are doing now is authenticating your script (or your server).
You will have to use JavaScript to accomplish what you want. If you store your credentials for the services on your server, then send them back to the client once you successfully log-in into your webpage. Then you could create a hidden iframe with the "src" attribute set to the login page of the chosen service. Once the iframe loads you can fill the login information (username/password) into the appropriate fields and submit the form. Once this is complete you should be loged-in into your services.
There are probably some other techniques but this is the first that springs to mind ...
This is not necessarily feasible, Gmail and Facebook may be doing very simple checks to see who the referer is and when it comes from your site rather than their own login page refuses to login. This is basic security checks.
You would need to look at their api to see if you can do anything, or possibly you could use javascript and a firefox plugin to write your username and password to the webform then submit the form, a bit of a hack but might do what you want.
There is no reason why the cURL method you tried wont work with the correct headers. playing around scraping sites like digg.com i found i needed a valid USER AGENT header and of course an appropriate REFERER URL, keep going with the curl technique if that will work best for you overall. use an http header add-on to firefox to see what headers you are sending to gmail and then fake them completely.
Tryusing firebug to find out what the response returned, It should always give you the best lead.
I see no reason why it wont work, I read my Gmail and analytics with Curl.
Have you configured curl to accept and store cookies? Usually once you've been authenticated for an online service it will send you a security token in the form of a cookie that you can send back with every subsequent request to verify your authorisation.

is it possible to tamper post data when using frames

I have a site that is using frames. Is it still possible from the browser for someone to craft post data for one of the frames using the address bar? 2 of the frames are static and the other frame has php pages that communicate using post. And it doesn't appear to be possible but I wanted to be sure.
No, it is not possible to POST data from the address bar. You can only initiate GET requests from there by adding params to the URL. The POST Body cannot be attached this way.
Regardless of this, it is very much possible to send POST requests to your webserver for the pages in a frame. HTTP is just the protocol with which your browser and webserver talk to each other. HTTP knows nothing about frames or HTML. The page in the frame has a URI, just like any other page. When you click a link, your browser asks the server if it has something for that URI. The server will check if it has something for that URI and respond accordingly. It does not know what it will return though.
With tools like TamperData for Firefox or Fiddler for IE anyone can tinker with HTTP Requests send to your server easily.
Any data in the $_REQUEST array should be considered equally armed and dangerous regardless of the source and/or environment. This includes $_GET, $_POST, and $_COOKIE.
POST data can not be added in the address bar.
You should always check & sanitize all data you get in your PHP code, because anyone could post data to all of your pages.
Don't trust data from outside of your page. Clean it & check it.
Maybe not from the browser, but they can still catch the request (tinker with it) and forward it to the provided destination, with a tool like burp proxy.
To answer your question: No, it is not possible to send post data using the addressbar.
BUT it is possible to send post data to any url in a snap. For example using cURL, or a Firefox extension. So be sure to verify and sanitize all the data you receive no matter if POST or GET or UPDATE or whatever.
This is not iFrame or php specific, so it should be considered in every webapplication. Never ever rely on data send by anyone being correct, valid or secure - especially when send by users.
Yes, they absolutely can, with tools like Firebug, and apparently more specialized tools like the ones listed by Gordon. Additionally, even if they couldn't do it in the browser from your site, they could always create their own form, or submit the post data through scripting or commandline tools.
You absolutely cannot rely on the client for security.

cakephp data validation before posting remote form

This one is really racking my brain:
I need to post non-sensitive data to a third party payment gateway. I want to use the built in cakephp form validation which is working fine.
What i am doing is submitting the form to a local action which calidates the data and outputs any errors if there are any. Where i am stuck is trying to re-submit that post data to a remote url if there are no validation errors. The problem is that the browser must be redirected to the external url with the post data... I think i lost it about here i know this is probably not possible...
My plan B is just using javascript for form validation and posting directly to the external url.. I looked into using curl but I need the browser to redirect/open the extrernal url. I s there a way to get curl to redirect the browser when posting to a url?
There are several routes you can go down for this, depending on your abilities and other business-related decisions.
My recommendation would be for you to use the AJAX validation methods to validate your data. Your server would then be used for validation (and you could store relevant details like invoice number, customer information, etc.) Once it validates you can have the page submit the form data to the 3rd party site. Note that it's likely you will run into some security related issues depending on how your security certificates (for SSL) are setup.
Another choice (one that I would consider a bit more secure) would be for your site to accept the data. If it doesn't validate, request fixes from the client (pretty basic Cake stuff here). If it does validate, you can then use libcurl to send the data to your 3rd party processor, forming each variable properly in the POST data in your request.
You're not going to be able to redirect the client with a POST payload. Either of the two options above would help you get the job done. I would personally use the second method, for the following reasons: 1) easier auditing / debugging (it's already in your server environment, etc.) 2) more secure - you can lock down your server better than client systems, 3) it seems cleaner to me (the client doesn't see connections going all over the place, etc.) and 4) you can modify and track requests as they pass through your system (respond appropriately to clients when the processor reports an error, etc.)
All in all, this is a doable thing. Does your 3rd party offer an API? You might look into that as well.
If an extra step is okay in your application flow, you can easily do it like this, no Javascript needed:
User fills in form as usual.
Form is submitted to Cake action as usual and validated.
If validation is successful, you display an intermediate page with all the values in hidden or read-only form elements.
Submit sends the hidden form to the external site.
To the user you can present that intermediate page à la "Please confirm your data one last time, click 'Back' to change data or hit 'Submit' to submit it to an external site."
The only option I know of is using JavaScript.
If you wanted to just send data, there would be a number of options. But redirecting the user is a different story.
Your best bet might be to rethink why you want to send the user to a different site.
It's a little complicated to use CURL to redirect, but it's possible, and not terribly hard. It's even easier if you know specifically what URL to redirect to, or if you can build that redirect url. Here is an example curl call:
/usr/bin/curl -D "/tmp/0001" -H "User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322)" -H "Cookie: key=value; key2=value2;" -H "Content-length: 89" -d "this=is&the=post&data=true" http://url.com/application.php
Let's walk through the command:
the -D will store any return headers into the file "/tmp/0001". From here you can parse out any redirects that you need to find. Just run something like grep Location /tmp/0001 to get the line with the location redirect header. That is if the app itself redirects. This is also where you parse out any cookies.
the -H is the header that you are sending to the server. You can post a Cookie: header if the page requires cookies. You may also need to generate the content length based on the data you post.
the -d is the data that you are posting to the application This is for an actual "POST".
Once you do your initial post and get that working, you simply call /usr/bin/curl again with the second url. Simply build the headers the way it should look for that second page, along with any cookies it sends you, and it should post as if it were a browser.
The results of the curl command will be the actual page, that you can log to some database for verification purposes.
I hope this helps.

Categories