Security: Form Submission + javascript/jQuery - php

Question:
What is best practice for form submissions while keeping in mind security?
This may be a n00b question but I'm concerned that people might be able to alter some data as its being submitted. Take my example:
I have a form that has a hidden input that stores a user's unique Facebook ID. I take that Facebook ID and create a user account from it. If I use jQuery, won't some users be able to change the data being posted?

It is just as safe as a regular form post. Both methods can be hijacked and data injected. The key is how your server side scripts validate the data alongside authentication, session, anti forgery tokens etc

Users will always be able to post whatever data they like to your server. You can't do anything to change that with javascript. With a decent browser it's easy to find hidden form fields, unhide them, and put whatever you want in them. A more skilled user can craft an http post by hand and send whatever they like. Security must be done on the server, not on the client.

Related

Protect backend from multiple form submits in Laravel

I'm trying to protect my backend from multiple post requests to avoid duplicate data on the database and the server overload.
I've already blocked the frontend disabling the submit button after a first click, but it will not prevent some "smart user" to submit my form from the console or disable javascript from the page and try something.
So I want to know if Laravel has some solution for this case.
PS: I've already tried some solutions on the backend too if you want I can post here.
As requested:
So one of my alternatives is check if the incoming data is already on the database and denied the request if is it, it will prevent the duplicate data but not the server overload.
Another alternative is to create a token in session for one use only on Create() method from the controller, send the token to the view and put it on a hidden field, retrieve it from post request and check the post token with the session token. If the two tokens are the same, then unset it to avoid others requests try using it and if they try I deny the request.
If you know who your users are (because they have an ID) then this is somewhat easy to do. Possibly use some sort of quick-access system such as Reddis to check-in that a user is in a state of edit while the action is being carried out.
However that creates complications of its own. Also doesn’t work if you don’t know who your users are.
The safer thing would be to make sure that your requests can handle potential problems. Consider using database transactions to ensure the integrity of the data.
It would depend really on what you’re trying to avoid, why you are, and what kind of data you’re worried about duplicating.
if it's too important for you to protect for multiple submit u can put a random string in hidden input in your form and also put it in a special session for each form you have and check it every time and after that try to change your session after each time you end your proccess

How much of an html form can be altered without triggering CSRF protection?

I implemented CSRF protection by including a token with PHP into a hidden input for every form. Each token can only be used once, of course.
However, there are tools, such as any web developer tools, which allow inputs to be changed. For example, I can change on-page input forms: I can make disabled checkboxes enabled, and I can change input boxes to textarea boxes without reloading the page or anything like that. CSRF wouldn't catch such changes.
So, how much of a form do I need to validate to stay safe? Do I need to validate every single input to make sure it wasn't altered, including selects, checkboxes, hidden inputs, etc? Surely it can't be safe to assume that these haven't been altered?
You need to validate (on the server side) everything that needs to be validated. What exactly needs to be validated depends on many factors and personal choices. Some of it may be for safety, but only a bare minimum is needed for that in many cases. For the most part validation is to improve or create user experience.
For example you can check to see whether they have entered a valid email address. If they haven't, you can give them a message. If you don't do that nothing bad will happen to your application, but the user won't be able to receive email from you.
There is also an important distinction between validation and sanitation. Sanitation is done for security (e.g. to prevent injection). Validation is done to make sure that input meets requirements to work correctly with your application although incorrect input may be benign. It's also possible for sanitized malicious input to be valid.
All input must be sanitized. No input needs to be validated, so it's really up to you.
CSRF protection has nothing to do with validation. All it does is prevent a user from making a request using your form from an external source because the only way to generate and see the token is to make a request to your site first.
What we are trying to do using CSRF is to ensure that the request IS coming from a reliable source. For e.g, your case what you need to do is ensure that the value in the hidden field is sane. And it can be sane (provided that your token is strong enough) only if it is the same as the one that was provided while the form was rendered by the server.
Now whether fields in the form changed or not, is just your application logic. It does not have anything to do with csrf. If the token is sane, then it came from the right source. Now, if it was the same person who entered values in the form for e.g. is not within the scope of csrf.
I think you are getting the wrong end of the stick here. The token is not a hash of the form when it was sent.
The way this works is to store your unique token in a hidden field on the form and into the session when you server the original page.
When you get the page POSTed/GETed back from your user you check that the token on the page is the same as the token previously stored in the session.
Changing fields must still be allowed or your user will not be able to enter any data on the form. You are just checking that you got the same form back that you sent, because the token is the same, rather than one from somewhere else, i.e. its not a cross site request forgery.
You still have to validate all the fields and do any data preparing before storing it in a database.
Reading1
Reading2

Should I send POST data cross domains?

What is the best practice as such? I have an iframe with a form and when it submits it updates the parent page. Currently it sends the form contents via GET so the parent page url reflects this. I can do the same via POST, but am wondering if this is frowned upon and if sometimes this is blocked/ unuseable.
Any help and advice is welcomed
There are no problems caused by using POST across domains (at least none that you wouldn't get from using POST on the same domain or GET on a different domain).
There is no problem submitting a form to a different domain, unless via javascript.
One concern to keep in mind is validations. You would want to take care on what happens when the form has errors, as you wouldn't want them to loose the information they've typed in if they missed something like 'name'. The server should be the final line of defense for validations, but you'll want to make sure that the client side validations for your form match those of the server and appropriately notify the user of their (or the server's) mistake.
The other thing that could potentially block a form from submitting over post would be a form that requires an authentication token. These are used by many different frameworks to prevent CSRF (Cross-site request forgery) attacks and ensures that the form was submitted from the same website domain.

Secure way to pass data after form submission?

I'm writing a small app that gets information from the server, allows the user to manipulate it, then saves it back to the server. When getting the information from the server, the server also gives a password. When the information gets sent back to the server, the server looks for the password, as a safety precaution.
My question is, what is a safe way to pass the password after the form submits? I considered hidden fields, but that would make it possible to find the password.
Any ideas?
You can store it in a session with a specific field defining of password and destroying after made in use.
This is much better usage.
You could give the password to Javascript/jQuery and then intercept the normal form submit, append the password to the post data, and then resubmit the form, but that may be a bit overkill...
As long as the password isn't being used for anything else (as in it's randomly generated) and it's single use, putting it in a hidden field shouldn't be much of an issue. Average users don't know how to view hidden fields. However, if you are connecting over HTTP and not HTTPS, your average hacker would be able to view the password coming over the unencrypted network and potentially use it before your user can.

php How do you ensure $_POST data is coming from your form and not a outside influence?

Is there a way to ensure the $_POST data my code received came from my form and not an outside influence. Basically I don't want someone to be able to spoof a $_POST to a universally available page such as account creation. The account creation page is accessible by any user, but I want to ensure only the data submitted by my account_creation form is what gets processed.
The only thing I could think of was initiating a $_SESSION, and then supplying the session_id to the form using a hidden input. Upon $_POST the value of the hidden input would then be matched against the current session_id.
If there is a better method to achieve this result? If there is I look forward to hearing it.
You cannot ensure that data came from a form. A POST request is just a POST request, it can be generated in any number of ways. An HTML form is just one of those ways that's very user friendly. Your server needs to validate whether the data received via the POST request is valid or not and whether to act on it or not.
Having said that, there are things that can help you to restrict and validate the data that is being submitted. First of all, require that a user is logged in using (session) cookies. That eliminates random requests by anonymous users. Secondly, you can embed a token as a hidden field into the form which you also save into the user's session. The POST request needs to contain that token in order to be valid. The token is simply a pseudo-random string.
You can enhance this by preparing a hash of the form fields that you expect the user to submit. If the form value should be read-only, you can include the value into the hash as well. E.g.:
$rand = md5(mt_rand());
$hash = sha1('lastname:firstname:email:' . $rand);
$_SESSION['rand'] = $rand;
$_SESSION['hash'] = $hash;
// on form submit:
$keys = array_keys($_POST);
$checkHash = sha1(join(':', $keys) . ':' . $_SESSION['rand']);
if ($checkHash != $_SESSION['hash']) {
die('Form submission failed token validation');
}
That's just a quick example, you'll probably want to sort the keys alphabetically to make sure you'll get the same hash etc. It demonstrates the concept of the user needing to have a unique token for each request though which prevents tempering with forms and submitting more or less data than wanted.
This still does not mean that a user actually used your form to submit the data though.
$ref = $_SERVER['HTTP_REFERER'];
if($ref !== 'some site path/index.php')
{
die("Access Denied!");
}
This should prevent most people from posting data to your database from an outside influence.
Slightly better is to add additional validation such as user_agent, user_ip and some other $_SERVER vars - those are the two I use.
So, create the unique ID (or Session ID) as you describe, but add a little extra validation that the agent and ip also match. Not fool proof, but adds another little layer of security.
Edit: I should add that you don't send the user agent back; keep that server side and silently validate against the returned session id.
Also, if a submission fails validation, never reveal that back to the user as to why - that way a cheat doesn't know how your tracking them. You can also add "5 invalids and you're out" tracking, but you need to sort of login for that.
Using the session ID is certainly one way of doing it. But there are other options most (if not all) of which involve adding some data as a hidden field.
Use a CAPCHA. That will always be unique to each page load and therefore make it mandatory to use your form.
Generate random data and store it in the DB (or just the $_SESSION variable) and check it once the form is submitted.
Option one is the one I recommend for a user creation form as it pulls double duty. It stops automated submission of your own form, while ensuring that the $_POST data is coming from your own form.
This is a standard pattern pattern to prevent XSRF. Essentially it is the similar to what you mentioned. Server creates a random token when form is rendered for the user. It is tied to a browser cookie for the user. On form submission it is posted back to the server. Server then compares the token with what was issued and form action is performed only after a successful match.
There's a lot of good mentions of putting a unique value in the form and matching to the stored value in the server side session. Do that, but also think about what happens when a user uses the back button and possibly tries to submit the form twice, or they open a second browser window(same session!), or they use multiple forms on your site.
Don't create crazy bugs by not thinking your system through.

Categories