I have set up a rather complicated HTML form that uses the JQuery Validate plugin with several required fields and various rules. Form is working great. It POSTs to a separate PHP processor file that does a number of things such as send a couple of emails and eventually sends the user to Paypal. (It is a club membership application.) It appears that it only took about a week for some type of "bot" to find the processor file and start running it directly over and over. About 500 emails & apps were generated before I caught it and stopped it by renaming the files temporarily. At the time it was happening I wasn't quite sure exactly what was going on, but after evaluating it for most of the day I came to realize that it couldn't be as a result of the main form being executed, but by just running the processor file directly.
So...my question is this: How can I keep this from happening? There must be some type of coding to include that will ensure that the processor can't run unless it is really coming from the real HTML form...or is there a better way? I followed all of the "examples" on the 'Net in regards to forms and POSTing but nowhere did I see anything that relates to this type of problem.
Generally this can be reduced by adding a CSRF token to the form.
Set a random sha/md5 value to your session, and set that value in the form also as a hidden input, upon a legit user sending the form that value will be passed along too, validate and check the passed value with the one in session. if all is good process.
If its a bot, the bot would need to parse the form for the CSRF token first. Or you could step up and make that security key an image and make the user type it (captcha).
How to properly add CSRF token using PHP
Its something you should also add to your login forms ect, else your have bots brute forcing there way in.
Maybe you could add a $_SESSION[] global variable on the form page. Then check it on your processing page and unset it after execution. Sounds like the simplest way to me, but you should hear out what others suggest. You can fin documentation on $_SESSION[] variables here PHP $_SESSION
Add a token to the form when generating the page, and save it into the session.
When you got the post data, check the token with the one in the session.
And you probably want to use CAPTCHA code to protect yourself from the bots.
Related
I'm trying to protect my backend from multiple post requests to avoid duplicate data on the database and the server overload.
I've already blocked the frontend disabling the submit button after a first click, but it will not prevent some "smart user" to submit my form from the console or disable javascript from the page and try something.
So I want to know if Laravel has some solution for this case.
PS: I've already tried some solutions on the backend too if you want I can post here.
As requested:
So one of my alternatives is check if the incoming data is already on the database and denied the request if is it, it will prevent the duplicate data but not the server overload.
Another alternative is to create a token in session for one use only on Create() method from the controller, send the token to the view and put it on a hidden field, retrieve it from post request and check the post token with the session token. If the two tokens are the same, then unset it to avoid others requests try using it and if they try I deny the request.
If you know who your users are (because they have an ID) then this is somewhat easy to do. Possibly use some sort of quick-access system such as Reddis to check-in that a user is in a state of edit while the action is being carried out.
However that creates complications of its own. Also doesn’t work if you don’t know who your users are.
The safer thing would be to make sure that your requests can handle potential problems. Consider using database transactions to ensure the integrity of the data.
It would depend really on what you’re trying to avoid, why you are, and what kind of data you’re worried about duplicating.
if it's too important for you to protect for multiple submit u can put a random string in hidden input in your form and also put it in a special session for each form you have and check it every time and after that try to change your session after each time you end your proccess
I am a little new to PHP, and I have gotten in the habit of creating a specific file that handles all the form processing.
For example, I have one PHP file that displays the actual form, let's called it "registration.php" for example, and it specifies as its action "registration-process.php". A user fills out the registration form on registration.php, hits submit, and the data is POSTed to registration-process.php because it was specified as the action file by the form.
Now my question is this: Can't someone who knows what they are doing POST data to registration-process.php without going through registration.php? This would have the potential to lead unexpected consequences.
Is there any way to ensure that registration-process.php will ONLY accept POSTed data from registration.php? Like maybe a hidden field with a value that gets encrypted via some PHP code, and that value gets checked by the registration-process.php file? I wouldn't know how to do that, however, or if that's even the best solution.
Yes, using a hidden "security token" field is a common way to verify a forms integriy. Many public forums are using this method.
Try Google for php form security token or check out this site:
http://css-tricks.com/serious-form-security/
Can you only accept POST data from one location, probably. It is worth it, probably not.
As long as you are validating your form fields correctly (make sure what you're getting is within the realm of what you're expecting) there won't be any negative consequences of leaving it so anything can POST to it.
Also, technically you can send POST data to any file on the web, it just depends on what the file does with it whether or not it means anything.
Also, what Mario Werner is talking about is CSRF tokens. That won't stop other things from posting to your site, it just adds a level of security that makes sure the request came from the right place. For a detailed explanation, you can read this: http://en.wikipedia.org/wiki/Cross-site_request_forgery
I've got a submission form, with 9 fields, 6 of which require validation, including a upload field with file size and file type validation.
Generating a random token, to prevent CSRF is working, but what is the correct way to validate when using a token?
If I do the validation within the same file, the token is regenerated with the validation reload. (can this be prevented? I've tried isset() but still regenerates.) However using the same file prevents the users Name and Email from being stored in a session.
Is it best to do the validation within a separate file, which then redirects back to the form with basic variables in the URL for each error, i.e. http://www.example.com/form?n=1
Using a separate file would also mean storing the form data within session, so the form can be repopulated if errors exist on the redirect.
Any help gratefully received.
From experience, CodeIgntier does great CSRF implementation, among other security mesures. I would suggest that you go over their code to gain a good understanding of the whole process. Also see this.
For a system I'm working on I've got a bit of a problem: I'm messing with one of the basic rules of HTTP and I'm allowing users to post data through a GET request.
Don't get mad at me yet: I've got a reason for this: Users arrive in my application from an external environment and I can't prompt them for any extra input (so all necessary data is in the GET query). They should be able to close the browser window right after it opens and the input should be saved. And no, I can't do this through AJAX, an API or other under-the-hood method.
These requirements kind of rule out captcha, calculations, forms etc. So I'm left with the problem that I really do want some type of verification to prevent bots/crawlers from "accidentally" submitting something.
One of the solutions I am looking into is making a very lightweight landing page that submits itself through javascript onload but it would be the ugliest thing in my application so I'm trying to prevent it. Another is to let the landingpage not do any of the processing but instead use an AJAX-call to do this. This would however mean that older browsers (and many mobile phones) would have to use another solution.
Background: Application written in PHP5.3, built on Yii Framework, 100% cross-browser compatible (this includes pretty much every mobile phone out there).
Some more background: The "exteral environments" I'm talking about vary from e-mail clients to websites. Manipulation of our content at runtime isn't possible.
Update: Here's what I'm going to do: I'm probably going to combine solutions posted here in a fallback mechanism so that a chain of verifications will be attempted:
1. Ajax verification
2. Non-Ajax javascript verification (automatic form submission)
3. Prompt for user input (user has to click a confirm button)
Besides this I'm going to implement a bot trap as descripbed by http://www.kloth.net/internet/bottrap.php
After I'm done with building this I'll update the post if I did anything different.
Hard to understand where you app is and where external environment really are. But one simple bot-removal technique I use is to put an hidden field named 'login' or 'name' and give it an empty value.
Human people will never fill this hidden field, but spam bots are always filling it. So you can discard any request with that field being not empty.
Now you must prevent crawlers and not only spam bots. Never did it, but here are some thoughts. You could add a hidden 'human' hidden input in the form on first mouseMove events (but keyboard-only -- and think about blind people -- users will be considered as robots). So maybe if this field is not there you can launch a javascript 'confirm' where you ask "Confirm that you are a robot or click cancel if you are human".
You can make your anchor link containing a default value that this hidden field values will overwrite in js. Most crawlers will not overwrite the values, especially if you must cancel a confirmation to get the right behavior (and avoid confirmation with mouseMove event for most users).
If you are able to modify the place that your users are coming fro, you could try including a checksum. Calculate some kind of checksum or hash of all the fields in the GET request and add it to the GET request itself (i.e. through javascript, but do it in the place your users are coming from, not where they are landing). Then, in your application, reject all hits with an incorrect checksum.
So I have a form that can create an account and right now the process to create the account is by calling a javascript REST API. I was thinking that it might be really easy to hack that programmatically since all they would need to do it look at the javascript to find out the url to spam and that it might be safer to do the processing in a PHP script. Then I though well, they could just look at the form to find the URL just as easy if I don't do it through javascript. The form is going to be processing only POST data but not sure if that is enough and if it matters if i process it through javascript or PHP.
What it the best way to prevent someone from spamming a form programmatically (ie prevent them from writing server, like PHP, or client, like javascript, code to spams the processing script).
One way is to use Captcha to filter the bots out reCaptcha but its not 100% protection
Using Captcha is probably the first method:
Google's Version
Secondly I would do data checking on the server side and possibly email verification, if the E-Mail is not verified I would have a cron to clean out the rows in your table which don't have e-mail verification.
With these two methods you should avoid a good majority of it.
Go for reCAPTCHA. It's pretty easy.
You can obtain a key pair there by registering your website URL. Use that key to generate the reCAPTCHA image/textbox in your form. Your form's data will be posted and added to database only if entry matches the text displayed in the image, otherwise not (that's aserverside check that you have to keep). You'll get plenty of related code in Google :)
Another technique, as most of the websites now a days follow, is to send an account activation link to the user via email. An account will get created only when that activation link is clicked upon. You can also set an expiration time (say, 24 hours) for this purpose.
Hope this helps.