Problem to stop Auto Bot to registertration form - php

Auto bot is registering dummy users to my website. Today, I have implemented Captcha with additions of two numbers and store the result in SESSION variable. Only human can understand it and proceed with registration process. But I'm really confused that same bot is parsing the SESSION variable and entering correct value of addition of two numbers and get registered in the website with dummy user.
Is there any other feasible method to stop this bot to register dummy users?
I'm using Zen cart.

Try using reCaptcha. It is most likely a more advanced solution to your CAPTCHA issue.

Use reCaptcha or other anti-bot solutions searchable by google

I once added some simple JavaScript code to add an additional parameter to the form wich is only executed when the client has JavaScript enabled. Bots often don't have JS support.
An other solution would be to add a number of seconds a user was on the page before sending the form. Bots will send it immediately after the first request without delay.
An other thing I did once was that I checked the clients IP address and if it was from an other country than my own the user has to solve the captcha.
Also changing the names of the parameter is helpful.
And sending an individual hash to each form GET request and checking it when the POST arrives.

Related

How to stop people from flooding PHP API with requests?

I am using a simple PHP API that takes requests and connects to a MySQL DB to store/retrieve user information. Example: Login is done using a HTTP POST to the API with username and password.
How do I prevent people from flooding my API with requests, potentially making thousands of accounts in a few seconds.
Thanks
You could serve a token generated and remembered on the server side which is rendered with your forms and validated when the form is sent back to your server. That prevents the user from just generating new post requests without actually requesting the according form from your server since they need the according token generated on your server to get it through.
Then there is the mentioned captcha which would be way too much for a login form from my point but when it comes to things like registering a new user the captcha in combination with the token sounds very good to me.
UPDATE
I Just found this article which is about floot protection of script execution in general. I think their approach is good as is the ip tracking provided you have the ability to use memcache or something similar to speed the checks up.
First, when registering a user, also save her IP address at the time of registration in the DB.
If the IP already exists within 45 minutes of previous registration, reject the request.
Another method is the Captcha, I personally prefer a different method that I found to be even more effective against robots.
Instead of asking the user to "type what they see in an image", and verify they are humans (or robots with sophisticated image processing),
Add another field (for example city), and make it hidden with javascript.
The robots would submit that field to the server, and humans would not.
Note that the robots must run the javascript in order to know what fields are hidden, and this is a time consuming process that they usually don't do.
(see turing halting problem)

What techniques are there for preventing multiple submissions to a competition?

The Project
We have a competition coded in PHP, with CodeIgniter. The form has validation on email addresses and mobile numbers. The page itself is hosted inside an iframe on a different domain (it's an agency-client relationship).
The Problem
We get users with 1000s of entries. We know they are fake because:
They use the same mobile number - assumedly they figure out a mobile number that passes the validation and then use that every time.
The email addresses are all on weird domains, with some of the domains repeated multiple times.
However, the IP addresses are unique, the entries are spread over a few days, the domains themselves have MX records, the user-agents look normal.
The client doesn't want to do anything which could result in fewer entries.
The Question
What are the pros and cons of methods like Captcha? What UI and code patterns have you used that worked?
One method I read is to allow entries that are suspicious, so that spammers entries are accepted, but their data has a 'suspicious' flag against it, which is then checked manually. What data can I check to see whether it is suspicious?
Some methods you could use:
Captcha: Stops bots submitting the form
Email Validation: Send them an email with a unique link to activate their competition entry. Stops invalid email addresses.
Mobile Number Validation: Send them a text message with an activation code. Stops invalid phone numbers.
In my opinion your approach should not be to prevent submission of entries but to require a level of validation on the details entered.
CONS of CAPTCHA:
Users hate it, and it can be frustrating when implemented poorly (failed captcha resets other form fields for instance).
Can be difficult for legit users to complete when the letters are hard to read.
Doesn't always work. Someone just scammed Ticketmaster by beating ReCAPTCHA a few months ago for instance*.
Ugly, more code to implement, and it passes the burden or responsibility from you to the users. PROVE YOU ARE HUMAN is not what I want to see when sending a form, very insulting.
#Nick's got the right idea, use text/email validation. IP checking can be OK sometimes, but as you said, you're getting unique IPs with the same mobile number, so it's not reliable.
There are lots of great posts here regarding CAPTCHA alternatives, definitely worth a read if you plan on employing it. You'll probably have to find a balance between making it easy for the user (encouraging submissions) and front end security techniques.
Why though, can't you simply disregard duplicate mobile numbers or phome number + IP combination? Just because they can can submit multiple times doesn't mean you have to accept it. If it is a human, let them think they are sending in multiple votes :)
*Ticketmaster used various means
to try to thwart Wiseguy’s operation,
at one point switching to a service
called reCaptcha, which is also used
by Facebook. It’s a third-party
Captcha that feeds a Captcha challenge
to a site’s visitors. When a customer
tries to purchase tickets,
Ticketmaster’s network sends a unique
code to reCaptcha, which then
transmits a Captcha challenge to the
customer.
But the defendants allegedly were
able to thwart this, as well. They
wrote a script that impersonated users
trying to access Facebook, and
downloaded hundreds of thousands of
possible Captcha challenges from
reCaptcha, prosecutors maintained.
They identified the file ID of each
Captcha challenge and created a
database of Captcha “answers” to
correspond to each ID. The bot would
then identify the file ID of a
challenge at Ticketmaster and feed
back the corresponding answer. The bot
also mimicked human behavior by
occasionally making mistakes in typing
the answer, authorities said.
Captcha is perfect in spam protection while confusing people very often.
But there is a workaround - You can use JavaScript to hide the captcha for real users (using browsers with JavaScript turned ON) while it will always be "visible" for spam bots (that do not have JS). It's quite simple - just by using of JS You set the div where the captcha is held to display:none, and create a hidden input with value containing that from captcha image...
Strongest approach may be the email validation - but then it means sometimes the rwritting of application. If user submit his reply You register it as not active and send him a validation email to the email address provided. If it is valid, after clicking on the link he will validate his email answer and You can turn his reply to status active...
Also a good workaround for users to prevent the re-submitting of forms on refresh is to redirect users to that same page after the form is submitted and processed... Yes, it takes a second or two longer to view the result, but it's much safer...

Human verification without user action

For a system I'm working on I've got a bit of a problem: I'm messing with one of the basic rules of HTTP and I'm allowing users to post data through a GET request.
Don't get mad at me yet: I've got a reason for this: Users arrive in my application from an external environment and I can't prompt them for any extra input (so all necessary data is in the GET query). They should be able to close the browser window right after it opens and the input should be saved. And no, I can't do this through AJAX, an API or other under-the-hood method.
These requirements kind of rule out captcha, calculations, forms etc. So I'm left with the problem that I really do want some type of verification to prevent bots/crawlers from "accidentally" submitting something.
One of the solutions I am looking into is making a very lightweight landing page that submits itself through javascript onload but it would be the ugliest thing in my application so I'm trying to prevent it. Another is to let the landingpage not do any of the processing but instead use an AJAX-call to do this. This would however mean that older browsers (and many mobile phones) would have to use another solution.
Background: Application written in PHP5.3, built on Yii Framework, 100% cross-browser compatible (this includes pretty much every mobile phone out there).
Some more background: The "exteral environments" I'm talking about vary from e-mail clients to websites. Manipulation of our content at runtime isn't possible.
Update: Here's what I'm going to do: I'm probably going to combine solutions posted here in a fallback mechanism so that a chain of verifications will be attempted:
1. Ajax verification
2. Non-Ajax javascript verification (automatic form submission)
3. Prompt for user input (user has to click a confirm button)
Besides this I'm going to implement a bot trap as descripbed by http://www.kloth.net/internet/bottrap.php
After I'm done with building this I'll update the post if I did anything different.
Hard to understand where you app is and where external environment really are. But one simple bot-removal technique I use is to put an hidden field named 'login' or 'name' and give it an empty value.
Human people will never fill this hidden field, but spam bots are always filling it. So you can discard any request with that field being not empty.
Now you must prevent crawlers and not only spam bots. Never did it, but here are some thoughts. You could add a hidden 'human' hidden input in the form on first mouseMove events (but keyboard-only -- and think about blind people -- users will be considered as robots). So maybe if this field is not there you can launch a javascript 'confirm' where you ask "Confirm that you are a robot or click cancel if you are human".
You can make your anchor link containing a default value that this hidden field values will overwrite in js. Most crawlers will not overwrite the values, especially if you must cancel a confirmation to get the right behavior (and avoid confirmation with mouseMove event for most users).
If you are able to modify the place that your users are coming fro, you could try including a checksum. Calculate some kind of checksum or hash of all the fields in the GET request and add it to the GET request itself (i.e. through javascript, but do it in the place your users are coming from, not where they are landing). Then, in your application, reject all hits with an incorrect checksum.

Protecting Froms From Scriptable Spamming

So I have a form that can create an account and right now the process to create the account is by calling a javascript REST API. I was thinking that it might be really easy to hack that programmatically since all they would need to do it look at the javascript to find out the url to spam and that it might be safer to do the processing in a PHP script. Then I though well, they could just look at the form to find the URL just as easy if I don't do it through javascript. The form is going to be processing only POST data but not sure if that is enough and if it matters if i process it through javascript or PHP.
What it the best way to prevent someone from spamming a form programmatically (ie prevent them from writing server, like PHP, or client, like javascript, code to spams the processing script).
One way is to use Captcha to filter the bots out reCaptcha but its not 100% protection
Using Captcha is probably the first method:
Google's Version
Secondly I would do data checking on the server side and possibly email verification, if the E-Mail is not verified I would have a cron to clean out the rows in your table which don't have e-mail verification.
With these two methods you should avoid a good majority of it.
Go for reCAPTCHA. It's pretty easy.
You can obtain a key pair there by registering your website URL. Use that key to generate the reCAPTCHA image/textbox in your form. Your form's data will be posted and added to database only if entry matches the text displayed in the image, otherwise not (that's aserverside check that you have to keep). You'll get plenty of related code in Google :)
Another technique, as most of the websites now a days follow, is to send an account activation link to the user via email. An account will get created only when that activation link is clicked upon. You can also set an expiration time (say, 24 hours) for this purpose.
Hope this helps.

do you need captcha validation in newsletter subscription?

I am using a custom captcha php script along with news letter scripts to let users subscribe using an email id. The method of registration is based on jQuery.post command.
My question is that am I really safe If I remove the captcha validation from my subscription script. The subscription is simple. For example
mydomain/subscribe.php?email=myemail#somedomian.com
The subscribe.php is called along with email as parameter using jQuery.post command. I am new to web programming stuff and don't have much idea about spammers in conjunction with above scenario. Any advise would greatly be appreciated.
Thanks
Spambots fill out every form they can get their hands on.
I receive subscriptions from bots everyday and I'm slowly adding captchas to all forms.
A bot does not know if the form he is filling out is for a subscription, a contact form or a mass mailing opportunity.
If you don't protect yourself by either using captchas or moving some of the logic into javascript functions, you will send emails to all these addresses.
This increases your traffic, decreases your control over your subscribers, makes a mess out of your stats and sometimes you might send an email to users that didn't want to subscribe because a bot has subscribed thru a trick (i.e. realuser+buy-viagra-now#gmail.com also reaches the user realuser#gmail.com).
So my advise: stay with the captcha
Or: modify your form to make more use of javascript (i.e. load the form via ajax), because bots are not normal users with a browser, they only simulate them
From the answers that I have read only the one from favo is really true. A spambot is a computer program that most likely goes one by one in google and searches your page for any input box and submit button.
I have experience with spam and my subscriptions but adding a capcha can get you less subscriptions. Take it off for now and if you can moderate the emails before adding them to to the mailing list. If you see you are getting to much spam then go ahead and put up the capcha. But really one or two spam emails a week is normal.
So, I recommend moderation until it gets ridiculous.
Depdends. You'd probably safe against general malicious subscription attempts, but not against those that are specifically targeting your signup. I personally wouldn't chance it. Did a majority of subscribers complain about a difficult signup process?
What's the motivation behind your thinking of removing the captcha?
I've never heard of spamming subscription lists.
Spammers are known for sending their own spam, not subscribing to someone's else one %)
By removing the captcha, you would basically allow bots to subscribe.
Why they would want to - depends on your content. If the news letter contains no links, or only links to other non-interactive (or captcha protected) resources, you're good.
If your news letter contains link to resources where a bot can go and spam other people, chances are that they will.
Something simple I have done in the past is add a few hidden fields. Most bots will fill these in, a normal user couldn't so you simply put in some logic that says if the hidden fields contain data simply do nothing.
It is not fail safe but it certainly has stopped large amounts of bot sign ups on sites where I have implemented it.
I am using code like this:
jQuery.post("php/varify_captcha.php", jQuery("#contact_form").serialize(),function(result, status){
if(status == "success"){
jQuery.post("php/send_mail.php", jQuery("#contact_form").serialize(),function(result, status){
alert(status);
}
}
}
First, captcha is validated then mail is sent. My question that should I move the captcha validation inside "send_mail.php"? I have no idea about spams and their ability to read and execute the code. If yes, then they can directly call the "send_mail.php". Let me if above method is safe or shall I move captcha validation inside the "send_mail.php" to make code bit more safer.

Categories