I have a classifieds website, and I just asked a question regarding if I need to use captcha or not whenever users post a new classified etc.
Some places on my website (like contact forms) do need some sort of authentication (captcha etc...).
But I don't know whether I need one when posting a new classified, so I turn to you guys to decide.
Here is the procedure today:
User clicks on "new classified".
User fills out a form containing alot of inputs/selects etc. The selects are javascript populated btw, if that makes any difference.
Users then click "Preview Classified" button.
Next is the "Preview" page, where users see what their classified would look like.
Here, all the form information from the first page is stored in SESSION variables.
If the user is satisfied, then they have to choose a password. This is done by filling out another small form in the "Preview" page at the bottom, with two inputs, "Password" and "Verify Password". This password will later be used to delete the classified, edit it etc.
When the password is entered and validated with Javascript (not empty, no special chars etc) then users may press the "Insert Classified" button.
The "Preview" page is submitted to a "insert_ad_into_database.php" page.
Remember setting all information into SESSION variables, well this page uses those variables to insert them into the databases (MySql and Solr indices).
If the classified got inserted, a "Success" text is echoed, otherwise "ERROR" is echoed.
Do you think I need captcha on this, as it actually is two forms which needs to be filled out? (classified form and password form)
Thanks
I already gave my full answer to this in your other question (let the principle of "least barrier to entry" guide you), but I'll give a recap summary here:
No, this is not safe from spam bots (but it would require a fairly sophisticated spam bot to get through this -- a standard-issue "post something to a form" spambot will break down here).
But you also shouldn't include a captcha until you've seen an actual problem with spam.
Spam bots (especially sophisticated ones) are way less common than you might think, and captchas hurt new user acquisition way more than you might think. The trade-off is easily in the direction of no captcha.
Related
I want to hide part of a page (a div) for bots (such as Google Bot), but it should be visible to Humans.
I did quite a bit of research on the above topic but could not find a proper solution. Using the knowledge obtain from searching I came up with the following work around.
Users are offerd to solve a Google-reCaptcha
User solve the captcha and submit a request to view content
A Persistant-Cookie and a new Session-Variable is created
Above details are recorded in a database with IP and User-Agent
As long as Session is valid user can view the content
If Session is absent, but a valid cookie is present
Create a new Session-Variable and update records in the database
If cookie or session invalid
Offer reCaptcha again
So the users do not have to solve a Captcha on each visit to the site and a valid cookie can be used in different pages for the same purpose.
Any comments on how to limit the number of times a user has to solve a Captcha or any weakness of the propose method will be highly appreciated.
May i suggest you reversing the approach by making an input text box then use css to make it invisible to the human eye then during validation you check if this invincible text box has been filled ? If it has then you end validation as only a bot could have seen it, this is called Honeypot Validation.
Never depend on data from cookies all user inputs are dangerous and can be manipulated, so step 6 is wrong.
About IP address it is against the law in some countries to store IP addresses of your visitors with out their consent, make sure your users know that their IP addresses will be stored, should they decide to view and continue to use the site.
I'll try to be as brief as possible regarding what I'm trying to accomplish:
Lets say I'm sending a message via SMS & Email to 5000 contacts (but I wish this to work for even only 10 or 1.000.000, no matter the volume).
In this message, I'm promoting an event like, say, a dinner party at some fancy place. Everyone who received this message is invited.
Every message contains an URL, linking to a landing page dedicated to the event. This page is responsive, it opens with a nice pic + some text, basic information and a PHP form to answer the invitation.
This form has a few fields. In this example, let's say:
Are you coming to the party ? (answer to choose from : "Yes" or "No thanks").
Will you be accompanied ? (answer to choose from : "Just me" or "+1", "+2").
Anything else you'd want to let us know ? (free typing field).
[SUBMIT]
Each URL link in the messages sent are actually UNIQUE redirections, each one belonging to 1 of the 5000 contacts reached.
Each contact received his own link on both the SMS and the email.
All of this means that in the database I use to send the messages, each contact has been attributed a redirection/short URL.
This way, I can very well know who even just opened the landing page, without needing to ask the openers to fill any information.
I just know that, for example:
Mr. Skywalker is http://website.com/dinnerparty/01
Ms. Organa is http://website.com/dinnerparty/02
Mr. Solo is http://website.com/dinnerparty/1138
Mr. Calrissian is http://website.com/dinnerparty/4469
And they all redirect to http://website.com/dinnerparty/
So far so good, right ?!
Now here's the tricky part...
What I need you guys to help me with, is how to "link and lock" the PHP form on the page to these unique URLs.
Basically, when Mr. Calrissian clicks on /4469, I want him to be able to click "Yes" (I'm coming to the party), "+2", [SUBMIT].
Once he submitted that, if he refreshes the page, he can no longer submit the form. Instead, he sees a message like "Your answer has been received, thank you !"
The page is essentially the same for everyone in the contact list, but as it's HIS link, it's HIS form that shows.
After that, I need to be able to retrieve these informations (the choices he made).
I know this can be done because I came across this kind of process before. I just don't know how to reproduce it! I'm not a programmer, just a noob trying to learn "by myself".
So far, I think I have to somehow give and ID to each of the 5000 forms and I just guess the information saying 'this URL = this form' is gonna be contained in the URL, that will look something like 'http://website.com/dinnerparty-356a192b7913b04c54574d18c28d46e6395428ab'
Does all this ring any bell ? My enquiries so far (playing with keywords "PHP" & "URL") just taught me how to transfer informations from a contact form to the content of another page with the GET array, but it's not what I need.
So if you got any clue, any lead regarding how to realize this, I'd be grateful!
Thanks for reading, any help will be much appreciated!
PS: sorry for bad english ;)
First things first, don't use incremental ID's. This is prone to what's known as a sequential number attack. Use UUID's instead. Otherwise, a guest could enter the next/previous/random ID and possibly RSVP for another guest. This is bad news for you, but great for an attacker (they'll be doin' it for the lulz... this could cost the victim big $$$, such as buying unnecessary dinners, for example). See https://www.youtube.com/watch?v=CgJudU_jlZ8 and https://www.youtube.com/watch?v=gocwRvLhDf8 for some good explanations.
Secondly, you should maintain the UUID in the URL, such as http://website.com/guest/123e4567-e89b-12d3-a456-426655440000 -- using a script or something like .htaccess, you can pickup the UUID and pass it to a single script. For example:
RewriteEngine On
RewriteRule ^/guest/(.*)$ /my-form.php?id=$1 [L]
From there, in your my-form.php, you can check to see if $_GET['id'] exists, and then validate it (does it exist? Has it previously been submitted? etc). If it validates, you can then display the form as usual and post to self (with the UUID in the url, still). On submit, you validate the ID again, and process the submission.
Then, on subsequent page loads, you can deny the form from being submitted again... or conversely, allow users to EDIT their response. Maybe they'll no longer be a +2... but the use case is up to you, some RSVP's don't allow edits after X date... so keep this in mind.
--
Now, to discuss mass SMS/Email services, you would want to have a database of users already. In order to send unique links, you'd just need to do what's known as a "Mail merge" (or merge tags, etc)... Basically, you would have the URL in the newsletter/message like: http://website.com/guest/[guest_id] and the mailmerge would replace [guest_id] with the UUID in your database.
There are plenty of mail/sms mass mailing services out there, so I don't need to list any. Pick one that has an API and fits your requirements.
Hope this helps. Good luck :)
I need to build a registration system which requires the collection of large data (many fields) from the user registering which is then inserted into a couple of tables in a database.
I don't really want to display a very long form to the user for the purposes of better UX.
This system will not run online, it is just a web app to run on the desktop.
I need help, pointers, references, etc on how I can better organize the registration process to make it more user friendly.
This How to encourage a user to fill in long application forms? has been helpful so far
As long as you don't mind requiring your user has Javascript, I would use AJAX. Let's say that you have 50 fields that you can logically combine into 4 different sets - the first may be about the person asking for name, email, etc., while the next set asks for historical information or employment information - like on an application.
Make one form for each set, and then present a new user with the first. When he completes the first page, instead of a "Submit" or "Register" button, use an AJAX call and a "Next" button to get the info and switch to the next page of the form with the next set of fields. You could use the AJAX calls to hold the information in a temp table in your database, and then, once the entire process is complete, you can write it to your member/users table.
You could do like other surveys or checkouts do and add a "title" for each page of the form above the form fields so that as a user moves through registration, they can monitor their own progress.
I'd recommend checking out the Amazon checkout, or really any multi-page survey (you may even be able to set one up yourself on Survey Monkey) to see how a large number of form fields can be broken down logically in a user friendly way.
Hope it helps.
Check out this link: http://www.smashingmagazine.com/2011/05/05/innovative-techniques-to-simplify-signups-and-logins/
It's talking about login- and registration-forms and how to make them more user-friendly. A suggestion which is also included in this article is as follows:
At registration don't ask the user to many questions. Only the basic data like their name for example. Then ask him about more detailed data when the user logs in the first time. This way the registration won't take too long.
Maybe this helps you out :)
For a system I'm working on I've got a bit of a problem: I'm messing with one of the basic rules of HTTP and I'm allowing users to post data through a GET request.
Don't get mad at me yet: I've got a reason for this: Users arrive in my application from an external environment and I can't prompt them for any extra input (so all necessary data is in the GET query). They should be able to close the browser window right after it opens and the input should be saved. And no, I can't do this through AJAX, an API or other under-the-hood method.
These requirements kind of rule out captcha, calculations, forms etc. So I'm left with the problem that I really do want some type of verification to prevent bots/crawlers from "accidentally" submitting something.
One of the solutions I am looking into is making a very lightweight landing page that submits itself through javascript onload but it would be the ugliest thing in my application so I'm trying to prevent it. Another is to let the landingpage not do any of the processing but instead use an AJAX-call to do this. This would however mean that older browsers (and many mobile phones) would have to use another solution.
Background: Application written in PHP5.3, built on Yii Framework, 100% cross-browser compatible (this includes pretty much every mobile phone out there).
Some more background: The "exteral environments" I'm talking about vary from e-mail clients to websites. Manipulation of our content at runtime isn't possible.
Update: Here's what I'm going to do: I'm probably going to combine solutions posted here in a fallback mechanism so that a chain of verifications will be attempted:
1. Ajax verification
2. Non-Ajax javascript verification (automatic form submission)
3. Prompt for user input (user has to click a confirm button)
Besides this I'm going to implement a bot trap as descripbed by http://www.kloth.net/internet/bottrap.php
After I'm done with building this I'll update the post if I did anything different.
Hard to understand where you app is and where external environment really are. But one simple bot-removal technique I use is to put an hidden field named 'login' or 'name' and give it an empty value.
Human people will never fill this hidden field, but spam bots are always filling it. So you can discard any request with that field being not empty.
Now you must prevent crawlers and not only spam bots. Never did it, but here are some thoughts. You could add a hidden 'human' hidden input in the form on first mouseMove events (but keyboard-only -- and think about blind people -- users will be considered as robots). So maybe if this field is not there you can launch a javascript 'confirm' where you ask "Confirm that you are a robot or click cancel if you are human".
You can make your anchor link containing a default value that this hidden field values will overwrite in js. Most crawlers will not overwrite the values, especially if you must cancel a confirmation to get the right behavior (and avoid confirmation with mouseMove event for most users).
If you are able to modify the place that your users are coming fro, you could try including a checksum. Calculate some kind of checksum or hash of all the fields in the GET request and add it to the GET request itself (i.e. through javascript, but do it in the place your users are coming from, not where they are landing). Then, in your application, reject all hits with an incorrect checksum.
I have a classifieds website, where anyone (no need for login currently) can post a classified. It is PHP based.
The procedure for posting is currently like this:
click on "New Classified" --->
fill in a form of all information and hit "View classified before publishing it" --->
the form submits to a "verify classifieds" page, where users verify their inputs --->
If everything is okay in the "verify" page, then the user hits OK and the classified is published.
The above procedure isn't exactly optimized. The first page (new_classified) where the form is, is pretty good, but the second page (verify) uses x number of hidden inputs in another form, used to contain the previous pages form inputs.
Now you know how it works on my site.
The issue today is that alot of companies want to publish their classifieds, and alot of classifieds at the same time. This means they have to fill out the form again and again currently.
I am thinking about creating a login, for companies only, so that their information is automatically inputted into the form, so all they would have to do is fill out the specific classified details like "headline" and "description" etc.
How should I do this in my case? Sessions?
This means I will have to create a new MySql table (I use MySql mainly) and store company-profiles there.
So do you think converting to sessions is alot of work? Worth it? More reliable?
I have never used sessions so I wouldn't know.
As a last note, you should know that I use a picture upload tool on the first page of "new_classified". When a user choses a file to upload, the page is automatically *refreshed*, and then the image is displayed on the same page under section "images uploaded". I hope the session wont interfere with this approach.
Thanks
I think it is worth your while to do logins, and even on a very basic level it will help you to identify who is using your site etc.
This is probably a big debate around developers, what is the best way to do a good login system, whether it's basic or not doesn't matter, I think the concepts still stay the same.
In your case I would suggest session cookies along with a login table consisting of user details. This would help you to verify the user on more than one occasion during his/her visit to the site.
A login is checked against a user entry in a table and then a session cookie is created. This session you can choose to never expire also.
You can then on every step check that the user is the user that is supposed to be logged in and get the companies details by checking the username. This would make for a better query in my opinion.
Sessions aren't a lot of work and it's relatively easy to learn.
http://www.php.net/manual/en/book.session.php
http://www.9lessons.info/2010/02/php-login-script-with-encryption.html is a good example of what you can do with this. Have a look around still. There are a bunch of these great tutorials on the web.