better implementations of email activation links - php

i've been seeing a lot of activation links sent via email and I implemented one but just isn't satisfied with it (the long activation links from other sites kinda looks cool but I can't see the point of it being so long). Here's how my activation link looks like
site/controller/method/4/MJKL
the 3rd segment is the user id and the the 4th one is a randomly generated token during registration...
i often see this implemented with url strings but what's the difference when using url strings and using url segments?
will it help if I pass any more information other than the user id and the token?

They might have a longer token in there to reduce the chances of an attacker guessing it correctly.
Don't bother passing any more information than you need. And don't be jealous just because the other URL is longer. Size doesn't matter, or so they tell me ;)

Is it the length you're concerned with, or the look of the URL?
I'm guessing you're using Zend Framework or something similar, that's why it shows "segments" as opposed to a parameter string.
Have you thought about using something like tinyURL? The Tiny API with PHP is super easy.
Edit: Another option if you are building html emails, is simply keeping the anchor text short
Click here to activate
I'm still assuming you want to make the URL shorter. If you want to make it longer, you could always append a session ID, a random hash or some other relatively useless information on the end of it that's ignored later.

If you're using an MVC setup, then it generally makes more sense to use the segmented (and also more SEO-friendly) URL: styles. However, this is no different than passing a query string, because the server (most likely Apache) is taking the input URL segments and passing them as a query string to the script anyway.
As for the long ID, that's not necessary. Either you can generate a custom, shorter ID tag, or use something like uniqid() to generate a shorter GUID for the user to activate with.

Related

Share login between PHP and ASP Classic (VBScript)

I'm trying to update/maintain an older web site that was initially written in Classic ASP/VBScript, and later had PHP pages added. I'd like to set it up so that PHP handles the login, but then that logged in state can be shared between PHP and ASP/VBScript. Note that the pages and languages are fairly intermingled -- somebody spending time on the site might come across several different pages in each language, in no particular order.
(Eventually I expect it to be completely rewritten in PHP, but I have to eat this elephant one bite at a time; and for now I'm simple trying to improve security.)
Let's assume I've successfully logged in and validated the user in PHP using something like phpPass. How do I tell the ASP/VBScript page they just pulled up that they're logged in? How can I best do this securely?
(And thank you for any help!)
You cannot share sessions across Classic ASP/VBScript and PHP as they create/use them differently. My solution isn't that secure but would work:
Log the user in via 1 of the languages (say PHP)
Pass the initial session variable to a URL and get ASP to look at the querystring and then create another session for ASP there.
That would deal with it...although not that secure!
The best answer I've been able to find for this issue was the following. Specific to sharing a login between Classic ASP and ASP.net, but the methodology is exactly the same:
As you probably already know, classic asp and asp.net cannot share the
same session state, so you do need to have a mechanism to log from one
into the other.
What I would do is: when someone logs in, create a unique GUID that
you save in the database for that user. When you jump from one site to
the other, pass that GUID into the query string. When you try to
auto-log them into the other site, look up that GUID and see if it's
attached to anyone. If it is, log them in.
This way you aren't passing anything that a user could guess or
decrypt.
Additionally, it's smart to add a timestamp to the database as well; and the GUID should only be valid for a second or two. Log in on the PHP end, then flip over to ASP and check the GUID.
Not totally secure, but appears to be about as secure as I'm going to find.
source: https://stackoverflow.com/a/921575/339440
Edit to add: per comments, also record the user's IP address to the database and compare it on the ASP side. No teleporting allowed!
CORRECTION: In this case "GUID" is a misnomer. What you need here is a random string of characters, not a GUID. A GUID is a semi-random construct with one of a handful of specific formats, and is not applicable here.

url shortener services API final link gets hit

I have used URL Shortener services, such as goo.gl or bit.ly, to shorten long URLs in my applications using their respective APIs. These APIs are very convenient, unfortunately I have noticed that the long URL gets hit when they shorten it. Let me explain a bit the issue I have. Let's say for instance that I want users to validate something (such as an email address, or a confirmation) and propose to them in my application a link for them to visit in order to validate something. I take this long URL, and use the API to shorten it. The target link (a PHP script for example) is getting hit when I call the shorten API, which makes the validation process useless.
One solution would be to make an intermediate button on the target page which the user has to click to confirm, but that solution makes another step in the validation process, which I would like to simplify.
I would like to know if anyone has already encountered this problem of if anyone has a clue in how to solve it.
Thanks for nay help.
I can't speak to Google but at Bitly we crawl a portion of the URLs shortened via our service to support various product features (spam checking, title fetching, etc) which is the cause of the behavior you are seeing.
In this type of situation we make two recommendations:
Use robots.txt to mark relevant paths as "disallowed". This is a light form of protection as there's nothing forcing clients to respect robots.txt but well behaved bots like BitlyBot or GoogleBot will respect your robots.txt file.
As mentioned by dwhite.me in a comment and as you acknowledged in your post, it is usually best to not do any state changing actions in response to GET requests. As always there's a judgement call on the risks associated vs the added complexity of a safer approach.

php security should I hide query string in url

A number of my pages are produced from results pulled from MySQL using $_Get. It means the urls end like this /park.php?park_id=1. Is this a security issue and would it be better to hide the query string from the URL? If so how do I go about doing it?
Also I have read somewhere that Google doesn't index URLs with a ?, this would be a problem as these are the main pages of my site. Any truth in this?
Thanks
It's only a security concern if this is sensitive information. For example, you send a user to this URL:
/park.php?park_id=1
Now the user knows that the park currently being viewed has a system identifier of "1" in the database. What happens if the user then manually requests this?:
/park.php?park_id=2
Have they compromised your security? If they're not allowed to view park ID 2 then this request should fail appropriately. But is it a problem is they happen to know that there's an ID of 1 or 2?
In either case, all the user is doing is making a request. The server-side code is responsible for appropriately handling that request. If the user is not permitted to view that data, deny the request. Don't try to stop the user from making the request, because they can always find a way. (They can just manually type it in. Even without ever having visited your site in the first place.) The security takes place in responding to the request, not in making it.
There is some data they're not allowed to know. But an ID probably isn't that data. (Or at least shouldn't be, because numeric IDs are very easy to guess.)
No, there is absolutely no truth to it.
ANY data that comes from a client is subject to spoofing. No matter if it's in a query string, or a POST form or URL. It's as simple as that...
As far as "Google doesn't index URLs with a ?", who-ever told you that has no clue what they are talking about. There are "SEO" best practices, but they have nothing to do with "google doesn't index". It's MUCH more fine grained than that. And yes, Google will index you just fine.
#David does show one potential issue with using an identifier in a URL. In fact, this has a very specific name: A4: Insecure Direct Object Reference.
Note that it's not that using the ID is bad. It's that you need to authorize the user for the URL. So doing permissions soley by the links you show the user is BAD. But if you also authorize them when hitting the URL, you should be fine.
So no, in short, you're fine. You can go with "pretty urls", but don't feel that you have to because of anything you posted here...

Is hiding GET parameters by rewriting it to POST and redirect a good idea?

I want to hide parameters from URL. I'm using uuids instead of ids and when I pass it in URL it looks a bit long and ugly. First thought was to use little forms with hidden inputs istead of anchors, but it will be uncomfortable to replace every one anchor with form, also it will not work when an anchor is placed in another form already.
So second thought was rewriting $_GET to $_POST/$_SESSION and then redirect again to this script. All variables will be available and the URL will be clean and short.
But what with performance of this solution? Is it a good idea to do it this way?
Any help or other ideas will be appreciated. Thanks in advance.
PS.
Don't change GET to POST or vice versa for prettiness. Both HTTP methods are handled very differently in many contexts, and you don't want to cause these kind of side effects.
POST requests cannot be self-contained in a URL, i.e. try to send someone the link to a site that requires a POST request. POST requests screw with browser history, i.e. try clicking the back button to go back to a page submitted via POST. POST requests aren't indexed by search engines.
POST requests are supposed to be used to modify data on the server. Don't use them for all regular requests.
If you need prettier URLs, find some other way to reference your records. Or just stop caring about it, it's really not that important.
you will of course loose ALL search engine benefits across the entire width of your site if you universaly adopt this strategy. you should only really use $_POST when you are submitting data that needs to be saved to a storage medium (or where you are sending secure data https etc), otherwise the recommendation is $_GET for 'requested' data. So, you'll need to identify the use case here and follow that pattern.
I understand what you are saying about 'ugly' URLs, but would advise you to be cautious on looking for remedial action on this. One way of course would be to do a urlrewrite on the incoming parameters but this would require database lookups etc (to get the mapped nice url string), so could be costly.
I'll get back with any other thoughts as they occur.

How to reject direct query?

I am using a URL to query some posts by their ID.
http://domain.com/page-name/?id=123
Visitors click the URL and will open the page and get the right post.
However, if anybody want, he can input this URL in browser and get the post, he can even get a lot of different posts if he knows other IDs. How can I reject this kind of query?
By the way, my site provide embed code for post. So, I need to enable access from other website.
The easiest way probably would be to check the HTTP Referer via $_SERVER['HTTP_REFERER'] and make sure the visitor clicked the link on one of your pages. This will, however, prevent any kind of bookmarking as well.
Another solution would be to use something else than IDs as URL parameter. Those would be hard to guess. You could use an MD5-Hash of the id + date or something instead of just the ID. (Of course you would have to store the hash in the database!)
On some pages you can see another approach. It is mainly used for search engine optimization, but can work for you as well. Generate a string from the title of the post (something like "news_new_blog_software") and store that in the database. Then use mod_rewrite to redirect all calls of http://domain.tld/post/* to a PHP file and over there check if the string after /post/ is in your database. This might look a little nicer than MD5 hashes, but you would have to ensure URL strings are not used several times.
If you want to make it really secure there is basically no other way than using some kind of login to check the access privileges.
However, if anybody want, he can input this URL in browser and get the post, he can even get a lot of different posts if he knows other IDs.
Exactly. That is the purpose of the World Wide Web.
And there is absolutely no reason in rejecting direct queries.
In fact, from the technological point of view, every request to you site is a "direct" one.
You are probably trying to solve some other problem (most likely imaginary one). If you care to tell it to us, you will get the right solution.
You can generate some kind of secret key and append it to the link URL, something like
http://domain.com/page-name/?id=123&key=1234567890
Some specific data required to generate this key is stored in cookie.
You can use md5 hash of random value + timestamp + page id, saving that random value to cookie. Every time you get a request, you check if key is present in request parameters, if user has cookie, then calculate hash and compare it with the one in the request.
you can pass id in hidden field and use post form method.
You need authorisation, not this. This would stop people clicking through to your site from search engines or other websites.
If you don't want to implement authorisation/login, then why not try implementing the First Click Free: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=74536

Categories