I am trying to send a Topic name in the URL like,
<a href="hello?TopicN=blahblahblha">
and then output the topic name as the Forum topic title. But the problem is the user can just change the name if they want which doesnt do any harm since I dont really do anything with teh name but I was wondering is there a way to encrypt or swap the letters so its not so obvious what the topicN is ?
I also tried md5 encryption but, md5 is only 1 way so that doesnt help. Could use sessions, not sure since i store user login details in sessions,
any ideas and examples would be helpful,
Thank you
MD5 is hashing, not encryption, so that won't help.
Consider passing an identifier to a row in your database so that you could lookup the title.
<a href="hello.php?TopicN=1234">
Trusting the client is a big no-no for many reasons, and this is one of them.
Using the session to store this information would work, but it seems inappropriate given that (I suspect) TopicN could (or, does) change frequently.
Good luck!
Ian
You could use base64_encode and base64_decode, it makes it less obvious.
base64_encode(blahblahblha) = YmxhaGJsYWhibGhh
base64_decode(YmxhaGJsYWhibGhh) = blahblahblha
I'd question the purpose or benefit of this requirement though.
I would advise you to do the reverse even.
Make it more search engine and user friendly, like so: forum/thread/blabla
If you are using Apache search for Rewrite Engine
Try this:
File: .htaccess
RewriteEngine On
RewriteRule url/(.*) url_parser.php?url=$1 [L,QSA]
File with the links:
Link
File: url_parser.php
<?php
$encodedURL = filter_input(INPUT_GET, 'url', FILTER_SANITIZE_STRING, FILTER_FLAG_STRIP_LOW | FILTER_FLAG_STRIP_HIGH);
$unencodedURL = base64_decode($encodedURL);
$urlParts = parse_url($unencodedURL);
$newURL = $urlParts['path'];
// See if there are any extra _GET parameters on this URL:
// Add encoded _GET params:
$newURL .= '?' . $urlParts['query'];
// Add extra _GET parameters:
$newURL .= '&' . $_SERVER['QUERY_STRING'];
// Add any passed #
$newURL .= '#' . $urlParts['fragment'];
// Redirect to the new URL:
header('Location: ' . $newURL);
// If you want to hide the URL completely, do this:
readfile($newURL);
Related
I have a bit of an odd question, I'm sure most of you realise that these sorts of questions arise out of certain situations that a developer has no control over!
I would like to work out how to keep a querystring parameter in the URL at all times. If the parameter is not set, I'd like a default to be appended to the URL ?param=something
I asked a previous question relating to this and have been able to use htaccess to add a default query - but this only works fir the initial request whereas I need to ensure it is always present in the address.
I am thinking of using a cookie - set with PHP and then queried with .htaccess.
So, I am asking if this is possible and if there is a better way of doing this?
Without changing all the URLs I can't see this being possible and as discussed it is a very hacky way to do it.
I would suggest you give them a "special" button that they use for copying a link. Make them use this button to copy the link instead of the URL. You can then control the data properly without hacking the website.
Edit: You "could" add some JQuery in to append ALL links with your param. Have you thought about this?
$(document).ready(function() {
$('a[href]').each(function() {
this.href = this.href + '?something=<?php echo $_SESSION["myparam"]; ?>'
});
});
You could get the param from the initial page load / session but this would require your users to have JS enabled (Which most people do now).
p.s Untested semi-pseudo code. I can test it properly if you decide to go down this route.
$_GET and $_POST superglobal variable those are not meant to used like this. And I don't know why is it required but anyway you can get the required functionality using Session or Cookies. What you are trying to do is not practical. Industry standard is to use session and cookie. Try them out I'm sure it will helps you out for sure. If you have any issues let me know.
If you are using Apache 2, then you could add a rewrite rule that appends a GET parameter to each request that doesn't have "param=..." in its URL. Something like (untested):
RewriteCond %{QUERY_STRING} !(\A|&)param=
RewriteRule (.*) $1?param=defaultvalue [QSA]
See modrewrite 's doc for details. You can add another RewriteCond on %{REQUEST_URI} if you want to limit this to some URL.
I only learned php about 3 months ago so I hope I am not leading you down a path, this is what I would try.
Edit: It has been about 5 months now since I learned php. My Previous version before this edit only "built" the get to append (and it probably didn't work).
if (!isset($_GET['param'])) {$_GET['param'] = 'something';}
$var = '';
while ($other_gets = $_GET)
{
foreach ($other_gets as $key=>$value)
{
if ($key = 'something') {}//do nothing we want the other gets
else {$var .= '&'.$key.'='.$value;}
}
}
$get = '?'.$user_param.$other_gets;
preg_replace your extensions
$string = 'your webpage before output';
$patterns = array();
$patterns[0] = '/.html/';
$patterns[1] = '/.php/';
$patterns[2] = '/yoursite.com /';//with space
$replacements = array();
$replacements[0] = '.html'.$get;
$replacements[1] = '.php'.$get;
$replacements[2] = 'yoursite.com/index.php'.$get;
Then print your string
echo preg_replace($patterns, $replacements, $string);
I want to allow users of my site to post urls. These urls would then be rendered on the site in the href attributes of a tags. Basically, user A posts a url, my site displays it on the page as an tag, then user B clicks it to see pictures of kittens.
I want to prevent javascript execution and xss attacks, and ensure there are no malformed urls in the output I generate.
Example: User A posts a malformed url, supposedly to pictures of kittens. My site tries to generate an tag from user A's data, then user B clicks the resulting link. User A has actually posted a malformed url which adds a javascript "onclick" event in the to send the victim's cookies to another site.
So I want to only allow correctly formed urls, and block out anything other than http/https protocols. Since I'm not allowing anything here which doesn't look like a url, and the user is not providing me html, it should be pretty simple to check by parsing and reforming the url.
My thinking is that parse_url should fail with an error on malformed urls, or it replaces illegal characters with '_'. I can check the separated parts of the url for allowed protocols as well. Then by constructing a url using http_build_url, I take the parts separated by parse_url and put them back together into a url which is known to be correctly formed. So by breaking them down this way first, I can give the user an error message when it fails instead of putting a sanitized broken url in my page.
The question is, will this prevent xss attacks from doing evil if a user clicks the link? Does the parsed and rebuilt url need further escaping? Is there a better way to do this? Shouldn't this be a solved problem by now with functions in the standard php libraries?
I really don't want to write a parser myself and I'm not going to even consider regular expressions.
Thanks!
What you need to do is just escape content properly when building your html. this means that when a value has a " in it, you build your html with "
Protecting against XSS isn't primarily about validating URL's it's about proper escaping. (although you probably want to be sure that it's a http: or https: link)
For a more detailed list of what to escape when building html strings (ie: the href attribute) see HTML, URL and Javascript Escaping
No, parse_url is not meant to be a URL validator.
You can use filter_var for this:
filter_var($someURL, FILTER_VALIDATE_URL);
So, in PHP, you would use something like:
<?php
$userlink = "http://google.com";
$newlink = htmlentities($userlink);
$link = "$newlink";
?>
Depending on a few other things, you might just validate the URL by checking if it points to any content. Here is an example:
figure 1
<?php
// URL to test
// $url = "";
$content = file_get_contents($url);
if(!empty($content)){
echo "Success:<br /><iframe src=\"$url\" style=\"height:400px; width:400px; margin:0px auto;\"></iframe>";
}else{
echo "Failed: Nothing exists at this url.";
}
?>
Curl is another option. With cURL you can just return http headers then check the error code it returns. ie Error 404 = page not found, 200 = OK, 201 = Created, 202 = Accepted, etc etc
Good luck!
~John
http://iluvjohn.com/
I am really new to online web application. I am using php, I got this code:
if(isset($_GET['return']) && !empty($_GET['return'])){
return = $_GET['return'];
header("Location: ./index.php?" . $return);
} else {
header("Location: ./index.php");
}
the $return variable is URL variable which can be easily changed by hacker.
E.g i get the $return variable from this : www.web.com/verify.php?return=profile.php
Is there anything I should take care? Should I use htmlentities in this line:
header("Location: ./index.php?" . htmlentities($return));
Is it vulnerable to attack by hacker?
What should i do to prevent hacking?
Apart from that typo on line 2 (should be $return = $_GET['return'];) you should do $return = urlencode($return) to make sure that $return is a valid QueryString as it's passed as parameter to index.php.
index.php should then verify that return is a valid URL that the user has access to. I do not know how your index.php works, but if it simply displays a page then you could end up with someting like index.php?/etc/passwd or similar, which could indeed be a security problem.
Edit: What security hole do you get? There are two possible problems that I could see, depending how index.php uses the return value:
If index.php redirects the user to the target page, then I could use your site as a relay to redirect the user to a site I control. This could be either used for phishing (I make a site that looks exactly like yours and asks the user for username/password) or simply for advertising.
For example, http://yoursite/index.php?return?http%3A%2F%2Fwww.example.com looks like the user accesses YourSite, but then gets redirected to www.example.com. As I can encode any character using the %xx notation, this may not even be obvious to the user.
If index.php displays the file from the return-parameter, I could try to pass in the name of some system file like /etc/passwd and get a list of all users. Or I could pass something like ../config.php and get your database connection
I don't think that's the case here, but this is such a common security hole I'd still like to point it out.
As said, you want to make sure that the URL passed in through the querystring is valid. Some ways to do that could be:
$newurl = "http://yoursite/" . $return;
this could ensure that you are always only on your domain and never redirect to any other domain
$valid = file_exists($return)
This works if $return is always a page that exists on the hard drive. By checking that return indeed points to a valid file you can filter out bogus entries
If return would accept querystrings (i.e. return=profile.php?step=2) then you would need to parse out the "profile.php" path
Have a list of valid values for $return and compare against it
this is usually impractical, unless you really designed your application so that index.php can only return t a given set of pages
There are many ways to skin this cat, but generally you want to somehow validate that $return points to a valid target. What those valid targets are depends on your specification.
If you're running an older version of both PHP 4 or 5, then I think you will be vulnerable to header injection - someone can set return to a URL, followed by a line return, followed by any other headers they want to make your server send.
You could avoid this by sanitising the string first. It might be enough to strip line returns but it would be better to have an allowed list of characters - this might be impractical.
4.4.2 and 5.1.2: This function now prevents more than one header to be
sent at once as a protection against
header injection attacks.
http://php.net/manual/en/function.header.php
What would happen if you put in a page that didn't exist. For example:
return=blowup.php
or
return=http://www.google.co.uk
or
return=http%3A%2F%2Fwww.google.co.uk%2F
You could obfuscate the reference by not including the .php in the variable. You could then append it in the code-behind and check for the existence of the file in your directory / use a switch statement of allowable values before redirecting to it.
In this case, it more depends on what's done with that part of the query string on index.php. If it's being sent to a database query, output, eval(), or exec() yes, its a very common security hole. Most other situations will be safe unfiltered, but its best to write your own general purpose sanitizing function which converts quotes of all varieties to their HTML entity, as well as equals symbols.
The things I would do are:
Define, what type of return values are allowed?
Write down all types of possible return values.
Then, make conclusions: what characters are not allowed, what is the maximum url length, what domains are allowed, etc.
Finally: make a filter function according to above conclusions.
I Thing hacker can do this
you will redirect if $_GET['return'] Contain any thing
the hacker can use it as xss
redirect to virus or any thing like it
but there is no ability to make any thing else
I have this very simple script that allows the user to specify the url of any site. The the script replaces the url of the "data" attribute on an object tag to display the site of the users choice inside the object on the HTML page.
How could I validate the input so the user can't load any page from my site inside the object because I have noticed that it will display my code.
The code:
<?php
$url = 'http://www.google.com';
if (array_key_exists('_check', $_POST)) {
$url = $_POST['url'];
}
//gets the title from the selected page
$file = # fopen(($url),"r") or die ("Can't read input stream");
$text = fread($file,16384);
if (preg_match('/<title>(.*?)<\/title>/is',$text,$found)) {
$title = $found[1];
} else {
$title = "Untitled Document";
}
?>
Edit: (more details)
This is NOT meant to be a proxy. I am letting the users decide which website is loaded into an object tag (similar to iframe). The only thing php is going to read is the title tag from the input url so it can be loaded into the title of my site. (Don't worry its not to trick the user) Although it may display the title of any site, it will not bypass any filters in any other way.
I am also aware of vulnerabilities involved with what I am doing that's why im looking into validation.
As gahooa said, I think you need to be very careful with what you're doing here, because you're playing with fire. It's possible to do safely, but be very cautious with what you do with the data from the URL the user gives you.
For the specific problem you're having though, I assume it happens if you get an input of a filename, so for example if someone types "index.php" into the box. All you need to do is make sure that their URL starts with "http://" so that fopen uses the network method, instead of opening a local file. Something like this before the fopen line should do the trick:
if (!preg_match('/^http:\/\//', $url))
$url = 'http://'.$url;
parse_url: http://us3.php.net/parse_url
You can check for scheme and host.
If scheme is http, then make sure host is not your website. I would suggest using preg_match, to grab the part between dots. As in www.google.com or google.com, use preg_match to get the word google.
If the host is an ip, I am not sure what you want to do in that situation. By default, the preg match would only get the middle 2 numbers and the dot(assuming u try to use preg_match to get the sitename before the .com)
Are you aware that you are creating an open HTTP proxy, which can be a really bad idea?
Do you even need to fetch the contents of the URL? Why don't you let your user's browser do that by supplying it with the URL?
Assuming you do need to fetch the URL, consider validating against a known "whitelist" of URLs. If you can't restrict it to a known list, then you are back to the open proxy again...
Use a regular expression (preg) to ensure it is a good HTTP url, and then use the CURL extension to do the actual request.
Mixing the fopen() family of functions with user supplied parameters is a recipe for potential disaster.
You could use PHP filter.
filter_var($url, FILTER_VALIDATE_URL) or
filter_input(INPUT_POST, 'url', FILTER_VALIDATE_URL);
http://php.net/manual/en/function.filter-input.php
Also try these documents referenced by this PHP wiki post related to filter
https://wiki.php.net/rfc/add_validate_functions_to_filter?s[]=filter
by Yasuo Ohgaki
https://www.securecoding.cert.org/confluence/display/seccode/Top+10+Secure+Coding+Practices
https://www.owasp.org/index.php/OWASP_Secure_Coding_Practices_-_Quick_Reference_Guide
http://cwe.mitre.org/top25/mitigations.html
I'm using php and I have the following code to convert an absolute path to a url.
function make_url($path, $secure = false){
return (!$secure ? 'http://' : 'https://').str_replace($_SERVER['DOCUMENT_ROOT'], $_SERVER['HTTP_HOST'], $path);
}
My question is basically, is there a better way to do this in terms of security / reliability that is portable between locations and servers?
The HTTP_HOST variable is not a reliable or secure value as it is also being sent by the client. So be sure to validate its value before using it.
I don't think security is going to be effected, simply because this is a url, being printed to a browser... the worst that can happen is exposing the full directory path to the file, and potentially creating a broken link.
As a little side note, if this is being printed in a HTML document, I presume you are passing the output though something like htmlentities... just in-case the input $path contains something like a [script] tag (XSS).
To make this a little more reliable though, I wouldn't recommend matching on 'DOCUMENT_ROOT', as sometimes its either not set, or won't match (e.g. when Apache rewrite rules start getting in the way).
If I was to re-write it, I would simply ensure that 'HTTP_HOST' is always printed...
function make_url($path, $secure = false){
return (!$secure ? 'http://' : 'https://').$_SERVER['HTTP_HOST'].str_replace($_SERVER['DOCUMENT_ROOT'], '', $path);
}
... and if possible, update the calling code so that it just passes the path, so I don't need to even consider removing the 'DOCUMENT_ROOT' (i.e. what happens if the path does not match the 'DOCUMENT_ROOT')...
function make_url($path, $secure = false){
return (!$secure ? 'http://' : 'https://').$_SERVER['HTTP_HOST'].$path;
}
Which does leave the question... why have this function?
On my websites, I simply have a variable defined at the beggining of script execution which sets:
$GLOBALS['webDomain'] = 'http://' . (isset($_SERVER['HTTP_HOST']) ? $_SERVER['HTTP_HOST'] : '');
$GLOBALS['webDomainSSL'] = $GLOBALS['webDomain'];
Where I use GLOBALS so it can be accessed anywhere (e.g. in functions)... but you may also want to consider making a constant (define), if you know this value won't change (I sometimes change these values later in a site wide configuration file, for example, if I have an HTTPS/SSL certificate for the website).
I think this is the wrong approach.
URLs in a HTML support relative locations. That is, you can do link to refer to a page that has the same path in its URL as the corrent page. You can also do link to provide a full path to the same website. These two tricks mean your website code doesn't really need to know where it is to provide working URLs.
That said, you might need some tricks so you can have one website on http://www.example.com/dev/site.php and another on http://www.example.com/testing/site.php. You'll need some code to figure out which directory prefix is being used, but you can use a configuration value to do that. By which I mean a value that belongs to that (sub-)site's configuration, not the version-controlled code!