HTML5 domain locking? - php

I've got a project where we're creating a dynamic html5 based video player with a bunch of Ajax controls and features. The intention is that this player will be used by other domains. Our old player used flash and could easily domain-lock, but now is there any method at all to do domain locking in HTML5?
Keep in mind that's its not just the video itself, we're also wanting to load html content for our ajax based controls. It seems like iframe is the obvious choice for this but then there's no way to do domain locking.
Any ideas?

You could use the function above, but its pretty obvious what it's doing, so anyone can just remove the domain lock.
There are services out there that will lock your page to a domain name, I know of two off the top of my head.
jscrambler.com - this is a paid tool, but it might be a bit of an overkill if all you want to do is lock your domain.
DomainLock JS - this is a free domain locking tool.

I came here looking for the same thing. But I think I have an answer worked out.
The best way I found sofar is to strip the location.href of its http:// and then check the first few characters for a whitelisted domain. So:
if(checkAllowedDomains())
{
initApplication();
}
else
{
// it's not the right domain, so redirect them!
top.location.href="http://www.snoep.at";
}
function checkAllowedDomains()
{
var allowed_domains=new Array();
allowed_domains.push("www.snoep.at");
allowed_domains.push("www.makinggames.nl");
allowed_domains.push("www.google.com");
// add whatever domain here!
var domain=top.location.href;
domain.replace('http://','');
var pass=false;
for(i=0;i<allowed_domains.length;i++)
{
var shortened_domain=domain.substr(2,allowed_domains[i].length);
if(shortened_domain.indexOf(allowed_domains[i])!=-1)
{
pass=true;
}
}
}
This bit of code checks several allowed_domains, you can easily extend the array.
That is the problem with the code, it's very readable. So, I'd advise you to put it through a js-minimizer to make it less obvious and include it in EVERY js on your page. InitApplication() is a function that starts your page or application.
Because you strip the location of http:// (which may or may not be there) and then checking only for the specific length (including the WWW!) of the allowed domain, you rule out subdomains, that might look like this: google.com.mydomain.com and throw the check of!
Hope this helps.

Try reading REFERER header, and if the site isn't blacklisted, don't display player.

Related

Open all external links to a new tab without JS

I was wondering if it is possible through htaccess or somehow else (but NOT JS) to make all external links (links that are not domain related) to open in a new tab (target="_blank").
Is this even possible?
Thank you!
Any link that you generate in your page (I'm assuming you are generating the page with PHP), just do
if (strpos($link, 'yourdomain.com') === false)
{
//append your target="_blank" to the link here
}
Then you're searching the link for your domain and if it is not on your domain, then making it open in a new tab.
See http://us1.php.net/strpos
there are only 3 ways to decide this :
target-attribute
JS
Browser-Settings or Plugins (depends on what browser you use, most tend to use JS)
If you don't want to use JS, then you are pretty much only left with target. You could insert it "automatically" by PHP/Ruby/Python/Java-Code (whatever you use to generate your HTML), by using search and replace functions.
If you write your HTML yourself then you can set it for each link by hand.
I see no reason why you would need more options, but if you do: you're fucked.
Browsers don't get to see .htaccess and your server only delivers HTML-Files. It has no control about how they are processed. The browser decides this on it's own (this is where you could suggest to all your users to install a plugin to do this).
CSS3 property is there.
a
{
target-name:new;
target-new:tab;
}
But unfortunately, it's not supported by any browser.

Detect SSL when proxy *always* claims a secure connection

I want to detect whether or not a user is viewing a secure page and redirect if not (for logging in).
However, my site travels through a proxy before I see the server variables and the proxy (right now) is telling me that $_SERVER['HTTPS'] is 'on' when the URI clearly indicates otherwise. It also shows 'on' when the user is navigating 'securely'.
Navigating through http:// and https:// both output that $_SERVER['SERVER_PORT'] = 443.
I don't have the ability to make any changes to the proxy so I want to know:
Does PHP have any other options for me to detect the truth or...
Am I stuck to resort to JavaScript's mechanisms for detection and redirection.
I mined this question for ideas but they mostly revolve around the $_SERVER['HTTPS'] variable being trustworthy. Bah!
It appears that this question is experiencing at least something similar, but s/he was able to resolve it by adapting an apache solution.
Are there any other PHP SERVER variables or tricks available to detect what the user's URI begins with? The only difference between the $_SERVER variables when my site is viewed http versus https are the following:
_FCGI_X_PIPE_ (appears random)
HTTP_COOKIE (sto-id-47873 is included in the non-secure version but I did not put it there)
REMOTE_ADDR (This and the next two keep changing inexplicably!)
REMOTE_HOST
REMOTE_PORT ('proxy people', why are you continually changing this?)
Are any of these items strong enough to put one's weight upon without it splintering and causing pain later? Perhaps I shouldn't trust anything as filtered through the proxy since it could change at any given time.
Here is my plan to use JavaScript for this purpose; is it the best I have?
function confirmSSL() {
if(location.protocol != "https:") {
var locale = location.href;
locale = locale.replace(/http:\/\//,"https://");
location.replace(locale);
}
}
<body onLoad="confirmSSL()">...
I think if the user has JavaScript disabled in my community, then they hopefully know what they are doing. They should be able to manually get themselves into a secure zone. What sort of <noscript> suggestions would be commonplace / good practice? Something like this, perhaps?:
<noscript>Navigate using https://blah.more.egg/fake to protect your information.</noscript>
PHP solutions that work (with good explanation) will be given preference for the correct answer. Feel free to submit a better JavaScript implementation or link to one.
Many thanks!
Although already partially discussed in the question's comments, I'll summarize some suggestions concerning the redirection logic in JavaScript:
Generally using window.location instead of location is advisable, an explanation can be found here.
Regex seems like a bit of an overkill for a simple replacement of the protocol.
The redirection logic should be executed as soon as possible, because in the event of redirection, every additional document processing is unnecessary.
Browsers with JavaScript disabled should at least show a notification prompting the user to switch to https.
I suggest using the following code (adopted from here), which is short and efficient:
<head>
<script type="text/javascript">
if (window.location.protocol != "https:") {
window.location.href = "https:" + window.location.href.substring(window.location.protocol.length);
}
</script>
...
</head>
<body>
...
<noscript>Please click here to use a secure connection!</noscript>
...
Just use the client-side approach. If your proxy is non-configurable, then that option is out. Detecting and redirecting via js is fine.
There is also a way to achieve the redirection without client side javascript. This method can be especially helpful, if JavaScript is disabled in the client's browser.
The steps are pure PHP and pretty simple:
Start a session
If it's a fresh session, redirect to the https location
If the session isn't new, it can be assumed, that the user has been redirected
Example code:
<?php
session_start();
if( !isset($_SESSION['runningOnHttps']) ) {
$_SESSION['runningOnHttps'] = true;
header('Location: https://my-cool-secure-site.com');
}
?>
Naturally, you could restrain this functionality to those browsers with JavaScript disabled in order to create a kind of 'hybrid mode': Whenever there is a new session with a non-JS browser, make a request to some kind of callback script that notifies the server to send a location header:
some_landingpage.php sends an initial <noscript> containing a hidden iframe, which will load redirect.php:
if( !isset($_SESSION['checkWasMade']) ) {
$_SESSION['checkWasMade'] = true;
echo '<noscript>
<iframe src="redirect.php" style="visibility: hidden; position: absolute; left: -9000px;"></iframe>
</noscript>';
}
A request to redirect.php will let you know, that JavaScript is disabled and give you the chance to force redirection by sending a Location header (see above) with the next actual request.
As a matter of course, this method will only work reliably, if the protocol won't change (magically?) during one session.
UPDATE:
All of the above-mentioned method for handling non-JavaScript user agents could be rendered moot by an even neater approach:
I just learned, that <noscript> can also be included inside the <head>, which allows one to just redirect via <meta> tags.
Hence, some_landingpage.php could send an initial meta-refresh inside <noscript>:
// The following echo must appear inside the html head
if( !isset($_SESSION['checkWasMade']) ) {
$_SESSION['checkWasMade'] = true;
echo '<noscript>
<meta HTTP-EQUIV="REFRESH" content="0; url=https://my-cool-secure-site.com">
</noscript>';
}

Redirect a user to an external site while linking to an internal page?

How can I redirect a user to an external site while linking to an internal page ?
I have seen examples like:
example.com/go/ksdjfksjdhfls
example.com/?go=http://www.new-example.com
... And many more...
How this is achieved in php ?
Does this have any pros/cons with regards to SEO ?
I don't see any benefit in this approach, but there are a few ways to achieve it. To do it with the GET query, you would simply need the following code:
HTML:
Google!
PHP:
if (filter_var($_GET['site'], FILTER_VALIDATE_URL)) {
header('Location: ' . $_GET['site']);
}
With the above example, it will actually take the user to that location, not to:
http://example.com/link.php?site=http://www.google.com
To achieve the url being "local" while pulling up a remote site, you'd either have to:
Mess with URL rewriting, which can get messy and I'm not sure will let you do the above
Retrieve the remote page via curl and display it, which may screw up links on the "remote" page
Use iframes and set the iframe to be the size of the page. Note that this last method, while least offensive, is recognized as a potential security breach known as 'clickjacking' since it's used to trick users into clicking on a link for one page which his hiding a malicious link to another site. Many servers and browsers are taking steps to avoid this (for instance, google does not allow iframing of its home page), so this may also reach dead ends.
So of the three server-side methods I can think up, one may or may not be possible, and is a pain. One will be crippled and put a heavy load on the server. The last is a known bad guy and is likely not to work for many cases.
So I'd just go with a redirect, and really, if you don't need the address bar to show the local URL, then I'd just have a direct link.
All of the raises the question: What are you hoping to accomplish?
put this is beginning before any output to browser
<?
header('location:example.com\index.php');
?>
Set up an index php file which sets the header location to the url in the get parameter.
example.com/?go=http://www.new-example.com :
// example.com/index.php
<?php
if (isset($_GET['go'])) {
$go = $_GET['go'];
header('Location: $go');
} // else if other commands
// else (no command) load regular page
?>
example.com/go/ksdjfksjdhfls :
// example.com/go/ksdjfksjdhfls/index.php
<?php
header('Location: http://someurl.com');
?>
I use .htaccess rules for this. No PHP needed.
i.e.
Redirect 307 /go/somewhere-else http://www.my-affiliate-link.com/
So visiting http://www.mywebsite.com/go/somewhere-else will redirect to http://www.my-affiliate-link.com/.
On my site, I use "nofollow" to tell the search engines not to follow the link. The 307 status code means "temporary redirect".
Click here!
example.com/?go=http://www.new-example.com
you can use iframe and set the src attribute to http://www.new-example.com
<!DOCTYPE HTML>
<html>
<head>
</head>
<body>
<iframe src="http://www.new-example.com" width="100%" height="100%"></iframe>
</body>
</html>

Showing only top-level for website

I have a website, say accessible under http://example.com.
For this, I have several PHP-scripts like index.php, intro.php, faq.php, contact.php etc.
So a typical use-case would look like so:
User going to http://example.com, which will be http://example.com/index.php -> then clicking on "Introduction" and being redirected to http://example.com/intro.php.
While all this is working nicely, I wondered if there is a way to hide the names of the PHP-scripts completely, so the URL will always read as http://example.com/, regardless whether the user is on index.php, intro.php, faq.php etc.
Using RewriteRules seems not the way to go as it is basically doing the other direction: Facilitating the input of a specific URL for the user (e.g. making the ".php" optional).
However, I want the user to get only the URL of the site to be visible and not the individual scripts along its way.
Is something similar actually possible with individual scripts or would this require all the individual scripts to be combined into one and then to use constructs such as:
if( $_POST['destination'] == "intro" )
{
//DO ALL THE Introduction MARKUP
}
Thank you.
Best.
You could use a full-page iframe, and load intro.php in the iframe. This way, the user stays on the same page, but the page in the iframe changes.
one way (working, but not very good) is to include all your scripts in index.php and call functions which draw specific pages from those scripts. this call s must depend on dome post variable.
You could use AJAX calls to load the new contents when a user clicks on a link. Then you could create your website like usual, but add a script similar to this one (using jQuery):
$(function() {
$('a').click(function() {
$.get($(this).attr('src'), function(data) {
document.write(data);
});
return false;
});
)};
I haven't tried this code, but something along this lines should work.
This would of course not work in browsers that do not support JavaScript, and you would need to take care of forms in another way, so a full-page iframe might be an easier solution.
Given your further explanation, I'd go with the single clean index.php, and other scripts included as needed (I'd even them outside your document root so they can't be accessed directly, either by accident or on purpose):
index.php:
<?php
$action = isset($_POST['action']) ? $_POST['action'] :'index';
switch($action){
case 'intro':
require '../pages/intro.php';
break;
case 'somethingelse':
require '../pages/somethingelse.php';
break;
case 'index':
default:
require '../pages/index.php';
}
?>
Possibly even somewhat optimized with a whitelisted array of possible pages. This keeps your original index.php small & tidy, with still the possibility to do all more complex stuff in dedicated files. No actual need for javascript (it's not needed for the functionality, but of course can be used as desired) or psuedo-hidden urls due to frames (which most of the time doesn't fool a search indexer or someone who just wants to use direct urls with the smallest amount of knowledge about html).

Best way to avoid code injection in PHP

My website was recently attacked by, what seemed to me as, an innocent code:
<?php
if ( isset( $ _GET['page'] ) ) {
include( $ _GET['page'] . ".php" );
} else {
include("home.php");
}
?>
There where no SQL calls, so I wasn't afraid for SQL Injection. But, apparently, SQL isn't the only kind of injection.
This website has an explanation and a few examples of avoiding code injection: http://www.theserverpages.com/articles/webmasters/php/security/Code_Injection_Vulnerabilities_Explained.html
How would you protect this code from code injection?
Use a whitelist and make sure the page is in the whitelist:
$whitelist = array('home', 'page');
if (in_array($_GET['page'], $whitelist)) {
include($_GET['page'].'.php');
} else {
include('home.php');
}
Another way to sanitize the input is to make sure that only allowed characters (no "/", ".", ":", ...) are in it. However don't use a blacklist for bad characters, but a whitelist for allowed characters:
$page = preg_replace('[^a-zA-Z0-9]', '', $page);
... followed by a file_exists.
That way you can make sure that only scripts you want to be executed are executed (for example this would rule out a "blabla.inc.php", because "." is not allowed).
Note: This is kind of a "hack", because then the user could execute "h.o.m.e" and it would give the "home" page, because all it does is removing all prohibited characters. It's not intended to stop "smartasses" who want to cute stuff with your page, but it will stop people doing really bad things.
BTW: Another thing you could do in you .htaccess file is to prevent obvious attack attempts:
RewriteEngine on
RewriteCond %{QUERY_STRING} http[:%] [NC]
RewriteRule .* /–http– [F,NC]
RewriteRule http: /–http– [F,NC]
That way all page accesses with "http:" url (and query string) result in an "Forbidden" error message, not even reaching the php script. That results in less server load.
However keep in mind that no "http" is allowed in the query string. You website might MIGHT require it in some cases (maybe when filling out a form).
BTW: If you can read german: I also have a blog post on that topic.
The #1 rule when accepting user input is always sanitize it. Here, you're not sanitizing your page GET variable before you're passing it into include. You should perform a basic check to see if the file exists on your server before you include it.
Pek, there are many things to worry about an addition to sql injection, or even different types of code injection. Now might be a good time to look a little further into web application security in general.
From a previous question on moving from desktop to web development, I wrote:
The OWASP Guide to Building Secure Web Applications and Web Services should be compulsory reading for any web developer that wishes to take security seriously (which should be all web developers). There are many principles to follow that help with the mindset required when thinking about security.
If reading a big fat document is not for you, then have a look at the video of the seminar Mike Andrews gave at Google a couple years back about How To Break Web Software.
I'm assuming you deal with files in the same directory:
<?php
if (isset($_GET['page']) && !empty($_GET['page'])) {
$page = urldecode($_GET['page']);
$page = basename($page);
$file = dirname(__FILE__) . "/{$page}.php";
if (!file_exists($file)) {
$file = dirname(__FILE__) . '/home.php';
}
} else {
$file = dirname(__FILE__) . '/home.php';
}
include $file;
?>
This is not too pretty, but should fix your issue.
pek, for a short term fix apply one of the solutions suggested by other users. For a mid to long term plan you should consider migrating to one of existing web frameworks. They handle all low-level stuff like routing and files inclusion in reliable, secure way, so you can focus on core functionalities.
Do not reinvent the wheel. Use a framework. Any of them is better than none. The initial time investment in learning it pays back almost instantly.
Some good answers so far, also worth pointing out a couple of PHP specifics:
The file open functions use wrappers to support different protocols. This includes the ability to open files over a local windows network, HTTP and FTP, amongst others. Thus in a default configuration, the code in the original question can easily be used to open any arbitrary file on the internet and beyond; including, of course, all files on the server's local disks (that the webbserver user may read). /etc/passwd is always a fun one.
Safe mode and open_basedir can be used to restrict files outside of a specific directory from being accessed.
Also useful is the config setting allow_url_fopen, which can disable URL access to files, when using the file open functions. ini-set can be used to set and unset this value at runtime.
These are all nice fall-back safety guards, but please use a whitelist for file inclusion.
I know this is a very old post and I expect you don't need an answer anymore, but I still miss a very important aspect imho and I like it to share for other people reading this post. In your code to include a file based on the value of a variable, you make a direct link between the value of a field and the requested result (page becomes page.php). I think it is better to avoid that.
There is a difference between the request for some page and the delivery of that page. If you make this distinction you can make use of nice urls, which are very user and SEO friendly. Instead of a field value like 'page' you could make an URL like 'Spinoza-Ethica'. That is a key in a whitelist or a primary key in a table from a database and will return a hardcoded filename or value. That method has several advantages besides a normal whitelist:
the back end response is effectively independent from the front end request. If you want to set up your back end system differently, you do not have to change anything on the front end.
Always make sure you end with hardcoded filenames or an equivalent from the database (preferrabley a return value from a stored procedure), because it is asking for trouble when you make use of the information from the request to build the response.
Because your URLs are independent of the delivery from the back end you will never have to rewrite your URLs in the htAccess file for this kind of change.
The URLs represented to the user are user friendly, informing the user about the content of the document.
Nice URLs are very good for SEO, because search engines are in search of relevant content and when your URL is in line with the content will it get a better rate. At least a better rate then when your content is definitely not in line with your content.
If you do not link directly to a php file, you can translate the nice URL into any other type of request before processing it. That gives the programmer much more flexibility.
You will have to sanitize the request, because you get the information from a standard untrustfull source (the rest of the Web). Using only nice URLs as possible input makes the sanitization process of the URL much simpler, because you can check if the returned URL conforms your own format. Make sure the format of the nice URL does not contain characters that are used extensively in exploits (like ',",<,>,-,&,; etc..).
#pek - That won't work, as your array keys are 0 and 1, not 'home' and 'page'.
This code should do the trick, I believe:
<?php
$whitelist = array(
'home',
'page',
);
if(in_array($_GET['page'], $whitelist)) {
include($_GET['page'] . '.php');
} else {
include('home.php');
}
?>
As you've a whitelist, there shouldn't be a need for file_exists() either.
Think of the URL is in this format:
www.yourwebsite.com/index.php?page=http://malicodes.com/shellcode.txt
If the shellcode.txt runs SQL or PHP injection, then your website will be at risk, right? Do think of this, using a whitelist would be of help.
There is a way to filter all variables to avoid the hacking. You can use PHP IDS or OSE Security Suite to help avoid the hacking. After installing the security suite, you need to activate the suite, here is the guide:
http://www.opensource-excellence.com/shop/ose-security-suite/item/414.html
I would suggest you turn on layer 2 protection, then all POST and GET variables will be filtered especially the one I mentioned, and if there are attacks found, it will report to you immediately/
Safety is always the priority

Categories