In my site's administration area, I have been using mysqli_real_escape_string when retrieving form input that goes into the database. It works fine but I realize that it does not prevent script injections. I mean I can pass through scripts like:
<script>alert('hello');</script>
What do I use in addition to this to prevent a malicious admin from injecting some nasty stuff?
htmlentities()?
strip_tags()?
htmlspecialchars()?
What is the proper way to sanitize form input in back-end forms where html is not required for input data? I am confused?
htmlentities() and htmlspecialchars() are used when you're outputting data. Encoding and escaping are different.
If you don't want HTML, my recommendation would be to use strip_tags() to clean it of any HTML tags and use html* when you're outputting the content.
Also, you might consider switching to MySQL PDO. This is a much more preferred and secure way of running your queries.
The term you are looking for is Cross Site Scripting or XSS for short. Searching for that should give you plenty of resources, such as this question right here on StackOverflow.
The proper answer is highly dependent on your application.
Many administration systems need a way for admins to manipulate HTML. But some HTML is more dangerous than others.
As JohnP said, strip_tags() can be handy, since the second parameter allows you to explicitly allow certain, harmless tags (like or ), while stripping out anything else (like or )
If you need more sophistication than that, you'll need to do a more careful analysis and come up with a solution tailored to your needs. (Hint: If that solution involves using regular expressions to match HTML tags, you probably want to take a step back)
You should use htmlentities() .
You can use magic_quotes function to sanitize if you're using php 4 or less php 5.2 or less.
Related
Following on from a question I asked about escaping content when building a custom cms I wanted to find out how dangerous not escaping content from the db can be - assume the data ha been filtered/validated prior to insertion in the db.
I know it's a best practice to escape output but I'm just not sure how easy or even possible it is for someone to 'inject' a value into page content that is to be displayed.
For example let's assume this content with HTML markup is displayed using a simple echo statement:
<p>hello</p>
Admittedly it won't win any awards as far as content writing goes ;)
My question is can someone alter that for evil purposes assuming filtered/validated prior to db insertion?
Always escape for the appropriate context; it doesn't matter if it's JSON or XML/HTML or CSV or SQL (although you should be using placeholders for SQL and a library for JSON), etc.
Why? Because it's consistent. And being consistent is also a form of being lazy: you don't need to ponder if the data is "safe for HTML" because it shouldn't matter. And being lazy (in a good way) is a valuable programming trait. (In this case it's also being lazy about avoiding having to fix "bugs" due to changes in the future.)
Don't omit escaping "because it will never contain data that needs to be escaped" .. because, one day, over a course of a number of situations, that assumption will be wrong.
If you do not escape your HTML output, one could simply insert scripts into the HTML code of your page - running in the browser of every client that visits your page. It is called Cross-site scripting (XSS).
For example:
<p>hello</p><script>alert('I could run any other Javascript code here!');</script>
In the place of the alert(), you can use basically anything: access cookies, manipulate the DOM, communicate with other servers, et cetera.
Well, this is a very easy way of inserting scripts, and strip_tags can protect against this one. But there are hundreds of more sophisticated tricks, that strip_tags simply won't protect against.
If you really want to store and output HTML, HTMLPurifier could be your solution:
Hackers have a huge arsenal of XSS vectors hidden within the depths of
the HTML specification. HTML Purifier is effective because it
decomposes the whole document into tokens and removing non-whitelisted
elements, checking the well-formedness and nesting of tags, and
validating all attributes according to their RFCs. HTML Purifier's
comprehensive algorithms are complemented by a breadth of knowledge,
ensuring that richly formatted documents pass through unstripped.
It could be, for example, also problem linked with some other vulnerabilities like e.g. sql injection. Then someone would b e able to ommit filtering/validation prior adding to db and display whatever he can.
If you are pulling the word hello from the database and displaying it nothing will happen. If the content contains the <script> tags though then it is dangerous because a users cookies can be stolen then and used to hijack their session.
I'm a little confused about the StripTags filter as used in Zend. I think it's meant to strip tags that could result in XSS. So shouldn't that mean it should be used when outputting data in the views? I've seen it being used with form inputs
->addFilter('StripTags')
Should it be used with both input in the forms and output in the views, or does it work by filtering the data before it even enters the database (in which case that wouldn't be a good idea).
Not so much a direct answer to your question and more an alternative approach.
In the blog post "HTML Sanitisation: The Devil's In The Details (And The Vulnerabilities)", Padraic Brady discusses HTML sanitisation and various components for doing it. He expresses significant concerns about the use of the StripTags filter for that purpose.
HTMLPurifier seems to be a better choice.
StripTags is used with output in the views.
Note, that displaying text in editable field(such as textarea) is actually still an "output in the view".
Data should not be preprocessed/transformed before entering the database.
The strip tag filter will not occur unless you explicitly call it through
$stripedValue = $form->getValue('fieldName');
according to ZF2 unofficial documentation:
https://zf2.readthedocs.org/en/latest/modules/zend.filter.set.html#striptags
Zend\Filter\StripTags is potentially unsecure
Be warned that Zend\Filter\StripTags should only be used to strip all available tags.
Using Zend\Filter\StripTags to make your site secure by stripping some unwanted tags will lead to unsecure and dangerous code.
Zend\Filter\StripTags must not be used to prevent XSS attacks. This filter is no replacement for using Tidy or HtmlPurifier.
So use it on your own risk...
I'm using jWYSIWYG in a form I'm creating that posts to a database and was wondering how you can prevent a malicious user from trying to inject code in the frame?
Doesn't the editor need brackets (which I'd normally strip during the post process) in order to display styles?
If the editor allows arbitrary HTML, you're fighting a losing battle since users could simply use the editor to craft their malicious content.
If the editor only allows for a subset of markup, then it should use an alternative syntax (similar to how stackoverflow does it), or you should escape all HTML except for specific, whitelisted tags.
Note that it's pretty easy to not do this correctly so I would use a third-party solution that has been appropriately tested for security.
Ultimately, the output is in your own hands when you will be inserting it into the database, a time you need to make sure that you strip away anything malicious. The simplest way will be to probaly use htmlentites against such data, however, there are other ways bad guys can bypass that. Here is a nice script also implemented by popular Kohana php framework for its input class against the possible XSS attacks:
http://svn.bitflux.ch/repos/public/popoon/trunk/classes/externalinput.php
I have encountered similar situations, and I have started using HTMLPurifier on my PHP backend which will prevent every attack vector I can think of. It is easy to install, and will allow you to whitelist the elements and attributes. It also prevents the XSS attacks that could still exist whilst using htmlentities.
I developed a web application, that permits my users to manage some aspects of a web site dynamically (yes, some kind of cms) in LAMP environment (debian, apache, php, mysql)
Well, for example, they create a news in their private area on my server, then this is published on their website via a cURL request (or by ajax).
The news is created with an WYSIWYG editor (fck at moment, probably tinyMCE in the next future).
So, i can't disallow the html tags, but how can i be safe?
What kind of tags i MUST delete (javascripts?)?
That in meaning to be server-safe.. but how to be 'legally' safe?
If an user use my application to make xss, can i be have some legal troubles?
If you are using php, an excellent solution is to use HTMLPurifier. It has many options to filter out bad stuff, and as a side effect, guarantees well formed html output. I use it to view spam which can be a hostile environment.
It doesn't really matter what you're looking to remove, someone will always find a way to get around it. As a reference take a look at this XSS Cheat Sheet.
As an example, how are you ever going to remove this valid XSS attack:
<IMG SRC=javascript:alert('XSS')>
Your best option is only allow a subset of acceptable tags and remove anything else. This practice is know as White Listing and is the best method for preventing XSS (besides disallowing HTML.)
Also use the cheat sheet in your testing; fire as much as you can at your website and try to find some ways to perform XSS.
The general best strategy here is to whitelist specific tags and attributes that you deem safe, and escape/remove everything else. For example, a sensible whitelist might be <p>, <ul>, <ol>, <li>, <strong>, <em>, <pre>, <code>, <blockquote>, <cite>. Alternatively, consider human-friendly markup like Textile or Markdown that can be easily converted into safe HTML.
Rather than allow HTML, you should have some other markup that can be converted to HTML. Trying to strip out rogue HTML from user input is nearly impossible, for example
<scr<script>ipt etc="...">
Removing from this will leave
<script etc="...">
Kohana's security helper is pretty good. From what I remember, it was taken from a different project.
However I tested out
<IMG SRC=javascript:alert('XSS')>
From LFSR Consulting's answer, and it escaped it correctly.
For a C# example of white list approach, which stackoverflow uses, you can look at this page.
If it is too difficult removing the tags you could reject the whole html-data until the user enters a valid one.
I would reject html if it contains the following tags:
frameset,frame,iframe,script,object,embed,applet.
Also tags which you want to disallow are: head (and sub-tags),body,html because you want to provide them by yourself and you do not want the user to manipulate your metadata.
But generally speaking, allowing the user to provide his own html code always imposes some security issues.
You might want to consider, rather than allowing HTML at all, implementing some standin for HTML like BBCode or Markdown.
I use this php strip_tags function because i want user can post safely and i allow just few tags which can be used in post in this way nobody can hack your website through script injection so i think strip_tags is best option
Clich here for code for this php function
It is very good function in php you can use it
$string = strip_tags($_POST['comment'], "<b>");
I run a website (sorta like a social network) that I wrote myself. I allow the members to send comments to each other. In the comment; i take the comment and then call this line before saving it in db..
$com = htmlentities($com);
When I want to display it; I call this piece of code..
$com = html_entity_decode($com);
This works out well most of the time. It allows the users to copy/paste youtube/imeem embed code and send each other videos and songs. It also allows them to upload images to photobucket and copy/paste the embed code to send picture comments.
The problem I have is that some people are basically putting in javascript code there as well that tends to do nasty stuff such as open up alert boxes, change location of webpage and things like that.. I am trying to find a good solution to solving this problem once and for all.. How do other sites allow this kind of functionality?
Thanks for your feedback
First: htmlentities or just htmlspecialchars should be used for escaping strings that you embed into HTML. You shouldn't use it for escaping string when you insert them into a SQL query - Use mysql_real_escape_string (For MySql) or better yet - use prepared statements, which have bound parameters. Make sure that magic_quotes are turned off or disabled otherwise, when you manually escape strings.
Second: You don't unescape strings when you pull them out again. Eg. there is no mysql_real_unescape_string. And you shouldn't use stripslashes either - If you find that you need, then you probably have magic_quotes turned on - turn them off instead, and fix the data in the database before proceeding.
Third: What you're doing with html_entity_decode completely nullifies the intended use of htmlentities. Right now, you have absolutely no protection against a malicious user injecting code into your site (You're vulnerable to cross site scripting aka. XSS). Strings that you embed into a HTML context, should be escaped with htmlspecialchars (or htmlentities). If you absolutely have to embed HTML into your page, you have to run it through a cleaning-solution first. strip_tags does this - in theory - but in practise it's very inadequate. The best solution I currently know of, is HtmlPurifier. However, whatever you do, it is always a risk to let random user embed code into your site. If at all possible, try to design your application such that it isn't needed.
I so hope you are scrubbing the data before you send it to the database. It sounds like you are a prime target for a SQl injection attack. I know this is not your question, but it is something that you need to be aware of.
Yes, this is a problem. A lot of sites solve it by only allowing their own custom markup in user fields.
But if you really want to allow HTML, you'll need to scrub out all "script" tags. I believe there are libraries available that do this. But that should be sufficient to prevent JS execution in user-entered code.
This is how Stackoverflow does it, I think, over at RefacterMyCode.
You may want to consider Zend Filter, it offers a lot more than strip_tags and you do not have to include the entire Zend Framework to use it.