I am passing a textarea input boxs' contents via POST to my php file from html (no javascript allowed).
I then use simplexml to get the feed at the url the user entered.
Unfortunately, the user can enter anything into the textarea. Which I am told is dangerous.
What is the recommended way to clean and secure the POST contents using PHP to get them ready and safe for the simplexml procedure?
(basically, to be sure they are not malicious and check they are a valid url)
Content inside a $_POST array are strings, so there's nothing ineherently unsafe there.
User enters php code? It surely won't be executed, so no problem here (this, among many others, is a reason not to use such things as eval()). So whatever php function or command he writes it will be read as a simple string, and string are no harmful whatever they contain.
User enters malicious javascript? Still no problem, as javascript inside php, or inside a database for what that matters, is pretty useless since it needs a browser to execute.
This leads to the real issue: user supplied contents needs to be "sanitized" only right before passing it to the target medium. If you're going to feed a database , use the escaping tools provided by your engine. If you're going to output it on the webpage, that's when you need to sanitize from malicious XSS attacks.
Sanitizing a POST array per se , before actually doing anything with its content, is wrong as you never know for sure when and where that content needs to be used; so don't even think to use strip_tags() or analogue functions that comes to your mind right after you get the POST value, but pass it as is and add the necessary escaping/sanitizing just when needed.
What you actually need to do, then, you only know, so act accordingly
Which I am told is dangerous.
it is wrong.
What is the recommended way to clean and secure the POST contents
it am afraid there is nothing to secure
Related
I've been setting a div equal to some user inputted data using query $('#div').text(data). The security I had was to pass the data processed first on server side, returning htmlentities(data) etc. I noticed that when the output is displayed, the entities are not decoded. They are decoded if I use jquery html(data) instead.
Since it doesn't appear that the html is rendered for the data in text(data), would it be okay to pass anything to it without first using htmlentities?
What is the proper security setup for this scenario? I read that html(data) was susceptible to some security holes, so that's why I was trying text(data) instead.
For example I have a Javascript-powered form creation tool. You use links to add html blocks of elements (like input fields) and TinyMCE to edit the text. These are saved via an autosave function that does an AJAX call in the background on specific events.
The save function being called does the database protection, but I'm wondering if a user can manipulate the DOM to add anything he wants(like custom HTML, or an unwanted script).
How safe is this, if at all?
First thing that comes to mind is that I should probably search for, and remove any inline javascript from the received html code.
Using PHP, JQuery, Ajax.
Not safe at all. You can never trust the client. It's easy even for a novice to modify DOM on the client side (just install Firebug for Firefox, for example).
While it's fine to accept HTML from the client, make sure you validate and sanitize it properly with PHP on the server side.
Are you saving the full inline-html in your database?
If so, try to remake everything and only save the nessesary data to your backend. ALL fields should also be controlled if they are recieved in the expected way.
All inline-js is easily removed.
You can never trust the user!
Absolutely unsafe, unless you take the steps to make it safe of course. StackOverflow allows certain tags, filtered so that users can't do malicous things. You'll definately need to do something similar.
I'd opt to sanitize input server side so that everyone gets their input sanitized, whether they've blocked scripts or not. Using something like this: http://www.phpclasses.org/package/3746-PHP-Remove-unsafe-tags-and-attributes-from-HTML-code.html or http://grom.zeminvaders.net/html-sanitizer implemented with AJAX would be a pretty good solution
I have a page like this. User write an URL into a form and submit. Once the URL is submitted, I connect that page with CURL, search for a string. If it finds the string, it adds URL into our database. If not, it gives an error to user.
I sanitize URL with htmlspecialchars() also a regex to allow A-Z, 1-9, :/-. symbols. I also sanitize the content retrieved from other website with htmlspecialchars() also.
My question is, can they enter an URL like;
www.evilwebsite.com/shell.exe or shell.txt
Would PHP run it, or simply look for the HTML output? Is it safe as it is or if not, what should I do?
Thank you.
Ps. allow_url_fopen is disabled. That's why I use curl.
I don't see why htmlspecialchars or a Regex would be necessary here, you don't need those. Also, there is no way that PHP will "automatically" parse the content retrieved using cURL. So yes, it is save (unless you do stuff like eval with the output).
However, when processing the retrieved content later, be aware that the input is user-provided and needs to be handled accordingly.
curl makes a request and to a server and the server sends back data. If there were an executable file on a web server you'd get back the binary of the file. Unless you write the file to your disk and execute it there should be no problem. Security in that sense should not be an issue.
Every now and then, I get unusual data saved to the database from my PHP form that looks like this:
Mr. Smith's
What could be causing this, and is there a better way to remove the entities than using preg_replace, since the php decode functions don't properly decode the entire thing?
I would suggest looking at the code processing data from the form pre-insertion into the database. If you are sanitising the data to be displayed on a web page use htmlentities($var); if you are are only sanitising it for security purposes look into prepared statements / stored procedures or just mysql_real_escape_string($var). If all else fails post the code and we'll have a look.
This must be due to some technical problem. The best way to decode the entities and after that if you find something like:
/&([a-z]+|#[0-9]+);/
Do not accept the form, just alert the user about the invalid value.
You could take a look at Codeingiters' source code where they remove entities at https://bitbucket.org/ellislab/codeigniter/src/c07dcadf094e/system/libraries/Security.php in the xss_clean method. This will give you a good idea of how to clean most of them more effectively.
I have a form and as of right now, you can type any javascript, etc. you want. Any XSS, etc.
How do I go about creating a whitelist so you can only post characters.
At some point I would like anything that starts with http:// to be converted to
Thanks
Is this efficient?
http://htmlpurifier.org/
jQuery or Javascript is preferred
Well, no, you can't do that, you see? Because even if you 'sanitize' your data using javascript, noone's stopping anyone from
turning off javascript
using a browser's developer console to mess with the data
doing the POST directly, without a browser
In other words, you have to perform the validation/sanitization on the server side. Javascript validation is there to enhance the experience of your users (by providing instant feedback on invalid input, for example).
But still, in many high-load applications developers use partially client-side verifications (but all inputs have to be prepared for writing to db).
As you will be using PHP, i suggest you to parse your $_POST values with htmlspecialchars(), mysql_real_escape_string() and so on.
You will have to use regular expression to convert anything that starts with "http://" to links (well, you can also use explode('.', $_POST['yourInput']) which can be easier for you).