Echoing external XML Files - php

I'm currently building an app that renders RSS and ATOM Feeds on the client side. I can't directly send an ajax request to "https://stackoverflow.com/feeds/tag/php", but I can send a request to my server that just echos the XML File like:
<?php
echo file_get_contents('https://stackoverflow.com/feeds/tag/php');
?>
What are the security implications(if any) on doing this?

StackOverflow is now allowed to hack the data your clients get and replace it with something malicious or annoying. (To be fair, they could even if you were able to use the URL directly.)
Your clients are now allowed to cause your server to make a lot of requests to StackOverflow, who may block you for DOSing the site or something like that. (I do hope you apply a modicum of caching.)

You may be able to use the Filter functions to sanitize the data before the echo. In general, unless the host you're getting the data from is controlled by you and doesn't allow general users to upload or add data that will be echo'ed then I wouldn't trust it. You just don't ever know what someone might be able to get through.

I would write a script which would run on cron and fetch the data and write your own database/filesystem/cache (your choice) and give them to users asychrously.
You never know how slow the other server responds and if it really responds slow, it also slows your site.

You have to send
header('application/xml');
Then the client will handle it as XML, and no XSS can occur as far as I can tell.

Related

Does HTTPS make POST data encrypted?

I am new to the world of programming and I have learnt enough about basic CRUD-type web applications using HTML-AJAX-PHP-MySQL. I have been learning to code as a hobby and as a result have only been using a WAMP/XAMP setup (localhost). I now want to venture into using a VPS and learning to set it up and eventually open up a new project for public use.
I notice that whenever I send form data to my PHP file using AJAX or even a regular POST, if I open the Chrome debugger, and go to "Network", I can see the data being sent, and also to which backend PHP file it is sending the data to.
If a user can see this, can they intercept this data, modify it, and send it to the same backend PHP file? If they create their own simple HTML page and send the POST data to my PHP backend file, will it work?
If so, how can I avoid this? I have been reading up on using HTTPS but I am still confused. Would using HTTPS mean I would have to alter my code in any way?
The browser is obviously going to know what data it is sending, and it is going to show it in the debugger. HTTPS encrypts that data in transit and the remote server will decrypt it upon receipt; i.e. it protects against any 3rd parties in the middle being able to read or manipulate the data.
This may come as a shock to you (or perhaps not), but communication with your server happens exclusively over HTTP(S). That is a simple text protocol. Anyone can send arbitrary HTTP requests to your server at any time from anywhere. HTTPS encrypted or not. If you're concerned about somebody manipulating the data being sent through the browsers debugger tools… your concerns are entirely misdirected. There are many simpler ways to send any arbitrary crafted HTTP request to your server without even going to your site.
Your server can only rely on the data it receives and must strictly validate the given data on its own merits. Trying to lock down the client side in any way is futile.
This is even simpler than that.
Whether you are using GET or POST to transmit parameters, the HTTP request is sent to your server by the user's client, whether it's a web browser, telnet or anything else. The user can know what these POST parameters are simply because it's the user who sends them - regardless of the user's personal involvement in the process.
You are taking the problem from the wrong end.
One of the most important rules of programming is :
Never trust user entries is a basic rule of programming ! Users can and will make mistakes, and some of them will try to damage you or steal from you.
Welcome into the club.
Therefore, you must not allow your code to perform any operation that could damage you in any way if the POST or GET parameters you receive aren't what you expect, be it by mistake or from malicious intents. If your code, by the way it's designed, renders you vulnerable to harm simply by sending specific POST values to one of your pages, then your design is at fault and you should redo it taking that problematic into account.
That problematic being a major issue while designing programs, you will find plenty of documentation, tutorials and tips regarding how to prevent your code to turn against you.
Don't worry, that's not that hard to handle, and the fact that you came up with that concern by yourself show how good you are at figuring things out and how commited you are to produce good code, there is no reason why you should fail.
Feel free to post another question if you are stuck regarding a particular matter while taking on your security update.
HTTPS encrypts in-transit, so won't address this issue.
You cannot trust anything client-side. Any data sent via a webform can be set to whatever the client wants. They don't even have to intercept it. They can just modify the HTML on the page.
There is no way around this. You can, and should, do client side validation. But, since this is typically just JavaScript, it can be modified/disabled.
Therefore, you must validate all data server side when it is received. Digits should be digits, strip any backslashes or invalid special characters, etc.
Everyone can send whatever they want to your application. HTTPS just means that they can't see and manipulate what others send to your application. But you always have to work under the assumption that what is sent to your application as POST, GET, COOKIE or whatever is evil.
In HTTPS, the TLS channel is established before and HTTP data is transfered so, from that point of view, there is no difference between GET and POST requests.
It is encrypted but that is only supposed to protects against mitm attacks.
your php backend has no idea where the data it receives comes from which is why you have to assume any data it receives comes straight from a hacker.
Since you can't protect against unsavoury data being sent you have to ensure that you handle all data received safely. Some steps to take involve ensuring that any files uploaded can't be executed (i.e. if someone uploads a php file instead of an image), ensuring that data received never directly interacts with the database (i.e. https://xkcd.com/327/), & ensuring you don't trust someone just because they say they are logged in as a user.
To protect further do some research into whatever you are doing with the received post data and look up the best practices for whatever it is.

Restricting access to php file

I'm currently writing an Android app at the moment, that accesses a PHP file on my server and displays JSON data provided by my MYSQL database.
Everything works great and I love the simplicity of it, but I'm not too comfortable with the fact that someone could just type in the URL of this PHP file and be presented with a page full of potentially sensitive data.
What advice would you give me to prevent access to this PHP file from anyone except those using my android app?
Thanks very much for any information.
The keyword is authentication. HTTP-Authentication is designed just for that purpose!
There are 2 forms of HTTP-auth:
Basic: easy to setup, less secure
Digest: harder to setup, more
secure
Here is the php manual.
And this is what you can do in your android app.
There isn't really a fool-proof way to do this. However you can require the user agent to match that of your application. You can also hide a private key in your application that is passed as POST data to your PHP file. Now, neither of these will stop someone who is determined to get at the raw output, but it will slow down the people who are just screwing around killing a little time seeing what they can accomplish.
Why not only enable a valid response if the request is sent with the following header:
Content-Type=application/json
If the request doesn't pass it as the Content-Type, then you just terminate the script (as regular browsers usually want to get text/html or similar things). It's not really worth locking everything tight shut, as if your app can get the data from your server, any user would have the opportunity too.

can you send $_POST to an externally hosted php file?

This is a question of security, so I am not looking for a solution on how to do this, I just want to make sure that it cannot be done.
Let's say I have a file called login.php and it's hosted online and running live, let's say on http://www.rimmer.sk/login.php
Now, let's image this file looks like this:
<?php
if (isset($_POST['register'])){
echo 'all is done !';
}
?>
Question: Can you, externally, send $_POST['register'] to my file, or can this be done only internally from files hosted within the same virtualhost?
It can be done. Everyone can send you a POST (or a GET, for that matter) request. There is no limit that forbids requests from outside your virtualhost.
(maybe not for you, maybe it is, but not everyone can set your $_SESSION, so an external domain cannot alter that)
In short yes it can be posted from the external site.
Yes this can be done very easily. Take a look at: http://php.net/manual/en/book.curl.php
Of course I can post from an external location, after all that's what the user's browser does when they submit the form. I can therefore write a script to post the register field to your server with ease.
What use case are you imagining? There are lots of security options (firewalls etc) but without knowing what you are trying to achieve, it's hard to give specifics.
One way of denying script attacks is to generate one time passwords on the server that you send to the browser with each registration form, then when you get a response back, check that the OTP is valid. This at least adds another layer of security.
But as I say ... without knowing more it's hard to be specific.
or can this be done only internally from files hosted within the same virtualhost?
Quite contrary. This cannot be done only internally. In fact, a form being sent not from server internals but from the user's browser

Securing JSONP?

I have a script that uses JSONP to make cross domain ajax calls. This works great but my question is, is there a way to prevent other sites from accessing and getting data from these URL's? I basically would like to make a list of sites that are allowed and only return data if they are in the list. I am using PHP and figure I might be able to use "HTTP_REFERER" but have read that some browsers will not send this info.... ??? Any ideas?
Thanks!
There really is no effective solution. If your JSON is accessible through the browser, then it is equally accessible to other sites. To the web server a request originating from a browser or another server are virtually indistinguishable aside from the headers. Like ILMV commented, referrers (and other headers) can be falsified. They are after all, self-reported.
Security is never perfect. A sufficiently determined person can overcome any security measures in place, but the goal of security is to create a high enough deterrent that laypeople and or most people would be dissuaded from putting the time and resources necessary to compromise the security.
With that thought in mind, you can create a barrier of entry high enough that other sites would probably not bother making requests with the barriers of entry put into place. You can generate single use tokens that are required to grab the json data. Once a token is used to grab the json data, the token is then subsequently invalidated. In order to retrieve a token, the web page must be requested with a token embedded within the page in javascript that is then put into the ajax call for the json data. Combine this with time-expiring tokens, and sufficient obfuscation in the javascript and you've created a high enough barrier.
Just remember, this isn't impossible to circumvent. Another website could extract the token out of the javascript, and or intercept the ajax call and hijack the data at multiple points.
Do you have access to the servers/sites that you would like to give access to the JSONP?
What you could do, although not ideal is to add a record to a db of the IP on the page load that is allowed to view the JSONP, then on the jsonp load, check if that record exists. Perhaps have an expiry on the record if appropriate.
e.g.
http://mysite.com/some_page/ - user loads page, add their IP to the database of allowed users
http://anothersite.com/anotherpage - as above, add to database
load JSONP, check the IP exists in the database.
After one hour delete the record from the db, so another page load would be required for example
Although this could quite easily be worked around if the scraper (or other sites) managed to work out what method you are using to allow users to view the JSONP, they'd only have to hit the page first.
How about using a cookie that holds a token used with every jsonp request?
Depending on the setup you can also use a variable if you don't want to use cookies.
Working with importScript form the Web Worker is quite the same as jsonp.
Make a double check like theAlexPoon said. Main-script to web worker, web worker to sever and back with security query. If the web worker answer to the main script without to be asked or with the wrong token, its better to forward your website to the nirvana. If the server is asked with the wrong token don't answer. Cookies will not be send with an importScript request, because document is not available at web worker level. Always send security relevant cookies with a post request.
But there are still a lot of risks. The man in the middle knows how.
I'm certain you can do this with htaccess -
Ensure your headers are sending "HTTP_REFERER" - I don't know any browser that wont send it if you tell it to. (if you're still worried, fall back gracefully)
Then use htaccess to allow/deny access from the right referer.
# deny all except those indicated here
order deny,allow
deny from all
allow from .*domain\.com.*

Possible to use Javascript to get data from other sites?

Is it possible for a web page using Javascript to get data from another website? In my case I want to get it for calculations and graphing a chart. But I'm not sure if this is possible or not due to security concerns. If it is considered a no no but there is a work around I would appreciate being told the work around. I don't want to have to gather this information on the server side if possible.
Any and all help is appreciated.
Learn about JSONP format and cross-site requests (http://en.wikipedia.org/wiki/JSON#JSONP).
You may need to use the "PHP-proxy" script at your server side which will get the information from the websites and provide it to yours Javascript.
The only reliable way is to let "your" webserver act as a proxy. In PHP you can use curl() to fire a HTTP request to an external site and then just echo the response.
You can't pull data from another server due to the same origin policy. You can do some tricks to get around it, such as putting the URL in a <script> tag, but in your case it wouldn't work for just parsing HTML.
Use simple_dom_html, to parse your data server side. it is much easier than doing it in JavaScript anyways.
A simple way you might be able to do this is to use an inline iframe. If the web page you are getting the data from has no headers, or you can isolate the data being pulled in (to say an image or SWF), this might work.
cross-domain javascript used to be impossible, using a (php-)proxy was a workaround for that.
jsonp changes this entirely, it allows to request javascript from another server (if it has an API that supports jsonp, a lot of the bigger webplayers like google, twitter, yahoo, ... do), specifying the callback-function in your code that needs to be triggered to act on the response.
the response in javascript will contain:
a call to a callback-function you defined
the actual payload as a javascript-object.
frameworks like jquery offer easy support for jsonp out of the box.
once you have the raw data you could tie into google chart tools to create graphs on the fly and insert them in your webapp.
Also worth considering is support for XMLHttpRequest Access Control which is support in some modern browsers.
If the service provider that you are trying to access via a web page has this set up, it is a very simple call to XMLHttpRequest and you will get access to the resources on that site without the need for JSONP (especially useful for requests that are not GET, i.e. POST, HEAD etc)

Categories