I'm building application that uses components from Facebook API, and there are certain requirements so app could go to review. I solved other requirements and I'm not quite sure do I need a HTTPS or it could work with HTTP just fine?
You only need https if you create a Page/Tab App or a Canvas App. Check out the App Settings, those platforms specifically ask for a https link.
Working with http is fine, Token security is done best with appsecret_proof. It does not really matter if you use http or https for that.
That being said, having https is better than not having https. But the question was "do i need it", so...see my very first sentence :)
I don't know specifically about facebook rules but:
If the front end (javascript) has access to token to identification the user, maybe you should use https to protect that token.
If you manipulate personal data of the user, then in a lot of country (in European countries for example) you have an obligation of protection.
As https protect your website against unwanted modifications (ISP injecting Ads) and protect your users, in 2016 the question is less "should I use https?" and more "do I have a good reason not to?"
Related
I know I can disallow robots using robots.txt but few search engines does not follow this. Hence I have a API where my users sends transactional info to insert/update/delete etc., using my API Request Parameters. But when I look at my logs, huge hits have been made to my .php page, Hence I google to use it in my php API page and found nothing.
Hence I landed on SO to get help from experts, is there any way I can block/disallow SE robots to access my base API URL?
The main approaches that I know of for dealing with bots that are ignoring robots.txt are to either:
Blacklist them via your firewall or server
Only allow whitelisted users to access your API
However, you should ask yourself whether they're having any impact on your website. If they're not spamming you with requests (which would be a DDoS attack) then you can probably safely ignore them and filter them out of your logs if you need to analyse real traffic.
If you're running a service that people use and you don't want it to be wide open to spam then here's a few more options on how to limit usage:
Restrict access to your API just to your users by assigning them an API token
Rate limit your API (either via the server and/or via your application)
Read the User Agent (UA) of your visitors, a lot of bots will mention they're bots or have fake UAs, the malicious ones will pretend to be users
Implement more advanced measures such as limiting access to a region if a lot of requests suddenly come from there in a short period of time
Use DDoS protection services such as CloudFlare
There's no perfect solution and each option involves trade-offs. If you're worried about DDoS then you could start by looking into your server's capabilities, for example here's an introduction into how NGINX can control traffic: https://www.nginx.com/blog/rate-limiting-nginx/
In a nutshell, any IP hitting your site can be a bot so you should defend by imposing limits and analysing behaviour, since there's no way to know for sure who is a malicious visitor and who isn't until they start using your service.
I can't seem to find anything definitive, with the HTTPS migration, what will happen to existing apps that are just on HTTP? Will they cease to work? Or is it just a case that any new apps will need to be on HTTPS and the old ones will be fine?
If I read rightly I think it's a case that all existing apps will need to be converted.
Cheers guys
From what I understand, any existing apps that haven't entered a secure url will be disabled. My guess is they'll be stuck in sandbox mode until you enter that url.
I just set up a test app today that embeds into the canvas. I found that while it requires it to be HTTPS, it doesn't care if you use a self-signed certificate, so they're not insisting that users spend money to keep using their canvas apps.
I have a php application that pulls in pages from a separate domain via iframes. These pages are using cookies.
I've noticed some browsers have a default set that blocks any external cookies. This is going to cause quite a problem for me.
I've heard mention of P3P but can't find much mention about how to implement it with cookies.
Any help most appreciated,
Jonesy
It would be extremely bad if you could access an external site's cookies just by embedding it in an iframe. Just imagine if you were able to access facebook.com's session cookie just by embedding it.
Just to clarify what Maerlyn is saying - what you're describing is impossible. The website can only access cookies from its own domain. When you go to facebook.com, your facebook cookies are sent to that domain. When you go to google, your facebook cookies are NOT sent there. There is no way for Google to look at your Facebook cookies. Even it uses iframes. Period. This is a security feature.
So, I suggest you look at other ways to design your software system. For example, if the website you're embedding has an API, I'd use that. Or do a back-end service synchronization to pull in user information. In any case, you need the consent of the other service / other domain to do this.
I know the general definition but I need more details on how to implement them in general and PHP in specific, and what exactly are the features I gain from them?
SSL stands for "Secure Socket Layer", and it's a method of encrypted HTTP communication (among other things). It encrypts the traffic between a web browser and a server, making it possible to send secure data without fear of eavesdropping.
SSL is a web-server level technology, and has nothing to do with PHP. You can enable any web server with SSL, whether it has PHP on it or not, and you don't have to write any special PHP code in order to make your PHP pages show up over SSL.
There are many, many guides to be found on the internet about how to set up SSL for whatever webserver you might be using. It's a broad subject. You could start here for Apache.
some webservers are configured to mirror the whole site, so you can get every page over http or https, depending on what you prefer, or how the webbrowser sends them around. https is secure, but a bit slower and it puts more strain on your hardware.
so you might implement your site and shop as usual, but decide to put everything from the cart to the checkout, payment and so on under https. to accomplish this, all links to the shopping cart are absolute and prefixed with https:// instead of http://. now, if people click on the shopping cart icon, they're transfered to the secure version, and because all links from there on are relative again, they stay there.
but! they might replace the https with http manually, or go on the unencrypted version using a malicious link, etc.
in this case, you probably might want to check if your script was called over https (_SERVER["SERVER_PROTOCOL"], afaik), and deny the execution if not (good practice). or issue a redirect to the secure site.
on a side note: https is not using ssl exclusivley anymore, tls (the successor to ssl, see rfc2818) is more modern
rule of thumb: users should have the choice if they want http or https in noncritical environments, but forced to use https on the critical parts of your site (login/cart/payment/...) to prevent malicious attacks.
I have two websites, one driven by ASP.NET and the other in PHP. The PHP site is hosted on a relatively inexpensive host provider ('unlimited bandwidth and diskspace for $10 a month). The PHP site also provides REST URLs which would help me in monetizing my website.
The other site (the 'main' site, as it were) is an ASP.NET website which houses the login mechanism. My goal is to allow users to log in to the ASP.NET site and then be redirected to the PHP based domain. Is there an easy and feasible solution that accomplishes this?
I have a few questions with regards to that approach:
How would I pass session information and variables from the ASP.NET Application to the PHP based application, to facilitate the aura of 'Single Sign On'?
Would a 'simple' cookie be able to handle this scensario? Or would I need to use encrypted query strings?
There is no 'sensitive' data on these sites, so securing user data isn't a top priority. The site was built 'for fun'.
Are there hosts that allow subdomains to be hosted on a different language platform than the main domain? If I had www.example.com hosted on an ASP.NET server, could I have a subdomain (forum.example.com) hosted on a PHP server? Is this possible?
Any help on this is greatly appreciated.
Although more complex, I would go with the same methodology as the OpenID spec and use the Diffie-Hellman exchange. This allows two parties with no prior trust, to establish a trust for a certain period of time.
Info for PHP
Info for VB.NET
I would go for a cookie if both sites are on the same domain. One advantage of cookies over encrypted strings is that they are automatically passed between requests and you don't have to think about them when building your urls. One downside of cookies is that they can be disabled by users.
Store the sessions in a database and create / use a session-type which is cross-platform. You might to do it yourself. But you should know that passing sessions etc between different languages like this, can be dangerous ( security-wise )