Website Administration Location + PHP CURL - php

I'm building an online dating website at the moment.
There needs to be an admin backend to the site to approve users/photos etc.
I can add this admin part of the site/login etc to the same domain.
eg: www.domainname.com/admin
Or from my experience with PHP CURL I can put this site on a different domain and CURL the requests through.
Question: is it more secure to put the admin code/site on a completely different domain? or it really doesn't matter if it sits on the same domain? hacking/security is the really point of this.
thx

Technically it might be more secure if you ran it from a different server and hosted it on a subdomain using a different IP/vhost, or use a proxy mod for your webserver (see Apache mod_proxy) to proxy requests from yourdomain.com/admin to admin.otherdomain.com and enforce additional IP or access control using .htaccess or equivalent to access the proxy url.
Of course, if those other domains are web accessible, then they are only as secure as the users and passwords that use them.
For corporate applications, you may want to make the admin interface accessible from a VPN connection, but I don't know if that applies to you.
If there is a vulnerability on your public webserver that allows someone to get shell access, then it may make it slightly more difficult to get administrative access since they don't have the code for the administration portion.
In other words, it can provide additional security depending on the lengths you go to, but is not necessarily a solid solution.
Using something like cURL is a possibility, but you'd have far less troubleshooting to do using a more conventional method like proxy or subdomain on another server.

Related

How to offer calls over HTTPS exclusively

I'm writing some APIs for another website to be able to interact with my website. They say this in their documentation:
All calls made over HTTPS.
I don't know what that means for me on my end. Does it just mean I need to be hosted on a
httpS://www.mywebsite.com
page instead of
http://www.mywebsite.com
What do I need to do on my end (PHP based code) to accept "calls over https".
I don't need any code written or anything like that, I just need to understand the scope of what I'm trying to do. Is it my code that deciphers the HTTPS call? Is it the server that I'm hosted on? What does this mean?
You need an SSL certificate installed on your server, which you can get from a Certificate Authority like Thawte or Verisign. Once that is done, your site will be able to serve the same content over https://... and http://...
You can then restrict it via the webserver's configuration to only allow the https://.... (simplest thing to do). Or you can leave it at the default which will allow both, and you can make a determination in the particular script being called whether it will accept both or only one or the other.
But for simplicity (especially when you really don't understand the concept of SSL as to when you should use it or don't really have to) you probably ought to just restrict your webserver to serve only https://... How you do that depends on whether you are using IIS or Apache HTTPD, etc.

Is it possible to host a PHP web application under my own domain, but rewrite the url to another domainname?

I'm currently building a simple web application in PHP that other company's can use as one of their services. I want to host the application myself and not install it on one of their servers, but i do want the accessibility that that would offer. Example:
www.mywebapp.com is where i would host the web application.
www.company.com would be the domain name of the client.
webapp.company.com should redirect to www.mywebapp.com/?c=company. Upon navigation, webapp.company.com/view.php?v=test would also be redirected to www.mywebapp.com/view.php?c=company&v=test and so on upon further using the web app.
Can someone explain how i can achieve this and if this is the best option considering my requirements?
I recommend that you switch to implementing an API. That's how this problem is solved by many corporations. They simply have an API key that will let your server know what client they are and therefore what to serve them.
Resources on API's:
Google Tech Talk: http://www.youtube.com/watch?v=aAb7hSCtvGw [1:00:19 long]
http://blog.programmableweb.com/2011/01/06/from-the-trenches-web-api-design-best-practices/
Directory of some existing API's: http://www.programmableweb.com/apis/directory
I think your idea IS possible if both servers are set up correctly, but doesn't it feel wrong to you?
You would need to have an 'a' record for both domains pointing to the same server
http://corz.org/serv/tricks/htaccess2.php?page=all#section-rewrite_sub-domains

Allowing the usage of custom domain names for web application projects

I would like to know the best way to enable custom domains when creating web applications. For instance, if you look at something like Base Camp, when you sign up for that you create your own 'sub domain' to which you use to login to your basecamp with.
Also, if you look at this like hosted ecommerce sites, you can use your own custom domain instead of using a sub domain of theirs.
Personally I can't see these web applications 'parking' each custom domain on the web apps hosting account or adding the DNS if it uses a sub domain like Base Camp does.
Therefore, the only way I can think about doing something dynamically like this is to maybe use mod_rewrite to redirect everything to a certain script that does the 'routing' based on the url. Then for the customer domains, the customer would just need to add a CNAME for their domain to something like custom.webappname.com which then in turn gets picked up by mod_rewrite and the php routing file.
If this is the best way forward, are there any performance issues with routing all incoming requests via this 'routing file'?
Sorry if im not clear, tried to explain the best I can.
Yes, your solution would be the best way. That is how other sites do it. Rerouting all requests through a central file using rewrite rules incurs a small performance penalty but it is well worth it.
In fact, in most applications you are already paying that penalty anyway. Any framework that uses the FrontController pattern already does exactly this. This includes pretty much all frameworks like Zend, Symfony 1 and 2, CakePHP, CodeIgniter an many others.
It really depends on the web app your working with.
For example we have a hosted CMS, using the cPanel API we do create an actual hosting account for each customer and install about 50KB of files on account creation including a default template, initial set-up script (handles DB install, initial population of data and basic settings amongst other things) and a few basic front end control scripts such as the contact form, in saying this we don't provide access to the hosting account, all interaction is via our web app. In our case this is regardless if a sub-domain or fully qualified domain. Our customers have the option of self-hosting their domain or we will host it, because we have complete cPanel hosting infrastructure it makes no difference to us where DNS is but if the customer has it away from us it is entirely their responsibility.
The reason we have this hosting set up is so customers can upload their own templates, for disk storage management (we aren't interested in being a file server but customers need some space for PDFs, images, etc) and to make sure the content of 1 customer doesn't get mixed in with the content of another. As a premium paid-for service our lawyers recommended a minimum of separate identifiable folders on the server for file storage.
Another example is blogger/blogspot, it is well known they use mod_rewrite for their sub-domains. This is appropriate for them to do it this way otherwise they would have to create a separate DNS zone for each blog at a minimum and this is a pain (hence why we use cPanel), plus you have all your other virtual hosting set-ups.
With mod_rewrite as you will be aware it will use a single wild-card zone to control the sub-domains and the mod_rewrite rule is easy to apply. From there it is simply creating a folder and redirecting requests for the sub-domain to it or directing to the script to control your app depending on what you're doing.
The truth is for an automated system using sub-domains I would use mod_rewrite but for something a bit more complex like a fully blown premium CMS requiring full legal conformity, quota management, suspend, terminate and file removal functions then I would recommend looking at a hosting control panel such as cPanel as a possible solution.
You've got the right idea. You keep a single codebase, running (for the sake of argument) on a single IP. You don't need to worry about virtual hosts, or even mod_rewrite (aside from whatever your application needs).
You web app simply handles any and all requests to that IP (on port 80 or 443, or whatever).
When your application bootstraps in response to a request, it peeks at $_SERVER['HTTP_HOST'], and configures itself for the client associated with that domain name. If not found, it 404s, or whatever makes the most sense.

How to reliably identify a website

I have a file that is being linked to from other sub websites.
The file: http://site.com/file.img
Website A linking to it <img src="http://site.com/file.img"></img>
website B linking to it <img src="http://site.com/file.img"></img>
I need to reliably identify which of these websites has accessed the file, but I know that $_SERVER['HTTP_REFERER'] can be spoofed. What other ways do I have to reliably confirm the requester site? By IP, get them to register an IP? not sure. setup an API key? What options are there?
If a website is only linking to a file, the "website" itself will never actually access your image. Instead, the client who's viewing the site will make a request for the image.
As such, you're depending on information sent by the client, which is completely out of your control and not reliable at all. If you have the opportunity to set some sort of unique cookie on the client, you may be able to use this in some fashion for extended identification, but even that won't be reliable.
There is no 100% reliable solution.
Getting the referrer is the best you can do without getting into complicated territory.
If you don't mind complicated, then read on: set up your Web server to serve file.img only to Website A and Website B, then require that Website A and Website B set up a proxy configuration on their end that will retrieve file.img on behalf of their visitors.
Example:
A visitor to Website A loads a page that contains an image tag like <img src="http://websiteA.com/file.img"/> (note reference to Website A rather than your site). Client requests file.img from WebsiteA.com accordingly. Website A is configured to proxy requests for the path /file.img to your server, http://site.com/file.img. Your site verifies that it is in fact Website A that is requesting the image and then serves it to Website A's proxy. Website A then serves it to the visitor.
Basically, that makes it a pain for Websites A and B, gives you a performance hit, and also requires further configuration on your part. But I imagine that would satisfy your requirement.
Have a look at how OpenID relying is implemented, it allows one site to authenticate against another. The protocol specification will give a hint at the effort and overhead required to reliably implement such a scheme.
http://googlecode.blogspot.com/2010/11/googles-sample-openid-relying-party.html

How can I restrict / authorize access to PHP script?

There is this PHP script on my website which I don't want people to be able to run by just typing its name in the browser.
Ideally I would like this script to be run only by registered users and only from within a Windows app (which I will have to provide). Can this be done ?
Alternatively, how can I protect this script so that it can only be called from a specific page or script?
Also how can I hide the exact URI from appearing on the address bar?
Thanks !
If you are running Apache for your webserver, you can protect it with a username/password combo using .htaccess. It takes a little configuration if your server is not already configured to allow .htaccess. Here are the Apache docs.
If you need authentication based on application-specific factors, you can put something at the top of your script like
<?php
if(!$user->isLoggedIn()) {
// do 404
header('HTTP/1.0 404 Not Found');
}
Do you have a question about how you would implement isLoggedIn?
You can also use mod_rewrite to rewrite URIs, and those directives can go inside your .htaccess as well. mod_rewrite can rewrite incoming requests transparently (from the browser's perspective) so a request for /foo/bar can be translated into secret_script.php/foo/bar. Docs for mod_rewrite.
However you decide to implement this, I would urge you to not rely solely on the fact that your script's name is obscure as a means to secure your application. At the very least, use .htaccess with some per-user authentication, and consider having your application authenticate users as well.
As Jesse says, it's possible to restrict your script to logged in users. There are a large number of questions on this already. Search for PHP authentication.
However, it is not possible to restrict it to a single application. It is fairly simple to use a program like Wireshark to see exactly how the program logs in and makes request. At that point, they can reproduce its behavior manually or in their own application.
There are a variety of different ways that you could go about securing a script. All have pluses and minuses, and its likely that the correct answer for your situation will be a combination of several.
Like mentioned, you could lock down the account with Apache...it's a good start. Similarly, you could build a powerful 'salt-ed' security system such as this: http://www.devarticles.com/c/a/JavaScript/Building-a-CHAP-Login-System-An-ObjectOriented-Approach/ If you use SSL as well, you're essentially getting yourself security like banks use on their websites--not perfect, but certainly not easy to break into.
But there are other ideas to consider too. Park your script in a class file that sits inaccessible via direct URI, then do calls to the various functions from an intermediary view script. Not perfect, but it does limit the ways that someone could directly access the file. Consider adding a "qualifier" to the URL via a simple get--have the script check for the qualifier or fail....again, not a great solution on its own, but one additional layer to dissuade the bad guys. If you have control of who's getting access (know exactly which networks) you could even go so far as to limit the IP's or the http referers that are allowed to access the file. Consider setting and checking cookies, with a clear expiration. Don't forget to set your robots file so the browsers don't stumble upon the script your trying to protect.
A while back my company did a membership app using Delphi on the front end, talking to php and MySql on the backend....it was a bit clunky given that we were all web application developers. If you're so inclined, perhaps Adobe Flex might be an option. But ultimately, you'll have to open a door that the application could talk to, and if someone was determined, theoretically they could dig through your app to find the credentials and use them to gain instant access to the site. If you're going the desktop app route, perhaps its time to consider having the app avoid talking to an intermediary script and do its work on the local machine, communicating the db that sits remote.
you can use deny access on .htaccess on a folder with a php authentification that will redirect to those php file

Categories