I've got a web app and now I've been told to implement SSL for that. I've never done that before, but I seem to have understood from Internet docs on how to do that.
But, my app has two sides: the user interface which is fine under SSL and the second side is I have some files that need to bypass the certificate. These files are accessed by remote machines and share the same libraries of some user interface files.
I've managed the bypass using symbolic links for the shared libraries but I'm not sure if this is the proper way to do it. I mean, if I don't use symbolic links to the shared libraries I can't use these scripts.
Thanks in advance for any light!
Do you mean you are calling the web pages from another machine but you want to bypass the cert? You can usually use curl for this: curl --no-check-certificate {url}
Otherwise you can probably just call those files without the 's' in the scheme (https / http) and be fine.
Related
I have a PHP app in Heroku that references a javascript stored at a non-secure location. My developer tools are telling me that they won't load "Mixed Content".
How do I get Heroku to serve over HTTP instead of HTTPS?
If your app is served via HTTPS, there is no way you can force the external content to be loaded via HTTP. It would require a security downgrade, and the browsers refuses to perform such action.
The best solution is to make sure that even that JavaScript file is served via HTTPS. You can easily host it behind a proxy, or store somewhere where you can use HTTPS. Today is very easy to find an HTTPS-enabled storage system.
I don't think so you can achieve this on Heroku side. You have to manage this things on your side.
Thanks
I'm building an online dating website at the moment.
There needs to be an admin backend to the site to approve users/photos etc.
I can add this admin part of the site/login etc to the same domain.
eg: www.domainname.com/admin
Or from my experience with PHP CURL I can put this site on a different domain and CURL the requests through.
Question: is it more secure to put the admin code/site on a completely different domain? or it really doesn't matter if it sits on the same domain? hacking/security is the really point of this.
thx
Technically it might be more secure if you ran it from a different server and hosted it on a subdomain using a different IP/vhost, or use a proxy mod for your webserver (see Apache mod_proxy) to proxy requests from yourdomain.com/admin to admin.otherdomain.com and enforce additional IP or access control using .htaccess or equivalent to access the proxy url.
Of course, if those other domains are web accessible, then they are only as secure as the users and passwords that use them.
For corporate applications, you may want to make the admin interface accessible from a VPN connection, but I don't know if that applies to you.
If there is a vulnerability on your public webserver that allows someone to get shell access, then it may make it slightly more difficult to get administrative access since they don't have the code for the administration portion.
In other words, it can provide additional security depending on the lengths you go to, but is not necessarily a solid solution.
Using something like cURL is a possibility, but you'd have far less troubleshooting to do using a more conventional method like proxy or subdomain on another server.
I'm not an expert and don't want to make a mistake, so please forgive me if the answer is obvious (better safe than sorry).
I finished a Flex app using FB4.5 and uploaded and tested it fine to a shared host. I'm now in the process of securing the app using https, but have landed in a quagmire.
First:
I forced the load of all pages to https with .htaccess so that the Flex app loads with SSL. Problem is that I get a connection failed ('BadVersion') when the app makes a data service call using the gateway.php file because of the .htaccess force (it is looking for http rather than https). I believe I can hardcode the https path in the Flash Builder class file, but I don't want prevent the app from working on my dev machine either. Any thoughts here?
Also, even if the gateway.php file is called using SSL, will the following calls to the PHP files containing the actual SQL queries fail because of the forced SSL by the .htaccess directives.
Second:
Instead of using .htaccess, I have also successfully used PHP to secure the initial launch of the app with a https redirect statement in the beginning. This allows the app to work, calling the gateway.php file fine because it isn't forcing https on everything. BUT, this defeats the purpose of trying to get everything to be encrypted.
Third:
Is is it necessary to have the gateway.php file launched on SSL because of transmitting in binary AMF?
Thoughts? Explanations? Things I'm missing. Suggestions?
Thanks in advance.
If your AMF calls are going over HTTPS, then you need to use a SecureAMFChannel rather than a vanilla AMFChannel.
Typcially this is configured either in the client, where you have declared your RemoteObject or ChannelSet, or in the services-config.xml file.
Most likely, this mismatch is what's causing the BadVersion error you're getting.
I'm vaguely aware that on a computer joined to a domain IE can be asked to send some extra headers that I could use to automatically sign on to an application. I've got apache running on a windows server with mod_php. I'd like to be able to avoid the user having to log in if necessary. I've found some links talking about Kerberos and Apache modules.
http://www.onlamp.com/pub/a/onlamp/2003/09/11/kerberos.html?page=last
https://metacpan.org/pod/Apache2::AuthenNTLM
Since I'm running on Windows it's proven to be non-trivial to get Perl or Apache modules installed. But doesn't PHP already have access to HTTP headers?
I found this but it doesn't do any authentication, it just shows that PHP can read the NTLM headers.
http://siphon9.net/loune/2007/10/simple-lightweight-ntlm-in-php/
I'd like to be able to have my users just point to the application and have them automatically authenticated. Has anyone had any experience with this or gotten it to work at all?
UPDATE
Since originally posting this question, we've changed setups to nginx and php-fcgi still running on windows. Apache2 and php-cgi on windows is probably one of the slowest setups you could configure on windows. It's looking like Apache might still be needed (it works with php-fcgi) but I would prefer a nginx solution.
I also still don't understand (and would love to be educated) why HTTP server plugins are necessary and we can't have a PHP, web server agnostic solution.
All you need is the mod_auth_sspi Apache module.
Sample configuration:
AuthType SSPI
SSPIAuth On
SSPIAuthoritative On
SSPIDomain mydomain
# Set this if you want to allow access with clients that do not support NTLM, or via proxy from outside. Don't forget to require SSL in this case!
SSPIOfferBasic On
# Set this if you have only one domain and don't want the MYDOMAIN\ prefix on each user name
SSPIOmitDomain On
# AD user names are case-insensitive, so use this for normalization if your application's user names are case-sensitive
SSPIUsernameCase Lower
AuthName "Some text to prompt for domain credentials"
Require valid-user
And don't forget that you can also use Firefox for transparent SSO in a Windows domain: Simply go to about:config, search for network.automatic-ntlm-auth.trusted-uris, and enter the host name or FQDN of your internal application (like myserver or myserver.corp.domain.com). You can have more than one entry, it's a comma-separated list.
I'd be curious about a solution that uses OpenID as a backend (of sorts) for this... I wasn't seeing anything that would hook into ActiveDirectory directly when I googled (quickly). However, it could be pretty painless to implement over plain HTTP(S) (you'd be an OpenID provider that checked credentials against your local AD). In a best case scenario, you might be able to just add a couple classes to your app and be off and running -- no web server modules required. There is a lot of open source code out there for either side of this, so if nothing else, it's worth taking a look. If you exposed the backend to the users (i.e. gave them OpenID URLs), you'd have the added benefit of them being able to log in to more than just your internal sites using these credentials. (Example: Stack Overflow.)
As an aside, I'd be against making it so that Internet Explorer is required. I'm not sure if that is the goal from the way you wrote the question, but depending on your IT environment, I'd expect people who use Firefox or Safari (or Opera or ...) to be less than enthusiastic. (You're not developing against IE first, are you? That's been painful whenever I've done so.) This is not to say that you couldn't use this feature of IE, just that it shouldn't be the only option. The link you posted stated that NTLM worked with more than IE, but since I don't have any experience with it, it's hard to judge how well that would work.
I had a similar problem which I needed to solve for my organization.
I was looking into using adLDAP.
There is some documentation on the site for achieving seamless authentication with Active Directory too.
One option for you is to use CAS (central authentication service).
It has php client library.
How-to link to MS Active Directory: http://www.ja-sig.org/wiki/display/CASUM/Active+Directory
You would require Apache maven 2 though.
For IIS/PHP FCGI, You need to send out an unauthorized header:
function EnableAuthentication()
{
$realm = "yoursite";
header('WWW-Authenticate: Digest realm="'.$realm.'",qop="auth",nonce="'.uniqid().'",opaque="'.md5($realm).'"');
header("HTTP/1.1 401 Unauthorized");
exit;
}
You can then get at the username with:
$winuser = $_SERVER["REMOTE_USER"];
I then make sure the $winuser is in my database of allowed users.
Be SURE and test this under a non-privileged account. When I first installed this I tested it and it worked fine, but later when a standard non-server-admin user tried it this failed. Turns out some of the temporary directories need to have permissions changed for guest users. I can't recall the exact settings.
I am hoping there is a way to do this but have had difficulty searching because the terms all seem too general and the results don't seem to be what I'm looking for.
I travel a lot and am often in areas where the internet on a shared computer is the ONLY method for accessing the web. I can't use SSH or SFTP.
I would like to make a web page (hosted publicly, accessible to the world) that will 1) be password protected itself (can do that with .htaccess, pretty sure), and then 2) when logged in, will present me a list of links that IF CLICKED will rename, remove, or move files on a server that's NOT on the same server as the web page is served.
Can this be done?
Basically, with SSH I'd just ssh to Server 1 (where the web server is) and then ssh AGAIN to the Server 2 (where the files I want to access are). Easy. But with just access to HTTP, it seems like this is much more complex.
I suspect the method is to create public/private keys and then use some php commands to trigger processes that are tied to hyperlinks on the page? For example:
Delete file 001
Rename file foo to bar
My idea is that the "deletefile001.php" file would contain the necessary ssh and link to a bash script that does what I need it to. That way, I wouldn't need access to SSH directly. It would be done in advance, with the process started by clicking the link.
Lastly, what kind of security risks are there in this? Can spiders trigger the links automatically? I'm not worried too much about people accessing maliciously, but mainly about accidental triggering that would cause sudden loss of the files.
Any and all help would be fantastic. Thank you!
What you're looking for is a web-based FTP or SSH client, something that would be a very large task to code yourself. Here are a couple frameworks that I found after a couple minutes of Google searching. I cannot guarantee the quality or the security of any of these services, as I have not used them myself.
Open-source frameworks that you could install on your own server:
Web-based FTP framework: Monsta FTP
Web-based SSH framework: Web Console
Online services that you could use independently of your website:
Online FTP client: net2ftp
Online SSH client: consoleFISH