not showing connection source code wordpress - php

Wordpress has a plugin editor that allows visitors to view my plugin's source code. I have some MySQL database connections and Azure connections that would be malicious to let others look at.
$connectionstring = "DefaultEndpointsProtocol=[http|https];AccountName=[yourAccount];AccountKey=[yourKey]";
This is an example of what I do not what to show the visitor. Could I do this in an external PHP file, hidden away from their sights? Anyway I could accomplish this efficiently and securely?

It seems like there are ways to secure these settings described in 48 Ways to Keep Your WordPress Site Secure which includes:
4. Secure wp-config.php
Lock down wp-config.php—it’s one single location that contains a wealth of critical data regarding your database, username, and password. Only you should have access.
To deny access to this file, you should add the code below at the top of the .htaccess file:
<files wp-config.php>
order allow,deny
deny from all
</files>
You don't ever want to expose connection strings client-side. It's like locking your front door, then hanging the keys up on a hook on the same door.
If you can't secure this sort of stuff on wordpress, you need some sort of server-side access, which would only respond to requests from your specific wordpress domain, like a RESTful API.

Related

Protecting folder using .htpasswd, but allowing php download while protecting password

I have password protected a folder using .htpasswd and .htaccess that contains digital assets that I want to control the downloading of using php.
I was planning on offering a download link using the mechanism:
http://username:password#www.website.com/directory/
However, I don't want people to have access to the username and password. In other words I want to make a php gateway file with a different url that decides to offer the download or not, based on information available in the database.
This is a security thing, so I'm not confident of where to start with this. I'm sure I could hash together some code but I'm not confident about it. How can I do this securely? Any help greatly appreciated.
If you have the technical possibility I would suggest you even store the assets outside of the web accessible folders so you don't need to rely on htaccess for protection. That way your PHP gateway script is the only way to access those files.
I won't go into details about writing the script itself, there are multitudes of ways to do that and it very much depends on your requirements what is best, so more information would be needed to give some advice to that. If your assets are very big then streaming them through your script might not work due to memory/time limitations, in that case you could symlink them from the safe location to a public location with a randomly hashed path/filename for a limited time and give that link out.

Allow Directory Listing only from specific URL

Setup
Server: Apache 2.2.
I do have access to httpd.conf, but in case necessary the solution can be using .htaccess
The goal:
To permit directory listing only in case the request comes from a specific URL.
So only if user abled to access a specific URL in my site he/she will be able to access this directory.
Currently I only have this configuration that allows all to access this directory:
<Directory "/home/myaccount/app/Ui/policies/gray_list">
Options Indexes FollowSymLinks
</Directory>
It is possible to evaluate the ReferrerURI for this (you said requests coming from a specific url, so a page offering a link), however note that this is not reliable. The ReferrerURI is a simple http header, thus it is very easy to manipulate / spoof that.
The only reliable approach to this is using session handling. So enforcing some kind of authentication to the referring page and only grant a directory listing if the authentication process protecting the referring page has led to a valid session. This has to be done on scripting level though, I am not aware of a straight forward approach using apache features only.

How to authenticate direct requests to files in a specific directory

Our site is powered by Wordpress and I'm using a plugin which allows users to upload files. We have reasonable security in place to prevent malicious files from getting loaded to the server by users, but those files are then directly accessible to the public. For example, anyone can directly access any of those files like so:
http://www.website.com/path/to/files/file_1.pdf
http://www.website.com/path/to/files/file_2.pdf
http://www.website.com/path/to/files/file_3.docx
This is a security/privacy problem because those files could contain personal data that should not be available to anyone.
I know I can block access to the entire directory using .htaccess but then the plugin will stop working. Instead, I think I need to use .htaccess to redirect those requests to a script which checks if the current user is authorized to view them. The tricky part is that direct requests to those files bypass the Wordpress app, so none of the core functions (like is_user_logged_in()) are available if I redirect to some intermediary page.
It seems like I either need to write a script to check for the Wordpress authorization cookies manually (which sounds like a huge hassle) or somehow loop in Wordpress.
Any suggestions for an elegant way to add this security layer without breaking the plugin?

configure httpd.conf to allow host access to my website

I have a server which is online right now, but requires authentication when accessing, so it is basically closed to everyone but me.
Thing is, I don't want to "Open" the website to the public, but I need to test my website on different browsers.
One way is to do it from websites like browsershots.org, which requires access to my website. But my website is "closed" (requires authentication) from anyone except me.
I have these lines in my apache2.conf (or httpd.conf as it also is known as):
<Directory /var/www>
AuthType Basic
AuthName "Some name"
AuthUserFile "dir/to/some/file"
Require user some_user
</Directory>
These above allows only access to somebody with username "some_user" and a passwords which is located in "dir/to/some/file".
Now, is there any way to give access to the website from a host also?
My problem is like I said, when trying to cross-browser check my website from sites which requires an URL to my website, they are all blocked because of the authentication I have.
Do I have to turn off the authentication in order to be able to cross-browser check?
Thanks
If you could verify what IP address they would be hitting your website from, you could use a combination of the Allow and Deny directives to make sure that only requests originating from browsershots.org's IP address get through.
http://httpd.apache.org/docs/2.0/mod/mod_access.html
You can create a page that shows the visitors IP, visit your site from browsershots.org, then use that in your apache config.
What about if you removed the authentication, but then added PHP code to restrict access by IP, so that the site was only accessible from your own computer? Would that work for your purposes? Something like this:
http://www.wmtips.com/php/simple-ways-restrict-access-webpages-using.htm#ip
Edit: sjobe has the better plan. Same idea, but that way you can still let BrowserShots do the work.

How can I restrict / authorize access to PHP script?

There is this PHP script on my website which I don't want people to be able to run by just typing its name in the browser.
Ideally I would like this script to be run only by registered users and only from within a Windows app (which I will have to provide). Can this be done ?
Alternatively, how can I protect this script so that it can only be called from a specific page or script?
Also how can I hide the exact URI from appearing on the address bar?
Thanks !
If you are running Apache for your webserver, you can protect it with a username/password combo using .htaccess. It takes a little configuration if your server is not already configured to allow .htaccess. Here are the Apache docs.
If you need authentication based on application-specific factors, you can put something at the top of your script like
<?php
if(!$user->isLoggedIn()) {
// do 404
header('HTTP/1.0 404 Not Found');
}
Do you have a question about how you would implement isLoggedIn?
You can also use mod_rewrite to rewrite URIs, and those directives can go inside your .htaccess as well. mod_rewrite can rewrite incoming requests transparently (from the browser's perspective) so a request for /foo/bar can be translated into secret_script.php/foo/bar. Docs for mod_rewrite.
However you decide to implement this, I would urge you to not rely solely on the fact that your script's name is obscure as a means to secure your application. At the very least, use .htaccess with some per-user authentication, and consider having your application authenticate users as well.
As Jesse says, it's possible to restrict your script to logged in users. There are a large number of questions on this already. Search for PHP authentication.
However, it is not possible to restrict it to a single application. It is fairly simple to use a program like Wireshark to see exactly how the program logs in and makes request. At that point, they can reproduce its behavior manually or in their own application.
There are a variety of different ways that you could go about securing a script. All have pluses and minuses, and its likely that the correct answer for your situation will be a combination of several.
Like mentioned, you could lock down the account with Apache...it's a good start. Similarly, you could build a powerful 'salt-ed' security system such as this: http://www.devarticles.com/c/a/JavaScript/Building-a-CHAP-Login-System-An-ObjectOriented-Approach/ If you use SSL as well, you're essentially getting yourself security like banks use on their websites--not perfect, but certainly not easy to break into.
But there are other ideas to consider too. Park your script in a class file that sits inaccessible via direct URI, then do calls to the various functions from an intermediary view script. Not perfect, but it does limit the ways that someone could directly access the file. Consider adding a "qualifier" to the URL via a simple get--have the script check for the qualifier or fail....again, not a great solution on its own, but one additional layer to dissuade the bad guys. If you have control of who's getting access (know exactly which networks) you could even go so far as to limit the IP's or the http referers that are allowed to access the file. Consider setting and checking cookies, with a clear expiration. Don't forget to set your robots file so the browsers don't stumble upon the script your trying to protect.
A while back my company did a membership app using Delphi on the front end, talking to php and MySql on the backend....it was a bit clunky given that we were all web application developers. If you're so inclined, perhaps Adobe Flex might be an option. But ultimately, you'll have to open a door that the application could talk to, and if someone was determined, theoretically they could dig through your app to find the credentials and use them to gain instant access to the site. If you're going the desktop app route, perhaps its time to consider having the app avoid talking to an intermediary script and do its work on the local machine, communicating the db that sits remote.
you can use deny access on .htaccess on a folder with a php authentification that will redirect to those php file

Categories