I have a PHP enabled site, with directory-listing turned off.
But, when I used Acunetix: (web vulnerability scanning software) to scan my site, and other high-profile websites, it was able to list all directories & files.
I don't know what this is happening, but I have this theory: maybe the software is using English words, trying to see if a folder exists by trying names like "include/", "css/", "/images", etc. Then, maybe it is able to list files that way.
Because, if directory listing is off, I don't know what more there is to do.
So, I devised this plan, that if I give my folders/files difficult names like I3Nc_lude, 11css11, etc., maybe it would be difficult for the software to find the names. What do you think?
I know, I could be dead-wrong about this, and the idea might be laughable but, that is why I am asking for help.
How do you Completely! Forbid directory listing??
Ensure all directories from the root of your site have directory
listings disabled. It is typically on by default when you setup a
new server.
Assuming that directory listing in your webserver is not your issue,
keep in mind that any resources you have in your site: CSS files, JS
sources, and of course HREFs can be traversed with little or no
effort (typically a few lines of javascript). There is no way to
hide anything that you've referenced. This is most likely what you
are seeing reflected in the scan.
Alternatively, if you use SVN or other version control systems to
deploy your site, often these can be used to determine the path of
every file in your codebase.
Probably the most common mistake people make when first creating sites is that they keep all their files in the webroot, and it becomes somewhat trivial to figure out where things are.
IMHO the best approach is have your code in a separate directory outside the webroot, and then load it as needed (this is how most MVC frameworks work). You can control entirely then what can and can not be accessed via the web. You can have 100s of classes in a directory and as long as they are not in the webroot, no one will ever be able to see them, even if directory listing were to become enabled.
The checkers aren't using some kind of language-based brute force attack, that would be far too costly and invasive even for the most inept hacker. Your internet file sharing service (Apache, IIS, whatever) is serving up the structure to anyone who asks.
I found this solution at - it should apply to you, I hope.
http://www.velvetblues.com/web-development-blog/dont-get-hacked-6-ways-to-secure-your-wordpress-blog/
Hide Your Directory Structure
It is also good practice to hide your directory structure. By default, many WordPress installations enable any visitors to snoop and see all files in folders lacking an index file. And while this might not seem dangerous, it really is. By enabling visitors to see what files are in each directory, they can better plot their attack.
To fix this problem, you can do one of two things:
Option 1: Use An Index File
For each directory that you want to protect, simply add an index file. A simple index.html file will suffice.
Option 2: Use An .htaccess File
The preferred way of hiding the directory structure is to use the following code in an .htaccess file.
Options -indexes
That just sounds like a nightmare to manage. Focus on securing the files the best you can with all preventative measures. Don't rely on security through obscurity. If someone wants in, some random directory names will just slow them down slightly
Related
I got a website in Wordpress and recently we discovered that it was infected by several malware scripts that insert scripts using the common base64 and eval functions like this:
We were able to solve most of the infected files but there are still some scripts being injected into the index.html, like these:
All these scripts marked in red make a requests to sites that immediately trigger my computer antivirus.
So question here is, how can I track which file loads these lines? How can I know which file prints them? I can't just search for the string since the code is encrypted like on the first image...
The truth is, it's probably going to be more than one file, and/or it's going to be something hidden deep in a plugin/upload folder.
This is going to be a bit time-consuming, but these are generally the steps I follow when fixing a hacked site to narrow things down and make sure I got all the crap out:
1) Before you do anything else, make sure you have a backup of both the files and db. That way, if you accidentally delete something, it's easy to restore.
2) Delete any unused themes or plugins, and make sure all existing plugins are up-to-date.
3) Update WordPress to the current version. Seriously. Keeping up-to-date is important. If you're more than two major releases behind, you'll want to update incrementally. (https://codex.wordpress.org/Upgrading_WordPress_-_Extended_Instructions)
4) After you've updated, connect via FTP and look for files older than when you updated. Look for extra files that shouldn't be there--this can be tricky, because hacked files are usually named things like wp-shortcode-s.php. I usually have a copy of WP core files open in a window beside my FTP client as a reference.
5) Check the first few lines of code on php and js files in your plugins folder for malicious code. Again, you might want to have a freshly downloaded copy of the plugin to compare files to.
6) Check the uploads folder and subfolders for malicious files.
I also keep checking my hacked site here to see how I'm doing:
http://isithacked.com/
And when you're finished, you might want to read up on how to harden WP to make it more difficult to hack.
Depending on the source of the malware, it's hard to give you a precise hint. There are a few more in-depth walk-through about the topic you can find on Google, here are some good examples which could help:
https://www.wordfence.com/docs/how-to-clean-a-hacked-wordpress-site-using-wordfence/
https://blog.sucuri.net/2011/02/cleaning-up-an-infected-web-site-part-i-wordpress-and-the-pharma-hack.html
Also if you are on a shared host, potentially the issue could be coming from an other compromised user. Hopefully you have a clean version of the site so that potentially moving to an other host (and upgrading) is an option.
Why it is always recommended to place framework files outside of the public root ?
Given that sometimes a framework doesn't have .ini or .inc files that can be opened by a browser.
Well, there is definitely nothing to be gained from placing framework sources inside the web root. Since the choice of where to place the file is therefore free, it's only logical to go with the principle of least privilege: you don't need web access to these files, so you won't get it.
A more concrete reason is that framework sources can easily disclose the brand and version of a framework being used on a website (although this information can also usually be gained by examining the generated content); this in turn can make it easier for malicious users to exploit known or newly discovered vulnerabilities.
This is safer because if there is any misconfiguration in web server, then it is possible that script files (be it .php, .asp or whatever) can be spit out in plain text and potential attacker sees all your source code and defined passwords. So the best practice is to put only index.php file in webroot which in turn includes bootstrap script from outside webroot.
I remember one real world example - in Latvia, where I live, we have large social network "draugiem.lv" (in our country more popular than Facebook) and few years ago all their PHP source code leaked by misconfigured server, as I described earlier.
In addition to the standard reasons cited by the other answers - server mis-configuration, principle of least privilege, etc - it is worth noting that many frameworks, including Zend Framework, can use config files that are in formats other than PHP, e.g., .ini, .yml, etc.
If these were in the public-accessible web root, then - depending upon server config - they would be served directly to anyone who requests them. Since these config files typically contain sensitive information like db passwords, API-keys, etc, it is certainly desirable to make them as inaccessible as possible.
As an example, consider application/configs/application.ini. If the doc root were at the project-folder level, then a request for:
http://example.com/application/configs/application.ini
would deliver the keys to the castle.
I'm wondering is there a difference of having my site within the /var/www or /home/myuser/public/sites. Just seen some tutorials that points to the former then others to the latter, but didn't mentioned some key benefits of having it placed there. If you know some articles that explains this best please share it with me.
Huge thanks.
The main difference that I can think of from one directory to another is if it's on a separate partition and you set different flags, such as noexec or nosuid. Apart from that, the actual directory you use doesn't matter.
However, with that said, by default a user's home directory is created within /home, so especially in something like a shared hosting server, it makes sense to have the DocumentRoot as a sub-directory of the user's home directory, which allows them to easily modify the files without having to worry about them needing give them permissions to leave their home directory, and thus making it easy to "lock" them in by things such as FTP or SFTP with chroot.
There are no pros or cons sir. You have the freedom to use what you want to.
I'm new to CodeIgniter. I notice that all CodeIgniter folders (cache, config, controllers, core, errors, etc...) contains an index.html file that basically says "Directory access is forbidden". Correct me if I'm wrong, but I don't think it is possible to get to any of these folders from the web based on CodeIgniter's default configuration.
What is the purpose of these index.html files? Can I just delete them, or do I leave them alone?
Thanks much.
The purpose of them is to prevent the contents of the directory from displaying if directory listing is enabled on your server. Apache servers by default have directory listing enabled.
There are several instances where given the right circumstances you might be able to attempt to browse to a folder directly. These would mainly be caused by a server which is not configured properly, or an exploit. Therefore it is really best if you just leave the index.html files alone (they aren't hurting anything, and they don't take up that much space).
I'd even go as far as to suggest that you too add an index.html file to any and all folders which you create.
They are there for fail-safes, ie. if for some reason the directory structure would get to be publicly browsable.
I can't see any reason to remove them.
If your codeigniter installation (system and app folders) is outside of your public server directory, then they're not going to help with anything since they could never be served. In that case, it doesn't matter whether they exist or not, since you could never get to their directories anyway.
I say remove them for two reasons:
1) If Apache is configured to allow directory browsing, then it doesn't matter what your index.html says. So claiming that "Directory access if forbidden" when it's really not, amounts to security through obscurity, which is an undesirable security strategy.
2) I disagree with the idea that "if it's not hurting anything, just leave it alone". I've spent many an hour trying to figure out the purpose of a particular piece of code, only later to find out that it wasn't doing anything at all. Remove unused code. The inheritors of your projects will curse you less.
They are for your security, if someone tries to access your folder on the server by your domain URL (if your server is configured in a wrong way), it will prevent you from loading those files by triggering that HTML file
It would be safe for you to keep the file indeed.
I'm looking to centralize a lot of my web applications code, so that multiple components have access to the same core functionality. This is how I have the website set up:
/var/www/website - domain.com
/var/www/subdomain1 - subdomain1.domain.com
/var/www/subdomain2 - subdomain2.domain.com
Naturally I've had a lot of trouble when it comes to the duplication of common functionality, as any changes made to one area would also need to be applied to other areas. My proposed solution is to create a new directory in /var/www which will contain all of the core scripts:
/var/www/code - core code
I would then set the PHP include directory to /var/www/code, so scripts can include these files without having to specify the absolute path.
Can you think of any more efficient ways of centralizing the code code?
Many thanks!
Your approach is good enough for this purpose.
Little suggestion:
store your front-end scripts in directory like /var/www/website/www instead of /var/www/website. There will be index file and ajax processors and scripts like that. But your project-based inclusions (as well as other miscellaneous stuff) would be stored in directory like /var/www/website/includes. It is simple yet efficient defense from hacker attacks on your inclusion files
so, your document roots will be in /var/www/website/www (domain) and /var/www/website/subdomain/www/ (subdomain)
It seems that you are thinking correctly :
Share Code between multiple PHP sites
It's only a suggestion, but you should put the public content in the /var/www/* which may end being publicly accessible—either because of your http server or because of some misconfiguration—and create some other directories for your shared code/libs like /usr/local/lib/php/*.
For more security you should frame it with open_basedir adding the private and public dirs—as well as upload and session dirs.
And don't forget to version your libs, e.g.:
/usr/local/lib/php/myLib-1.0
/usr/local/lib/php/myLib-1.2
etc.
Thus, you'll be able to make changes without breaking everything.