I'm new to CodeIgniter. I notice that all CodeIgniter folders (cache, config, controllers, core, errors, etc...) contains an index.html file that basically says "Directory access is forbidden". Correct me if I'm wrong, but I don't think it is possible to get to any of these folders from the web based on CodeIgniter's default configuration.
What is the purpose of these index.html files? Can I just delete them, or do I leave them alone?
Thanks much.
The purpose of them is to prevent the contents of the directory from displaying if directory listing is enabled on your server. Apache servers by default have directory listing enabled.
There are several instances where given the right circumstances you might be able to attempt to browse to a folder directly. These would mainly be caused by a server which is not configured properly, or an exploit. Therefore it is really best if you just leave the index.html files alone (they aren't hurting anything, and they don't take up that much space).
I'd even go as far as to suggest that you too add an index.html file to any and all folders which you create.
They are there for fail-safes, ie. if for some reason the directory structure would get to be publicly browsable.
I can't see any reason to remove them.
If your codeigniter installation (system and app folders) is outside of your public server directory, then they're not going to help with anything since they could never be served. In that case, it doesn't matter whether they exist or not, since you could never get to their directories anyway.
I say remove them for two reasons:
1) If Apache is configured to allow directory browsing, then it doesn't matter what your index.html says. So claiming that "Directory access if forbidden" when it's really not, amounts to security through obscurity, which is an undesirable security strategy.
2) I disagree with the idea that "if it's not hurting anything, just leave it alone". I've spent many an hour trying to figure out the purpose of a particular piece of code, only later to find out that it wasn't doing anything at all. Remove unused code. The inheritors of your projects will curse you less.
They are for your security, if someone tries to access your folder on the server by your domain URL (if your server is configured in a wrong way), it will prevent you from loading those files by triggering that HTML file
It would be safe for you to keep the file indeed.
Related
I have a PHP enabled site, with directory-listing turned off.
But, when I used Acunetix: (web vulnerability scanning software) to scan my site, and other high-profile websites, it was able to list all directories & files.
I don't know what this is happening, but I have this theory: maybe the software is using English words, trying to see if a folder exists by trying names like "include/", "css/", "/images", etc. Then, maybe it is able to list files that way.
Because, if directory listing is off, I don't know what more there is to do.
So, I devised this plan, that if I give my folders/files difficult names like I3Nc_lude, 11css11, etc., maybe it would be difficult for the software to find the names. What do you think?
I know, I could be dead-wrong about this, and the idea might be laughable but, that is why I am asking for help.
How do you Completely! Forbid directory listing??
Ensure all directories from the root of your site have directory
listings disabled. It is typically on by default when you setup a
new server.
Assuming that directory listing in your webserver is not your issue,
keep in mind that any resources you have in your site: CSS files, JS
sources, and of course HREFs can be traversed with little or no
effort (typically a few lines of javascript). There is no way to
hide anything that you've referenced. This is most likely what you
are seeing reflected in the scan.
Alternatively, if you use SVN or other version control systems to
deploy your site, often these can be used to determine the path of
every file in your codebase.
Probably the most common mistake people make when first creating sites is that they keep all their files in the webroot, and it becomes somewhat trivial to figure out where things are.
IMHO the best approach is have your code in a separate directory outside the webroot, and then load it as needed (this is how most MVC frameworks work). You can control entirely then what can and can not be accessed via the web. You can have 100s of classes in a directory and as long as they are not in the webroot, no one will ever be able to see them, even if directory listing were to become enabled.
The checkers aren't using some kind of language-based brute force attack, that would be far too costly and invasive even for the most inept hacker. Your internet file sharing service (Apache, IIS, whatever) is serving up the structure to anyone who asks.
I found this solution at - it should apply to you, I hope.
http://www.velvetblues.com/web-development-blog/dont-get-hacked-6-ways-to-secure-your-wordpress-blog/
Hide Your Directory Structure
It is also good practice to hide your directory structure. By default, many WordPress installations enable any visitors to snoop and see all files in folders lacking an index file. And while this might not seem dangerous, it really is. By enabling visitors to see what files are in each directory, they can better plot their attack.
To fix this problem, you can do one of two things:
Option 1: Use An Index File
For each directory that you want to protect, simply add an index file. A simple index.html file will suffice.
Option 2: Use An .htaccess File
The preferred way of hiding the directory structure is to use the following code in an .htaccess file.
Options -indexes
That just sounds like a nightmare to manage. Focus on securing the files the best you can with all preventative measures. Don't rely on security through obscurity. If someone wants in, some random directory names will just slow them down slightly
I have found that I have a similar attack to the one mentioned here
giant regex hack
The file keeps getting recreated and I cannot see from where its coming. How can I fix this? Anyone with a similar experience? I am running joomla 1.5.25
How can I trace the script that includes this file? How do I secure the site?
Its always recommended that you keep the permission of your .htaccess file as readable only.
chmod 0555 .htaccess
But regarding the hack being created again and again, there could be just numerous reasons. Like the host not properly configured, such that, on a shared hosting, a script running for a different domain can access scripts for your own domain.
Also check, what kind of files do you allow to upload, whether it's Admin or Frontend. Make sure, its not just any file upload is allowed. Executable files like PHP should not be allowed to be uploaded.
Yikes, that's not a good situation. I've seen it happen a few times and more often than not the solution was to recreate the website from a clean Joomla install, copy over the data and reinstall components.
However first check that the permissions are ok (no 666, 777, etc) and definitely check the VEL to see if any of your extensions have know vulnerabilities http://docs.joomla.org/Vulnerable_Extensions_List
You could also check on the Inj3ctor database http://www.1337day.com/ .
Most hacks like these happen via out of date extensions, open permissions, or as linuxeasy mentioned a poorly configured host.
Would highly recommend installing jhackguard or eyesite on the website. Eyesite will monitor your files and notify you when changes occur -http://extensions.lesarbresdesign.info/extensions/eyesite
I'm starting a project in PHP, and I want to structure my files properly from the start (unlike my last project, which had almost every file in a single directory). The problem is the following, which I will describe with an example:
Take the following files: index.php, includes/header.php, and css/common.css. index.php 'includes' the header (as will many other php files). The header then calls common.css so that its html elements can be placed properly. common.css will also provide styling for general elements in index.php and other files.
Notice that since the header is being included, when the header calls common.css, it does so from the location of the file calling it; in this case, index.php. But if I add, say, modules/friends.php and call the header with it, it will be looking for the CSS file in the wrong spot!
Initially I tried to remedy this by using the actual path for when I call CSS files. However, my local machine and web server have a different layout of directories, and therefore I cannot simply call /var/www/whatever.
Can anyone help me or redirect me to a place where this sort of thing is documented?
Thanks,
Paragon
Always specify absolute paths to all your resources: .css, .js, images, etc...
http://en.wikipedia.org/wiki/Absolute_path
However, my local machine and web server have a different layout of directories, and therefore I cannot simply call /var/www/whatever.
You can. Web paths is not the same thing as local filesystem paths. When you specify path in web - the root sign / specifies to the webroot (the directory your project is placed at), not your filesystem root.
Congratulations on recognizing a huge problem.
Yes, this is always the big, important question that you need to answer at the start.
I've finally learned -- and this is after quite a few years -- to try my best to make the file structure on the development machine (my PC, say) be exactly like the file structure on the host machine (a Linux host, for example). That one thing alone has saved me unending hours of grief.
If you can accomplish that, then the rest is a piece of cake, believe me. You can put files in whatever directories you want, wherever it makes sense to you, on both machines. You can figure out what files should go where.
If you don't bother to try for near-identical file-directory setups on both machines, you are forever going to be wondering, as you edit away, "Hey, what machine am I on? If I'm on the host, then very-important-file.php is in /toplevel, and everything else is under it. But if I'm on the PC, then very-important-file.php is over here in /my-files, see, and then other files are on different levels and did I delete that file and ..." My God, don't make me think, much less think about that mindless crap.
You can handle and remember just the root being in different spots on different machines, but other than that, forget it.
Now when you come to run your stuff, you will always know where the pieces of that stuff are: CSS files, JS files, whatever. PLUS you can (maybe; if you're lucky) debug your code on the PC or the host equally well, with no differences and with no changes anywhere. PLUS when you upload your new code, you can FTP it up to the host in one big chunk rooted where you like. (Which has the very nice ancillary benefit of your being able to move files around wherever you want on the development machine.)
Piece of cake! Don't pass up this chance to save yourself days or weeks (literally) of time.
Always IMHO.
I am always reading that you should always store your database credentials outside of your document root because normally you would have them set to db.inc or something similar.
I can understand this and naturally it makes perfect sense.
What I don't understand is why you are making the file into one that you either need to set apache to hide or you need to put it into a secure location in the first place.
What is the issue with making it, say db.php - Then apache knows to execute the script first and return the output (which would presumably be blank in most cases).
Maybe I am being dumb and missing an inherent security flaw but is there any issues with just storing your details in a .php file? I mean Wordpress and other major open source PHP applications manage to get away with it, but is this because they can't make their script talk to folders outside of www or because it is just as secure as any other method?
Maybe I am being dumb and missing an inherent security flaw but is there any issues with just storing your details in a .php file?
A tiny slip up in the configuration of Apache, and the file starts being served raw instead of being processed by the PHP engine.
I mean Wordpress and other major open source PHP applications manage to get away with it, but is this because they can't make their script talk to folders outside of www or because it is just as secure as any other method?
They accept increased risk for increased convenience.
Storing files containing (database) credentials outside the document root is always a good idea.
Say, you upgrade Apache, but forget updating the configuration with PHP. Any file in the document root can possibly be downloaded without getting parsed.
Wordpress, Joomla, phpBB and others are made to be portable. That is, reside in one folder.
I was going to ask what the best way to do this is, but then decided I should ask whether or not it is even necessary. I have never seen it done in JSP development, but it appears to be common practice in PHP. What is the reasoning behind this, and if I do not protect against this, what else should I be taking into consideration?
The reason this is more common in PHP than other similar languages has to do with PHP's history. Early versions of PHP had the "register_globals" setting on as a default (in fact, it may not have even been a setting in really early versions). Register_globals tells PHP to define global variables according to the query string. So if you queried such a script thusly:
http://site.com/script.php?hello=world&foo=bar
... the script would automatically define a variable $hello with value "world" and $foo with value "bar."
For such a script, if you knew the names of key variables, it was possible to exploit the script by specifying those variables on the query string. The solution? Define some magic string in the core script and then make all the ancilliary scripts check for the magic string and bail out if it's not there.
Thankfully, almost nobody uses register_variables anymore, but many scripts are still very poorly written and make stupid assumptions that cause them to do damage if they are called out of context.
Personally, I avoid the whole thing by using the Symfony framework, which (at least in its default setup) keeps the controllers and templates out of the web root altogether. The only entry point is the front controller.
If you include everything from outside web root then it's not an issue as nothing can be loaded directly.
Well, This is to prevent sensitive includes from being sent to the web-server directly. It's certainly not an all-inclusive security measure, but it could help with your particular setup.
If however, your user was in a position to include the file from their own script, it won't help at all
I emit a 404 page, not as a serious security measure but only because I don't like leaking information about the internals of a site, even the names of internal files.
But if the file just contains functions then there's no real harm in omitting the check.
It also isn't just a security feature in php but more of how many MVC based PHP sites function. If for example in SugarCRM you were to call a module file directly the page load would fail because the controller, view and model were not previously loaded and you'd have no db config/connection information either, so to make sure all dependencies are loaded the users is forced through a known entry point - i.e. index.php
I just found an approach in the .Net MVC system that you could replicate for PHP using Apache Rewrites, .htaccess files or if you are using IIS, a web.config file.
As the MVC pattern doens't need the user to directly access aspx files these are not served and a 404 is sent instead. If you have a naming convention for included files "inc.php" for example you could redirect *.inc.php requests to a 404 for specific folders - in Apache Rewrite supply R=404 at the end of the rule will return that HTTP status to your client.
Some of these examples may help: Apache Rewrite Examples
As already mentioned in some of the other answers, you shouldn't need to do this. If a file isn't supposed to be served up by the web server, you shouldn't leave it within the web folder. Includes should be placed in a directory outside the web root.
Apart from that, the proper way to tell the user that a page doesn't exist, is by emitting a status 404, using:
header("HTTP/1.0 404 Not Found");
exit;
If you don't do this, it is hard for non-humans (Eg. search-engines) to distinguish between a regular page and a non-page.
This is very important because if you are editing your site running Google Toolbar, it will find your inner php files and then put them into search results. At best this will create an awkward experience for users but if you are a sloppy programmer, could reveal database connection information.