Is there any way to prevent a user from writing code to include a php file if they have access to the server? I can't use a .htaccess deny from all type solution because as I understand it that is only for remote access. I also can't use a solution like the ones described here link from Tyler Carter because I process all pages from one central call which includes the appropriate page parts. Thus, $_SERVER['PHP_SELF'] always returns the same file, and I can't get the FILE of the calling script in the called script. So neither of those work.
I simply want to prevent the parts I include in the central file not to themselves call other files on the server outside their own directory.
I hope I am missing something and that this can be accomplished with .htaccess?
I guess a bigger question is if this is even a worthwhile security endeavor?
No.
If the user has the right to write files to your web-root, there is nothing you can do to prevent him/her from creating PHP files (short of writing your own FileSystem).
If you don't trust a user, they should not have the ability to write to your FS. This attack is actually quite commonly used in the real world against CMS installations. A user with file upload permissions uploads a PHP shell to take over the server.
Related
I have found in my server, that some files are stored inside the webroot, for example
example.com/files/longnamewithcode_1234293182391823212313231230812.pdf
They are stored for a web app and hace sensible info on them.
If you access example.com/files you get an empty index.html, so you can't directly scan the directory. Anyway, I'm concerned about this: I feel that it is not safe and I would like to know what kind of attacks could be made to access the files. I understand that some brute force attack would be possible, but with the long code names I guess it's a less big problem.
Finally, I would say that the correct way is storing the files outside the web folder and return them with PHP, but I'm not sure I'll be able to have access to the code to change this.
If you have to make the files accessible from webroot by the webserver you can't really make it more safe than use sufficient amount of entropy in the file names, but that still not account for simply sharing a the links by users that get a hold of them somehow.
If you want to implement the permission checking inside php take a look into the various X-Sendfile implementations on popular webservers like, mod_xsendfile (apache), XSendfile (nginx) or X-LIGHTTPD-send-file (lighttpd). This allows you to use the webserver to serve the file basically as efficiently as simply accessing it from the webroot after you validated the accessing user.
have you considered an .htaccess file to restrict who is allowed to access those sensible files? you tagged it, but i'm not sure why you are not using it. =)
If you wish to block everything in the folder you can use an .htaccess file to block all connections.
Try
deny from all
in the .htaccess file in that directory. This will deny all access to anything in that directory including the index file.
The question is
Are the files supposed to be accessed by users freely?
If yes, don't worry about those files too much (as long as they're not writeable).
If no, i.e. users have to be logged in to download them, move them out from the publicly accessible folder and deeper into the application. Write a php script that will manage the permissions for the file i.e. /download?file_id=1.
So I'm a bit confused about what crafty users can and can't see on a site.
If I have a file with a bunch of php script, the user cant see it just by clicking "view source." But is there a way they can "download" the entire page including the php?
If permission settings should pages be set to, if there is php script that must execute on load but that I dont want anyone to see?
Thanks
2 steps.
Step 1: So long as your PHP is being processed properly this is nothing to worry about...do that.
Step 2: As an insurance measure move the majority of your PHP code outside of the Web server directory and then just include it from the PHP files that are in the directory. PHP will include on the file system and therefore have access to the files, but the Web server will not. On the off chance that the Web server gets messed up and serves your raw PHP code (happened to Facebook at one point), the user won't see anything but a reference to a file they can't access.
PHP files are processed by the server before being sent to your web browser. That is, the actual PHP code, comments, etc. cannot be seen by the client. For someone to access your php files, they have to hack into your server through FTP or SSH or something similar, and you have bigger problems than just your PHP.
It depends entirely on your web server and its configuration. It's the web server's job to take a url and decide whether to run a script or send back a file. Commonly, the suffix of a filename, file's directory, or the file's permission attributes in the filesystem are used to make this decision.
PHP is a server side scripting language that is executed on server. There is no way it can be accessed client side.
If PHP is enabled, and if the programs are well tagged, none of the PHP code will go past your web server. To make things further secure, disable directory browsing, and put an empty index.php or index.html in all the folders.
Ensure that you adhere to secure coding practices too. There are quite a number of articles in the web. Here is one http://www.ibm.com/developerworks/opensource/library/os-php-secure-apps/index.html
We have several different client directories (each it's own domain) that include/require the central app from a different location on the server. Basically each domain is an extension of the centralized code, but very lean because all the main code doesn't need to be duplicated.
If we wanted to give clients/resellers access to editing their own PHP codes, how would we prevent them from reading the central code that we wish to protect?
Basically we want to prevent them from creating some code that opens, reads, TARs, or somehow outputs the source code, but we must still allow the include.
open_basedir() does almost this; it prevents the opening of the code, but in doing so it also prevents the include.
Are code encryption solutions (e.g. Zend Guard) our only options, or is there a way like open_basedir() that allows includes? I've also though about disabling all the read functions and writing my own that checks the source.
Thoughts?
The answer is no, you cannot give the "read" permission and prevent them from reading...
If they can "include" the code they can also write a simple php script that reads your central app files and print the content to screen, for example.
I believe you cannot restrict reading if you allow reading globally however you could filter the access of your site in .htaccess file with %{REMOTE_HOST} or similar. Basically if you are able to identify your clients from their remote locations by IP or url than I believe you can restrict reading specific directories based on who is accessing the site. Can you give me an example of your PHP code for the reseller access to your side?
I ended up using Smarty to give limited capabilities to clients (templating), while keeping the PHP secure.
I've seen recommendations to store some or all php include files some place other than in the web document root directory (username/public_html in my case) for the specific reason of protecting php files with sensitive information (like database connection and login info) in the event that the web server hiccups and stops protecting php files and they become 'visible' to outsiders who know where to look.
It seems somewhat paranoid to me, but I'm guessing people have gotten burned badly on this before so I'm willing to go along. The suggestion usually takes the form of having the include files in something like '../include_files/' so its not directly in the document root and not directly accessible to outsiders through the web server.
My question is this: is there a significant difference in security between that way and just putting your 'include_files' directory under the document root and sticking an .htaccess file in there (with the appropriate entries)? Would putting an .htaccess file in '../include_files/' make any significant improvement there?
TIA,
Monte
Using .htaccess adds overhead since Apache has another item it needs to check for and process.
Keeping files out of web root isn't being paranoid, it's good practice. What happens if someone accesses one of the "include" files directly and it throws out revealing errors because all the pre-requisite files weren't loaded?
Each file needs to have it's own security checks to make sure it is running under the expected environment. Each executable file in a web accessible area is a potential security hole.
It really depends on what you have in your include_files. The most important thing is that you put any credentials you have outside of the document root ( database logins, etc ). Everything else really is secondary and doesn't matter that much.
If you don't want anyone stealing your source code then try to follow Zend conventions:
application
library
public
DocumentRoot points to public and that just contains media files, js/css files. HTML/views, db logic, conf/credentials are in application. Third party libraries are in library.
Theoretically, if you just stick a .htaccess file in the folder, you could still have the .php files called directly.
Taking them out of the server root; however, keeps them from be accessed ever by someone who is browsing your website.
Is it possible to "deny from all" apache htaccess style using php.
I can't use htaccess because im using different webserver, so i wan't to use php to workaround it.
So let say user are trying to access folder name 'david', all content and subdirectory are denied from viewing.
No
PHP cannot be used to protect folders.
Because it is not PHP who serves requests, but a web server
You can move this catalog above Document Root to prevent web access to it.
But premissions will help you nothing
Use chmod to change the permissions on that directory. Note that the user running PHP needs to own it in that case.
If you just want to prevent indexing the folder, you can create an index.php file that does a simple redirection. Note: Requests that have a valid filename will still be let through.
<?php
header("Location: /"); // redirect user to root directory
Without cooperation from the webserver the only way to protect your files is
to encrypt them, in an archive, maybe, of which your script would know the password and tell no one - that will end up wasting cpu as the server will be decrypting it all the time, or
to use an incredibly deranged file naming scheme, a file naming scheme you won't ever describe to anyone, and that only your php script can sort trough.
Still data could be downloaded, bandwidth go to waste and encrypted files decrypted.
It all depends on how much that data matters. And how much your time costs, as these convoluted layers of somewhat penetrable obfuscation will likely eat huge chunks of developer time.
Now, as I said... that would be without cooperation from the webserver... but what if the webserver is cooperating and doesn't know?
I've seen some apache webservers, (can anyone confirm it's in the standard distribution?) for instance, come preloaded with a rule denying access to files starting with .ht, not only .htaccess but everything similar: .htproxy, .htcache, .htwhatever_comes_to_mind, .htyourmama...
Chances are your server could be one of those.
If that's the case... rename your hidden files .hthidden-<filename1>,.hthidden-<filename2>... and you'll get access to them only through php file functions, like readfile()