Web directory structure problem - php

This is my directory structure:
-st.ambulance
---public_html
---resources
My document root is public_html. I have a script in resources that I need to execute securely (meaning: normal users shouldn't be able to execute it). How would I do this?

Normal users already won't be able to execute it if they can't get to it in their browser. If it's not in your document root, they shouldn't be able to get to it. So, it's already safe, unless one of the scripts in your document root is including it, or your site gets hacked.
As far as executing the script yourself, find out whether it should be run from the command line or as a web page. If it can run at the command line, just ssh in and run it. If it needs to be run as a web page, have your web server serve the resources directory to an admin (sub)domain, secured by https, and password protect it with something like basic http authentication.

What is the script? I guess that doesn't matter so much anyway, but what you can do is use:
require_once("/../resources/script.php);
I should add that you of course need to set up your path appropriately.

Related

how to send header:location to out of web root file

I'm making a web application which will only allow registered members to download zip folders from a folders directory.
I really need to know which would be the proper way to secure the folder as only members stored in my database will be able to access them so the problem is if somebody finds the directory and a file name there's nothing to stop them accessing it.
I've been doing some research and found some approaches but they all have major drawbacks.
1.) put the files outside of the webroot then use readfile to send them the data.
This is how I have it currently set up. the major draw back is that I'm on a shared server and max execution time for the script is 30 seconds (can't be changed) and if the file is big or user connection slow the timeout will be called before the download is complete.
2.) htaccess and htpasswd inside a webroot directory.
The problem with this is I don't want to have to ask the user to put a password again. unless there's a way to allow php to send the password then send a header to the actual zip file that needs to be downloaded.
3.) Keeping the files in webroot but obfuscating the file names so they are hard to guess.
this is just totally lame!
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it. is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download. thus allowing apache to serve the file and not php with readfile.
Is there some easier way to secure the folders and serve with apache that I am just not coming across? Has anybody experienced this problem before and is there an industry standard way to do this better?
I know this may resemble a repeat question but none of the answers in the other similar question gave any useful information for my needs.
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it.
More to the point, it is outside the web root so it doesn't have a URL that the server can send in the Location header.
is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download.
No. Preventing the server from simply handing over the file is the point of putting it outside the web root. If you could redirect to it, then you would just be back in "hard to guess file name" territory with the added security flaw of every file on the server being public over HTTP.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
Your options (some of which you've expressed already in the form of specific implementations) are:
Use hard to guess URLs
Put the file somewhere that Apache won't serve it by default and write code that will serve it for you
Use Apache's own password protection options
There aren't any other approaches.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
No, there isn't an easier way (but that said, all three implementations you've described are "very easy").
Another approach, which I consider really dirty but might get around your resource constraints:
Keep the files outside the web root
Configure Apache to follow symlinks
On demand: Create a symlink from under the web root to the file you want to serve
Redirect to the URI of that symlink
Have a cron job running every 5 minutes to delete old symlinks (put a timestamp in the symlink filename to help with this)
It's effectively a combination of the first two options in my previously bulleted list.

PHP script with root access?

I need to call a PHP script from a remote computer which will invoke the remote server to change a DNS record. Is this possible? I have everything working fine in test environment but PHP doesn't have access to /var/namdd/mydomain.com.db. It's a VPS of mine that I have root access to so getting access isn't a problem. I just want the script to do it for me. Any ideas?
Assuming the other server is running BIND, the files could be owned by named, not root. Would be a much simpler option. Also, you could make them 777 and then have PHP simply edit them.
Three ways immediately come to mind:
Changing the permissions of the file.
Creating a daemon to do it.
Creating a command and configuring sudo to run it without a password.
Changing the permissions of the file
The simplest case would be to change the permissions of the file. To do this, you'll probably at least want to know
the user reading the file, and
the user your PHP script runs as.
If the users are the same, then of course the answer is simple: change the owner to that user. Otherwise, if the file contains no sensitive information, you might be able to change the owner to the PHP user and grant read access to all users. Finally, you might be able to create a new group containing those two users and change the group of the file. Then you could change the user of the file to the PHP user and grant write permissions, change the group to the group you created and grant read permissions, and grant no permissions to everyone else.
Creating a daemon to do it
This is rather involved; I can't say I would recommend this option. Essentially, you have a background process running with whatever privileges are necessary. This process can do whatever needs to be done and communicate with processes of lesser privilege. When the PHP script needs to modify that file, it can send a request over to the daemon, which can then change the file on behalf of the PHP script.
Command with sudo
This is probably one of the best ways to do it. Essentially, you'll need to create some command that does whatever you need to do with the file; then, once you know it works when run as root (or preferably some less privileged user which still has the necessary privileges), you can configure sudo to let the PHP user execute it as the more privileged user without a password.

Execute shell commands with sudo via PHP

So far my search has shown the potential security holes that will be made while trying to perform a sudo'd command from within PHP.
My current problem is that I need to run a bash script as sudo on my work web server via PHP's exec() function. We currently host a little less than 200 websites. The website that will be doing this is restricted to only be accessible from my office's IP address. Will this remove any potential security issues that come with any of the available solutions?
One of the ways is to add the apache user to the sudoers file, I assume this will apply to the entire server so will still pose an issue on all other websites.
Is there any solution that will not pose a security threat when used on a website that has access restricted to our office?
Thanks in advance.
Edit: A brief background
Here's a brief description of exactly what I'm trying to achieve. The company I work for develops websites for tourism related businesses, amongst other things. At the moment when creating a new website I would need to setup a hosting package which includes: creating the directory structure for the new site, creating an apache config file which is included into httpd.conf, adding a new FTP user, creating a new database for use with the website CMS to name a few.
At the moment I have a bash script on the server which creates the directory structure, adds user, creates apache config file and gracefully restarts apache. That's just one part, what I'm looking to do is use this shell script in a PHP script to automate the entire website generation process in an easy to use way, for other colleagues and just general efficiency.
You have at least 4 options:
Add the apache user to the sudoers file (and restrict it to run the one command!)
In this case some security hole in your php-apps may run the script too (if they can include the calling php for example - or even bypass the restriction to your ip by using another url that also calls the script, mod_rewrite)
Flag the script with the s bit
Dangerous, don't do it.
Run another web server that only binds to a local interface and is not accessible from outside
This is my prefered solution, since the link calling the php is accessible by links from your main webserver and the security can be handled seperately. You can even create a new user for this server. Some simple server does the job, there are server modules for python and perl for example. It is not even necessary, that you enable exec in your php installation at all!
Run a daemon (inotify for example, to watch file events) or cronjob that reads some file or db-entry and then runs the command
This may be too complex and has the disadvantage, that the daemon can not check which script has generated the entry.

How to protect PHP from the public?

So I'm a bit confused about what crafty users can and can't see on a site.
If I have a file with a bunch of php script, the user cant see it just by clicking "view source." But is there a way they can "download" the entire page including the php?
If permission settings should pages be set to, if there is php script that must execute on load but that I dont want anyone to see?
Thanks
2 steps.
Step 1: So long as your PHP is being processed properly this is nothing to worry about...do that.
Step 2: As an insurance measure move the majority of your PHP code outside of the Web server directory and then just include it from the PHP files that are in the directory. PHP will include on the file system and therefore have access to the files, but the Web server will not. On the off chance that the Web server gets messed up and serves your raw PHP code (happened to Facebook at one point), the user won't see anything but a reference to a file they can't access.
PHP files are processed by the server before being sent to your web browser. That is, the actual PHP code, comments, etc. cannot be seen by the client. For someone to access your php files, they have to hack into your server through FTP or SSH or something similar, and you have bigger problems than just your PHP.
It depends entirely on your web server and its configuration. It's the web server's job to take a url and decide whether to run a script or send back a file. Commonly, the suffix of a filename, file's directory, or the file's permission attributes in the filesystem are used to make this decision.
PHP is a server side scripting language that is executed on server. There is no way it can be accessed client side.
If PHP is enabled, and if the programs are well tagged, none of the PHP code will go past your web server. To make things further secure, disable directory browsing, and put an empty index.php or index.html in all the folders.
Ensure that you adhere to secure coding practices too. There are quite a number of articles in the web. Here is one http://www.ibm.com/developerworks/opensource/library/os-php-secure-apps/index.html

PHP application to replicate websites from single code source

I'm attempting to build an application in PHP to help me configure new websites.
New sites will always be based on a specific "codebase", containing all necessary web files.
I want my PHP script to copy those web files from one domain's webspace to another domain's webspace.
When I click a button, an empty webspace is populated with files from another domain.
Both domains are on the same Linux/Apache server.
As an experiment, I tried using shell and exec commands in PHP to perform actions as "root".
(I know this can open major security holes, so it's not my ideal method.)
But I still had similar permission issues and couldn't get that method to work either.
But I'm running into permission/ownership issues when copying across domains.
Maybe a CGI script is a better idea, but I'm not sure how to approach it.
Any advice is appreciated.
Or, if you know of a better resource for this type of information, please point me toward it.
I'm sure this sort of "website setup" application has been built before.
Thanks!
i'm also doing something like this. Only difference is that i'm not making copies of the core files. the system has one core and only specific files are copied.
if you want to copy files then you have to take in consideration the following:
an easy (less secured way) is to use the same user for all websites
otherwise (in case you want to provide different accesses) - you must create a different owner for each website. you must set the owner/group for the copied files (this will be done by root).
for the new website setup:
either main domain will run as root, and then it will be able to execute a new website creation, or if you dont want your main domain to be root, you can do the following:
create a cronjob (or php script that runs in a loop under CLI), that will be executed by root. it will check some database record every 2 minutes for example, and you can add from your main domain a record with setup info for new hosted website (or just execute some script that gains root access and does it without cron).
the script that creates this can be done in php. it can be done in any language you wish, it doesn't really matter as long as it gets the correct access.
in my case i'm using the same user since they are all my websites. disadvantage is that OS won't create restrictions, my php code will (i'm losing the advantage of users/groups permissions between different websites).
notice that open_basedir can cause you some hassle, make sure you exclude correct paths (or disable it).
also, there are some minor differences between fastCGI and suPHP (i believe it won't cause you too much trouble).

Categories