Filesystem permissions making a cURL-based caching script break - php

I'm writing this cURL script in PHP. It's purpose is to take a product or category code given to it (which type of code it is is ambiguous at that point, sure, it's possible for a category and a product to have the same code, but that's what business rules are for and what this question is NOT about), and then attempt to load either a product or category page on our shopping cart with it. Whichever page returns a 200 response then gets its output cached into an html file in the DocumentRoot.
Problem is, the DocumentRoot isn't owned by apache and I don't feel comfortable giving global write permissions to the DocumentRoot, so while the script works for the most part, the page doesn't get cached.
I do not have root or su access to the server and cannot get either. I tried writing the file to the /tmp/ directory and then moving it, but the permissions won't let me. Is there a way around this without opening up a security hole? If not, would this be possible with a Perl CGI script or would I face the same problem?

If apache doesn't have the rights do something, then there's nothing you can do to bypass it short of putting in an suid program to force a permissions set, use suphp to do the same, or just grant the required permissions.
Another option is to grant Apache write permissions in a SUBdirectory off the documentroot, and then use some mod_rewrite magic to make requests for those cached files get transparently rewritten to use the subdir instead. that way you've got a writeable directory, but don't have the issues of making the parent document root writeable.

Related

Apache server file permissions and url accessibility

Is it possible to arrange file permissions/group ownership/etc in such a way that a file can be read by the function readFile() for a forced download, but it cannot be downloaded by navigating to the literal url of the file?
Maybe you could add the user that is running apache / php to the group that owns the file. And set config to read and write for owner and owner group, and no permission at all for others. (-rwxrw---- 0r 0760)
Never tested it, but it should work.
The Apache user will need read permissions. To prevent it from being navigated to, the best (and easiest) solution is to store the file outside of the web folder.

.htaccess: prevent php scripts from accessing parent/sibling directories

I'm not particularly experienced with .htaccess (outside of simple mod_rewrite, and basic deny/access), and am unsure of how to approach the following issue:
I have a directory structure as follows:
/parentDirectory
/childDirectoryOne
/childDirectoryTwo
I have a domain that points to /parentDirectory (we'll call it parent.com), and seperate subdomains for each of the children directories (we'll call them one.parent.com and two.parent.com respectively).
These are all located on a shared host. I need to be able to grant ftp access to the subdirectories, but the problem is right now, someone could upload a php file to a childDirectoryOne that scans the parent directory, thereby discovering its sibling direcotry, and can then move into the sibling directory and get sensitive information from files (like a dbConfig file).
What I have been attempting to do (with no success so far) is develop a set of .htaccess files that would prevent the scripts in the children directory from accessing the parent or sibling directories. I'm not even sure if this is possible. Unfortunately, my shared host has no support for setting up a chroot jail, so this is my last option for finding a solution (next to purchasing hosting for each and every ftp user so they can't access others information).
It's considered bad practice to allow read, write and execute permissions to a folder to people you don't absolutely trust.
The ability to upload an arbitrary script and execute it on the server is a very big deal (them accessing another folder is the least of your worries). People can completely destroy your server and all sites on it, access your db, overwrite other pages in any site, and the list goes on.
I would recommend disabling php entirely for uploaded files. You can put this in your .htaccess.
php_flag engine off
That being said, if you really want to do it this way, you can use the open_basedir.
<Directory /parentDirectory/childDirectoryOne>
php_admin_value open_basedir "/parentDirectory/childDirectoryOne"
</Directory>
NOTE!
You need to utilize safe_mode too, otherwise with shell(),exec()... you will be hacked.... BUT!! that's not enough. Read here fully - https://puvox.software/blog/restrict-php-access-upper-directory/

Folder permissions when telling PHP to save a file to that folder?

I'm trying to use this Dagon Design PHP form to help a local non-profit publication enable their readers to submit photos. I've got the "mailer" part working -- the notifications work fine -- but the "saving a file to a folder" part isn't functioning.
On the form page, the author says "the directory must have write permissions," but I'm not sure "who" is writing to that folder -- is this PHP script considered "Owner" when it saves something on my site? Or do I need to allow save permissions for Owner, Group and Others?
I'm not sure why the script isn't saving the photos, but this seems like a good place to start. I've tried looking around on Stack for answers, but most questions seem to have to do with folder creation/permissions.
The page I'm clumsily trying to build is here, if that helps.
As Jon has said already, you don't want to allow write access to everyone.
It's also possible (depending on the hosting) that something like suEXEC is being employed - which will cause your PHP script to run as a user other than the webserver's (as reported by Dunhamzzz).
Probably your best approach, in my opinion, is a script calling whoami:
passthru('whoami');
Or alternatively you could try:
var_dump(posix_getpwuid(posix_geteuid()));
Bear in mind, this does give system information away to the world - so delete the script once you've used it!
Then, as you've correctly asserted in your question, it'll likely be the file permissions.
If you do have CLI access, you can update the permissions safely as so (first command gets the group)
id -n -g <username>
chmod 770 <directory>
chown <username>:<group> <directory>
(You may have to pre-pend "sudo" to the "chown" command above, or find other means to run it as "root"..., reply back if you get stuck.)
If you've not got access to run command-line, you'll presumably be doing this via a (S)FTP client or the alike. I'm afraid the options get a little to broad at that point, you'll have to figure it out (or reply back with the client you're using!)
As always, YMMV.
Finally, bear in mind if this is your own code, people will at some point try uploading PHP scripts (or worse). If that directory is accessible via a public URL ... you're opening the hugest of security holes! (.htaccess, or non-document root locations are your friend.)
If you are not sure how is your server configured (and this would influence who's the final file owner) then add write permission to anyone (chmod a+w folder), upload one file and ls -l to see the owner. Then you can adjust permissions to allow write access to certain users only
The PHP script that saves the files is running with the privileges of some user account on the server; the specific account depends on your OS and the web server configuration. On Linux and when PHP is running as an Apache module this user is the same user that Apache runs as.
Solving your problem reduces to determining which user account we are talking about and then ensuring that this user has permission to write to the save directory (either as owner or as a member of the group; giving write access to everyone is not the best idea).
You'll need to set the permissions of the directory to that of the webserver (probably Apache, nginx or similiar), as that's what is executing the PHP.
You can quickly find out the apache user with ps aux | grep apache, then you want to set the permssions of the upload directory to that user, something like this:
chown -R www-data:www-data images/uploads

Am I wrong to make directory executable?

I am writing a file upload using Zend_Form_Element_File(). I created a directory called users in the public directory. When I load the file, I got an error saying page is not found. I check the directory and saw that the permission is drwxr-xr-x. So I change the permission to drwxrw-rw- and load the page again. The page loads properly. But when I upload a file, it produces an error again. So I finally change the permission to drwxrwxrwx and everything runs properly.
My question is that am I doing the usual way that others are doing? I found it strange to make a directory executable.
Can someone explain whether I'm doing it correct? I am just learning Zend framework.
Directories must be executable if a program should be able to "enter" it. Entering a directory basically means accessing any file/directory below that directory.
Having "read" access to a folder allows you to list its contents - what "write" access does is pretty obvious.
However, for security reasons you should check if drwxrwx--- (770) is not sufficient; often your user and the webserver share a common group. If that's the case, there's no need to give any access to "world".
It would be even better to run your scripts as the same user as you - by using fastcgi that wouldn't be too hard, but if you are on shared hosting you usually do not have the necessary access to do this.
Typically when you set permissions on the directory it is so they cascade down to the files within via extended ACLS in the majority of cases. The issue that I see immediately is that you have granted world access which is a bad idea. The only user that needs permissions to the directory (700 at max) is going to be your web server. So I would revert security to be 700 asap.

PHP/CHMOD Questions

I am working on a PHP based website. In the admin there is a section that checks a form field and based on the field looks for a folder on the server. This folder will be in a sub-directory. If it does not exist it needs to be created. After that, previously existing or not, PHP will write file to the folder.
These folders will hold images and PDF files that will be viewed and/or downloaded on the main site.
Here is an example directory structure: merchants/east/user123
In the above merchants and east would definitely exist and user123 may exist or otherwise be created.
Given that info my questions are about folder permissions.
What should folders be set to for the best security.
Should I open them up wider during operations then chmod them (in PHP) after I'm done to something more secure?
What should upper level folders be set to?
770 would be a safe bet for the files. Setting it to that would disallow any public access. I would implement some sort of document delivery system in PHP. PHP will be able to access the non-public files and then send them to the user.
The upper level folders could be set to the same.
Update
As others have said, you can easily chmod them to 600 without any issues. That's the more secure way of handling it (prevents other users on the system from accessing the files). It also omits "execute", which isn't needed for file reading anyway. It's my personal practice to leave the extras in unless there's a defined reason not to.
The upper level folder would need to have read, write and execute permissions for the apache user., the top level folder could be owned by apache, and have permissions like 755 to allow the the webserver to read, write and list files.
You might think about permissions 750 or 700 if you are particularly concerned about other local users or services on the web server from seeing the files in this directory.
For file permissions: 644 or 600 as conventionally they do not need execute permission.
A nice compromise might be to use 750 for directories and 640 for files with owner set to apache, and change the group (chgrp) so that the group for the file allows access to the user that you normally edit the website files with.
I can't think of any significant advantage of the php script increasing and then reducing the permissions.
I think you should consider #chunk's comment about keeping the uploaded files own of the public html directory completely, and serving them back via an file delivery script. Otherwise you would need some careful validation of the content of the files and to tightening up the apache configuration for that particular directory - perhaps using some mimetype checking to make sure that the files really are docs and pdfs.

Categories