i have to admit, i never really completely got the *nix filesystem permission model. oh, the rwxrwxrwx stuff isn't too complicated, but i get confused easily when programs create new files and how i can handle them.
my current problem is the mixture of a (closed source) java-applet that does file uploads over ftp and php (it's on a dedicated server and the data isn't really critical, so i'm not all too concerned about world-writeability).
so, i have two users: ftp (1000) and apache/php (81). groups don't match, so they're basically "others", if i'm correct.
an "import" directory, set to 0777 owned by ftp.
if a visitor acesses the upload page, a subdirectory named after his username is created by my script. let's say, the visitors username is "foo", so it's "import/foo", set to 0777, user 81/php.
next, the visitor uses the java applet to upload a file to this directory (test.jpg). the file's permissions are now rw-r--r--, user ftp.
first question
the first thing i don't understand is: i'm able to unlink that file through php.
why? the users don't match, and the file isn't world writeable.
is this because of the parent directories world-write permission? understandably, i can't chown or chmod through php.
so far no problem, because as long as i can read and unlink, everythings ok.
second question
the java applet is able to upload whole directories, which is nice. if i do this, the new subdirectory import/foo/test has permissions rwxr-xr-x/ftp. the files in this directory are rw-r--r--/ftp.
now i'm out of luck. i can't do anything with those files (besides reading, which i do successfully), no unlinking, no chmod/chowning. they just sit around and gobble up diskspace.
so, what's the plan behind the default-permissions new files have? my guess is they have the permission set through umask, as long the creating script doesn't chmod them to something else. am i right?
third question
what can i do about it? i mean, what would a sensible person do? can i/should i change the umask for the ftp user? (i just learned about umask yesterday). i'm not very comfortable with this, as this would affect all ftp traffic, doesn't it? also, the server is dedicated but i'm not an admin, so my access is restricted.
i just had another idea. before starting this post i read up on the basic linux permission stuff. first, the sticky bit isn't set anywhere in the directory chain. and then, there it was: the "set user ID bit".
so, my current plan is to write a simple shell script with owner ftp that is other-executable with setuid. the script just transfers the file in the import directories ownership to user php.
then, after each import i just exec() that file from my php-script and process files further.
would that work? and more important: is that clean and legal? or would the sysadmin put a bounty on my head?
thx a lot!
update: i just tried to set the uid bit (4755) through winscp (through an ftp connection), but it somehow doesn't work - it seems to "forget" only the uid bit (the other bits get set). why is that? why can't the owner set the uid himself? is that server-specific or generally the case?
update 2:
wikipedia says it all
Due to the increased likelihood of security flaws, many operating systems ignore the setuid attribute when applied to executable shell scripts.
is it still possible for user root to set the uid-bit?
First answer
Correct, you have permission to modify the directory so you can unlink the file. Whether that file is readable or writable to you is irrelevant.
Second answer
Yes you are correct that the user's default umask will be used unless the script/applet then chmods the file permissions to something else
Third answer
Setting the default umask for ftp is the simplest solution, but as you say this affects all files then created by the ftp user. If that user is only used for the upload via the applet (and it should be really) then this isn't really an issue I'd say.
The other option is to have a cron job running that executes a script (like you suggest), chmoding/owning the files (and maybe virus scanning them etc) from the FTP upload area to somewhere on the webroot.
You don't state why having these files readonly to Apache is an issue (or is it the fact that the FTP root isn't under the web root?), maybe clarifying that would help point to a sensible solution? Generally you don't want to trust anything the user has given you unless you've vetted it first.
EDIT - just seen you're not an admin of the machine which makes things difficult.
what i learned today: the setuid bit is often deactivated for scripts.
the solution i've settled on: apache/php gets sudoer-rights for certain scripts (all scripts in one directory outside the webroot) that may run as either the ftp- or apache-user. i can then call those scripts with sudo out of php with system/exec/etc.
i didn't know sudo could be configured in this way. amazing
Related
Good day everyone.
I am building a Laravel 4 application and I have some file permissions issues.
Once the file has been uploaded by the client, it's moved in a folder. However, the file gets very little permissions, is assigned the user www-data and can't be touched / moved by anything else.
I need to know how to dynamically permit laravel to have permissions on those files because I'm using CloudConvert to convert this file.
I'm on Ubuntu 14.04 also and my prod is on debian. I already tried to chmod -R the folder, which works for files already saved, but when new ones are created, it doesn't work anymore and stays in www-data ready-only low permissions.
Thank you for your answers.
EDIT : I might have found something : chmod g+s with given permissions. I read it gave recursive permissions to created files. I'll try once I'm home.
That account "www-data" is the default account for the web server. That is why it becomes the owner of the files and has full access to them, from a web angle. If what is happening is that you are trying to then convert the files outside the web context (like a cron job or something) then you need to make sure the same user (or maybe a group) owns the files. Have not tried CloudConvert (looks cool, will check it out) but maybe you can leverage Laravel's queues and that way execute your conversion with the web server's "www-data" account?
It was a stupid mistake on my side, I'm afraid.
I was renaming the file before curl could pass on it, making it unable to know what was the original file name (getClientOriginalName()).
On of my client approached me to check and fix the hacked site. Site was developed by another developer , Very inexperienced developer not even basic security taken care of.
Well the problem was somehow PHP files were written to the images folder. Hackers also wrote an index.html which displays site is hacked. When I check images folder has 777 permissions. So I came to rough conclusion that its because of folder permissions. Hosting support guy says that some PHP file has poorly written scripts which allowed any extension file to upload to server, and then hackers executed files to gain access or do whatever they want.
I have few questions:
Is it only through upload functionality can we upload other PHP files ?
Is it not possible other way to write files from remote as folder permissions are 777?
Sit has some fckeditors editors and couple of upload functionalities. I checked them, there are enough validations , so when extensions other then images or PDF are tried to upload they just return false .
Does'nt setting folder permissions to lower level fix the issue?
I asked the support guy to change folder permissions and it would solve the issue, but he says there is some PHP file through of which other PHP files were written and he wants that to be fixed otherwise site cannot go live. He says even folder permissions are changed hacker can again change them to 777 and execute whatever he wants because that poorly written PHP file.
How should be my approach to find if there is such PHP file? Any help or pointers would be much appreciated.
777 means that any user on the system (with execute access for all the parent directories, anyway) can add anything to that directory. Web users are not system users, though, and most web servers (Apache included) won't let random clients write files there right out of the box. You'd have to specifically tell the server to allow that, and i'm fairly certain that's not what happened.
If you're allowing any file uploads, though, the upload folder needs to at least be writable by the web server's user (or the site's, if you're using something like suPHP). And if the web server can write to that directory, then any PHP code can write to that directory. You can't set permissions high enough to allow uploads and low enough to keep PHP code from running, short of making the directory write-only (which makes it pretty useless for fckeditor and such).
The compromise almost certainly happened because of a vulnerability in the site itself. Chances are, either there's a file upload script that's not properly checking where it's writing to, or a script that blindly accepts a name of something to include. Since the PHP code typically runs as the web server's user, it has write access to everything the web server has write access to. (It's also possible that someone got in via FTP, in which case you'd better change your passwords. But the chances of the web server being at fault are slim at best.)
As for what to do at this point, the best option is to wipe the site and restore from backup -- as has been mentioned a couple of times, once an attacker has gotten arbitrary code to run on your server, there's not a whole lot you can trust anymore. If you can't do that, at least find any files with recent modification times and delete them. (Exploits hardly ever go through that much trouble to cover their tracks.)
Either way, then set the permissions on any non-upload, non-temp, non-session directories -- and all the existing scripts -- to disallow writes, period...particularly by the web server. If the site's code runs as the same user that owns the files, you'll want to use 555 for directories and 444 for files; otherwise, you can probably get by with 755/644. (A web server would only be able to write those if it's horribly misconfigured, and a hosting company that incompetent would be out of business very quickly.)
Frankly, though, the "support guy" has the right idea -- i certainly wouldn't let a site go live on my servers knowing that it's going to be executing arbitrary code from strangers. (Even if it can't write anything to the local filesystem, it can still be used to launch an attack on other servers.) The best option for now is to remove all ability to upload files for now. It's obvious that someone has no idea how to handle file uploads securely, and now that someone out there knows you're vulnerable, chances are you'd keep getting hacked anyway til you find the hole and plug it.
As for what to look for...unfortunately, it's semi vague, as we're talking about concepts above the single-statement level. Look for any PHP scripts that either include, require, or write to file names derived in any way from $_GET, $_POST, or $_COOKIE.
Changing folder permissions won’t solve the issue unless you’re using CGI, since PHP probably needs to be able to write to an upload folder, and your web server probably needs to be able to read from it. Check the extension of any uploaded files!
(So no, 0777 permissions don’t mean that anyone can upload anything.)
As cryptic mentioned, once a hacker can run code on your server then you have to assume that all files are potentially dangerous. You should not try to fix this yourself - restoring from a backup (either from the client or the original developer) is the only safe way around this.
Once you have the backup files ready, delete everything on your your site and upload the backup - if it is a shared host you should contact them as well in case other files are compromised [rarely happens though].
You've identified 2 issues: the permissions and the lack of extension checking however have you any evidence that these were the means by which the system was compromised? You've not provided anything to support this assertion.
Changing the permissions to something more restrictive would have provided NO PROTECTION against users uploading malicious PHP scripts.
Checking the extensions of files might have a made it a bit more difficult to inject PHP code into the site, it WOULD NOT PREVENT IT.
Restoring from backup might remove the vandalized content but WILL NOT FIX THE VULNERABILITIES in the code.
You don't have the skills your client (whom is probably paying you for this) needs to resolve this. And acquiring those skills is a much longer journey than reading a few answers here (although admittedly it's a start).
Is it only through upload functionality can we upload other PHP files ? Is it not possible other way to write files from remote as folder permissions are 777?
There definitely are multiple possible ways to write a file in the web server’s document root directory. Just think of HTTP’s PUT method, WebDAV, or even FTP that may be accessible anonymously.
Sit has some fckeditors editors and couple of upload functionalities. I checked them, there are enough validations , so when extensions other then images or PDF are tried to upload they just return false .
There are many things one can do wrong when validating an uploaded file. Trusting the reliability of information the client sent is one of the biggest mistakes one can do. This means, it doesn’t suffice to check whether the client says the uploaded file is an image (e.g. one of image/…). Such information can be easily forged. And even proper image files can contain PHP code that is being executed when interpreted by PHP, whether it’s in an optional section like a comment section or in the image data itself.
Does'nt setting folder permissions to lower level fix the issue?
No, probably not. The upload directory must be writable by PHP’s and readable by the web server’s process. Since both are probably the same and executing a PHP file requires only reading permissions, any uploaded .php file is probably also executable. The only solution is to make sure that the stored files don’t have any extension that denote files that are executed by the web server, i.e. make sure a PNG is actually stored as .png.
I'm hosting my website in a shared server, so my options are limited. For instance, I don't have access to the exec function.
My problem is that my very easy addLog PHP function needs rights to write into log files. I had read in another post that the PHP user is usually assimilated as "others" in the traditional UNIX owner-group-other permission scheme.
However, since I built my own www structure, I was thinking of letting my ./log directory's permissions be set to 777 so the script could write to the necessary logs.
The application only needs to be able to write to the log files; no read or execute permissions are necessary. (I don't even know what they are when it comes to UNIX permissions) Additionally, I am not storing any database information in the logs; I may, however, store stack traces.
Is there a security risk to set the log directory's permissions to 777?
There's nothing in this folder except logs.
Are the read and execute rights necessary on this directory?
I only really need to write (append) into the logs.
Remember that it is possible that an application error could contain database login information, query strings, as well as file structure of your application. There may be certain files that you have hidden from users and robots, but are still in the document root. By allowing anybody (777) on the shared environment to view the log (any) files you open yourself to more risk.
If PHP is writing the log it should also be able to read the log, in theory. Bottom line is making the file owner the same user that will be writing to the file. Then use appropriate permissions.
Of course if you are not concerned with anything on your site being secure, and there are no DB connections, it really doesn't matter.
Really depends on the shared environment. I have seen poorly set up shared hosting where users are able to FTP traverse and view others files/folders. 777 is not a good idea on shared hosting. (Better to be safe, than sorry)
this might seem like a stupid question but I've Googled to no avail.
I've always thought of PHP as a language for creating dynamic database driven sites, and I've never thought about using it to move system files on the actual server (as I have never had a need to). My question is:
can a standard PHP 5.3.x.x installation move, copy or edit system files (I'm using a Linux sever as an example) around in /bin or maybe /etc?
is this a good idea/practise?
It has never occurred to me that if a malicious hacker were to be able to inject some PHP into a site, that they would effectively be granted access to the entire Linux server (and all its system files). I have only ever thought of PHP as something that operates inside the /vhosts directory (perhaps naively).
Sorry if this sounds like a stupid question, but I can't really test my theory as if my boss was to see me writing/uploading/executing a script that moved stuff around in the Linux file system I would be dead.
Thanks for your help guys! :)
PHP can to your server whatever the permissions of the user account it runs as allow it to do. PHP as a language is not restricted in any way (at least, in terms of permissions), it is the user account that is restricted.
This is why people will usually create a user for Apache/nginx/insert web server here to run as, and only give it permissions to manipulate files and directories related to the web server. If you don't give this user access permissions to /bin or /etc, it's can't do anything that will affect them.
is this a good idea/practice?
Normally not. Leave system administration to your sysadmin and not the user requesting your PHP scripts.
PHP can attempt to call many system commands to move or directly edit files on the hard disk. Whether it succeeds depends on the security settings.
Let's assume your running PHP thru apache and apache is set up to run all processes as the user www-data - a default setup for OS's like Debian. If you give the user www-data permission to edit /etc then yes, PHP can read and write to files in /etc
There is only one major drawback as you identified; security, security and security. You also better be sure that your PHP works properly as 1 wrongly written file could now take down the entire server.
I would also definitely not practice on your server behind your bosses back. Look into getting a cheap virtual machine, either hosted elsewhere or on your own machine curtsey of VirtualBox
Yes it can. Its a programming language, it can do anything.
It completely depends who is running it. If its root it can do anything. If its just a normal user bob. It can not do much outside the home /home/bob. Apache is also like bob. Apache usually runs under www-data, www, apache user names.
I'm developing a WYSIWYG type site builder in JS for people who don't know HTML/CSS. It's all done, but I want to make this as simple as possible. In a perfect world, they'd just upload all the files to their host and be done with it. The problem I'm having is, I have some files and folders that need writing to, but PHP doesn't have permission unless I CHMOD those specific files and folders to 777.
I really don't want to do this and was hoping I had some alternative, nor do I want to be criticized for forcing CHMOD 777 upon everyone. Is there anything I can do (that would be simple for my users) to allow PHP to write to files/folders without having to grant permission to EVERYONE?
I can't have PHP create the files/folders itself because it doesn't have access to write to the root directory either.
You could chgrp the files to the web server's group (or PHP's, if it's set up to run as its own user) and chmod 770 them. But that wouldn't get you much securitywise.
Alternatively, you could do what some other PHP CMSes (like Joomla) do -- when a file needs to be modified, have the server connect back to itself via FTP (with the site owner's credentials) and upload the replacement file.
Truth be told, though, any way you choose to allow people to modify files on the server is going to have its pitfalls, and securitywise will generally be almost as bad as making the whole site world-writable. No matter how you do it, i suggest you make damn sure your authentication and access control mechanisms are up to snuff, as you're taking those responsibilities upon yourself especially when you allow web users to edit files.
Have the users CHMOD 777 the root directory, have your script create the new folder, and then have them restore the root directory's permissions.