So I have a PHP application. The permissions are set to appropriate ones for www-data user and group. The problem is that I have files being generated by another server process (a java process) running under a different user that PHP needs to access. They are also in another directory not registered with the vhost.
So I'm trying to figure out the best way to do this. I think I can create a symlink to that directory in my php folder that's already accessible via the virtualhost (checking to make sure apache is following symlinks). So then I will still have to change the permissions on the actual files, right--maybe add www-data to the group that creates those files? Does that mean www-data would potentially have access to all the files owned by that group? Are the apache directory permissions enough to prevent a potential attacker from moving outside the directory with those specific files I want served?
Alternatively I could create a new virtualhost that runs under the user that owns those files, and just access the files via a different subdomain.
I could potentially see about creating the files as owned by the www-data group and having group read permissions on the files.
Anyway, just seeing whether there's a standard best practice for this issue.
Related
Good day everyone.
I am building a Laravel 4 application and I have some file permissions issues.
Once the file has been uploaded by the client, it's moved in a folder. However, the file gets very little permissions, is assigned the user www-data and can't be touched / moved by anything else.
I need to know how to dynamically permit laravel to have permissions on those files because I'm using CloudConvert to convert this file.
I'm on Ubuntu 14.04 also and my prod is on debian. I already tried to chmod -R the folder, which works for files already saved, but when new ones are created, it doesn't work anymore and stays in www-data ready-only low permissions.
Thank you for your answers.
EDIT : I might have found something : chmod g+s with given permissions. I read it gave recursive permissions to created files. I'll try once I'm home.
That account "www-data" is the default account for the web server. That is why it becomes the owner of the files and has full access to them, from a web angle. If what is happening is that you are trying to then convert the files outside the web context (like a cron job or something) then you need to make sure the same user (or maybe a group) owns the files. Have not tried CloudConvert (looks cool, will check it out) but maybe you can leverage Laravel's queues and that way execute your conversion with the web server's "www-data" account?
It was a stupid mistake on my side, I'm afraid.
I was renaming the file before curl could pass on it, making it unable to know what was the original file name (getClientOriginalName()).
I have the following code:
mkdir($thumb_dir)
which creates a directory in the proper location, but when I view the permissions it is
Owner : nobody
Group : nobody
I don't have shell access to chown. How do I prevent the user assigned as nobody and how do I delete the folder that I have already made since I don't have permission.
It's a godaddy shared server...
you can delete empty directories with rmdir().
nobody is the user that runs the apache process. You can't change the owner from within php, nor you can delete the folder using shell access (or make any changes on it whatsoever) without root permissions; you can manipulate it only through php
This happens because the Web server is run by the nobody user. Therefore, everything you do on the file system will be done with the privileges of nobody.
There is typically no way for you to change anything about that. You'll have to manage with the Apache user being different from the FTP user you have. If you create a directory with PHP, you'll only be able to delete it with PHP (using rmdir() when the directory is empty), and if you create files you will most likely have to delete them from PHP as well.
I suggest that you create your directory structure with your FTP user and keep as little PHP-generated content around as possible because of that.
You can alleviate the symptoms using permissive authorizations (with chmod), but that's generally not a super good idea security-wise.
Use rmdir($thumb_dir); to delete it.
You cannot change your PHP user on a shared server.
I'm developing a WYSIWYG type site builder in JS for people who don't know HTML/CSS. It's all done, but I want to make this as simple as possible. In a perfect world, they'd just upload all the files to their host and be done with it. The problem I'm having is, I have some files and folders that need writing to, but PHP doesn't have permission unless I CHMOD those specific files and folders to 777.
I really don't want to do this and was hoping I had some alternative, nor do I want to be criticized for forcing CHMOD 777 upon everyone. Is there anything I can do (that would be simple for my users) to allow PHP to write to files/folders without having to grant permission to EVERYONE?
I can't have PHP create the files/folders itself because it doesn't have access to write to the root directory either.
You could chgrp the files to the web server's group (or PHP's, if it's set up to run as its own user) and chmod 770 them. But that wouldn't get you much securitywise.
Alternatively, you could do what some other PHP CMSes (like Joomla) do -- when a file needs to be modified, have the server connect back to itself via FTP (with the site owner's credentials) and upload the replacement file.
Truth be told, though, any way you choose to allow people to modify files on the server is going to have its pitfalls, and securitywise will generally be almost as bad as making the whole site world-writable. No matter how you do it, i suggest you make damn sure your authentication and access control mechanisms are up to snuff, as you're taking those responsibilities upon yourself especially when you allow web users to edit files.
Have the users CHMOD 777 the root directory, have your script create the new folder, and then have them restore the root directory's permissions.
I have a script and i d like to access to home directories of users in a Linux Environment.
Web root : /var/www/html/
And there are user directories such as : /home/john
/home/david
etc.
There are files in users home directories. The permissions of user homes belong to the users. eg:
/home/david/file.txt user: david group: david
Is it possible to access these files with apache? I assume it s not because of the permission,
Is there a way around this ?
in other words, my php program under /var/www/html/index.php can acccess the files under /home/david/foo.txt
How can i get this done? Thanks.
The best way would be to have the users place the specific needed files into a pub directory, then chmod 777 that directory.
If you want to access arbitrary files in the home directory, you have to run Apache as root, which is a big security risk. (While you could change the permissions of the home directory, this can mess up a lot of programs, in my experience.)
Maybe I am a bit paranoid when it comes to these things, but in my opinion there is something conceptually wrong here:
A script, that is exposed to the web should never be given access to users' home directories. One reason for saying that is that a compromise of the web server might result in exposure of files in the home directories to anyone who can access the web server. Another reason is that files in the home directories are (at least to my understanding) a place where users keep more or less personal/private files that should not be available to other users. Otherwise they would have placed it in a public directory.
While I am not sure what your use case is, I suggest it might be better to use a different concept where Apache does not need access to the home directories of other users in the first place.
I have a PHP script that processes file uploads. The script tries to organise the files that are uploaded and may create new folders to move the files into if needed. These files will be below the www root directory (ie, a web browser will be able to access them).
My question is, what permissions should I set for the folders that get created and for the files that are moved into them (using mkdir() and move_uploaded_file())?
Your webserver needs read and write permission in those folders, execute permission should be revoked (assuming UNIX-like systems). If not, a user could upload a script and have it executed by sending a HTTP request for it.
But IMO the whole concept is a potential security hole. Better store the files in a folder outside the webserver root, so that no direct acceess is possible. In your web application, you can have a PHP download page that scans the upload directory and displays a list of download links. These download links lead to another script, that reads the fiels from you storage dir und sends them to the user.
Yes, this is more work. But the scenario is very common, so you should be able to find some source code with example implementations easily. And it it much less work that having your server hacked...
to answer it specifically 766 (no execute permissions) would be the loosest you would want to use. On the other end 700 would allow no one but the web user to mess with the file.
But really it all depends you were doing with the files that would determine the best result.