Is it possible to restrict PHP's file access to its document root? Basically on my work's server we have our domains in a file structure like:
/home/something/domains/domain1/
/home/something/domains/domain2/
/home/something/domains/domain3/
Right now, any script on domain1 has read and write access to anything in /home/something/domains/ including all of our other domains. I would like to restrict file access for a script in domain1 to only that domain.
The server is hosted with mediatemple and it is their grid service, so I don't have root access or even vhost config access. We can change php.ini, and i know it offers open_basedir, but it doesn't sound like that solves my problem as I could only restrict file access to /domians/ and not the individual domains. Any help appreciated.
What I'm really trying to do:
This server was recently hacked, and the hackers were overwriting domains/.htaccess which affected all our sites. We have tons of sites and many of them have lots of lines of bad code. They uploaded WSO, a hacking backdoor/shell which gave them full access to everything. I don't know how they got access, I guess it was either from the timthumb exploit, one of the millions of lines of bad code, or they just got our FTP password somehow. We updated old timthumbs, changed the password and removed all the bad files we found, but since there is a decent chance whatever exploit they found is still on the server, or that we missed some backdoor, I would at least like to limit their access to the actual domain that contains the exploit or unfound backdoor.
My initial thought was to set open_basedir for each of the virtual hosts (even if you have to ask your host admin to do it for you), but I am doubtful that will even work because I am fairly certain external/shell commands that run in PHP/scripts will still work on directories outside of the designated tree.
After more consideration, the closest way to configure your setup and get what you want, that I could think of, would be to set up chroot-jailed user accounts for each vhost and have your webserver use those user accounts through a mechanism like the Apache 2 MPM ITK, which I can only assume your hosting provider will have trouble setting up.
Related
I'm trying to replace the contents of a file in a different folder than my php file folder, and I'm getting an error: "Failed to open stream, Permission Denied"
I'm using the put_file_contents function to change the file contents. I searched online for a solution to this problem and found that the file directory is allowed to be written only by the owner/user. I checked directory properties in filezilla (ftp), and found that the directory was not writable by group or public.
In filezilla, I tried allowing the directory to be written by public, and the php file was able to write to the folder's file.
Therefore, I think I can easily set the permissions to the file only, and not the directory, and easily replace it's content by setting the permission as writable by public. Although I don't understand what owner/group/public options mean? Cause this is supposed to be a website's webserver hosted on a paid domain host, and I'm not sure if the public write option is safe or not, or why would there be user groups for a webserver hosting only a single website?
Since only a php file can change the contents on a webserver, why is a public option provided for a webserver? If it's for uploads, then that too means the upload page resides on the server! I cannot access the terminal on ftp or cpanel therefore I cannot execute chmod etc...
Please could someone provide more detail regarding security risks to files with public write permissions?
A server is still a computer, like any of them. This means that you can create several users, and that a lot of things can happen.
Your server might not necessarily stay only a web server. Obviously it is also an FTP server, and you probably also can access it using ssh. In the future you might want to use it for another purpose.
In any case, even if you use it just as a web server, setting a folder's permission as writable to the every user is dangerous, because of the security risks that your code most inevitably will have. It can act as a safeguard : even if you write something dangerous in the future, it contains the damage.
That is the reason why on some systems, the software responsible for serving http requests (typically Apache) runs as a specific user httpd.
User groups is just what it says it is : defined groups of users. You can read about it easily online.
A strange thing occurred today. I have made a CI based site, and a hacker managed to:
Overwrite my index.php file by making a file upload to root;
Inject code direct into my index.php replacing everything with a dummy html formatted page;
I don't know which of the above actual occurred.
The site is quite simple (no input forms, no db ecc.), I started developing it with CodeIgniter since client didn't know what he wanted, so I ended up using the framework just for templating and compressing.
I have strong doubts whether a security hole was offered to the hacker on the PHP side. I am incline to believe the issue is from my hosting service bad server configuration (I had a bad chat with them, they say they will look into it)
I find it very curious that only the index.php was (apparently) modified (application and system are also in the root since I do not have FTP access above, maybe if I were an hacker I would have deleted any file in root before allowing my fancy index to showy perform)
How did this happen? What do you think is most likely possible?
Unfortunately no one will give you a straight answer without full access to the server, the server and system logs etc. It could be one of many things, if you are on a shared hosting, simply bad configuration of the server will often mean enough (meaning if a person compromises one site, he compromised them all). It could be outdated services on the server, where the attacker used a publicly available exploit. It also might be CI based exploit, private or public...
Chances are, if you are confident that your website couldn't have been hacked, it will most likely be a badly configured shared hosting environment and permissions, allowing the attacker to access system commands and folders that don't belong to the user, which often would've been followed by uploading a php shell via a vulnerable site and from there it would be as simple as browsing folders of a web server.
Second likely I would say is that it could have been outdated exploitable service running on the shared host.
If there is any "signature" in the html you were talking about, you might want to try to google it and see what returns. Also you might want to try to execute some system commands via PHP (something you shouldn't be able to access like ls level below your web root; if you are able, it is likely the attacker access your files that way.
I rule a server where several users have their own webdir, set on /home/user/public_html.
Nginx is running as http.
Everything works OK, but if a php requires a file to be created, it outputs a permission error.
How may I fix this?
Is possible to tell nginx to create all files under /home/username as "username"?
Regards && TIA ^^.
Yes, this is possible. Actually it's not Nginx, it's the FastCGI PHP module which is acting as a specific user. There is a good explanation of how to setup a Debian/Ubuntu system to do that.
Additionally I would create two users and one group per account (e.g. user1, user1-www and user1-group). For FTP you can use user1/user1-group. But the FastCGI PHP module can be configured to act as user1-www/user1-group.
Now you can limit read and write access to the folders of one account.
Your clients are free to allow or forbid write access to any file or directory based on their requirements.
If a PHP process can modify PHP code, you are exposed to be hacked and sites to be manipulated. Limiting write access to data files only reduces this danger. If you are on your own I wouldn't mind but providing hosting services to other people increased your responsibility.
On my webserver. I have two directories which are publicly accessible. One of those is httpdocs/ and the other is httpsdocs/. You can probably guess where they're for.
The problem occurs when I wanted to create a CMS for myself. I want it to load the page with HTTPS for security reasons, but the problem is that my files are inside the httpdocs directory. I can move only the files for the system to the httpsdocs, but my next thought was what if I want a login on the normal website for people to login and view hidden content for not registered users?
Now my question is: Is there a way to combine those to directories so all the files will be inside the httpdocs and I can access them with the security HTTPS provides?
Things that you may need to know:
The webserver runs php 5.2.9
I only have FTP access to the webserver and to these directories.
The server runs apache, so .htaccess is possible.
There are a ton of avenues, but most are blocked if you can't alter the httpd.conf or certain restrictions are in place.
Your best bet would probably to simply abandon http: and just stick to https: and create a redirecting .htaccess in the http: folder. That way everything runs on a single directory.
Unless there is a drawback that would not allow your users to use HTTPS over HTTP.
Getting your ISP to make the https folder a symlink to the http folder would be the easiest solution here.
It sounds like you are likely using a hosting service that has things seperated. If this is the case then the answer is no, you can't combine these two directories into one. You can ask your provider if they can make that arrangement, but it would require changes to the apache configuration that you can't do.
That being said, barring the ability to have that configuration modified, the only other answer you are going to find is to move everything under the httpsdocs directory. Unless you can get another account setup or possibly if they offer the ability to setup subdomains with HTTPS connections. That would be an alternative solution.
I have a Joomla 1.0 website running on a shared host which I don't have shell access (only FTP available). Recently my website has been marked as malware site by Google and I notify that the .htaccess file is modified with malicious contents. These redirections rule to a website called 'depositpeter.ru' are added to the .htaccess:
ErrorDocument 400 http://depositpeter.ru/mnp/index.php
ErrorDocument 401 http://depositpeter.ru/mnp/index.php
...
If I clean this .htaccess file, it will be modified back with malicious contents a few minutes later.
I suspect there are some backdoor PHP and javascript has been injected to our codebase which constantly modifies the .htaccess file. However I have no idea how these malware landed on my site in the first place. I'm pretty sure that no FTP users have uploaded those to my site. A virus scan found that there is a user-uploaded image being injected with PHP.ShellExec malware (I'm not sure how this PHP.ShellExec work and if it is related to the .htaccess virus though).
My question is how should I start troubleshooting and cleaning this malware? I'm pretty clueless and have little experience dealing with web malware. Any help is greatly appreciate!
It might be beyond your power to fix this yourself. But here are some things that you should do.
Download any apache/php logs you have - these can point to the security holes being exploited. If you can find the entries, make sure the holes are covered.
Remove the image that is indicated as infected.
Contact your host - several hosting companies have automated solutions to find and clean up common vulnerabilities. Also, if your site is infected, odds are, other clients on the same server are, too.
Conversely, it might be another client on the same server that's causing this problem for you.
Add an .htaccess file in the uploads directory that would prevent access to anything other than uploaded images. It might look something like this:
Order deny,allow
Deny from all
<FilesMatch "\.(jpe?g|bmp|png)$">
Allow from all
</FilesMatch>
If your host hasn't blocked functions that allow php to invoke system commands (you'd be surprised) and you know what to do, you can mimic shell access with a custom php script using system, exec, popen and some other functions. I use a script I made myself: https://github.com/DCoderLT/Misc_Tools/blob/master/sh/sh.php . It's fairly primitive, but got the job done when I needed it to.
Future considerations:
Make backups. Your hosting company might provide these going back a certain period of time.
Keep an eye on the updates. Subscribe to the Joomla announcements mailing list. Apply these updates as quickly as you can. Popular applications like Joomla and WordPress are a frequent and easy target for script kiddies and automated bots.
Make backups.
Make sure your hosting company has the server set up properly, so that user A cannot affect user B's files (file permissions, suexec or similar). I don't know how common this is these days, but it used to be a frequent oversight in the past.
Make backups.
Don't leave write permissions enabled on files and folders that don't need it.
Make backups.
What kind of PHP-Framework/CMS are you running there? First thing would be to get an update there. Second idea would be to remove the write-right on these directories, where the PHP-Shell gets put. Third thing I'd do is to remove the php-shell (try to find files that dont belong to your cms/framework).
good luck