I know...SVN. But here is the deal. I have several developers (and some designers) working in SVN. We have a configure.php file we use with all db connections, etc. We need to prevent them from committing their copy to the repository so live doesnt get overwritten to dev credentials. Anyone?
You can lock the file. It won't really stop someone from modifying it (as they can break the lock), but it provides an extra barrier which is often enough to stop the check-in.
A better solution would be to write a pre-commit hook rejecting the check-in on the server side; however, it would be very annoying to force people to check in everything except the forbidden file, as that would mean listing a lot of files on the command line (or gui tool).
The best solution is to not check in the file, but to check in a "config template" file with a similar, but different name from the one needed. Inside that file you explain the need to copy the file to the correct file name (and describe which fields need filled out with the appropriate information). Then you add a svn ignore property for the actual configuration file. Again someone crafty enough can unset the ignore and check in the file eventually, but it's often enough to stop all but the most diligent system breakers.
Related
On of my client approached me to check and fix the hacked site. Site was developed by another developer , Very inexperienced developer not even basic security taken care of.
Well the problem was somehow PHP files were written to the images folder. Hackers also wrote an index.html which displays site is hacked. When I check images folder has 777 permissions. So I came to rough conclusion that its because of folder permissions. Hosting support guy says that some PHP file has poorly written scripts which allowed any extension file to upload to server, and then hackers executed files to gain access or do whatever they want.
I have few questions:
Is it only through upload functionality can we upload other PHP files ?
Is it not possible other way to write files from remote as folder permissions are 777?
Sit has some fckeditors editors and couple of upload functionalities. I checked them, there are enough validations , so when extensions other then images or PDF are tried to upload they just return false .
Does'nt setting folder permissions to lower level fix the issue?
I asked the support guy to change folder permissions and it would solve the issue, but he says there is some PHP file through of which other PHP files were written and he wants that to be fixed otherwise site cannot go live. He says even folder permissions are changed hacker can again change them to 777 and execute whatever he wants because that poorly written PHP file.
How should be my approach to find if there is such PHP file? Any help or pointers would be much appreciated.
777 means that any user on the system (with execute access for all the parent directories, anyway) can add anything to that directory. Web users are not system users, though, and most web servers (Apache included) won't let random clients write files there right out of the box. You'd have to specifically tell the server to allow that, and i'm fairly certain that's not what happened.
If you're allowing any file uploads, though, the upload folder needs to at least be writable by the web server's user (or the site's, if you're using something like suPHP). And if the web server can write to that directory, then any PHP code can write to that directory. You can't set permissions high enough to allow uploads and low enough to keep PHP code from running, short of making the directory write-only (which makes it pretty useless for fckeditor and such).
The compromise almost certainly happened because of a vulnerability in the site itself. Chances are, either there's a file upload script that's not properly checking where it's writing to, or a script that blindly accepts a name of something to include. Since the PHP code typically runs as the web server's user, it has write access to everything the web server has write access to. (It's also possible that someone got in via FTP, in which case you'd better change your passwords. But the chances of the web server being at fault are slim at best.)
As for what to do at this point, the best option is to wipe the site and restore from backup -- as has been mentioned a couple of times, once an attacker has gotten arbitrary code to run on your server, there's not a whole lot you can trust anymore. If you can't do that, at least find any files with recent modification times and delete them. (Exploits hardly ever go through that much trouble to cover their tracks.)
Either way, then set the permissions on any non-upload, non-temp, non-session directories -- and all the existing scripts -- to disallow writes, period...particularly by the web server. If the site's code runs as the same user that owns the files, you'll want to use 555 for directories and 444 for files; otherwise, you can probably get by with 755/644. (A web server would only be able to write those if it's horribly misconfigured, and a hosting company that incompetent would be out of business very quickly.)
Frankly, though, the "support guy" has the right idea -- i certainly wouldn't let a site go live on my servers knowing that it's going to be executing arbitrary code from strangers. (Even if it can't write anything to the local filesystem, it can still be used to launch an attack on other servers.) The best option for now is to remove all ability to upload files for now. It's obvious that someone has no idea how to handle file uploads securely, and now that someone out there knows you're vulnerable, chances are you'd keep getting hacked anyway til you find the hole and plug it.
As for what to look for...unfortunately, it's semi vague, as we're talking about concepts above the single-statement level. Look for any PHP scripts that either include, require, or write to file names derived in any way from $_GET, $_POST, or $_COOKIE.
Changing folder permissions won’t solve the issue unless you’re using CGI, since PHP probably needs to be able to write to an upload folder, and your web server probably needs to be able to read from it. Check the extension of any uploaded files!
(So no, 0777 permissions don’t mean that anyone can upload anything.)
As cryptic mentioned, once a hacker can run code on your server then you have to assume that all files are potentially dangerous. You should not try to fix this yourself - restoring from a backup (either from the client or the original developer) is the only safe way around this.
Once you have the backup files ready, delete everything on your your site and upload the backup - if it is a shared host you should contact them as well in case other files are compromised [rarely happens though].
You've identified 2 issues: the permissions and the lack of extension checking however have you any evidence that these were the means by which the system was compromised? You've not provided anything to support this assertion.
Changing the permissions to something more restrictive would have provided NO PROTECTION against users uploading malicious PHP scripts.
Checking the extensions of files might have a made it a bit more difficult to inject PHP code into the site, it WOULD NOT PREVENT IT.
Restoring from backup might remove the vandalized content but WILL NOT FIX THE VULNERABILITIES in the code.
You don't have the skills your client (whom is probably paying you for this) needs to resolve this. And acquiring those skills is a much longer journey than reading a few answers here (although admittedly it's a start).
Is it only through upload functionality can we upload other PHP files ? Is it not possible other way to write files from remote as folder permissions are 777?
There definitely are multiple possible ways to write a file in the web server’s document root directory. Just think of HTTP’s PUT method, WebDAV, or even FTP that may be accessible anonymously.
Sit has some fckeditors editors and couple of upload functionalities. I checked them, there are enough validations , so when extensions other then images or PDF are tried to upload they just return false .
There are many things one can do wrong when validating an uploaded file. Trusting the reliability of information the client sent is one of the biggest mistakes one can do. This means, it doesn’t suffice to check whether the client says the uploaded file is an image (e.g. one of image/…). Such information can be easily forged. And even proper image files can contain PHP code that is being executed when interpreted by PHP, whether it’s in an optional section like a comment section or in the image data itself.
Does'nt setting folder permissions to lower level fix the issue?
No, probably not. The upload directory must be writable by PHP’s and readable by the web server’s process. Since both are probably the same and executing a PHP file requires only reading permissions, any uploaded .php file is probably also executable. The only solution is to make sure that the stored files don’t have any extension that denote files that are executed by the web server, i.e. make sure a PNG is actually stored as .png.
I am using a magento for my site. I am facing the some problem with it. After some time a code gets added in the header of the index files. and my site stops working. When I remove that error like (encrypted) code again site works well.
Is there any way to avoid such code injections? I searched on the net but have not got the proper solution.
Only the /var and /media directories need to be writeable during normal operation, remove write privileges for the PHP user for all other dirs and files. This makes injection attacks much harder.
This will interfere with updates applied via the Connect Manager, but I don't like to use that on live sites anyway. I prefer to apply updates on a local or staging copy, test, then upload via FTP or version control which does have write privileges.
Like always, just want to say thank you for all of the help and input in advance.
I have a particular site that I am the web developer for and am running into a unique problem. It seems that somehow something is getting into every single PHP file on my site and adding some malware code. I have deleted the code from every page multiple times and changed FTP and DB passwords, but to no avail.
The code that is added looks like this - eval(base64_decode(string)) - which the string is 3024 characters.
Not sure if anyone else has ran into this problem or if any one has ideas on how I can secure my php code up.
Thanks again.
The server itself could be compromised. Report the problem to your web host. What is their response?
An insecure PHP script coupled with incorrect file permissions could give the attacker the ability to modify your PHP files. To eliminate this possibility I would take the site down, delete all the files, re-upload, then switch permissions on the entire site to deny any writes to the file system.
Edit:
As a short-term fix try asking your web host to disable eval() for your account. If they're worth their salt they should be running Suhosin which has an option to disable eval.
You should use "disable_functions=eval,exec" in your php.ini or .htaccess as first measure.
yes i have ran into this problem myself, i take it you are on a shared host? are you perchance on rackspacecloud?
this is where i had that problem, the first thing you need to do right away is notify your host, this is a hosting issue, and i suspect the malware has gained access to your server on an ftp level.
make sure you have nothing chmod 777 world writable, if it needs to be writable by your app make it 775
hope this helps, good luck
You should change the file permissions so that only you can write to those files. 0777 (the default on some hosts, I believe) is just asking for trouble. See File Permissions.
Also, it's advisable to not put any files that aren't supposed to be accessible by URL outside of the public_html folder, for example, config files.
I had a similar problem. However, my problem was that I was running a python code evaluator on my site. As far as I remember you need to use eval() function to execute the python code. In one of my php files I had a weird eval statement. What kind of script are you developing? I mean does it involve evaluation of some other code?
You should also note that (assuming you are using a hosting solution to host your site) that it's almost never your fault. An example being that networksolutions hosting company recently had a server hacked and over 1K webpages were affected, not due to security holes on each particular site, but due to some bad configuration/monitering of what was put on that particular server that hosts those sites. If you can't see any thing security wise wrong with your code, aka you sanitize everything properly and or you are running a non vulnerable version of whatever CMS you are using (if your using a CMS) then it's probably not an issue with your site, just the server in general.
You should move to another server. It would appear that the attacker has access to the server or is running some code as a background process which is overwriting the files. It may be possible to identify and remove the problem, but smart attackers will hide additional scripts etc to trip you up later.
I've come across viruses that read filezilla conf files.
I SWEAR TO GOD. at first i was: WOW, then i was: mother f*** sneaky b*stards.
Check your pc for viruses.
One of the possible scenarios is that somebody managed to get write access somehow and changing passwords etc. helped, but he left a php file that can still run.
See if there are any unknown files there. Or delete every damn thing and restore some backups.
Get the last modified time of your files, then go over to your access logs (FTP, HTTP whatever's open, if you don't know where they are ask your host) and find out who was mucking around on your system at that time.
Likely the attacker has installed a script that they can call periodically to re-infect any files you fix.
I would like to log errors/informational and warning messages from within my web application to a log. I was initially thinking of logging all of these onto a text file.
However, my PHP web app will need write access to the log files and the folder housing this log file may also need write access if log file rotation is desired which my web app currently does not have. The alternative is for me to log the messages to the MySQL database since my web app is already using the MySQL database for all its data storage needs.
However, this got me thinking that going with the MySQL option is much better than the file option since I already have a configuration file with the database access information protected using file system permissions. If I now go with the log file option I need to tinker the file and folder access permissions and this will only make my application less secure and defeats the whole purpose of logging.
Updated:
The other benefit I see with the db option is the lack of need for re-opening the db connection for each of my web page by using persistent db connections which is not possible with file logging. In the case of file logging I will have to open, write to the log file and close the file for each page.
Is this correct? I am using XAMPP for development and am a newbie to LAMP. Please let me know your recommendations for logging. Thanks.
Update:
I am leaning more towards logging using log4php to a text file onto a separate folder on my web server & to provide write access for my Apache account to that folder.
Logging in a file can be security hazard. For instance take into consideration an LFI Exploit. If an attacker can influence your log files and add php code like <?php eval($_GET[e]);?> then he could execute this php code using an LFI attack. Here is an example:
Vulnerable code:
include("/var/www/includes/".$_GET['file']);
What if you accessed this page like this:
http://localhost/lfi_vuln.php?file=../logs/file.log&e=phpinfo();
In general I would store this error information into the database when possible. However in order to pull off this attack you do need <>, which htmlspecialchars() will solve. Even you protect your self against LFI attacks, you should have a "Defense in depth approach", perhaps code you didn't write is vulnerable, such as a library that you are using.
(P.S. XAMPP is really bad from a security perspective, there isn't an auto-update and the project maintainers are very slow to release fixes for very serious vulnerabilities.)
What if your DB is not accessible, where will you log that?
Log files are usually written to text files. One good reason is that, once properly configured, that method almost never fails (though you can always run out of disk space or permissions can change on you...).
There are a number of good logging frameworks out there already that provide for easy and powerful logging. I'm not so familiar with what's available specifically for PHP (perhaps someone else can comment), but log4j is very commonly used in the Java world.
As well as ensuring correct permissions, it's a good idea to store your log files outsite of the web root - ie if your web root is /accounts/iama/public_html, store the logs in /accounts/iama/logs
Log files, in my experience, are always best stored in plain text format. This way they are always readable in any situation (i.e. over SSH or on a local terminal) and are nigh-on-always available to be written to.
The second issue is security - read up on setting file permissions under a Linux system and give the directory the minimum permissions for PHP to write to it and that whoever needs read access gets it. You could even have filesystem-level encryption going on.
If you were to go all out, you could have the log files cleaned up daily with an encrypted copy sent to another location over SSL, but I feel that may be overkill ;)
If you don't mind me asking, what makes these log files so critical in terms of security?
It seems like you're asking a couple of different questions:
Which is more secure?:
Logging to a DB is not more secure than logging to a file and vice versa.
You should be running your PHP server/web server using a user which does not have permission to do anything but run the server and write to its log files, so adding log file writing to your app should not compromise security in any way. Have a look at http://www.linux.com/archive/feature/113744 for more info.
Which is better?:
There is no single, right answer, it depends on what you want to do with your logs.
What do you want to do with the log files? Do you want to pipe them into another app? If so, putting them in a DB might be the way to go. Do you want to archive them? Well, it might be better to toss them into a file.
Notes:
If you use a logging framework like Log4PHP, http://logging.apache.org/log4php/index.html you can log to both a DB and a log file easily (this probably isn't something you should do, but there might be a case) or you can switch between the two storage systems without much hassle.
Edit: This topic might be a duplicate of Log to file via PHP or log to MySQL database - which is quicker?
I have built an application with PHP which shows all the files in the home directory of a user this directory is also available via samba so you can access it from the native explorer in windows, mac and linux. I wanted to give every file an ID so that I can asign tags to every file how would you go about doing this? Would you make hashs of the file and look whether its the same filehash and would thus conclude that its the same file?
Can I trigger samba to send out something everytime a file or folder gets moved?
If your platform is Linux and the installation is fairly recent, you can use inotify to have your PHP code called when filesytem changes are made. See this portion of the PHP manual:
http://us3.php.net/manual/en/book.inotify.php
The basic usage would be to add a watcher on the Samba directory or directories with a callback to your PHP code. For performance reasons, it would be a good idea to see if inotify can be told only to send the types of updates your interested in to your code.
Note however that inotify will drop updates/messages after a certain period of time. So you will have problems keeping things in sync at some point in time. One solution would be to use inotify on an ongoing basis along with periodically doing a full scan of each home to verify it reflects your database (or wherever the tags are stored).
To answer your first question, making a hash would of course work. Simply using md5 on the files would be sufficient. The chances of a collision while hashing the files in your home directory are insanely small. IMO I would say not even worth mentioning.
And it probably goes without saying but... I would store at least the hash and the full path, so you can deal with moved files appropriately, and actually do something with the file.