How to lock as in, prevent change to an uploaded file - php

I am working on a web application where file uploads and revisions are tracked. Once a project is ready, it is submitted for an approval process. I want to, upon "submit for approval" lock down all the projects attached files to prevent further changes.
The file uploads are handled by my own simple forms, and the files are tracked in a mySql db.
Is there anyway to set the files as read only so they can not be deleted, renamed, moved, etc. But still be viewed? (prevent change even by FTP or a cPanel file manager)
The idea is to protect the integrity of what has been approved. At the least I will be using .htaccess to prevent viewing the uploads folder directly.
Obviously, someone could SSH into the server and SUDO SU and do whatever, but, I am thinking for the less tech savy folks who need GUI.

You want application level control, not operating system level control. I.e., don't set the files to read only. Instead, have your application recognize that the project is in "submitted" status, and disable the features that would allow a user to change them.
If you're worried about somebody changing files with FTP or cPanel outside the project app then you likely have a personnel issue, not a technical one. I'm never a fan of using technology to solve a personnel issue.
That said, to additionally ensure that no OS-level changes have occurred, you can generate a hash of the contents of all the component files at the time the project is submitted, and then store those hashes in the database along with the project data. This will allow you, at any later date, to determine if the files have changed since.

Related

php security issue - file uploads

On of my client approached me to check and fix the hacked site. Site was developed by another developer , Very inexperienced developer not even basic security taken care of.
Well the problem was somehow PHP files were written to the images folder. Hackers also wrote an index.html which displays site is hacked. When I check images folder has 777 permissions. So I came to rough conclusion that its because of folder permissions. Hosting support guy says that some PHP file has poorly written scripts which allowed any extension file to upload to server, and then hackers executed files to gain access or do whatever they want.
I have few questions:
Is it only through upload functionality can we upload other PHP files ?
Is it not possible other way to write files from remote as folder permissions are 777?
Sit has some fckeditors editors and couple of upload functionalities. I checked them, there are enough validations , so when extensions other then images or PDF are tried to upload they just return false .
Does'nt setting folder permissions to lower level fix the issue?
I asked the support guy to change folder permissions and it would solve the issue, but he says there is some PHP file through of which other PHP files were written and he wants that to be fixed otherwise site cannot go live. He says even folder permissions are changed hacker can again change them to 777 and execute whatever he wants because that poorly written PHP file.
How should be my approach to find if there is such PHP file? Any help or pointers would be much appreciated.
777 means that any user on the system (with execute access for all the parent directories, anyway) can add anything to that directory. Web users are not system users, though, and most web servers (Apache included) won't let random clients write files there right out of the box. You'd have to specifically tell the server to allow that, and i'm fairly certain that's not what happened.
If you're allowing any file uploads, though, the upload folder needs to at least be writable by the web server's user (or the site's, if you're using something like suPHP). And if the web server can write to that directory, then any PHP code can write to that directory. You can't set permissions high enough to allow uploads and low enough to keep PHP code from running, short of making the directory write-only (which makes it pretty useless for fckeditor and such).
The compromise almost certainly happened because of a vulnerability in the site itself. Chances are, either there's a file upload script that's not properly checking where it's writing to, or a script that blindly accepts a name of something to include. Since the PHP code typically runs as the web server's user, it has write access to everything the web server has write access to. (It's also possible that someone got in via FTP, in which case you'd better change your passwords. But the chances of the web server being at fault are slim at best.)
As for what to do at this point, the best option is to wipe the site and restore from backup -- as has been mentioned a couple of times, once an attacker has gotten arbitrary code to run on your server, there's not a whole lot you can trust anymore. If you can't do that, at least find any files with recent modification times and delete them. (Exploits hardly ever go through that much trouble to cover their tracks.)
Either way, then set the permissions on any non-upload, non-temp, non-session directories -- and all the existing scripts -- to disallow writes, period...particularly by the web server. If the site's code runs as the same user that owns the files, you'll want to use 555 for directories and 444 for files; otherwise, you can probably get by with 755/644. (A web server would only be able to write those if it's horribly misconfigured, and a hosting company that incompetent would be out of business very quickly.)
Frankly, though, the "support guy" has the right idea -- i certainly wouldn't let a site go live on my servers knowing that it's going to be executing arbitrary code from strangers. (Even if it can't write anything to the local filesystem, it can still be used to launch an attack on other servers.) The best option for now is to remove all ability to upload files for now. It's obvious that someone has no idea how to handle file uploads securely, and now that someone out there knows you're vulnerable, chances are you'd keep getting hacked anyway til you find the hole and plug it.
As for what to look for...unfortunately, it's semi vague, as we're talking about concepts above the single-statement level. Look for any PHP scripts that either include, require, or write to file names derived in any way from $_GET, $_POST, or $_COOKIE.
Changing folder permissions won’t solve the issue unless you’re using CGI, since PHP probably needs to be able to write to an upload folder, and your web server probably needs to be able to read from it. Check the extension of any uploaded files!
(So no, 0777 permissions don’t mean that anyone can upload anything.)
As cryptic mentioned, once a hacker can run code on your server then you have to assume that all files are potentially dangerous. You should not try to fix this yourself - restoring from a backup (either from the client or the original developer) is the only safe way around this.
Once you have the backup files ready, delete everything on your your site and upload the backup - if it is a shared host you should contact them as well in case other files are compromised [rarely happens though].
You've identified 2 issues: the permissions and the lack of extension checking however have you any evidence that these were the means by which the system was compromised? You've not provided anything to support this assertion.
Changing the permissions to something more restrictive would have provided NO PROTECTION against users uploading malicious PHP scripts.
Checking the extensions of files might have a made it a bit more difficult to inject PHP code into the site, it WOULD NOT PREVENT IT.
Restoring from backup might remove the vandalized content but WILL NOT FIX THE VULNERABILITIES in the code.
You don't have the skills your client (whom is probably paying you for this) needs to resolve this. And acquiring those skills is a much longer journey than reading a few answers here (although admittedly it's a start).
Is it only through upload functionality can we upload other PHP files ? Is it not possible other way to write files from remote as folder permissions are 777?
There definitely are multiple possible ways to write a file in the web server’s document root directory. Just think of HTTP’s PUT method, WebDAV, or even FTP that may be accessible anonymously.
Sit has some fckeditors editors and couple of upload functionalities. I checked them, there are enough validations , so when extensions other then images or PDF are tried to upload they just return false .
There are many things one can do wrong when validating an uploaded file. Trusting the reliability of information the client sent is one of the biggest mistakes one can do. This means, it doesn’t suffice to check whether the client says the uploaded file is an image (e.g. one of image/…). Such information can be easily forged. And even proper image files can contain PHP code that is being executed when interpreted by PHP, whether it’s in an optional section like a comment section or in the image data itself.
Does'nt setting folder permissions to lower level fix the issue?
No, probably not. The upload directory must be writable by PHP’s and readable by the web server’s process. Since both are probably the same and executing a PHP file requires only reading permissions, any uploaded .php file is probably also executable. The only solution is to make sure that the stored files don’t have any extension that denote files that are executed by the web server, i.e. make sure a PNG is actually stored as .png.

Securing Uploaded Files (php and html)

I have a simple site which allows users to upload files (among other things obviously). I am teaching myself php/html as I go along.
Currently the site has the following traits:
--When users register a folder is created in their name.
--All files the user uploads are placed in that folder (with a time stamp added to the name to avoid any issues with duplicates).
--When a file is uploaded information about it is stored in an SQL database.
simple stuff.
So, now my question is what steps do I need to take to:
Prevent google from archiving the uploaded files.
Prevent users from accessing the uploaded files unless they are logged in.
Prevent users from uploading malicious files.
Notes:
I would assume that B, would automatically achieve A. I can restrict users to only uploading files with .doc and .docx extensions. Would this be enough to save against C? I would assume not.
There is a number of things you want to do, and your question is quite broad.
For the Google indexing, you can work with the /robots.txt. You did not specify if you also want to apply ACL (Access Control List) to the files, so that might or might not be enough. Serving the files through a script might work, but you have to be very careful not to use include, require or similar things that might be tricked into executing code. You instead want to open the file, read it and serve it through File operations primitives.
Read about "path traversal". You want to avoid that, both in upload and in download (if you serve the file somehow).
The definition of "malicious files" is quite broad. Malicious for who? You could run an antivirus on the uplaod, for instance, if you are worried about your side being used to distribute malwares (you should). If you want to make sure that people can't harm the server, you have at the very least make sure they can only upload a bunch of filetypes. Checking extensions and mimetype is a beginning, but don't trust that (you can embed code in png and it's valid if it's included via include()).
Then there is the problem of XSS, if users can upload HTML contents or stuff that gets interpreted as such. Make sure to serve a content-disposition header and a non-html content type.
That's a start, but as you said there is much more.
Your biggest threat is going to be if a person manages to upload a file with a .php extension (or some other extension that results in server side scripting/processing). Any code in the file runs on your server with whatever permissions the web server has (varies by configuration).
If the end result of the uploads is just that you want to be able to serve the files as downloads (rather than let someone view them directly in the browser), you'd be well off to store the downloads in a non web-accessible directory, and serve the files via a script that forces a download and doesn't attempt to execute anything regardless of the extension (see http://php.net/header).
This also makes it much easier to facilitate only allowing downloads if a person is logged in, whereas before, you would need some .htaccess magic to achieve this.
You should not upload to webserver-serving directories if you do not want the files to be available.
I suggest you use X-Sendfile, which is a header that instructs the server to send a file to the user. Your PHP script called 'fetch so-and-so file' would do whatever authentication you have in place (I assume you have something already) and then return the header. So long as the web server can access the file, it will then serve the file.
See this question: Using X-Sendfile with Apache/PHP

Allowing Users To Upload ANY File. Is My Method Insecure?

I'm creating a file sharing site, similar to Megaupload or Rapidshare. Just like those sites mentioned I need to allow ANY filetype.
I was thinking about a solution, and need to know if there is any security risks with it, or is there a better solution to my problem?
User uploads file
Check file size, if below 100mb begin upload
Encrypt the filename using IP, timestamp and salt
Store in a directory that is not accessible from the web
Store filename, description, and hashed file name in database
Upload done. Now for the downloading:
User requests download
Connect to database, locate file ID
If file ID found, copy the file from the file server location and prepare for file transfer
It's important to note that NOTHING CAN EVER RUN ON THE SERVER. So users can't upload malicious files and launch attacks on the server. When requesting the file, it will immediately launch a download, and never run.
Now, with the above in mind, is there any flaws in the model above that could allow malicious users to attack the servers?
For the purpose of answering the question, assume the rest of the site is secure.
This sounds fine, provided it is properly implemented of course; the only step that's missing IMO is checking for possible filename collisions between 3. and 4., and/or using a completely random file name. After all, the IP address and timestamp are not really relevant information in this context.
On the download side, copying the file on the server should not be necessary. Streaming it from the secret location should be enough, seeing as it will never be visible or accessible to the end user.

Getting data with out scraping

I've got a directory where people submit data. It's stored and pending while it's moderated to make sure it's o.k.
Once approved I'd like another couple of sites that I control and a few I won't (on different servers) to be able to grab that data. This would be on a cron or something so there wouldn't be any human interaction. Moderation is fully dependent on that first moderation.
How do I go about doing this securely.
I've thought about grabbing it as rss, parsing and storing. I've thought about doing soap requests, grabbing xml files, etc....
What would YOU do?
A logical means of securely distributing the data would be to use (S)FTP, ideally with a firewall that only permits access the various permitted machines by IP, etc.
To enable this, once you have the file on the "source" machine, you could simply:
Move the file into a local FTP folder. (You'll quite possibly have to FTP it in (even though it's on the same machine) depending on user rights, etc.) As a tip, FTP is into a temp directory in the FTP folder and then move (rename in FTP parlance) it into a "for collection" folder once the FTP has completed. By doing this, you'll ensure that no partial files are collected.)
Periodically check (via cron) the "for collection" folder from the various permitted machines.
Grab the file(s) if there are any new files awaiting collection.
There are a variety of PHP functions to assist with this, including ftp_ssl_connect which uses SSL-FTP.
However, all that aside, it might be a lot less hassle to use something like rsync over ssh.
Why not have the storage site shoot a request over to the "subscribing" sites indicating new information is available (push notification)?
IE - just make a page request to a "newinfo.php?newinfo=true" or whatever on each of the sites. Then, each of those sites can do whatever they like knowing there's more information available.

How can a hacker put a file on my server root (apache, php, 1and1)

I have a site hosted on 1and1 and a couple of weeks ago I noticed a hacker had put a .php file on the server that when viewed in a browser exposed my DB schema, DB connection strings, FTP account (for file uploads using a form), etc, etc. Naturally I panicked and I wiped the server and reuploaded my files. Fortunatley I encrypt passwords using MD5 and I don't store things like credit card details, etc, etc.
Now I checked my files and with all user input I use a clean function (htmlentities, sql_real_escape_string, etc, etc) that strips the input of any XSS or SQL injection. I have also made sure that the session key gets re-engineered when a user status changes (like they log into their account) to prevent session hijacking, my folder permissions are set to 755 and file permission are 644.
Has anyone got any idea how this could have happened? Or if I'm missing something
the most usual cause is trojan horse that steal passwords from the FTP client on the developer's PC.
One option is through an upload script of some kind, for example I have see poorly protected image upload scripts allow this behaviour. I've actually seen this once in a script that resized images on the fly but used GET variables for the location of the image to resize. Its also worth checking any usages of exec() or system() for possible weaknesses. If its possible to access your server via SSH you could also check all commands run recently using the command history

Categories