How do you track files in SMB with an application? - php

I have built an application with PHP which shows all the files in the home directory of a user this directory is also available via samba so you can access it from the native explorer in windows, mac and linux. I wanted to give every file an ID so that I can asign tags to every file how would you go about doing this? Would you make hashs of the file and look whether its the same filehash and would thus conclude that its the same file?
Can I trigger samba to send out something everytime a file or folder gets moved?

If your platform is Linux and the installation is fairly recent, you can use inotify to have your PHP code called when filesytem changes are made. See this portion of the PHP manual:
http://us3.php.net/manual/en/book.inotify.php
The basic usage would be to add a watcher on the Samba directory or directories with a callback to your PHP code. For performance reasons, it would be a good idea to see if inotify can be told only to send the types of updates your interested in to your code.
Note however that inotify will drop updates/messages after a certain period of time. So you will have problems keeping things in sync at some point in time. One solution would be to use inotify on an ongoing basis along with periodically doing a full scan of each home to verify it reflects your database (or wherever the tags are stored).

To answer your first question, making a hash would of course work. Simply using md5 on the files would be sufficient. The chances of a collision while hashing the files in your home directory are insanely small. IMO I would say not even worth mentioning.
And it probably goes without saying but... I would store at least the hash and the full path, so you can deal with moved files appropriately, and actually do something with the file.

Related

php security issue - file uploads

On of my client approached me to check and fix the hacked site. Site was developed by another developer , Very inexperienced developer not even basic security taken care of.
Well the problem was somehow PHP files were written to the images folder. Hackers also wrote an index.html which displays site is hacked. When I check images folder has 777 permissions. So I came to rough conclusion that its because of folder permissions. Hosting support guy says that some PHP file has poorly written scripts which allowed any extension file to upload to server, and then hackers executed files to gain access or do whatever they want.
I have few questions:
Is it only through upload functionality can we upload other PHP files ?
Is it not possible other way to write files from remote as folder permissions are 777?
Sit has some fckeditors editors and couple of upload functionalities. I checked them, there are enough validations , so when extensions other then images or PDF are tried to upload they just return false .
Does'nt setting folder permissions to lower level fix the issue?
I asked the support guy to change folder permissions and it would solve the issue, but he says there is some PHP file through of which other PHP files were written and he wants that to be fixed otherwise site cannot go live. He says even folder permissions are changed hacker can again change them to 777 and execute whatever he wants because that poorly written PHP file.
How should be my approach to find if there is such PHP file? Any help or pointers would be much appreciated.
777 means that any user on the system (with execute access for all the parent directories, anyway) can add anything to that directory. Web users are not system users, though, and most web servers (Apache included) won't let random clients write files there right out of the box. You'd have to specifically tell the server to allow that, and i'm fairly certain that's not what happened.
If you're allowing any file uploads, though, the upload folder needs to at least be writable by the web server's user (or the site's, if you're using something like suPHP). And if the web server can write to that directory, then any PHP code can write to that directory. You can't set permissions high enough to allow uploads and low enough to keep PHP code from running, short of making the directory write-only (which makes it pretty useless for fckeditor and such).
The compromise almost certainly happened because of a vulnerability in the site itself. Chances are, either there's a file upload script that's not properly checking where it's writing to, or a script that blindly accepts a name of something to include. Since the PHP code typically runs as the web server's user, it has write access to everything the web server has write access to. (It's also possible that someone got in via FTP, in which case you'd better change your passwords. But the chances of the web server being at fault are slim at best.)
As for what to do at this point, the best option is to wipe the site and restore from backup -- as has been mentioned a couple of times, once an attacker has gotten arbitrary code to run on your server, there's not a whole lot you can trust anymore. If you can't do that, at least find any files with recent modification times and delete them. (Exploits hardly ever go through that much trouble to cover their tracks.)
Either way, then set the permissions on any non-upload, non-temp, non-session directories -- and all the existing scripts -- to disallow writes, period...particularly by the web server. If the site's code runs as the same user that owns the files, you'll want to use 555 for directories and 444 for files; otherwise, you can probably get by with 755/644. (A web server would only be able to write those if it's horribly misconfigured, and a hosting company that incompetent would be out of business very quickly.)
Frankly, though, the "support guy" has the right idea -- i certainly wouldn't let a site go live on my servers knowing that it's going to be executing arbitrary code from strangers. (Even if it can't write anything to the local filesystem, it can still be used to launch an attack on other servers.) The best option for now is to remove all ability to upload files for now. It's obvious that someone has no idea how to handle file uploads securely, and now that someone out there knows you're vulnerable, chances are you'd keep getting hacked anyway til you find the hole and plug it.
As for what to look for...unfortunately, it's semi vague, as we're talking about concepts above the single-statement level. Look for any PHP scripts that either include, require, or write to file names derived in any way from $_GET, $_POST, or $_COOKIE.
Changing folder permissions won’t solve the issue unless you’re using CGI, since PHP probably needs to be able to write to an upload folder, and your web server probably needs to be able to read from it. Check the extension of any uploaded files!
(So no, 0777 permissions don’t mean that anyone can upload anything.)
As cryptic mentioned, once a hacker can run code on your server then you have to assume that all files are potentially dangerous. You should not try to fix this yourself - restoring from a backup (either from the client or the original developer) is the only safe way around this.
Once you have the backup files ready, delete everything on your your site and upload the backup - if it is a shared host you should contact them as well in case other files are compromised [rarely happens though].
You've identified 2 issues: the permissions and the lack of extension checking however have you any evidence that these were the means by which the system was compromised? You've not provided anything to support this assertion.
Changing the permissions to something more restrictive would have provided NO PROTECTION against users uploading malicious PHP scripts.
Checking the extensions of files might have a made it a bit more difficult to inject PHP code into the site, it WOULD NOT PREVENT IT.
Restoring from backup might remove the vandalized content but WILL NOT FIX THE VULNERABILITIES in the code.
You don't have the skills your client (whom is probably paying you for this) needs to resolve this. And acquiring those skills is a much longer journey than reading a few answers here (although admittedly it's a start).
Is it only through upload functionality can we upload other PHP files ? Is it not possible other way to write files from remote as folder permissions are 777?
There definitely are multiple possible ways to write a file in the web server’s document root directory. Just think of HTTP’s PUT method, WebDAV, or even FTP that may be accessible anonymously.
Sit has some fckeditors editors and couple of upload functionalities. I checked them, there are enough validations , so when extensions other then images or PDF are tried to upload they just return false .
There are many things one can do wrong when validating an uploaded file. Trusting the reliability of information the client sent is one of the biggest mistakes one can do. This means, it doesn’t suffice to check whether the client says the uploaded file is an image (e.g. one of image/…). Such information can be easily forged. And even proper image files can contain PHP code that is being executed when interpreted by PHP, whether it’s in an optional section like a comment section or in the image data itself.
Does'nt setting folder permissions to lower level fix the issue?
No, probably not. The upload directory must be writable by PHP’s and readable by the web server’s process. Since both are probably the same and executing a PHP file requires only reading permissions, any uploaded .php file is probably also executable. The only solution is to make sure that the stored files don’t have any extension that denote files that are executed by the web server, i.e. make sure a PNG is actually stored as .png.

PHP application to replicate websites from single code source

I'm attempting to build an application in PHP to help me configure new websites.
New sites will always be based on a specific "codebase", containing all necessary web files.
I want my PHP script to copy those web files from one domain's webspace to another domain's webspace.
When I click a button, an empty webspace is populated with files from another domain.
Both domains are on the same Linux/Apache server.
As an experiment, I tried using shell and exec commands in PHP to perform actions as "root".
(I know this can open major security holes, so it's not my ideal method.)
But I still had similar permission issues and couldn't get that method to work either.
But I'm running into permission/ownership issues when copying across domains.
Maybe a CGI script is a better idea, but I'm not sure how to approach it.
Any advice is appreciated.
Or, if you know of a better resource for this type of information, please point me toward it.
I'm sure this sort of "website setup" application has been built before.
Thanks!
i'm also doing something like this. Only difference is that i'm not making copies of the core files. the system has one core and only specific files are copied.
if you want to copy files then you have to take in consideration the following:
an easy (less secured way) is to use the same user for all websites
otherwise (in case you want to provide different accesses) - you must create a different owner for each website. you must set the owner/group for the copied files (this will be done by root).
for the new website setup:
either main domain will run as root, and then it will be able to execute a new website creation, or if you dont want your main domain to be root, you can do the following:
create a cronjob (or php script that runs in a loop under CLI), that will be executed by root. it will check some database record every 2 minutes for example, and you can add from your main domain a record with setup info for new hosted website (or just execute some script that gains root access and does it without cron).
the script that creates this can be done in php. it can be done in any language you wish, it doesn't really matter as long as it gets the correct access.
in my case i'm using the same user since they are all my websites. disadvantage is that OS won't create restrictions, my php code will (i'm losing the advantage of users/groups permissions between different websites).
notice that open_basedir can cause you some hassle, make sure you exclude correct paths (or disable it).
also, there are some minor differences between fastCGI and suPHP (i believe it won't cause you too much trouble).

Prevent users from committing a file in SVN

I know...SVN. But here is the deal. I have several developers (and some designers) working in SVN. We have a configure.php file we use with all db connections, etc. We need to prevent them from committing their copy to the repository so live doesnt get overwritten to dev credentials. Anyone?
You can lock the file. It won't really stop someone from modifying it (as they can break the lock), but it provides an extra barrier which is often enough to stop the check-in.
A better solution would be to write a pre-commit hook rejecting the check-in on the server side; however, it would be very annoying to force people to check in everything except the forbidden file, as that would mean listing a lot of files on the command line (or gui tool).
The best solution is to not check in the file, but to check in a "config template" file with a similar, but different name from the one needed. Inside that file you explain the need to copy the file to the correct file name (and describe which fields need filled out with the appropriate information). Then you add a svn ignore property for the actual configuration file. Again someone crafty enough can unset the ignore and check in the file eventually, but it's often enough to stop all but the most diligent system breakers.

Downloading PHP content from another domain (safe way)?

So, if this question has been asked before, I'm sorry. I'm not exactly sure what to search for.
Introduction:
All the domains I maintain now are hosted on my server, so I have not ran into this problem yet.
I have created a structure, similar to WordPress, for uploading and editing images.
I regularly create changes in the functions and upload them to a single folder. When the user logs in, the contents are automatically downloaded into their folder.
What I am wanting to do:
Now, say I have a user that is not hosted on my server. I cannot use copy(), but is there a safe and secure way to echo the contents of each php file (obviously, I can echo) into another file on the users server?
For example:
Currently I can copy from jasonleodurbin.com to geodun.com (same server), but say I want to copy jasonleodurbin.com/test.php to somedomain.com/test.php.
I had some thoughts like give each user a private key and send that to a file like echo.php. echo.php will grab the contents of every file (that has been modified recently) and echo that to the screen. The requesting server would take that content and copy that into it's respective .php file.
I assume I could send the key through GET, but since I have never dabbled into the security implications of anything (I am a hobbyist), I don't know how secure this is.
Are there any suggestions or directions that someone could send me?
I appreciate the help!
I'm assuming this is sensitive data. If that's the case, then I would suggest encrypting the file using PGP keys. Either way, you need a method to send the file from your server to their server. I can't recall how I did it, but I used to send encrypted data file from our remote server to a server in house. We used PGP keys to encrypt and decrypt once it arrived in house. As for the method we used to send the file across the web, I believe we used SCP (you need shell access on the server).
You could use FTP, but how about setting it up so that they only have access to a particular directory so they can't touch anything else. You'll need a script to grab the file from the FTP location and storing it in the appropriate directory per user?
Just thought of something, store the file in a protected folder. Have the user download the file using curl. I believe you can specify username/password with curl.
Several options:
Upload the newest version of test.php as test.phps (PHP Source file, will be displayed instead of run) in a location know to the client. It is then up to them to download this file and install it on their web server.
pros: not much effort required on your part, no keys or encryption required.
cons: everyone can view the contents of your PHP file if they know where to look, no guarantee that clients will actually get updated versions of the file.
Copy the file to clients web server. Use scp, ftp, or some such method to update test.php on the clients web server whenever you change it.
pros: file will always be updated. Reasonably secure if you use scp
cons: extra step required for you, you will have to remember to do this each time you change test.php. You will need to have access to the clients web server for this to work
Automated copy at a timed interval. Set up a cron script that syncs test.php to the clients web server at a certain time each hour/day/week/whatever
pros: Not much repeated effort required on the part of either party. Reasonably secure if you use scp
cons: could break if something changes and you're not emailing when an error occurs. You will still also need access to the clients machine for this to work.
There's probably a lot more different ways to do this as well, but this is just a few to get you started
Use a version control system, such as subversion. Just check in your code to the repository each time you make some changes you want to push, and run an update from the clients. If you're already using a version control system, create a production-branch where you commit your changes when they're ready to be pushed to clients.
It can be done from the clients in pure php (slightly experimental) with library from here or here, with a PHP extension, or with a wrapper to the native svn client.
This gives you security, as each user can have their own password, which you can retract if you so please. Can also do encryption by running through a ssh tunnel (limits your library choices to the wrapper I think), but really, wouldn't worry too much about encryption, who's going to be looking at the traffic between the servers? Unless you're doing top secret type stuff.
It also gives you automatic change detection, you don't have to roll your own way of keeping track of which files are updated as this is done when you commit your new changes.
It's a proven way of doing code bases up to date, so I don't see why you would implement your own. It also gives you the extra advantage of being able to roll back changes if (when) there's a problem with the code update.

Future proof file storage

I accept file uploads from users. Each file has a pointer in the db which has info on the file location in the filesystem.
Currently, I'm storing the files in the filesystem non categorically, and each file is currently just named a unique value. All categorisation and naming etc is done in the app using the db.
A factor that I'm concerned about is that of file synchronization issues.
If I wanted to set up file system synchronization where, for example, the user's files are automatically updated by bridging with a pc app, would this system still work well?
I have no idea how such a system would work so hopefully I can get some input.
Basically, is representing a file's name and location purely in the database optimal, especially if said file may be synchronized with a pc application?
Yes, the way you are doing this is the best way to do it. You are using a file system to store files and a database to sore structured data.
One suggestion I would make is that you create a directory tree on the file system. You may one day run up against a maximum files per directory limitation of your file system. I have built systems that create a new sub directory for each day or week.
Make sure you have good backups of the database as well as the document repository.
All you need to make such a system work is to make sure the API you use (or, more likely, create) can talk to the database and to the filesystem in a sensible way. Since this is what your site is already doing anyway, it shoudn't be hard to implement.
The mere fact that your files are given identifiers instead of plain-English names is mostly irrelevant with regard to remote synchronization.
Store a file hash in the database rather than a path (i.e. SHA1) and have a separate database connect the hash with the path. Write a small app that will synchronize the hash database so that when you move your files to a different location it'll be easy to build a new database with updated paths.
That way you can also have the system load the file from a different location depending of which hash database you use to locate the file so it offers some transparency if you need people to be able to access the same file from diverse locations (i.e. nfs or webdav).
We use exactly this model for file storage, along with (shameless plug) SabreDAV to make it seem to the end-user it's a normal filesystem.
I think this is a perfectly fine model, as long as looking up the file is documented and easily retrieved there shouldn't be an issue. Just make backups of your DB :)
One other advice I can give, we use an md5() on the file-id to generate a unique filename. We use parts of the files to generate a directory structure, for example.. id 1 will yield: b026324c6904b2a9cb4b88d6d61c81d1, the resulting filename will become:
b02/632/4c6/904b2a9cb4b88d6d61c81d1 The reason for this is that most stable filesystems can become very slow after a high number of files (or directories) in one directory. It's much, much faster too traverse a few sub-directories.
The Boring Answer™:
I think it depends on what you wanna do, as always :)
I mean take your regular web hosting company. Developers are synching files to web servers all the time. Would it make sense for a web server to store hash-generated file names in a db that pointed to physical files? No. Then you couldn't log in with your FTP-client and upload files like that, and you'd have to code a custom module to get Apache to work etc. Instant headache.
Does it make sense for Flickr to use a db? Yes, absolutely! (Then again, you can't log in with an FTP-client and manage your photos—and that's probably a good thing!)
Just remember, a file system is a (very simple) db too. And it's a db that comes with a lot of useful free tools.
my 2¢
/0

Categories