With the current issues with Network Solutions sites being hacked, I'm in need of a tool (preferably freeware) that I can install into my site and it will email me the second a file change/update occurs.
Any recommendations welcome :)
This site is on a shared server hosting package.
You can't install a true IDS on shared hosting, this is the host's responsibility.
An hack-ish solution:
You could create a script that ran periodically (using cron or some other mechanism), that would checksum all files, and compare the checksums with a previously stored record, then notify you if there are differences.
To find out if your script itself was deleted by the attack (1), you must also create a script sitting on a remote server (something like Google App Engine, perhaps), that pings your shared-server-script, and checks if it gets an expected result (a hash based on given time, perhaps) – if not, it emails you.
(1) This is actually quite unlikely, most attacks don't delete files
http://www.la-samhna.de/samhain/
However this won't work on shared hosting, so you'll need either a vps or a dedicated server
I've used Tripwire before. It worked really well. ...its not freeware.
You could find some good options by searching the term "IDS" or "Intrusion Detection System"
I second the suggestion of Joel L above - usually any cron job output is emailed to the address you pick when you set up the cron job.
If you rarely change themes or plugins, then this is a good way to go.
When you do make a change, you can just update the "baseline" checksum values.
I need to check out the mute screamer plugin, though, that may be best.
The best free and open source Intrusion Prevention System (IPS) for web application (as in a Web Application Firewall WAF) is Mod_Security. But no system will stop it all. Espically with Wordpress because it won a pwnie award for being so insecure. I would think seriously about ditching Wodrpess for any other blog engine.
Another option which is best suited if you are in a shared hosting enviroment is to use PHP-IDS. The name is a bit deceptive, its actually a regular expression based IPS. All of the regular expressions used by PHP-IDS have been ported to Mod_Security. Mod_Security provides a much better level of protection(ips) and logging(ids).
i originally wrote this in a comment on the rook's answer, but it might get lost in all that noise;
phpids indeed looks interesting as it can
be used in a shared server hosting
environment, which in general will
not be the case for tripwire or mod_security.
interestingly, there is a wordpress
plugin which nicely integrates (an
older version of) phpids in
wordpress, so that might be worth
looking into.
Rook: I think it is probably because WordPress security flaws get patched quickly once discovered. This do mean that anyone running an install must watch for new releases and install them as quickly as they can.
You could version the site with subversion/git/etc - doing a simple 'svn status' or 'git status' would allow you to tell if the source files had changed - however it obviously won't catch any modifications someone may have made to the database content, and it'll get a little messy when someone updates plugins (or wordpress itself) - as so much will have changed.
take a look at http://www.guardio.net uptime and file integrity monitoring
Related
I am building a site platform similar to Wordpress that allows my users to download a .zip file, upload it onto their server, and be good to go.
I know everyone says eval() is evil - but the code will not include any user or variable input.
The benefit here is that updates will occur automatically. I can just change the code being grabbed on my server.
My clients using the code will have pretty low traffic sites - so I'm not worried about overloading their server. Most of the heavy lifting will be done by us.
Here's the basic code concept:
$code=file_get_contents("http://myserver.com/code.txt");
eval($code);
Is this a realistic option? What security holes do I need to worry about?
It's "realistic" in the sense that it will work, but at the same time it sounds like a sysadmin's nightmare. If you are meaning to have a client download and execute remote code every time a request is made, your clients are at your whim if the master server goes down or is unreachable at any point. It's now a mission-critical service you'll have to keep running forever for as long as your clients need it.
You list automatic updates as a benefit, but is it? In nearly every software platform, the features users depend on can change over time; function signatures can change, or functionality may be dropped entirely in favour of a more refined alternative. Since it sounds like you're writing some form of framework, can you guarantee that future versions will always be backwards-compatible? Not everyone is using the cutting-edge version of a piece of software in production for a reason -- they want what they are using to be stable. If an upgraded version of your platform rolls out overnight, and it breaks some custom code written by the client (at least one of them will try doing this, even if you don't want them to) or even, old, standard functionality that was deprecated but still worked with the previous release, how are they going to roll it back to a version that works?
It just sounds like something that will eventually incur a ton of technical debt.
I've spent some time now developing a web application in php. It has mostly been just for the fun of learning as a side project, but the app now has a few users which I don't want to upset by breaking things as I develop.
At the moment, I have a very rudimentary method for managing the development - I use a text editor (ultraedit) to write the code and use its built in ftp to upload the files to the server. In terms of version control I have 2 domains, and only push files to the "live" domain when they work, but that's it. The domains are hosted on a cPanel shared hosting site which I have some doubts about its ability to handle even minor spikes in traffic. I looked at slicehost yesterday for something more scalable but that looks like a bit of a learning curve from where I am now.
I know I could do better than this, but where to start? I think I need advice on three things
1 - code writing tool
2 - version control / management
3 - scalable hosting
I've deliberately asked these in the same question as I'd like to know if one choice impacts another. Is there a good integrated solution?
Thanks in advance as ever.
Each part of your question has been answered before. The list below lists some of the common tools to use and links to appropriate searches on StackOverflow. There is no all in one package and going into details about these tools in one question is out of scope, so I am afraid you have to do some digging:
SVN, Git or Mercurial
UnitTesting (PHPUnit or SimpleTest)
Continuous Integration
Phing (for deployment)
phpqatools.org
Netbeans or Eclipse (for an IDE)
Disclaimer: list is not meant to be complete and order is not important
There is a lot going on here. I'll give you my two cents though.
I used to use ultra edit. Now I use netbeans, its a fully integrated development environment and it makes my life soooo much easier. Its free too. I can't imagine ever going back to UltraEdit. Also, which brings me to number two, netbeans has SVN and CVS integration
I would use subversion for version control. In my experience it does everything you need for version control. Others use ones like git and mercurial, but I think SVN is widely supported and easy enough to set up. As far as pushing code to the server, i've begun using svn for this too. I first ssh into the server and checkout the code into the public_html directory, and then set up an alias to do svn updates through the command line... its way easier than ftping in my opinion. There are other deployment methods, but i've never used them. see this question:
What is your preferred php deployment strategy?
obviously shared hosting is not going to handle traffic as well as a dedicated server. But there are lots of things you can do to improve performance before moving to a dedicated server. Check out this article: http://developer.yahoo.com/performance/rules.html
It seems that you're after a robust deployment strategy as opposed to a development one. But, correct me if I'm wrong.
In terms of 'code writing tool', and
IDE choice is a subjective
discussion. Feel free to work with
the one you are most comfortable
with, for me this is Netbeans.
As for a deployment strategy, I think it was best summed up in this answer.
Your point about scalable hosting is fairly broad. We will need much better forcasted metrics to give better advice. However, for now, if scalable hosting is a worry then maybe look into some sort of Cloud Hosting.
Have you looked at using wamp/xamp/mamp/lamp for development locally? Uploading via ftp for every change is a pain.
You could do that for local and see that everything works, then push it to your test domain and run through it again and then finally push live.
Might want to look at something like SpringLoops for doing your version control - this has the advantage of doing the deploy and then if it goes pear shaped you can revert it (free account gets 3 deploys a day).
I wouldn't worry about scaleable hosting just now, just focus on the site and the coding, you've only got a few users - wait till it starts becoming a problem before moving (however, I suggest looking into options) but don't try and get all cloud ready - might not ever be a problem.
ps, go with Linode over Slicehost.
I think use Aptana Studio ( http://www.aptana.org ). It is an Eclipse based IDE with all the tools you need integrated in it. It has integrated PHP development tools, GIT or SVN for version control.
I've used shared hosting as well. Once an other site on the same server had DoS atacks and my site became unreachable as well. Otherwise it could work in reasonable speed after some optimization. It served 1000-3000 users per day.
Dedicated servers are much better. Or you can use Amazon web services. I know they are more expensive.
1 - code writing tool
Zend Studio. I would recommend Linux as well if you are going to use linux servers.
2 - version control / management
SVN + phing (if you are going to build serious applications).
3 - scalable hosting
Amazon or RackSpace.
For your editor, just use what you are comfortable and productive with. You most definitely should have version control in place. You never know when you need to rollback a version or two.
I always keep at least 3 versions on the production server. I use symlinks to point the web server at the latest version. If there is a problem, you just need to recreate the symlink to point to an older version.
As for shared hosting, you'd be surprised out how much bandwidth you can get. I have a $10/month shared host that gets about 500K page views a month. Generally, it's not that your shared host can't handle the load, it's that the hosting provider puts too many "shares" on the same server. So it depends on how much resources everyone else on the same server is using. If you are having issues, you can always ask to be moved to a different server.
Our relatively small development team is getting a bit sick of Dreamweaver. The only functionality that we're reliant on is its file check in system. As the team is likely to grow over the next few months we need to address these issues.
Subversion has come to our attention but are unsure if it will suit our requirements.
All we need is to be made aware of whether or not someone is working on a particular file before we request it from the server and to block any overriding of a checked out file.
Any recommendations or advice on general best development practices would be much appreciated.
Thanks in advance.
The way this is generally done with SVN is to not "block" people the way you ask for : on the contrary, SVN allows for several developpers to work on the same file, and the modifications each one made are then "merged".
This merge is mostly automatic ; but when it is "too complex" (like when two developpers modified the same portion of a file), one human being has to resolve the "conflicts", indicating which modifications from which developper has to be kept.
This way of working by merges, instead of locks, seems a bit unusual at first, but once you get it (you'll need to take some time to explain your team about it, and how to use it efficiently, of course), it works really nice : I've used SVN on projects with more than 10 developpers, with absolutly no problem (a few conflicts once in a while, but you solve them and that's it).
On the other side, locking files so only one developper can work on it can block the whole team : what if one file is locked by a guy, and he goes for a coffee-break ? And at this same moment, someone else need to modify the same file to be able to work ?
For more informations about SVN, you can take a look at this online book. It give lots of useful informations :-) (you will probably not need all of that, but a quick look can do no harm ^^ )
As a sidenote, if you are developping in PHP, on a big application, an IDE like Eclipse PDT can help a lot ; and there are plugins, like Subversive, that can be used to integrate SVN access into Eclispe.
There are a number of version control solutions you can look into. Are you sure you want to restrict people from working on similar files? Users can fork the development branch and have their own, when they check in any conflicts will need to be merged together. Some of the version controls come with tools to handle most of the conflicts for you.
It's not always a good idea to lock files that another developer may need to make modifications for. One problem you can run into is if the person with the file checkout is out of the office and their machine is inaccessible.
If that is functionality that you MUST have most of the version controls will allow you to configure your branch to work in such a way.
A few of the version controls out there are: CVS, SVN, Git, SourceSafe, ClearCase
If you decide to use SVN with dreamweaver, there's an extension to integrate some of the commands into the ide: Subweaver
Subversion is ok for our team. We're trying to decide if we should lock files or use the default copy-modify-merge model. Could you give me some examples of when you would have multiple developers on the same file. It sounds a little bizzare as I can't see why it can't be a one man job.
I just noticed a PHP config parameter called allow_url_include, which allows you to include a PHP file hosted elsewhere as you would a locally. This seems like a bad idea, but "why is this bad" is too easy a question.
So, my question: When would this actually be a good option? When it would actually be the best solution to some problem?
Contrary to the other responders here, I'm going to go with "No". I can't think of any situation where this would make a good idea.
Some quick responses to the other ideas:
Licensing : would be very easy to circumvent
Single library for multiple servers: I'm sorry but this is a very dumb solution to something that should be solved by syncing files from for example a
sourcecontrol system
packaging / distribution system
build system
or a remote filesystem. NFS was mentioned
Remote library from google: nobody has a benefit to a slow non-caching PHP library loading over PHP. This is not (asynchronous) javascript
I think I covered all of them..
Now..
your question was about 'including a file hosted elsewhere', which I think you should never attempt. However, there are uses for allow_url_include. This setting covers more than just http://. It also covers user-defined protocol handlers, and I believe even phar://. For these there a quite a bit of valid uses.
The only things I can think of are:
for a remote library, for example the google api's.
Alternately, if you are something like facebook, with devs in different locations, and the devs use includes from different stage servers (DB, dev, staging, production).
Once again during dev, to a third party program that is in lots of beta transition, so you always get the most recent build without having to compile yourself (for example, using a remote tinymce beta that you are going to be building against that will be done before you reach production).
However, if the remote server goes down, it kills the app, so for most people, not a good idea for production usage.
Here is one example that I can think of.
At my organization my division is responsible for both the intranet and internet site. Because we are using two different servers and in our case two different subdomains then I could see a case for having a single library that is used by both servers. This would allow both servers to use the same class. This wouldn't be a security problem because you have complete control over both servers and would be better than trying to maintain two versions of the same class.
Since you have control over the servers, and because having an external facing server and internal server requires seperation (because of the firewall) then, this would be a better solution than trying to keep a copy of the same class in two locations.
Hmm...
[insert barrel scraping noise here]
...you could use this is a means of licensing software - in that the license key, etc. could be stored on the remote system (managed by the seller). By doing this, the seller would retain control of all the systems attempting to access the key.
However, as you say the list of reasons this is a horrifying idea outweigh any positives in my mind.
It is pretty standard practice now for desktop applications to be self-updating. On the Mac, every non-Apple program that uses Sparkle in my book is an instant win. For Windows developers, this has already been discussed at length. I have not yet found information on self-updating web applications, and I hope you can help.
I am building a web application that is meant to be installed like Wordpress or Drupal - unzip it in a directory, hit some install page, and it's ready to go. In order to have broad server compatibility, I've been asked to use PHP and MySQL -- is that **MP? In any event, it has to be broadly cross-platform. For context, this is basically a unified web messaging application for small businesses. It's not another CMS platform, think webmail.
I want to know about self-updating web applications. First of all, (1) is this a bad idea? As of Wordpress 2.7 the automatic update is a single button, which seems easy, and yet I can imagine so many ways this could go terribly, terribly wrong. Also, isn't the idea that the web files are writable by the web process a security hole?
(2) Is it worth the development time? There are probably millions of WP installs in the world, so it's probably worth the time it took the WP team to make it easy, saving millions of man hours worldwide. I can only imagine a few thousand installs of my software -- is building self-upgrade worth the time investment, or can I assume that users sophisticated enough to download and install web software in the first place could go through an upgrade checklist?
If it's not a security disaster or waste of time, then (3) I'm looking for suggestions from anyone who has done it before. Do you keep a version table in your database? How do you manage DB upgrades? What method do you use for rolling back a partial upgrade in the context of a self-updating web application? Did using an ORM layer make it easier or harder? Do you keep a delta of version changes or do you just blow out the whole thing every time?
I appreciate your thoughts on this.
Frankly, it really does depend on your userbase. There are tons of PHP applications that don't automatically upgrade themselves. Their users are either technical enough to handle the upgrade process, or just don't upgrade.
I purpose two steps:
1) Seriously ask yourself what your users are likely to really need. Will self-updating provide enough of a boost to adoption to justify the additional work? If you're confident the answer is yes, just do it.
Since you're asking here, I'd guess that you don't know yet. In that case, I purpose step 2:
2) Release version 1.0 without the feature. Wait for user feedback. Your users may immediately cry for a simpler upgrade process, in which case you should prioritize it. Alternately, you may find that your users are much more concerned with some other feature.
Guessing at what your users want without asking them is a good way to waste a lot of development time on things people don't actually need.
I've been thinking about this lately in regards to database schema changes. At the moment I'm digging into WordPress to see how they've handled database changes between revisions. Here's what I've found so far:
$wp_db_version is loaded from wp-includes/version.php. This variable corresponds to a Subversion revision number, and is updated when wp-admin/includes/schema.php is changed. (Possibly through a hook? I'm not sure.) When wp-admin/admin.php is loaded, the WordPress option named db_version is read from the database. If this number is not equal to $wp_db_version, wp-admin/upgrade.php is loaded.
wp-admin/includes/upgrade.php includes a function called dbDelta(). dbDelta() scans $wp_queries (a string of SQL queries that will create the most recent database schema from scratch) and compares it to the schema in the database, altering the tables as necessary so that the schema is brought up-to-date.
upgrade.php then runs a function called upgrade_all() which runs specific upgrade_NNN() functions if $wp_db_version is less than target values. (ie. upgrade_250(), the WordPress 2.5.0 upgrade, will be run if the database version is less than 7499.) Each of these functions run their own data migration and population procedures, some of which are called during the initial database setup script. Nicely cuts down on duplicate code.
So, that's one way to do it.
Yes it would be a security feature if PHP went and overwrote its files from some place on the internet with no warning. There's no guarantee that the server is connecting correctly to your update server (it might download someone code crafted by someone else if DNS poisoning occured) - giving someone else access to your client's data. Therefore digital signing would be important.
The user could control updates by setting permissions on the web directory so that PHP only has read access to the files - this procedure could simply be documented with your program.
One question remains (I really don't know the answer to): can PHP overwrite files if it's currently using them (e.g. if the update.php file itself needed to be updated)? Worth testing.
I suppose you've already ruled this out, but you could host it as a service. (Think wordpress.com)
I'd suggest that you package your application with pear and set up a channel. Your users can then upgrade the application through a standard interface (pear). It's not entirely automatic (unless the users have some kind of automation running on top of pear), but it's standard, so any sysadmin can maintain it.
I think your best option is an update checking mechanism that will alert the administrator when there are update(s).
As you mention, there are a number of potential security problems. Due to those alone, I would suggest not doing this. Instead, try creating a fairly smart upgrading script.
Just my 2 cents: I'd consider an automatically self updating application within my CMS as a security hole, so if you decide to code this feature, you should consider to implement different levels of this behavior:
Automatically update
Check for updates and notify
Disable