Criteria for choosing refactoring instead of a complete re-write - php

I'm working on a project which has a codebase of about 3500 files probably a few hundred less than that actually. This project is made in PHP and is a quite messy, in this case it means that the documentation is hard to understand OOP and procedural programming is mixed, dependecies are unclear and the ones who made the system where beginner programmers with all that that entails.
To be honest what they have done is impressive, it is a working product and all that. But debugging and adding new features is a real chore.
Now to my question what are some good criteria for wheter we should refactor the whole project or do a complete rewrite. I should mention that rewriting parts of the system as we go is probably a no go because of everything being interdependet.

If the application is big, successful, and does interesting things, an attempt to rewrite it by hand will likely fail. The rewrite will never have enough mass to displace the original, and you can't throw the original away until it does, meaning you will continue to have to enhance the original until the replacement is ready. Twice as much work, no additional value. You are much better off, likely, keeping the application and cleaning it up.
You might consider two ideas to help reduce the size of the code base:
Run test coverage on the code. Normally people use this to validate the application functionality, but if you run test coverage and simply run the application the code for an extended period of time, what it will tell you is what code isn't exerecised and therefore likely dead. If the system is mess, it likely has a lot of dead code, and this is an easy way to find candidates. There are test coverage tools that can collect data on your entire application with low overhead; you can use them on production code.
Run a clone detector. This will find duplicate code. Outright duplication is easy to remove. A good clone detector will find not just identical clones, but parametric clones, that is, code which has been copy/paste/edited but in a way that can be summarized with parameters. Parametric clones require more finesse to remove, but finding them tells you of likely missed abstractions, and removing them (by replacing with objects or methods) inserts those missing abstractions into your code, making further maintenance involving that idea easier. (It isn't a very well known fact, but clone removal also raises test coverage rates, if you are really using test coverage to verify functionality!)

The basic criteria will probably be the time ( = money).
The estimation is always hard when something is big. Split it to the smallest chunks.
Mind the point of view. Your estimation will probably differ from the estimation of the previous programmers, as it will be biased by the style you prefer. Maybe a better choice would be to hire the previous programmers to "fix" the code.
Anyways, you should state the minimum requirements to call the app well written and easy to maintain,
discuss this with the boss and the team, split to smallest possible sprints and estimate the time required for all of them.
If you are able to convince your boss that the rewrite will return in better ROI overall, do rewrite. Otherwise, learn the "new style". If you have the comfort to stop the business for the time of the rewrite, or a spare time to do it, you're lucky. You're even more lucky if the code is testable, or you have at least some unit tests.
If this is not possible, you should implement functional tests at least to help with both cases (refactoring or rewrite).
Note, that each one of us will probably prefer to rewrite from scratch, than to refactor.
Then rewrite this rewritten...
So refactoring is always a good choice, assuming you already have tests or you are able to easily implement the tests.

If the application works already and there are no major malfunctions then I would leave most of it as it is since rewriting everything from scratch is always costly.
Perhaps you should consider reducing the amount files/lines wherever you see duplicated code making objects/functions to centralize everything as much as possible. This helps a lot for an eventual maintenance and doesn't require a large amount of time to do. Especially if it was written, as you say by beginners, you will easily find the "wrong approaches" to improve.

Related

retroactive phpUnit testing: is it worth it?

We have about seven different websites that we have developed in-house. They're sites that track different HR applications and help some of our people get their jobs done via scheduling. Today, the head software designer told me to start writing test cases using phpUnit for our existing code. Our main website has at least a million lines of code and the other websites are all offshoots of that, probably in the tens of thousands to hundreds of thousands of lines.
None of this code was written through any apparent design method. Is it worth it for us to actually go back through all of the code and apply phpUnit testing to it? I feel that if we wanted to do this, we probably should have been doing it from the start. Also, if we do decide to start doing this unit testing, shouldn't we adopt TDD from here on? I know for a fact that it will not be a priority.
tl;dr: I've been told to write post-rollout test cases for existing code, but I know that the existing code and future code has not been/will not be created with the principles of TDD in mind. Is it worth it? Is it feasible?
Do you still change the code? Then you will benefit from writing tests.
The harder question is: What to test first? One possible answer would be: Test the code that you are about to change, before you change it.
You cannot test all your code years after it has been written. This will likely cost too much time and money for the generated benefit your company would gain.
Also, it is very hard to "just start with PHPUnit", the benefits for such an approach are probably not that great for the first few months or years, because you'd still test very small units of code without being able to also check if the whole system works. You should also start with end-to-end tests that use a virtual browser that is "clicking" on your pages and expects some text to be displayed. Search keywords would be "Selenium" or "Behat".
I know that the existing code and future code has not been/will not be created with the principles of TDD in mind
It's a question of attitude of all the developers what will happen in the future. For existing code without tests, they likely will never happen. New code, written by unwilling developers, will also not be tested. But willing developers can make a difference.
Note that this takes more than just the lead software designer telling the team to start testing. You'd have to be coached and need a proper professional help if you haven't done it before, and if there is no infrastructure that constantly runs these tests to ensure they still work. Setting this all up means quite an effort, and if the lead software designer or any higher boss is ready to spend time and money on this, you should consider yourself very happy because you can learn something and build more reliable software.
If not, that approach will likely turn out to fail because of passive aggressive denial.
Automated tests are an extremely good idea for when you want to refactor or rewrite code. If you have a PHPUnit test for a function or class you can rewrite the code then confirm using the test that the code still functions the same as it did before. Otherwise you can end up breaking other parts of your code when you refactor or rewrite stuff.
You probably wont be able to test the code in it's current form. It's not just throwing tests to some poorly witten code cause that code probably is not testable.
To be able to write the tests you probably will need to refactor the code. Depending on the quality of the code it could mean rewrite it entirely.
I'd recommend test only the new additions to the code. So if you need a new method in a class or a new function or whatever, it must be tested. Personally I prefer TDD but to people new on testing it can be too much to make them think on a test even before having refactored the code. So you can stick to test later.
For testing the new additions, you'll need to do some refactoring on existing code. For example, if you add a method that logs info, the logger should be injected to allow stubbing it. So you should write some kind of container. Now it does not mean that everywhere the logger is used you have to inject it. Just in the places that you're adding or changing something and that must be tested. This can become quite tricky and require someone well versed in testing to assist.
In short, my recommendation is:
test only the new additions and the new code
refactor the code to be sure that the code under test is really testable
see if the tests give confidence and are worth the effort and keep going

Cleaning Code: Fresh Start vs Hardcore Cleaning

so I got a budget to clean up code that I've inherited from another programmer. the code is in really bad shape, there's lots of files not being used, there's lots of functions not being used. It actually looks like linear programming instead of object oriented. My issue is whether or not I should
a) Clean the code, restructure it, and try to delete all unused resources
b) start with a fresh framework (it's using codeigniter), and copy over the files that are needed to run
I like option b because it kind of reminds me of a new laptop from best buy. I can either spend the time removing all the bloatware or do a wipeout and create the system from scratch, which I always do. The only problem is that this time, there's a lot more involved then just creating a fresh system. Here are the pros and cons:
PROS
Clean system that is very easy to maintain
Don't need to go around searching for unused files
Easier to keep track of what I've done
CONS
Lots of things can break
Might miss required files
Might take longer
What do you think? Can you give me some of your pros and cons, and what you might do in a situation like this?
Update
A lot of people say that I am missing an important part, budget (time and money). We do have a decent budget, but my project manager is willing to go over if it ensures a more solid system with quicker turnaround time for new features. It's hard for me to quantify budget because you have not seen code, so giving hours won't help which is why I did not say anything about budget. I guess from your prospective, consider this a budget that can accommodate both solutions.
Often a hybrid approach works better. Keep the gold nuggets and toss the rest. Most likely there is some really effective code even in the worst project. Also, things that would be painful to rewrite and already work well, you might just clean those up a little.
It takes a little judgment to determine what to keep, but you can kind of have the best of both that way.
Cleaning someone else's code can be a nightmare. If you can actually choose and the stakeholders agree, I'd strongly recommend you to start over.
I had an experience like this last year and to this date, the software is still in pretty bad shape, it's almost impossible to track every mistake made by the programmers we inherited the code from and spend a LOT of time on support for hidden bugs and stuff.
I guess it somewhat depends on your time constraints and how intimately you'll need to know the project.
I just finished restructuring some code written by a math-majoring grad student. Bright guy, but he's not a programmer and the code was, as you said, very linear. Long story short, I rewrote about 90% of the code and took more time doing it than I would have liked. It would have been faster for me to start from scratch, using his code as a reference. Not only that, if I had planned on making as many changes as I had, I would have put more thought into the restructuring process. On the bright side, I now know all the code and concepts in this project very well.
On the other hand, if you don't plan on making many changes or having to maintain the code, maybe it's not worth the hassle. Get it to a usable state and tuck in back under the rug, so to speak.
My 2 cents...
This question reminds me of Joel's article I've read a while back.
The single worst strategic mistake
that any software company can make:
They decided to rewrite the code from
scratch... It’s harder to read code than
to write it.
http://www.joelonsoftware.com/articles/fog0000000069.html

process of commenting and improving already written program?

Please allow my intro to properly define the scope of my question:
I'm still very new to the programming world. This all started for me when I had an idea for a software program, but no programming experience. I ended up going the outsourcing route to get the program, and after almost a year, we do have it live and functioning.
This specific program is written with php and is 100% web-based. We're using lots of ajax, jQuery, etc.
Now a year into it, I have been learning and learning wherever I can (lots of learning here!!!) I'm mainly focusing on Java now to build up to Objective-C and the iPhone fun (probably like 99% of all other newbie programmers out there).
I'm really learning so much, and one of the biggest things I'm learning about is proper commenting and scalability.
I'm seeing now that this job we just finished is very sorely lacking in both those areas. I am wanting to add and build upon this program, and not only do I not have much experience, but I'm seeing that it's really hard for me to even get an idea about the functions without these comments...
So my question is-what is the best course of action to begin to pick up the pieces on this program? A full re-write is out of the question, and I don't think is needed.
I'm sure this is not the first time some newbie programmer, software developer has been down this path...what do others do here?
Is it common for a programmer to come into a project very far along and then "clean up" the mess in order to make things move forward productively?
If this is the wrong place for this question (and I understand it may well be) can someone point me to where this would be more appropriate?
Thanks!
Joel
We call it "refactoring" and it's an important part of programming.
First, you must have a rock-solid set of automated tests. Usually we have unit tests that we run with a unit testing framework.
http://www.testingtv.com/2009/09/24/test-driven-development-with-refactoring/
Then you can make changes and run the tests to confirm that nothing was broken by your changes.
In some cases, you have to "reverse engineer" the tests around the existing programs. This is not very difficult: you have to focus on the interfaces that are "external" or "major" or "significant".
Since you're reverse-engineering, it's hard -- at first -- to determine what should be tested (because it's an important external feature,) and what should not be tested (because it's an implementation detail.)
I'm really learning so much, and one of the biggest things I'm learning about is proper commenting and scalability.
First, I'm curious what you've learned about "proper commenting" as this varies drastically. For some, it's documenting every class and function. For others, it may be documenting every line of code or no code at all.
After having gone through some of the different phases above, I'm with Uncle Bob Martin who, in Clean Code, says that you document decisions, not what the code does. The code itself should be readable and not need documentation. By adding comments describing behavior, you've created duplication that will eventually become out of sync. Rather, the code should document itself. Using well-named functions and variables help describe exactly what the other intended. I'd highly recommend Clean Code for a full discussion of these concepts.
As for scalability, it's usually something that you want to build in. Scalability might be a function of good design, or a proper design for the requirements, but poor design will make scalability a nightmare.
I'm seeing now that this job we just finished is very sorely lacking in both those areas. I am wanting to add and build upon this program, and not only do I not have much experience, but I'm seeing that it's really hard for me to even get an idea about the functions without these comments...
I see this as an indicator of one of two things:
That the code isn't well written. Yeah, that's highly subjective. -OR-
That you don't yet fully understand everything you need to. -OR-
A little bit of both.
Writing good, intention-revealing code is hard and takes years of practice.
So my question is-what is the best course of action to begin to pick up the pieces on this program? A full re-write is out of the question, and I don't think is needed.
As other posters have mentioned, Refactoring. Refactoring is the process of changing code to improve readability and usability without changing functionality. Get a good book on refactoring, or start reading everything you can online. It's now a critical skill.
Is it common for a programmer to come into a project very far along and then "clean up" the mess in order to make things move forward productively?
Unfortunately it is. It takes a lot of diligence to avoid falling into this trap. Try to make your code a little bit better every day.
I don't know about this being the wrong place or not, but I'll answer as I can:
Is it common for a programmer to come into a project very far along and then "clean up" the mess in order to make things move forward productively?
Yes, in my experience this is very common. I have been doing contract work for over 10 years, and I can't count the number of times I've had to come in and clean up something hastily put together to either make it scale or to be able to add functionality onto it. This is especially common when you outsource the programming to another company, the incentive there is to get it working and out of the door as quickly as possible.
So my question is-what is the best course of action to begin to pick up the pieces on this program? A full re-write is out of the question, and I don't think is needed.
I don't know that there is a "good" answer to this question, the only thing I can tell you is to take it one method at a time and document what they do as you figure them out. If you still have access to the people that initially wrote the program you can ask them if they could give you documentation on the system, but if that was not included as part of the original work spec I doubt they are going to have any.
I'm really learning so much, and one of the biggest things I'm learning about is proper commenting and scalability.
As you have found on your own, proper commenting is important, I'm not convinced on the importance of building scalability in from the beginning, going by the YAGNI principle. I think that as any program grows it is going to go through growing pains, whether that is scalability or functionality. Could someone have built twitter from the start with the kind of scalability in mind that it currently needs? Possibly, but there is the very real possibility that it would flop.
Is it common for a programmer to come into a project very far along and then "clean up" the mess in order to make things move forward productively?
It's definitely common for pretty much EVERY programmer :)
Having said that, remember the IIABTFI principle. If It Ain't Broke, Don't Fix It.
Understanding how the program works and what the pieces are is useful.
Trying to improve it without a specific goal and a business purpose in mind is not.
The big question is how well is the program currently running meeting the needs of those that use it? While it may not be the best looking code, it does work which may mean that you end up doing 101 refactoring exercises around it to get enough of the basics down to make other changes.
While you may be able to ask the original writers of the program, this can be a possible sore spot if they think it is awesome and you think it is crap, for example. It is an idea and one that should be carefully analyzed a bit before one goes and ends up burning bridges because they think you can't appreciate their genius in what was done.
Often this aren't done in an optimal way and so as one learns better ways to do things, things are done in better ways. There is a limit to that of course, but I'd start with the idea that you have some refactoring lessons to help get the basics of the app under your belt and then start putting in enhancements and other stuff to see what was really done in the end.

How do I implement a test framework in a legacy project

I've got a big project written in PHP and Javascript. The problem is that it's become so big and unmaintainable that changing some little portion of the code will upset and probably break a whole lot of other portions.
I'm really bad at testing my own code (as a matter of fact, others point this out daily), which makes it even more difficult to maintain the project.
The project itself isn't that complicated or complex, it's more the way it's built that makes it complex: we don't have predefined rules or lists to follow when doing our testing. This often results in lots of bugs and unhappy customers.
We started discussing this at the office and came up with the idea of starting to use test driven development instead of the develop like hell and maybe test later (which almost always ends up being fix bugs all the time).
After that background, the things I need help with are the following:
How to implement a test
framework into an already existing
project? (3 years in the
making and counting)
What kind of frameworks are there
for testing? I figure I'll need one
framework for Javascript and
one for PHP.
Whats the best approach for testing
the graphical user interface?
I've never used Unit Testing before so this is really uncharted territory for me.
G'day,
Edit: I've just had a quick look through the first chapter of "The Art of Unit Testing" which is also available as a free PDF at the book's website. It'll give you a good overview of what you are trying to do with a unit test.
I'm assuming you're going to use an xUnit type framework. Some initial high-level thoughts are:
Edit: make sure that everyone is is agreement as to what constitutes a good unit test. I'd suggest using the above overview chapter as a good starting point and if needed take it from there. Imagine having people run off enthusiastically to create lots of unit tests while having a different understanding of what a "good" unit test. It'd be terrible for you to out in the future that 25% of your unit tests aren't useful, repeatable, reliable, etc., etc..
add tests to cover small chunks of code at a time. That is, don't create a single, monolithic task to add tests for the existing code base.
modify any existing processes to make sure new tests are added for any new code written. Make it a part of the review process of the code that unit tests must be provided for the new functionality.
extend any existing bugfix processes to make sure that new tests are created to show presence and prove the absence of the bug. N.B. Don't forget to rollback your candidate fix to introduce the bug again to verify that it is only that single patch that has corrected the problem and it is not being fixed by a combination of factors.
Edit: as you start to build up the number of your tests, start running them as nightly regression tests to check nothing has been broken by new functionality.
make a successful run of all existing tests and entry criterion for the review process of a candidate bugfix.
Edit: start keeping a catalogue of test types, i.e. test code fragments, to make the creation of new tests easier. No sense in reinventing the wheel all the time. The unit test(s) written to test opening a file in one part of the code base is/are going to be similar to the unit test(s) written to test code that opens a different file in a different part of the code base. Catalogue these to make them easy to find.
Edit: where you are only modifying a couple of methods for an existing class, create a test suite to hold the complete set of tests for the class. Then only add the individual tests for the methods you are modifying to this test suite. This uses xUnit termonology as I'm now assuming you'll be using an xUnit framework like PHPUnit.
use a standard convention for the naming of your test suites and tests, e.g. testSuite_classA which will then contain individual tests like test__test_function. For example, test_fopen_bad_name and test_fopen_bad_perms, etc. This helps minimise the noise when moving around the code base and looking at other people's tests. It also has then benefit of helping people when they come to name their tests in the first place by freeing up their mind to work on the more interesting stuff like the tests themselves.
Edit: i wouldn't use TDD at this stage. By definition, TDD will need all tests present before the changes are in place so you will have failing tests all over the place as you add new testSuites to cover classes that you are working on. Instead add the new testSuite and then add the individual tests as required so you don't get a lot of noise occurring in your test results for failing tests. And, as Yishai points out, adding the task of learning TDD at this point in time will really slow you down. Put learning TDD as a task to be done when you have some spare time. It's not that difficult.
as a corollary of this you'll need a tool to keep track of the those existing classes where the testSuite exists but where tests have not yet been written to cover the other member functions in the class. This way you can keep track of where your test coverage has holes. I'm talking at a high level here where you can generate a list of classes and specific member functions where no tests currently exist. A standard naming convention for the tests and testSuites will greatly help you here.
I'll add more points as I think of them.
HTH
You should get yourself a copy Working Effectively with Legacy Code. This will give you good guidance in how to introduce tests into code that is not written to be tested.
TDD is great, but you do need to start with just putting existing code under test to make sure that changes you make don't change existing required behavior while introducing changes.
However, introducing TDD now will slow you down a lot before you get back going, because retrofitting tests, even only in the area you are changing, is going to get complicated before it gets simple.
Just to add to the other excellent answers, I'd agree that going from 0% to 100% coverage in one go is unrealistic - but that you should definitely add unit tests every time you fix a bug.
You say that there are quite a lot of bugs and unhappy customers - I'd be very positive about incorporating strict TDD into the bugfixing process, which is much easier than implementing it overall. After all, if there really is a bug there that needs to be fixed, then creating a test that reproduces it serves various goals:
It's likely to be a minimal test case to demonstrate that there really is an issue
With confidence that the (currently failing) test highlights the reported problem, you'll know for sure if your changes have fixed it
It will forever stand as a regression test that will prevent this same issue recurring in future.
Introducing tests to an existing project is difficult and likely to be a long process, but doing them at the same time as fixing bugs is such an ideal time to do so (parallel to introducing tests gradually in a "normal" sense) that it would be a shame not to take that chance and make lemonade from your bug reports. :-)
From a planning perspective, I think you have three basic choices:
take a cycle to retrofit the code with unit tests
designate part of the team to retrofit the code with unit tests
introduce unit tests gradually as you work on the code
The first approach may well last a lot longer than you anticipate, and your visible productivity will take a hit. If you use it, you will need to get buy-in from all your stakeholders. However, you might use it to kickstart the process.
The problem with the second approach is that you create a distinction between coders and test writers. The coders will not feel any ownership for test maintenance. I think this approach is worth avoiding.
The third approach is the most organic, and it gets you into test-driven development from the get go. It may take some time for a useful body of unit tests to accumulate. The slow pace of test accumulation might actually be an advantage in that it gives you time to get good at writing tests.
All things considered, I think I'd opt for a modest sprint in the spirit of approach 1, followed by a commitment to approach 3.
For the general principles of unit testing I recommend the book xUnit Test Patterns: Refactoring Test Code by Gerard Meszaros.
I've used PHPUnit with good results. PHPUnit, like other JUnit-derived projects, requires that code to be tested be organized into classes. If your project is not object-oriented, then you'll need to start refactoring non-procedural code into functions, and functions into classes.
I've not personally used a JavaScript framework, though I would image that these frameworks would also require that your code be structured into (at least) callable functions if not full-blown objects.
For testing GUI applications, you may benefit from using Selenium, though a checklist written by a programmer with good QA instincts might work just fine. I've found that using MediaWiki or your favorite Wiki engine is a good place to store checklists and related project documentation.
Implementing a framework is in most cases a complex task because you kinda start rebuilding your old code with some new solid framework parts. Those old parts must start to communicate with the framework. The old parts must receive some callbacks and returnstates, the old parts must then somehow point that out to the user and in fact you suddenly have 2 systems to test.
If you say that you application itself isn't that complex but it has become due to lack of testing it might be a better option to rebuild the application. Put some common frameworks like Zend to the test, gather your requirements, find out if the tested framework suits for the requirements and decide if it's usefull to start over.
I'm not very sure of unit testing, but NetBeans has a built-in unit testing suite.
If the code is really messy, it's possible that it will be very hard to do any unit testing. Only sufficiently loosely coupled and sufficiently well designed components can be unit tested easily. However, functional testing may be a lot easier to implement in your case. I would recommend taking a look at Selenium. With this framework, you will be able to test your GUI and the backend at the same time. However, most probably, it won't help you catch the bugs as well as you could with unit testing.
Maybe this list will help you and your mates to re-structure everything:
Use UML, to design and handle exceptions (http://en.wikipedia.org/wiki/Unified_Modeling_Language)
Use BPMS, to design your work-flow so you won't struggle (http://en.wikipedia.org/wiki/Business_process_management)
Get a list of php frameworks which also support javascript backends (e.g. Zend with jQuery)
Compare these frameworks and take the one, which matches the most to your project desgin and the coding structure used before
You should may be consider using things like ezComponents and Dtrace for debugging and testing
Do not be afraid of changes ;)
For GUI testing you may want to take a look at Selenium (as Ignas R pointed out already) OR you may wanna take a look at this tool as well: STIQ.
Best of luck!
In some cases, doing automatic testing may not be such a good idea, especially when the code base is dirty and PHP mixes it's behavior with Javascript.
It could be better to start with a simple checklist (with links, to make it faster) of the tests that should be done (manually) on the thing before each delivery.
When coding in a 3 years old minefield, better protect yourself with many error checking. The 15 minutes spent writing THE proper error message for each case will not be lost.
Use the bridging method: bridge ugly lengthy function fold() with a call to fnew() which is a wrapper around some clean classes, call both fold and fnew and compare result, log differences, throw the code into production and wait for your fishes. When doing this, always use one cycle for refactoring, anOTHER cycle for changing result (do not even fix bugs in old behavior, just bridge it).
I agree with KOHb, Selenium is a must !
Also have a look at PHPure,
Their software records the inputs and outputs from a working php website, and then automatically writes phpunit tests for functions that do not access external sources (db, files etc..).
It's not a 100% solution, but it's a great start

Trying out Test-Driven Development

After reading this post I kinda felt in the same position as the guy who asked the question. I love technology and coming up with new ideas to solve real world problems just gets my neurons horny, but the other part of the equation - actually getting things done (fast) - is normally a pain in the ass to accomplish, specially when I'm doing this for myself.
Sometimes I kinda feel plain bored with code, some other times I spend more time moving the cursor in the text editor and staring at my code, trying to come up with a solution that is better than the one I already have. I heard this is a disease called perfectionism.
I've read in that same post (and also a few times here on SO too) that TDD is actually good to stop coding like a girl, however I've never given a chance at TDD - either because I'm too lazy to learn / set it up or because I don't think I need it because I can do all the tests I need inside my head.
Do you also believe that TDD actually helps to GTD?
What do I need to know about TDD?
What about alternatives to TDD?
What would be the best methodology to organize / develop a TDD web app?
What libraries should I use (if any) to make my life easier?
PS: I'm primarily (but not exclusively) working with PHP here.
Personally I think TDD is at best overkill and at worst an impediment to a the creative process of programming. Time that is spent laboriously writing unit tests for each as yet unwritten methods/classes would be better spent solving the original problem. That being said I am a big fan of unit tests and believe wholeheartedly in them. If I have a particularly complex or troublesome piece of code I'm more than happy to write 20 unit tests for a single method but generally AFTER I have solved the problem. TDD, just like every other programming paradigm, is no silver bullet. If is suits you use it if not keep looking.
But take my opinion with a grain of salt. A much more interesting one comes from Kent Beck and How deep are your unit tests?.
Do you also believe that TDD actually helps to GTD?
My biggest concern was simply not being able to test code. It was too complex. Our core libraries weren't built around an easily testable interface. So we wrote tests for whatever we could. In the end we ended up refactoring our core libraries to make life easier. Besides that, it's a change of a mindset, and i would definitely consider allocating more time on your first TDD project just to kind of flush out some of the problems you may have along the way.
What do I need to know about TDD?
TDD is not a substitute for a methodology. It is a beneficial addition or at least it's supposed to be. If done right, TDD greatly improves software design. It also acts as your internal documentation. If you want somebody to look at your API and understand how it works they can simply look at your well named an formed tests.
What about alternatives to TDD?
Like i said, i wouldn't consider this a substitute for a methodology. There is an alternative and that is not to use it :)
What would be the best methodology to organize / develop a TDD web app?
We have been fairly successful with scrum/agile, if that's what you're asking.
What libraries should I use (if any) to make my life easier?
My PHP knowledge has expired 5 years ago, and i'll let somebody else answer this.
Either way, just my 2 cents. If you feel like reading here is a good overview: http://www.agiledata.org/essays/tdd.html
I've recently started using "fat models thin controllers" http://www.amitshah.in/php/controller-vs-model.html : shovelling as much code into the model (and out of the view / controller) as possible.
I use PHPUnit (and the Zend Framework support for it) to test only a few complex models in my web apps. Writing unit tests that exhaustively check a 2 line function that executes a simple SQL query is a waste of time IMHO. Over the last couple of years I've got lazier and lazier with writing tests, and for most web apps its not worth it because the code is so simple.
But in a recent project (an ecommerce site that tracks stock levels over multiple warehouses with composite products) I started doing test-first development because I knew there was going to be a lot of subtle complexity that I couldn't keep in my head all at once. The only way to keep everything working as my thinking developed was to write some tests. With some parts it seemed more natural to write the class then the tests, other parts were test first, yet other parts didn't need tests because they were trivial. TDD is a tool, not a religion. Use it when it works, stop if it doesn't.
Where I see the benefit in TDD is in reducing complexity and increasing speed (the rate at which I can solve problems). If I write some tests that prove my code is working, then as soon as the all the tests pass, I can move on to the next problem. This puts the fun back in coding for me.

Categories