In its finest form, as practiced by KentBeck, the approach looks like this.
With a couple of features added to the debugger, like a StubButton, you build most of the code in the debugger ... it's like being able to observe the engine running while you tune it.
If you haven't experienced this, it probably sounds like baling wire and chewing gum. It isn't. You are intimately connected to the objects that need to do the job for you. You follow where Smalltalk leads, crafting clear, small methods as you go. If I could give you this one experience, my life would be complete. (Not that I'd log out, you understand.) --RonJeffries
This is the way I used to do it in Smalltalk. Now, when I'm using VisualAgeJava, I find that it can't be done as easily because when I add a method or change a signature, etc. stuff doesn't compile. But VisualAgeJava kindly shows me with little red X's what I have to fix. So I fix them and then use breakpoints and the debugger (until a change I make in the debugger gives me new little red X's).
It's still a beautiful thing and gets the same result. --KenAuer
I've been having mixed results with this "top-down" approach, and I'm pretty sure it's my fault, because the technique sounds so beautiful. I'm having trouble figuring out when to write the tests for the lower-level methods. If I try to write the test when the walkback occurs, it tends to break the flow of the process. If, instead, I try to maintain the flow, I find myself producing beautiful, clear methods - but when I'm done, I find myself with only one high-level test written. (At that point, I can go back and write the lower-level tests, but that doesn't sound like the right way to do things.)
Is there a better way of doing this? Maybe I need to train myself to incorporate the lower-level-test-writing into the flow.
(A week later...)
Having had the answer pounded through my head by several people, I'll attempt to write it up here. Please fix the parts that I get wrong. --AdamSpitz
The trick is to do things so simply that there's no need to go very deep during any test-code-refactor cycle. Start off by writing the simplest test you can think of, and make it pass by writing the simplest code that you can - which probably means just hard-coding the correct answer.
You don't just leave the hard-coded answer in there, of course. You go back and write another test - this time the second-simplest test you can think of. Maybe this time the hard-coded answer isn't good enough anymore, so you generalize the method so that it passes both tests.
It might seem stupid to do the two steps separately, but it's important to realize that while you were doing that first step, the code told you a little bit about how it wanted to be structured, even if it didn't tell you how it wanted to be implemented. And it all happened without straying very far from a place where the tests would pass.
I suppose I'm just daft, but all this talk of automated test suites sounds like pie in the sky to me--but maybe I'm reading too much into it. It seems to me that when code gets complex, no test that I could write would convince me that it is sound. On the other hand, if I can think of a test, I almost don't need it, because it'll be so simple. That said, I'd love to have a suite of tests that I can run anytime I want to verify that the whole system works, but how to do it? I mean, a test that tells me whether a bunch of code still processes some test data in a predictable way is nice, but what I really need is a test that tells me when a side effect has appeared, and that's an animal I'm not likely to see.
I work mostly on small projects in Lotus Notes and LotusScript (Visual Basic). I test as I go, but in most cases I just can't see how a cost effective automated test can be built. As a practical matter, I write tiny tests to verify each function, but mostly rely on manual functional tests to assess the system. I'll admit it does not work perfectly, and would not work for large projects, but if their is a practical alternative for small projects, I'd like to learn about it.
I chanced across the following today. I think it's a beauty - YouArentGonnaNeedIt delightfully illustrated.
http://developer.java.sun.com/developer/bugParade/bugs/4263747.htmlFor those who may not want to sign up in the SUN Java database, a few key phrases are...
while (go up through parent pointers) { if (is a window) { if (no warning string) { // Just a comment, asking if something should be done here. } break; } }
I think it's partly about the definition of "need". I know I need this bit before I can do that bit, so I do this bit first. But that bit doesn't exist yet, so I don't really "need" this bit at all (according to this characterisation of XP). So instead I write that bit first, then I find it doesn't work, so I stop what I am doing and go back and write this bit, because now I do need it. -- DaveHarris
I think JustInTimeProgramming is a better phrase for describing this part of ExtremeProgramming. It's also quite trendy.
There are analogies between JustInTimeProgramming and JustInTimeCompiling?. In both cases, by deferring work as late as possible you maximize the information available to do the work, which means you can do a better job. In both you avoid doing work that turns out not to be necessary. In both you minimize the amount of produced code that has to be warehoused.
For me at least, this phrase is more precise, more positive, less confrontational. It emphasises getting it done on time rather than not doing it. -- DaveHarris
The two are related, and in ExtremeProgramming we use them both. JIT affirms certain positive behaviors. YAGNI prohibits certain negative behaviors. Some times call for "good dog", other times call for whacking the programmer on the nose with a rolled-up newspaper. --RonJeffries
This page mirrored in ExtremeProgrammingRoadmap as of April 29, 2006