ExtremeProgramming emphasizes testing, communication and several other practices that speed development and make it reliable. Catalysis emphasizes rigorous specification to aid the development of reusable components. ExtremeProgramming does not look specifically towards reusability. However, my observation is that UnitTests can act as specifications of external behavior. Their focus is a bit different from the practices in DesignByContract. If we follow this line of reasoning, we could look at ExtremeReuse practices.
ExtremeCatalysis is a very tongue-in-cheek phrase. If ExtremeProgramming and Catalysis could be unified, we'd have a GrandUnifiedMethodology?. That'd be something to chew on. Regardless, it is interesting to see the same thing approached from opposite sides.
Surely the major issue here is with the expense of refactoring all those CatalysisMethodology diagrams/models. What happens when you refactor? How do you cope with change? -- DafyddRees
Specifications stabilize far faster than code, and don't need as much refactoring because they are simpler. Also, specifications, unlike code, can be described using a series of refinements starting from the very abstract and adding detail. Your most abstract specification can describe the big picture; successive refinements split areas of concerns and add more detail. The split between levels acts like an encapsulation barrier. Changes near the bottom need not filter to the top. On the occasions that they do, they indicate serious mistakes.
Appropriate programming style helps capture "refinement" in code, but not with the ease of doing refinements in a specification. See also AllThoseDiagrams.
That's fair enough, perhaps I shouldn't have said "all" those diagrams ;-) --DafyddRees
Refactoring specifications is exactly as easy as refactoring code if you never compile or test the code. If you do compile or test, you have to expend a bit more effort with code, but you've also learned something that you could never learn from a specification: whether your thinking was faulty or not.
True enough, but remember that specs can be more concise, because you can totally disregard implementation. That said, you can "compile" your specs to varying degrees - from simple typechecking of OCL (Object Constraint Language), to programming them more or less directly in functional and logic languages. It is even cooler if your language includes refinement as a first-class structure (I can't recollect an example just yet). -- Aamod Sane
Specifications, unlike code, can be described using a series of refinements starting from the very abstract and adding detail.
Please explain in what ways code cannot be described using a series of refinement, starting from the very simple/abstract and adding detail. --RonJeffries
See SpecVsCode -- AamodSane
Of course code can be described in that way. The problem is that we typically only keep the lowest level of abstraction as our code base. Sometimes we attempt to preserve the abstractions artificially using "high level" classes or what-have-you (that then invoke the lower-level stuff), but mainstream programming environments just don't provide facilities that allow you to keep all the historic levels of abstraction, so that you can squint your eyes and see the big picture. This is usually the role of "design documentation". And we know what happens to that... -- JohnDaniels
As I see it, appropriate tools are the main prerequisite to make ExtremeCatalysis work. One of those would be a programming language that truly allows to use code as the canonical representation of the design. Such a language would need the means to express design elements, such as classes, associations, collaborations directly; a flexible and powerful macro system seems to be the key (my proposed candidate for language that may live up to this is DylanLanguage). Then there's a need for tools that work on the code at higher levels of abstraction: say, point one end of an association at a different class or change its cardinality. -- MichaelSchuerig
I'm generally disappointed in the way a lot of folks use UML --- just to draw pictures of the code. I'm very unkeen on that --- I think they're missing the point. Tools that translate straight between pictures and program code are IMHO of limited usefulness for large practical systems. If your code is a mess, converting it to pictures won't make it any better; and conversely, UML is an inadequate notation for programming.
If UML is good for anything, it's for presenting partial views of a program.
Why would I want to write something other than the final code? (1) Because I want to specify behavior before I start coding. (2) Because I want to describe the behavior or protocols common to many components or objects, separately from their various implementations --- for example, integrating enterprise applications; or defining plug-substitutable components; or just doing some neat polymorphic programming. (3) Because I want to discuss an implementation at a high level for the benefit of maintainers. (4) Within a pattern, I want to describe a general template that will be realised in a design.
Tests are useful models for the behavioral abstractions, (1) and (2). The others need something else --- UML collaboration diagrams for example.
But I still would like to use UML+OCL for behavioral models. (Let's be clear, modeling != pictures. There are pictorial programming languages, and textual modeling languages. Pictures are useful provided you have a clear meaning for them. UML could currently do with some improvement in that area, though we have a fairly specific interpretation of it in Catalysis, and I'm contributing to efforts to improve the UML semantics.)
Why use anything other than programming language? Mostly because tests can be very long, relative to an equivalent piece of notation purpose-built for modeling (which OCL+UML could sorta be seen as, if you squint your eyes and look sideways). They can therefore be difficult to work with rapidly while discussing the design. Specifics:
And we do have ways of exercising UML models (using 'snapshots'), though only on paper. The objective of incremental development is to cycle with the customer very rapidly: and when we're grabbing the initial requirements, the Catalysis analyst can raise questions about the consistency and holes in all the customer's expressed requirements long before the pure XPer. However, XP overtakes after the high level model is sorted: so there is no point in doing fine-detail models, or tedious long models that write postconditions of get and set operations.
-- AlanCameronWillsI must agree with Alan and John Daniels. Coding IS modeling. Anything that can be expressed in a programming language like Java or C++ can also be represented in a visual notation like UML. The difference between VISUAL modeling and coding hinges on the amount of information a model can describe in one go. A class diagram can neatly sum up the structure of a large chunk of program code (thousands of lines) in a single page. Visual modeling helps us cope with complexity. It enables us to separate out different aspects of the complete model and view it from all sorts of interesting and helpful angles. All sorts of design flaws are easier to spot if you use the right notation. This is why mathematicians invented graphs. Would you be happy buying a new car if you knew the design was never modeled visually?
There is another aspect to this debate that hasn't been touched upon, which is modeling concepts that have nothing to do with code. In Extreme Programming an understanding of the problem domain comes through the use of shared metaphors. The idea is that a system metaphor is chosen that everybody understands. Of course, in practice its very rare that two people will have exactly the same understanding of a problem so XP relies chiefly on large amounts of communication and feedback to steer the team's understanding towards the correct interpretation. Using a precise modeling notation at this point could dramatically reduce the amount of communication required.
A clear and precise understanding of the requirements is a major factor in meeting them - and thus in project success. Using test cases as specifications has its drawbacks, not least of which is that a complete specification would often require an infinite numbers of test cases. Specification by example works more than 90% of the time because the interpreted meaning of a test case is usually the right one. Having said that (here I go again) would you be happy buying a new car if you knew that the people who built it only interpreted the specification right 90% of the time? Ambiguous requirements are much harder to test, of course - without a clear specification how do the XP-ers choose their test cases? Arguably they carry around a precise behavioural spec in their heads. How nice it would be if they just took the time to write it down for others to see...
XP and Catalysis are by no means mutually exclusive, as an XP-like process (test-first, constant communication and feedback, refactoring, the planning game, go easy on documentation etc etc... all good ideas in practice) can only be enhanced by a more precise and rigorous approach to specification and refinement. Long live ExtremeCatalysis!
--- JasonGorman
It is often tempting to think about systems development as the creation of software systems that solve specific computational problems. As Jason points out, this is often not case.
Large business systems are usually built from a combination of people, processes (formal and informal), hardware and software. Software, like the others, must support the ultimate goals of the business. Changes to a business, such as the development of a software system, generally have many stakeholders with differing (often conflicting) views and needs.
Reconciling and refining these views and then providing a structure for the resulting problem is a complex process. To do this in a way that is open and traceable (and therefore resilient to change) is even more so.
This process has little to do with software and yet is a vital part of the development process: Unless we understand the problem, we can't develop a solution. Unless the problem has a robust structure, we can't break it up into manageable pieces and expect the solution to survive future change.
Testing is also a vital part of the development process because it provides validation that the system conforms to our understanding of what a system should do. But how do we validate our understanding? Non-technical stakeholders are unlikely to validate test code directly; and so the gap must be bridged by formal or informal communication. The problem with the latter, however, is that it is imprecise and untraceable. This is the real world - people forget things, misinterpret things and twist things. For all the benefits of informal communication, it is not (by itself) a robust way of ensuring that the system does what everybody is expecting, or what is best for the business.
There is also a big risk that developers under pressure will choose to test aspects of the system that are easy to test, missing those that are obscure or difficult to test. Ultimately, testing can only prove one thing - that a system is does not behave correctly.
This said, proving the behaviour of even relatively simple systems is horrifically difficult and tends to completely stifle creativity. But then software engineering is all about making informed trade-offs and it is our responsibility to choose the appropriate tools and techniques for each job that we do. Nobody would argue that a methodology for designing web sites is automatically suited to the development on nuclear power station control systems. The most general advice that I can offer is to evaluate the methodology and approach, on a per-project basis, as thoroughly as any other tool that you would use.
-- ColinCassidy
I've noticed over the last two years a distinct clash of cultures between die-hard XP-ers and folk from a Formal Methods background, and I think that is all it is - a culture thing. My own issue with XP is that, in practice, XP teams seem to be entirely focussed on code and almost oblivious to "higher problems" like the business goals of a project or the context in which applications will be used (usability). ExtremeProgramming doesn't prescrivbe this, but it does seem to attract those kinds of developers.
With hindsight I find that TestDrivenDevelopment needs to be extended to include higher level tests - beyond system-level and into business-level. This essentially means developing systems (as opposed to software) to specifically meet testable business goals and ensuring those goals are met (goals being simply assertions about what should be true about the business - eg, "the sales hotline should answered within 3 rings"). UML, despite its many weaknesses, can be utilised to build models that allow us to capture business goals as some assertion about the business model (eg, "customer.credit <= customer.creditLimit"). In many cases software is just a red herring!
This, of course, goes down like a tonne of bricks in planning meetings or retrospectives. It gets little or no resistence from the Formal Methods people, though... :-)
-- JasonGorman
This page mirrored in ExtremeProgrammingRoadmap as of April 29, 2006