Thursday, July 18, 2013

Agile Development leads to Alzheimer's

Iterative development and design helps you to reach your way towards understanding what the customer really needs, to try out new ideas, evaluate designs, experiment, respond to feedback and react to changing circumstances. Everything gets better as you learn more about the domain and about the customer and about the language and technologies that you are using. This is important early in development, and just as important later as the product matures and in maintenance where you are constantly tuning and fixing things and dealing with exceptions.

But there are downsides as well. Iterative development erodes code structure and quality. Michael Feathers, who has been studying different code bases over time, has found that changes made iteratively to code tend to bias more towards the existing structure of the code, that developers make more compromises working this way. Code modules that are changed often get bigger, fatter and harder to understand.

Working iteratively you will end up going over the same ground, constantly revisiting earlier decisions and designs, changing the same code over and over. You’re making progress – if change is progress – but it’s not linear and it’s not clean. You’re not proceeding in a clear direction to a “right answer” because there isn’t always a right answer. Sometimes you go backwards or in circles, trying out variants of old ideas and then rejecting them again, or just wandering around in a problem space trying stuff until something sticks. And then somebody new comes in who doesn’t understand or doesn’t like the design, tries something else, and leaves it for the next guy to pick up. Changes in design, false starts, dead ends and flip flops leave behind traces in the code. Even with constant and disciplined refactoring, the design won’t be as clean or as simple as it would be if you “got it right the first time”.

It doesn’t just wear down the code, it wears down the team too

Iterative development also has an erosive effect on an organization’s memory – on everyone’s understanding of the design and how the system works. For people who have been through too many changes in direction, shifts in priorities and back tracking, it’s difficult to remember what changed and when, what was decided and why, what design options where considered and why they were rejected before, what exceptions and edge cases came up that needed to be solved later, and what you need to know when you’re trying to troubleshoot a problem, fixing a bug or making another design change.

Over the course of the last 6 or more years we've changed some ideas and some parts of the code a dozen times, or even dozens of times, sometimes in small, subtle but important ways, and sometimes in fundamental ways. Names stay the same, but they don’t mean what they used to.

The accumulation of all of these decisions and changes in design and direction muddies things. Most people can keep track of the main stories, the well-used main paths through the system. But it’s easy for smart people who know the design and code well, to lose track of details, the exception paths and dependencies, the not-always-logical things that were done for one important customer just because 25 or 50 or 110 releases ago. It gets even more confusing when changes are rolled out incrementally, or turned on and off in A/B testing, so that the system behaves differently for different customers at different times.

People forget or misremember things, make wrong assumptions. It’s hard to troubleshoot the system, to understand when a problem was introduced and why, especially when you need to go back and recreate a problem that happened in the past. Or when you’re doing trend analysis and trying to understand why user behaviour changed over time – how exactly did the system work then? Testers miss bugs because they aren't clear about the impact of a change, and people report bugs – and sometimes even fix bugs – that aren't bugs, they've just forgotten what is supposed to happen in a specific case.

When making changes iteratively and incrementally, people focus mostly on the change that they are working on now, and they forget or don’t bother to consider the changes that have already been made. A developer thinks they know how things work because they’ve worked on this code before, but they forget or don’t know about an exception that was added in the past. A tester understands what needs to be tested based on what has just been changed, but can’t keep track of all of the compatibility and regression details that also need to be checked.

You end up depending a lot on your regression test suite to capture the correct understanding of how the system really works including the edge cases, and to catch oversights and regression mistakes when somebody makes a fix or a change. But this means that you have to depend on the people who wrote and maintained the tests and their understanding and their memory of how things work and what impact each change has had.

Iterative development comes with costs

It’s not just the constant pace, the feeling of being always-on, always facing a deadline that wears people down over time. It’s also the speed of change, the constant accumulation of small decisions, and reversing or altering those decisions over and over that wears down people’s understanding, that wears down the mental model that everyone holds of how the system works and how the details tie together. All of this affects people’s accuracy and efficiency, and their confidence.

I am not sure that there is a way to avoid this. Systems, teams, people all age, and like in real life, it’s natural that people will forget things. The more changes that you make, the more chances there are for you to forget something.

Writing things down isn't much of a help here. The details can all be found somewhere if you look: in revision history, in documentation, in the test suite and in the code. The problem is more with how people think the system works than it is with how the system actually works; with how much change people can keep up with, can hold in their heads, and how this affects the way they think and the way they work.

When you see people losing track of things, getting confused or making mistakes, you need to slow down, review and reset. Make sure that before people try to fix or change something they have a solid understanding of the larger design – that they are not just focusing on the specific problem they are trying to solve. Two heads are better than one in these cases. Pair people up: especially developers and testers together, to make sure that they have a consistent understanding of what a change involves. Design and code reviews too, to make sure that you’re not relying too much on one person’s memory and understanding. Just like in real life, as we get older, we need to lean on each other a bit more.


Anonymous said...

I agree with all the points you made but I disagree with the conclusion that "iterative development comes with costs" or "agile development leads to alzheimer".

I mean, even if you do start the project saying "this is not going to change so I will do a clean design", you finally end up modifying the code and you have the exact same traps !

So you'd better say IMHO that "developement leads to alzheimer" and not particularly agile developement.

To avoid these traps, there exists IMHO one possibility that is VERY VERY difficult to do right : you need to have lots of small and separate applications communicating with each others so that you can ditch an entire application if necessary.

This is very very difficult because at one moment or the other you may start to depend on another app and then you have the same problems but I think it can mitigate the risks a lot longer.

Marko said...

Not enough code refactoring leads to problems, not agile development.

People will forget things one way or the other. One should not depend on the fact that he has developers who remember how something is implemented, but on the fact that ANY of his developers can look it up in a small amount of time without difficulties.

Jim Bird said...

Any kind of change is erosive. Iterative development, because it accelerates change, accelerates erosion. Refactoring mitigates this, if it's done well, but it doesn't solve it completely.

I agree that people can always go back to the code - the only real source of truth. But people won't do this every time, especially people who have been working on the same code for a while - they already know what it does, or think they do. And this is where problems come in.

Paddy3118 said...

You might want to inject some Simulated annealing into your iterative flow. Every so-so iterations you need to "turn down the temperature" and "crystallize" by taking a good look at your system to inject some focus by removing outliers of functionality take the time to converse with your customers (who might only be your marketing guys). Explain that the general "cost" is escalating needing certain trimmings that are projected to cost Y which must be passed on to customers or taken off your profits.

Play the game to make others know that features don't come for free and that these features in your cost benefit calcs need culling.

Take any gains by not just dropping code; those features you selected should have been selected to allow you to add structure to your codebase.

At other iterations you would be much more open to added features but evaluated/implemented in the light of the improved structure of the codebase, equivalent to raising the temperature of your simulated annealing.

tvk said...

What about writing good commit comments, so that everybody can check in IDE what's there in the code for what reason. (Eclipse, annotation view, version control history.) Best to mention BUG or Story ID in commit comment.

What about writing proper unit tests, which is a kind of live documentation and prevent breaking old features and also helps keeping code clean and maintainable.

What about using code analysis tools which gives a clean picture about technical debt and its trends?

What about considering O and D of the SOLID principles, which aren't so popular but highly help writing modular code. Modularity is breaking down big complexity into small pieces. We badly need it.

What about doing boyscouting. Always make a small cleaning in the code you are just working on?

Iterative development itself is not agile. Think about the fourth sentence of the Agile Manifesto: 'Responding Change over following a plan.' This says, you have to react if technical debt is too much. If you don't do it, you are not doing agile development.

Jim Bird said...


All of your points are valid. Iterative development is not necessarily agile - but agile development is iterative, and it is the impact of sustained, continuous iterative change that I am concerned about. Teams can and should do all of the things that you recommend - they will all help with managing technical debt. But they won't make the impact of constant change go away. It's unavoidable.

Anonymous said...

The whole point of agile is that in the beginning the people involved do not completely know what is the problem to be solved and what is the best way to solve it.

This uncertainty implies that small iterations are a good idea. And they are.

Of course this refactoring (rewriting) comes with a cost. But it is perfectly valid approach if the cost is significantly smaller than the alternative approach of doing comprehensive design and modeling (known as the waterfall model).

It feels you are bashing agile with unreasonable arguments. Agile in it's various flavors is a trade-off and a very good approach when circumstances warrant it. In my experience the cost is not significant even if, in hindsight, the circumstances didn't really require an agile approach.

When you go from the beginning with the idea that many portions of the codebase will be rewritten it is logical to choose technologies and languages which make this as easy as possible. The agile posse has done very extensive research to minimize the cost and risks of this refactoring.

Jim Bird said...


My post isn't meant as a criticism, but as an observation. There is a lot of value in iterative, incremental development. But over an extended time it wears down the code (even if you refactor - see Michael Feathers' recent observations on refactoring, which shows that changes over time are biased towards the existing code structure, amplifying the impact of early decisions, whether good or bad) and it wears down the team's memory of what decisions were made when and why.

Erosion and entropy are inevitable if you're changing the same code a hundred or more times over several years, and you're releasing changes constantly.

Unknown said...

I have to ask, what is the alternative? Ask the customer for 4 months to rewrite the software, or continue to add features as needed, which is what agile offers. What is the suggested alternative to agile? -WH

Site Meter