Monday, June 11, 2007

I want a revolution (every twenty years)

When I was in school, I was in the midst of a revolution.

Software was engineering, and they were going to do it right. Doing software “right” had been the Top Down Software Process model, and it was Dogma. The revolutionaries wanted the Water Fall Process model.

The very first process model was the Heroic model. The “heroes” were math PhD’s and electrical engineers brainstorming how to make computers work for the first time. Computers came out of the effort to win World War II. Computers were labor saving tools for mathematicians working on trigonometric artillery tables. Later, computer technology began to incorporate control theory from aviation.

AT&T, the heavily regulated telephone monopoly, used computers to compress even more customer calls into its widely installed base of wires and call centers, thus driving down their cost and increasing profits without having to get an Act of Congress.

But the Heroic process model did not scale to larger problems.

By the 1960’s, computers were a big business. Software to control the computers was being written to solve large, complex problems. There was a lot of money, equipment, or lives at stake. There were spectacular project failures. Rockets flew off course. Phone calls were noisy or not there at all. Software was delivered late, buggy, or not at all.

A new generation of wise men of computers devised a revolution. They replaced the Heroic process model with the Top Down Process Model:

1) gather requirements from customers

2) decompose the problem into functionality to meet the requirements

3) design modular systems to meet the functionality

4) implement the design

5) test

6) document

7) maintenance

The Top Down Process scaled to larger problems than the Heroic model of just putting a bunch of smart guys in a room and hoping for the best.

As time went on, the space race drove software to new heights with the need for solving problems dealing with complex orbital math, control, and communications. Also, computers started to be used in commerce in a very bid way. Banks and stock markets realized that computers could record large volumes of transactions quickly, cheaply, and accurately.

The apex of the Top Down model was CASE (Computer Aided Software Engineering) in the early 1980’s. The idea was that you could have a design that was so good, so clear, and so coherent that you could capture it all in “The Tool,” push a button, and a working system would pop out the back end. This idea was very popular with large software customers like the Defense Department, which had spent hundreds of millions of dollars on systems that, in the end, didn’t really do everything that the DoD wanted.

Twenty years had passed. There were spectacular failures. Banks lost track of money. Radar systems would not talk to the alert system, which would not talk to the missile systems. Software was delivered late, buggy, or not at all.

In the 1980’s, a new generation of wise men of computers devised a revolution, the Waterfall Process model:

1) gather requirements from customers

2) decompose the problem into functionality to meet the requirements

3) design modular systems to meet the functionality

4) implement the design

5) test.

6) iterate: if not done goto 1

7) document

8) maintenance

Step 6, planned iteration, was a big deal. They were dealing with the scaling problem by admitting that problems are large and complex and you could not just be “smart enough” and design it right the first pass. Plan to iterate. Design, build, test, repeat.

The RAD (Rapid Application Development) was the vanguard of the model. If you do your software in this rapid iteration, you would arrive at a working solution in an economically feasible amount of time.

Huge and important systems were developed. A personal computer with word processing, spread sheets, and email went onto the desk of nearly every office worker.

The internet with email and the web revolutionized commerce and communication. Google organized and indexed the internet and effectively made everyone smarter.

Twenty years passed. The problems scaled larger. There were spectacular failures. The internet is full of security threats from crooks. The FBI could not get its case files into the hands of the agents and law enforcement. The Vista operating system was delivered years late and with many of its original features removed. Software was delivered late, buggy, or not at all.

In the 2000’s, another generation of wise men of computers had a revolution, Agile Development:

1) gather requirements from customers

2) re-factor the problem into functionality to meet the requirements

3) write automatic tests

4) implement the design

5) iterate: if not done, goto 1

6) document

7) maintenance

The software design phase was compressed even further. Testing moved from step 5 to step 3. The iteration cycles are even shorter than RAD.

Good software is being written this way. Automatic testing is a very powerful tool for writing modular code with good interfaces. It allows our projects to be robust in the face of change.

So every twenty years a new generation of wise men have advanced the state of the art with a revolution:

Heroes of computers in the 1940’s

Top Down model in the 1960’s

Waterfall in the 1980’s

Agile in 2000’s

Each revolution had to live with and accommodate the old way of doing things because they worked for a given class and size of problems.

But the problems will continue to grow in size and complexity. Agile programming makes testing a core principle, and that is good.

However, I think that in fifteen to twenty years, a new generation of wise men of computing will stage a revolution. A new process model will replace Agile as we get to the limits of what it can do. Unit testing code, even with 100% code coverage, does not mean you test 100%, 50%, or even 10% of the states and interactions of the code. Multi-threading and distributed computing will make the systems larger and more diffuse, and the complexity of the interactions will dominate the complexity of a single component. We want to be able to change parts of large distributed systems while they are still running - how do you test that completely? Software will get delivered late, buggy, or not at all.

Around 2020 or so, a new generation of wise men of computing will have to have a revolution. The problems will have scaled and we will have to do something new.

Viva La Revolution!

1 comment:

Anonymous said...

I bet YOU aren't the one bringing the revolution. Christ, does anyone you know actually believe you are a "professional" software developer?

Go stand in a corner, you dunce. You ruin the rep of the business with your irrelevant rantings and bad coding.