Death March author Ed Yourdon admits he was wrong
The man who made his fortune on software methodologies now says they are irrelevant
Edward Yourdon invented much of modern software engineering in the 1970s. He has produced a steady stream of books for the past 20-plus years, many of them reporting on the latest trends in the field. In his most recent,Death March, he renounces many of the formalisms he created two decades ago, saying that they are irrelevant in today's Internet-speed world. We trace Yourdon's about-face through the 1990s, examining such books as, Decline & Fall of the American Programmer(1992), Rise & Resurrection of the American Programmer (1996), and finally his most recent offering, Death March: The Complete Software Developer's Guide to Surviving 'Mission Impossible' Projects (3,400 words)
After a long and distinguished career as a software development methodologist, Yourdon's recent books have shown an almost complete change of heart. The book he wrote a year ago, Rise & Resurrection of the American Programmer, effectively said, "Maybe this stuff is not for everybody." His new book, Death March: The Complete Software Developer's Guide to Surviving `Mission Impossible' Projects, goes further by saying that in today's Internet-speed world, much of it is not for anybody.
Ed Yourdon put himself on the map in the mid-1970s with a set of formalisms for software development collectively called structured methods. His work of that period is represented by the classic, oft-revised Structured Design: Fundamentals of a Discipline of Computer Program and Systems Design. At that time, the first wave of large commercial systems had been designed and implemented. Programmers had languages like FORTRAN and COBOL, but there were few tools for managing large projects like airline reservation systems, banking back offices, or space flights. At best, ad-hoc management techniques were used.
Structured methods were the result of Yourdon's insight that large systems could be described in standard ways that are more abstract than what's possible in FORTRAN or COBOL code. His work was one of the most successful attempts to provide some structure and sanity to large system development -- projects that involve tens or even hundreds of engineers. It included an elaborate graphical notation scheme for depicting elements of systems, which was sort of like a visual programming language. But it was not executable; it was simply a way of notating components of systems and their interrelationships so that developers and management could see the big picture. Using structured methods, you could describe systems from the top level down to a level at which relatively small pieces could then be programmed, in whatever language made sense, with confidence that they would all link together into a working whole.
At first, you could only use structured methods' notation by hand; there were no tools available to generate code from structure or dataflow diagrams. The tools, called CASE tools (for Computer Aided Software Engineering), came slowly. At first, they allowed programmers to draw the pictures on a screen and add documentation. The tools would make the drawings look nice, annotate them, sort, search, print, etc., but still not generate code. They were highly expensive and required weeks of training, but the benefits were clear: increased software reliability, ease of project management, predictability of software maintenance, and so on.
CASE tools that actually generated code came out in the mid to late 1980s. But by then, much of the world had moved on beyond FORTRAN and COBOL. Other computer scientists, like Alan Kay at Xerox PARC and Barbara Liskov at MIT, designed programming languages in the mid-1970s (Smalltalk and CLU, respectively) with certain abstraction techniques built into them, in contrast to Yourdon's language-independent notation. Smalltalk and CLU embodied what became known as object-oriented design principles, which by the late 1980s were supplanting the older structured methodologies. Object-orientation became very popular through its incorporation into languages like C++ and development environments like NextStep.
Because of the lack of good tools to support structured methods (and similar methodologies) and the growth in popularity of object-oriented programming, traditional CASE tools never really caught on beyond large software development companies, defense contractors, and industrial giants like AT&T -- the places where they added the most value. So Yourdon teamed up with Peter Coad and adapted his principles to object-oriented design. This didn't quite work either: other object-oriented design methodologies, such as those of Grady Booch, James Rumbaugh, and Ivar Jacobson -- which were designed from scratch rather than as an outgrowth of older methodologies -- proved more popular. Coad/Yourdon and other object-oriented methodologies were dismissed by some as old-timers' attempts to get with the program (sorry).
Object-oriented CASE tools were developed that -- finally -- not only allowed programmers to draw pretty pictures but also generated code. The OO/CASE market has consolidated to its current state, where there is a single dominant vendor -- Grady Booch's Rational Inc. -- which hired both Jacobson (and acquired his company, Objectory) and Rumbaugh. And the bottom dropped out of the market for diagram-only CASE tools (known as "upper CASE") in the early 1990s: the software vendor ShapeWare introduced a Booch diagram add-on template for its excellent, low-cost Visio graphics package for Windows -- resulting in a reasonable upper CASE tool for something like $200.
In other words, CASE tools and structured methodologies were a nice idea that never quite achieved the mass acceptance that folks like Ed Yourdon hoped for. Two important factions actively eschewed structured methods: small development groups and major packaged software houses like Microsoft. Yourdon became known as one of the most important founders of modern software engineering principles, but his ideas were more admired in theory than put into actual practice.
This must have gotten Yourdon personally depressed during the late 1980s and early 1990s. In 1992, he wrote a book with the provocative title, Decline & Fall of the American Programmer. A cynical view of that book's alarmist thesis goes something like this:
Let's start with the whole idea of programming as a profession. Programming has never been allowed to become a profession like law, medicine, or architecture -- at least not in America. If you believe Ed Yourdon's earlier process-obsessed view of the world, then somehow, during the history of the computer, programmers metamorphosed from "genius rocket scientists" to Dilbert-like zhlubs who write COBOL code in large banks: mechanical craftsmen who are more analogous to bricklayers than to architects. American industry, as a whole, still hasn't quite figured out that good programmers are skilled individual contributors -- like trial lawyers, surgeons, or financial analysts -- and should be treated as such, not as peons who get Peter-principled into management. Industry applied the command-and-control management philosophy, which was in vogue in the 1960s and 70s, to software development; structured and similar methodologies were natural outgrowths of such mentality. Only recently have a few enlightened software companies begun to figure out how to manage the so-called "cowboy" programmers who disdain bureaucracy but get most of the work done, and get it done quickly.
Another of Yourdon's invalid assumptions is that it's not necessary for programmers to have or use good communications skills. In other words, at the most basic level, it's perfectly OK for Filipinos who don't speak English to create software (and write documentation) for Americans, and to do it by communicating over long-distance telephone and e-mail. Yourdon admits in Rise & Resurrection that this assumption was wrong. But it did not become wrong. It was always wrong.
When good enough is good enough
The book Rise & Resurrection of the American Programmer, which was published last year, is Yourdon's attempt to explain the explosive growth and continued global domination of the American software industry in light of his previous book. To his immense credit, he does so by examining the statements he made in Decline & Fall and seeing how they hold up four years hence.
Rise & Resurrection viewed software engineering through the lens of such late 1980s and 90s management themes as decentralization, small-team empowerment, and lean-and-mean operation. Large corporations have been reducing the role of corporate management and giving more operating autonomy to smaller divisions, on the rationale that local empowerment must go hand-in-hand with local accountability and local performance. Analogously, Yourdon notices that the most productive software teams are those that are given the freedom to choose their own tools, methodologies, and working styles. (A cynical version of this philosophy is "If you can do it the way you want, then you have fewer excuses for failing.")
Yourdon's view is reactive, in that it reports on what has worked for existing organizations rather than proscribing how software should be developed. He stresses notions like "personal best practices," "dynamic processes" (i.e., those that can change during a project), and the need to accommodate superstar "cowboy" programmers. All of this runs exactly opposite to his former command-and-control view of the world.
The most startling new concept in Rise & Resurrection is that of "good enough software." This essentially means removing the time-honored stricture that all software must be bug-free before it is released; instead, software quality becomes a third variable added to the traditional development time vs. functionality tradeoff. Yourdon observes that, through highly-competitive free market forces, users have essentially been deciding how much quality they want. It may not be necessary to have a zero-defect word processor, even though no one doubts the importance of bug-free air traffic control systems and nuclear power plants. For most types of packaged PC software, users want OK quality now rather than bulletproofing later.
This all makes sense, but it flies in the face of some of the (admittedly extreme) claims of certain methodology czars in previous decades: that "quality is free" as long as you use the proper formal methodology and use it correctly. Supposedly, if you captured all of the users' requirements using some formal notation, used certain methods to transform those requirements into functional specs, and coded precisely to those functional specs, then your odds of producing high-quality software increase dramatically. There was even an experimental technology called "transformational software development" that purported to do all of this automatically and thus produce zero-defect code. We all now know that this is bunk, because it is impossible to capture user requirements completely and accurately. Users rarely know, and can even more rarely articulate, what they want from software; and even if they can, the requirements are highly likely to change during the duration of the project. So now Yourdon sees quality as something that requires as much work as functionality.
The most famous proponent of "good enough" software is, of course, Microsoft. In one chapter of Rise & Resurrection, Yourdon takes us into the belly of the Beast and determines what processes Microsoft uses to develop software. He reports that, despite the company's reputation as a collection of juvenile hackers, Microsoft has implemented some judiciously-chosen processes and does track some quality and productivity metrics, but that front-line developers are kept blissfully unaware of them. They empower their top developers and add processes where they make sense and do not interfere.
Rise & Resurrection gives an excellent overview of then-current (1995) thinking about software tools and processes. He makes room for dissenting opinions. But the majority of voices represented are those of his fellow methodologists and metrics gurus, rather than, say, the hotshot programmer whose incredible ideas and productivity put a startup company on the map, or even the development manager for a huge-selling shrink-wrapped application. He spends a lot of time talking about how certain methodologies, or process measurement schemes like the Software Engineering Institute's Capability Maturity Model (CMM), could be made to apply in bits and pieces rather than all the way. Certain aspects of this stance are hard to dispute -- essentially, the notions that you should make informed choices about methodologies and processes rather than ignorant ones, and that any process is better than no process. But overall, he comes across like a guitar teacher who has despaired of teaching young Eddie Van Halen wannabes lots of scales, positions, and arpeggios when they just want to learn "Eruption" from the record well enough to play it for their girlfriends next weekend.
He also summarizes certain aspects of software development that have helped change its mechanics over the years. First, teams of programmers can be more productive, and thus smaller, thanks to the latest generation of rapid development tools a la PowerBuilder and Delphi; Java technology, which makes software automatically platform-independent; and other innovations. This is especially important in light of another fairly recent discovery: that large projects are bad. People like Capers Jones and Meilir Page-Jones (unrelated), who measure all sorts of things about software development projects, have published statistics that lead to this inescapable conclusion. For example, they have found that two things rise with the size of projects: risk of failure (serious time and cost overruns, lack of user acceptance, project cancellation), and the percentage of project time and resources devoted to nonproductive activity (meetings, memos, coordination). In fact, Capers Jones found that almost three-quarters of the effort on the largest projects is non-productive. Second, technology changes so rapidly nowadays that any technological decisions made in year X are guaranteed to be obsolete in year X+3.
In other words, the only projects worth doing are small and quick, yet today's tools allow the most skilled people to get more done that way. At Sun, our CIO has a rule that limits size (10 engineers), duration (1 year), and cost ($1 million) of any internal development project. I can say from personal experience that this is a good rule. It's unfortunate, though that technologists often develop project ideas that are based on some grand vision that involves a fundamental architecture (e.g., client/server) requiring years of pain and agony to develop. Reality dictates that such projects are virtually impossible to do successfully. You have to break them down into smaller chunks and thus live with architectural compromises. That's one reason why I'm glad I'm not in software development anymore.
Marching at Internet speed
Small and quick are the only projects that make sense anymore, but, to make matters worse, management often requires they become even smaller and quicker. Increased business competition, more volatile corporate politics, and other factors engender many projects that have utterly unreasonable deadlines and are resourced with half the money and people you need to get them done. Yourdon calls these projects death march projects. Sound familiar? It should.
In Yourdon's latest book, Death March, he concludes from his own findings that at least half of all software projects are "death march" projects; such projects are becoming the norm rather than the exception. I'm surprised that the percentage is that low. Death March describes the world of corporate software development that I'm glad I left a couple of years ago. It's a world of paradoxes:
In other words, there's no escaping the "death march," except in certain hidebound large corporate situations in which the projects probably aren't worth doing anyway. So what do you do when you are faced with a six-month timeline on a project that looks like it should take a year? You know you will be faced with long hours and unbearable pressure. What tools, what processes will help you get it done? That's what Death March is about.
Death March's messages are prosaically simple: use common sense; maintain a low profile so that you can avoid corporate bureaucracy; don't start learning new software tools on a "death march" project; stick with what you know and can use immediately; be adamant about trading off time for functionality and quality. The book is also full of practical tips for project managers who must run "death march" projects, on such subjects as maximizing productivity amid horrifyingly long hours, retaining and motivating good people, negotiating with management, navigating political waters, and so on.
This book contains advice that is disappointingly mundane, considering the source. Here is the man who virtually invented software development methodologies, now telling us to avoid the corporate "Methodology Police" in order to get the project done on time. He also makes a big deal out of the concept of triage, whose application to software development anyone who has seen M*A*S*H would understand intuitively. His advice is unquestionably good, but it amounts to this: forget all that highfalutin, fancy process stuff. Just use your common sense, stick with the tools you like best, and get the work done without unnecessary overhead.
Who needs Ed Yourdon to tell them this? Perhaps the value of the message is to shake old-style software development organizations, which were built on Yourdon's (and others') strictures, out of their torpor. For them, this book would be like Cotton Mather telling his flock that a glass of wine with dinner every night is medically beneficial and makes you less uptight.
These books are nevertheless worth reading, for a couple of reasons. First, Ed Yourdon is a very good writer. His style is reminiscent of the messages he is preaching: it's engaging, conversational, and dynamic, but it contains amusing vestiges of his former rigid outlook -- mainly in the form of textbook-like section and subsection numbers that don't really fit. Second, he brings the perspective of one who has probably seen more different software development situations than anyone else on Earth. And, admittedly, he has always claimed that blind adherence to (or inappropriate adoption of) any software methodology, whether his or another, leads to disastrous results. He has always encouraged us to use common sense and put tools and processes in perspective. That perspective is always worthwhile, even if some of the baggage he carries seems old and worn.
Title: Death March: The Complete Software Developer's Guide to Surviving
`Mission Impossible' Projects
Author: Edward Yourdon
Publisher: Prentice Hall Computer Books
List price: $24.95
Bill Rosenblatt is an enterprise IT architect at Sun Microsystems, where he specializes in media technology. Reach Bill at email@example.com.
If you have technical problems with this magazine, contact firstname.lastname@example.org