Geeks With Blogs
Ulterior Motive Lounge UML Comics and more from Martin L. Shoemaker (The UML Guy),
Offering UML Instruction and Consulting for your projects and teams.

My buddy Josh Holmes has written a very excellent post on the Return on Investment (ROI) for software. I recommend it to anyone who sees software as a business, not just a job or a hobby.

Last week, the always-worth-reading Patrick Greene made a comment that made me start thinking specifically about the ROI for Requirements Analysis. Most teams and most managers know they have a requirements problem; but too many of them say, “But we’re too busy to fix it, so we’ll just start coding.” Or “We’d like to fix it, but we can’t, so we’ll just start coding.” Or “Yeah, but our customers won’t answer our questions, so we’ll just start coding.” Or “Our executives want progress, they want us to just start coding.”

And that leads me to name a common, pernicious antipattern, JSC: Just Start Coding. This is not Agile Development, but it masquerades as Agile. It really means, “We’re gonna be late no matter what we do, and our executives are wondering why we’re not coding already. So we’re gonna Just Start Coding, and call it Agile.”

Which is exactly wrong. Good Agile, real Agile, is very requirements-centric. It’s about surfacing requirements and particularly requirements miscommunications quickly and frequently, so that we stay as close to the right path as possible, and waste as little time as possible on wrong paths. Agile makes short, quick moves, and short, quick corrections.

JSC, on the other hand, just makes quick movement, with no mechanism in place for planning and correction. “We’re on the wrong path? Who cares? At least we’re coding!”

And yes, let’s admit the dark side: we developers often contribute to this mess. Coding is comfortable. Requirements analysis (for many of us) is uncomfortable. It’s really easy to ignore nagging doubts and bury your concerns in lots of fun coding challenges. “Well, it doesn’t make sense to me; but this is what they asked for, so this is what I’ll build.” We abdicate responsibility for the wrong requirements.

But I want to follow in Josh’s footsteps, discussing software as a business, not just a job. And for a business, JSC has a cost. A huge cost. And Requirements Analysis can eliminate much or all of that cost. Maybe a little ROI calculation can help us to win executive support to do the job right.

Some Fundamental Assumptions

The ROI calculations that I’ll make are based on three fundamental assumptions: two from industry research, and one from faith.

  1. Barry Boehm’s COCOMO II research tells us that poor requirements analysis can add 40% to as much as 100% to your schedule.
  2. Many different sources (summarized by Steve McConnell) tell us that the average project is underestimated by roughly 100%.
  3. Those were the research; here’s the faith. I contend that with a relatively minor investment of time compared to the overall schedule, your team can drastically improve your requirements, reducing that 40-100% schedule slip to nearly zero.

You can’t fix the tendency to underestimate (though you can make a good start by reading Steve McConnell); but I believe that most of what you need to fix requirements failures is simply paying attention to requirements early and refusing to ignore the obvious failures. I’ve worked on too many teams where everyone knew the requirements were fatally flawed, yet no one did anything about them. Everyone saw those flaws as “just the way things are around here”. They erected a giant SEP field around the requirements flaws, and they Just Started Coding.

I don’t think it has to be that way. I think that we as developers have the responsibility and the power to fix most of our requirements problems, if only we can persuade our managers to let us – and persuade ourselves to try. I’ve written a whole book (coming soon, I hope) on how we can perform better as Analyst-Developers; but to get management to back us, we have to show them a business reason. The assumptions above are part of that reason.

A Tale of Four Projects

To perform my ROI calculations, I’m going to envision four projects. Well, no, not really. Rather, I’m going to envision four different scenarios for the same project:

  1. Just Start Coding (Typical). In this scenario, the team starts coding from the earliest possible moment. Management has a target date (they’ll call it an Estimate, but McConnell will call that a delusion). The more likely date is twice as far down the road, per Assumption 2 above; and then because the team fails to do adequate requirements analysis, the final schedule is 1.4 times that doubled schedule, for a total of 2.8 times the expected duration.
  2. Just Start Coding (Out-of-Control). This scenario is similar to Just Start Coding (Typical); but the requirements analysis is even worse than usual, adding another 100% to the project. And these are factors, not sums; the final duration is 4 times the expected duration.
  3. Analyzed (Time Bound). In this scenario, we assume enough Requirements Analysis at the start to recognize that the project cannot possibly meet the target date; and management tells us that the date is sacrosanct. We must meet that date, even if we have to cut features to do so. It’s always smarter and less error-prone to cut those features at the start than at the end, so we do so. Then we continue a small fraction of Requirements Analysis throughout the duration of the project, likely trimming more features as we go. We hit the target date with a lean but critical subset of the desired features, and a high-quality implementation of those features. We’ll add the other features next time.
  4. Analyzed (Feature Bound). In this scenario, we assume enough Requirements Analysis at the start to recognize that the project cannot possibly meet the target date; and management tells us that the features are sacrosanct. We must supply those features, even if we have to slip the date to do so. Our final schedule will be 100% longer than the target; but when we finish, we’ll have the complete set of desired features, and a high-quality implementation of those features.

Obviously, some managers or customers will want Time Bound and Feature Bound. It can’t be done. Break the news to them gently.

Oh, all right, I know: some managers will demand Time Bound and Feature Bound. Let’s call this Scenario 5. It’s the Scenario where you do the analysis right up front, realize there’s no way to succeed, realize that management demands you violate the laws of space and time, and get your resume out on CareerBuilder.com so that you can be working somewhere else when the mandatory overtime and the punitive firings and the lawsuits begin. But since there’s no ROI for Scenario 5, we’ll ignore that scenario as we go forward.

More Assumptions

Next we need a few more assumptions (based on my personal experience, believe them as you will) and some parameters. First, the assumptions:

  • Up-front analysis requires only about 10% of the likely schedule. If your target is 6 months and so your likely schedule is 16.8 months (6 * 2 for estimation error * 1.4 for analysis error), then your up-front analysis time is about 1.68 months.
  • Because the analysis is work you would have to do to succeed anyway, just pushed to the front of the discussion, it involves the whole team, and yet doesn’t add to the final schedule.
  • Analysis is a skill most teams don’t have in abundance. You need a few skilled lead analysts to work with the team, roughly 1 lead analyst per 5 team members.
  • Once the initial analysis is done, you need little bits of analysis throughout the rest of the project. That will involve the lead analysts for about 20% of the total remaining schedule.

If you don’t like those assumptions, you can treat them as parameters and vary them; but these values fit with my experience.

As for the parameters, you really only need two:

  • How much does an hour of a developer’s time cost you (total)?
  • How much does an hour of a lead analyst’s time cost you (total)?

And that’s all you need to know. The rest of the possible factors – duration of the project, team size, etc. – all divide out, because everything is calculated as ratios of time to time, dollars to dollars, and so on. ROI is a relative measure.

And to be honest, you really don’t need to know the absolute costs for developers and lead analysts, just the ratio. As long as the ratio remains constant, you can plug in any values you like.

The Results

So given those assumptions and parameters, what’s the ROI?

Are you sitting down?

My parameters have a lead analyst costing 50% more than a developer. I don’t usually find that to be the case, but I wanted to make analyst costs relatively high to skew the ROI down.

And even with that skew, I ended up with these calculations:

  • An Analyzed (Feature Bound) project has an ROI of 652% when compared to a typical JSC project, and 2,081% when compared to an Out-of-Control JSC project.
  • An Analyzed (Time Bound) project has an ROI of 1,943% when compared to a typical JSC project, and 3,371% when compared to an Out-of-Control JSC project.

Those are astonishing ROI figures. Unheard of ROI figures.

And of course, those are ideals. Your Requirements Analysis won’t be 100% effective at cutting out lost time. Let’s assume you only cut out half the lost time that’s due to requirements failure. Then you get numbers like these:

  • An Analyzed (Feature Bound) project has an ROI of 24% when compared to a typical JSC project, and 776% when compared to an Out-of-Control JSC project.
  • An Analyzed (Time Bound) project has an ROI of 364% when compared to a typical JSC project, and 1,116% when compared to an Out-of-Control JSC project.

Still nothing to sneeze at. And I think you can do a lot better than just half. I know I can.

And although I've only calculated ROI based on labor savings, labor costs are not the only costs associated with requirements failures. Requirements failures increase costs across the board. Besides a longer schedule, you get more overtime, higher shipping costs, higher travel costs, missed commitments, lost opportunity costs, interest payments, performance penalties… even costly litigation.

And there's also a cost in quality. Reworked code has more bugs than original code. The rework is done under tighter deadlines, there’s more pressure, and there’s more overtime. Developers cut corners in response. This results in more bugs; and those bugs lead to more delays and even more schedule pressure. There's a nasty feedback effect here, and it can kill a project. Requirements Analysis can cut that feedback loop.

Conclusion

Again, this is not BDUF. I just spent a week at a client’s site, and spec’ed out their major requirements in that week in more useful detail than they had developed in months. Better Requirements Analysis does not mean bigger; it means more thorough. It means recognizing when you have questions, and making sure you answer them. It means identifying priorities and making choices.

And it means some resistance. It’s not by accident that we have a systemic problem with requirements in our industry. But it’s in our power to change that.

My spreadsheet for this ROI calculation is available if you want it. Leave a comment. If you disagree with any of my assumptions, leave a comment about that, too. I want to make the most clear, justifiable case possible to my stakeholders (and to yours!) to convince them that Defining the problem in a comprehensible, verifiable form is necessary to solving the problem.

Posted on Monday, May 4, 2009 7:53 PM It's all about communication. , Requirements Patterns and Antipatterns , Development Processes , Personal | Back to top


Comments on this post: Calculating the ROI for Requirements Analysis

No comments posted yet.
Your comment:
 (will show your gravatar)


Copyright © Martin L. Shoemaker | Powered by: GeeksWithBlogs.net