This is the first of three posts dealing with aspects of management that I consider important:
- Iron Triangle management
- Process model
- Delivery mode
The reason I think these aspects are important is that they affect the way we handle various management issues, particularly our choice of metrics. Metrics that apply to one option in one of these areas may be meaningless or misleading when applied to a different option.
As George Box famously wrote, all models are wrong but some models are useful. The model I present here is based on my own experience and observation. It is wrong by definition by virtue of the fact it is a model; but hopefully it is useful, as well.
The order in which the aspects are listed reflects their relative effect on our choices, in my opinion. The strongest effect comes from our approach to managing the Iron Triangle. That is the subject of this post.
The process model is also important, especially because many people conflate the process model with the approach to Iron Triangle management. For example, a common conceptual error is to assume that if we are using a timeboxed iterative process then we are, by definition, doing adaptive development. When we then have difficulty making sense of the metrics we are using, it is because we are trying to measure things that are not happening. I have seen people get pretty frustrated about this. My observation is that Iron Triangle management and process model are orthogonal. When we are clear about that, we can choose metrics appropriate to our own situation.
The final aspect, delivery mode, refers to the difference between continuous product support and discrete projects. It has substantially less impact on our choice of metrics than the first two factors, but sometimes can be important.
Before going further, let’s clarify the meaning of Iron Triangle. You may know it by the name, triple constraint. It refers to three key elements of any project or initiative: Scope, schedule, and budget. Some people include quality in the list, but in my view quality is part of scope, since a product that doesn’t meet the necessary quality standard to be acceptable can’t be considered “done.” So, by Iron Triangle I mean scope, schedule, and budget.
In my experience, there are basically just two ways to manage the Iron Triangle, and the choice depends primarily on how the project plan is used to guide and track progress. To assess our progress, we compare our actual results with our plan. When we see variance, we change something. The two approaches are (a) change reality to conform with the plan, or (b) change the plan to take reality into account. Either way, the plan is our reference for assessing progress.
The choice of approach has implications for the way we construct the plan, the metrics we use to track progress, and the corrective measures we tend take when we detect variance from plan. When the plan itself is the definition of success, we use approach (a) to deal with variances. Sticking to the plan is, by definition, the right thing to do. When the plan is treated as a point-in-time navigational aid to help us stay on track toward customer-defined goals, we use approach (b) to deal with variances. In this case, the plan defines expectations but not success. We always have some sort of plan and some sort of expectations.
There’s a saying that appears in many variations and in many contexts that goes something like this: “No plan survives first contact with the enemy.” I don’t know who first said it, or who traditionally gets credit for doing so. In any case, the idea applies to software projects as well as to other activities. Even when we are using approach (a) to deal with the Iron Triangle, the plan is subject to change. There will be formal change control procedures that can be used to make minor changes. To make major changes to the plan, people re-assess risks and costs and “re-baseline” the plan.
Even with these mechanisms for change, the general assumption with approach (a) is that the plan itself is the definition of success. When there is variance, managers ask, “What are you doing to get back on plan?” If we are to understand the mentality about the Iron Triangle so that we can choose appropriate metrics, we must be clear about this fact, even if people are adamant that they are “flexible” about accommodating change. It isn’t a question of making people feel good about being flexible in the midst of an inflexible process; it’s about choosing metrics that will help us make well-informed business decisions throughout the project.
With approach (b), the definition of success focuses more on what will be delivered at the end of the initiative than on what was envisioned at the beginning. The general approach is sometimes called adaptive development. The idea is that we learn as we go along. Everyone involved learns more about what will really be needed in the solution, about what the solution can look like, and about how to progress from where we are today to where we need to be in the future. Frequent change is part and parcel of the approach, and so there are few formalities around changing the plan. When there is variance from expectations, managers ask, “What is the best result that can realistically be achieved?”
What are the key differences in these two approaches that inform our choice of metrics? I think there is just one: The level of detail in the plan at the outset.
With approach (a), it is necessary to resolve or mitigate all risks, identify all architectural issues, and define the full scope of the work in advance. Otherwise, it would not be possible for the plan to serve as the definition of success. Metrics used with this approach are usually based on the notion of percentage complete in one form or another. The comprehensive, detailed plan firmly defines the meaning of 100%. It defines 100% of scope, 100% of schedule, and 100% of budget in advance. Popular metrics such as Earned Value and Earned Schedule compare actual performance to date with the plan’s definition of 100%. When the plan is re-baselined, we establish new definitions of 100%, which are considered to be stable for purposes of percentage-based metrics.
With adaptive development, people identify and resolve or mitigate risks at the macro level, establish a high-level estimate for overall cost and time, and define a vision for the capability to be developed. Up to that point, the work is very similar to approach (a), but then they begin work based on the high-level plan or roadmap. They analyze needs and design detailed portions of the solution incrementally as they progress. They defer decisions so that they will avoid investing in work that is ultimately discarded or revised. They deliver incrementally in order to solicit feedback from stakeholders along the way, and adjust course accordingly.
With this approach there need be no stable definition of 100% of any of the three sides of the Iron Triangle before development begins, in principle. In practice, people do firmly define at least one factor. When time-to-market is the key business driver for the project, then schedule may be firmly defined at the outset. When scope has to be delivered in full for the solution to be meaningful, then scope may be firmly defined at the outset. When a fixed amount of funding is available, then budget may be firmly defined at the outset. But the plan is not comprehensive, and one or two sides of the Iron Triangle are meant to be flexible. Lacking a stable definition of 100%, percentage-based metrics are meaningless. The adaptive approach calls for different metrics.
Identifying the approach in use
Since people tend to use popular buzzwords to mean whatever they need the words to mean, how can we recognize the approach to Iron Triangle management that people are really using in any given organization? In my experience, there are several (but not too many) key distinguishing characteristics that tell us which approach people are using. They are:
Traditional | Adaptive | |
---|---|---|
Funding | Fixed, predetermined budget | Incremental funding |
Scope definition | Comprehensive work breakdown structure (WBS) fully defined prior to starting development | High-level feature list and list of necessary quality attributes (a.k.a. non-functional requirements) |
Estimation for release planning | Bottom-up roll-up of time-based task estimates | Top-down, high-level feature estimates |
Short-term planning | Based on task estimates and detailed work schedule in the WBS | Collaborative relative sizing of work items at the last responsible moment |
Risk assessment (business risk) | Comprehensive assessment up front followed by preemptive resolution prior to beginning work | Contingency plans based on high-level assessment up front; business trade-offs determined at last responsible moment |
Risk assessment (“technical” risk) | Conflated with business risk assessment. | Planned, empirical discovery of technical viability of alternative approaches |
Stakeholder involvement | Initial agreement on scope, schedule, cost, and acceptable quality codified in a contract or other formal mechanism | Initial agreement on product vision followed by direct (or proxy) participation in the development process |
Definition of success | All originally-planned features delivered on time and on budget | Functionality needed by customers/stakeholders as of the time of delivery at the right price point and at a reasonable level of quality to meet goals |
Change management | Formal process to address unplanned changes in scope, schedule, and/or budget; perceived as a corrective intervention when work goes off plan due to human error in the planning stages | Change is integral to the process and is not seen as a result of poor planning or execution; process is based on expectation of frequent change, so usually includes little formal change management overhead |
Process compliance | Processes are defined by process specialists; during execution, teams are expected to follow the defined process | Ongoing improvement of the defined process based on day-to-day learning is expected and is part of the job of teams executing the plan |
Governance | Compliance with governance requirements is mandated | Compliance with governance requirements is mandated |
These are mostly of equal weight in making a judgment about how the Iron Triangle is managed, but the first one, funding, is more significant than the others. It’s a commonplace to say that nothing happens without funding. How the funding happens gives us a strong clue as to the prevailing mentality about the Iron Triangle in the organization.
The funding model and the Iron Triangle
Conventionally, funding is determined before work begins on an initiative. The initiative has a predetermined budget allocation. Ongoing reassessment of funding needs is not part of the formal delivery process. Changes in the budget allocation are treated as exceptions.
Here is the tell-tale clue by which you can recognize this funding model: Changes in funding are driven from the project management level. When the PM realizes that some required change in scope or some unanticipated occurrence calls for a change in funding, he/she has to initiate a formal process to request that change.
With the adaptive approach, funding is allocated for shorter periods of time; say, three or four months. Periodically, all work in progress in the organization is reassessed against the ever-evolving understanding of business needs, market conditions, and revised business plans. Each initiative then receives more or less or the same funding for the next three or four month period, as well as any adjustments in other resources that may be deemed appropriate.
Here is the tell-tale clue by which you can recognize this funding model: Changes in funding are driven from the portfolio management or program management level. Reasessment for purposes of allocating funds is a recurring and regular part of the overall delivery process. Changes in funding are expected and planned for, and not considered exceptions.
Here is the key implication for recognizing the organization’s approach to Iron Triangle management: If a conventional funding model is used, it is all but impossible for any other aspects of the work to take an adaptive approach. Therefore, this is a clear sign that the organization uses approach (a) to manage the Iron Triangle.
However, the opposite assertion is not true. The fact an organization uses incremental funding does not automatically mean they take an adaptive approach to the Iron Triangle. It only means an adaptive approach is not precluded by the funding model. In this case we have to look at additional factors to determine how the Iron Triangle is managed.
Scope definition and the Iron Triangle
Given an incremental funding model, we know it is possible to take an adaptive approach to the Iron Triangle. We can usually tell whether that is actually true by looking at the way in which scope is defined for software development initiatives.
To support approach (a) to Iron Triangle management, the organization’s delivery process typically includes relatively extended analysis and design “phases.” The output from these phases will look like a comprehensive WBS and a technical design that, while possibly not fully detailed, is more detailed than really necessary, and that includes work far out in the future in the project plan. Project management tools will be Gantt-like, notwithstanding the buzzwords in the tools’ names or on the titles of formal progress reports. Metrics may be in use that are usually associated with adaptive methods, but their meaning is twisted to represent some flavor of “percentage complete,” which by definition requires people to know what “complete” means in some detail.
In contrast, if you see that the formal process interleaves analysis and design work with other aspects of development, such as coding, testing, and creating end-user documentation, and if the scope definition defines work items far out in the future at a high level of detail, then the indication is that people are taking an adaptive approach to the work.
Attitude also plays a role here. When people use incremental funding in conjunction with a comprehensive WBS, consider their reasoning for doing so. If they are concerned about minimizing losses in the event the planned scope, schedule, and budget will not be met, the implication is that the Iron Triangle is king, and they are using approach (a). If they are using an incremental approach to the comprehensive WBS as a means of delivering the maximum possible value to stakeholders, the implication is that they are using an adaptive approach to tackle a project whose scope happened to be well-known at the outset, and they are using approach (b). Note that people will almost always tell you they understood the “requirements” at the outset, if the organizational culture requires them to say so; and they will almost always say they are working in an adaptive way if they believe that is what people wish to hear. Consider, instead, the way they work and the attitudes they exhibit.
Other factors
The remaining distinguishing characteristics in the table provide supporting evidence for our observations of whether the organization takes approach (a) or approach (b) to the Iron Triangle. They are not the primary determining factors, though. They are more in the nature of effects than causes.
[…] Iron Triangle management […]
[…] Iron Triangle management […]
[…] bell), has since evolved into a model that considers software delivery processes along three axes: Iron Triangle management, process model, and delivery mode. Should I condemn the me of 2005 for failing to leap ahead in […]
[…] Iron Triangle of scope, schedule, and budget is fundamental to managing software delivery initiatives. Two […]
[…] understanding our operational context is to use a model comprising three dimensions of management: the approach to the Iron Triangle, the process model, and the delivery mode. Different metrics may apply depending on where our […]