The Cone of Uncertainty

I recently sat in a call for a project where two separate cost consultants where being given the opportunity to engage the BIM process and take advantage of any model-based data they could derive. The client was allowing them some time to engage the process and use the project as a pilot to identify their own gaps, whatever those may be.

After some talk, a leading voice emerged making the claim that the traditional method would be preferable as it would be the fastest. The comparison focused on expediency to produce the same product that was required by industry standard and their scopes. Based on these parameters, this person was probably correct. The chips were stacked against the new process as the lack of experience in the tools alone would doom the surveyors to a slower process than usual.

These parameters, however, failed to make a true comparison since the two products would be fundamentally different. The traditional method would lead them to a reasonably accurate pricing estimate, but with some considerable allowance due to uncertainty. The new process would require this allowance as well, but the introduction of the transparency afforded to us by a design model being made available should move the needle.

In software development project management, the Cone of Uncertainty represents the ‘…evolution of the amount of uncertainty during a project.‘ (source) The premise is that at the onset of a project, little is actually known about what the product will look like in the end and how it is going to be made. As we move through the process of design and development, decisions are made, paths are selected until eventually unknowns ‘…decrease, reaching 0% when all residual risk has been terminated or transferred. This happens by the end of the project i.e. by transferring the responsibilities to a separate maintenance group.’

Source: modernanalyst.com via Federico on Pinterest

The same article states that “Most environments change so slowly that they can be considered static for the duration of a typical project, and traditional project management methods therefore focus on achieving a full understanding of the environment through careful analysis and planning.” This strikes a clear parallel with the traditional design stages of the design delivery process. A lot of work is done through design development and construction documents to more clearly define a building’s scope of work. ‘Careful analysis and planning’ is done, sometimes for years, to come up with a set of specifications and drawings that will act as the basis for design in the bidding and construction phase. 

It continues “Well before any significant investments are made, the uncertainty is reduced to a level where the risk can be carried comfortably.” It stands to reason then that a rich source of insight into the designer’s intent – as is the model – would go a long way to move the dial farther to the right in the above graph. Even in a model of poor craft, having a clear understanding of the ‘gaps’ in the design is of value.

Model and data driven approaches to studying how a model was built and the intent behind it then should derive a richer context for the quantity surveyor, essentially ‘de-risking’ the estimating process by increasing transparency and democratizing access to information.

The person described above identified risk in the many unknowns that came with the new process and new information, and would act to mitigate that risk by recommending we don’t engage it. Instead, he should see the opportunity to provide a more complete estimate at an earlier time in the project schedule. Essentially, a more complete estimate, earlier in time and with less unknowns. This, regardless of whether a model existed, should be at the core of their research and development efforts internally. In this case, they are being given the opportunity to work with a client jointly on that R&D. I hope they take that chance.