The Cone of Uncertainty

I recently sat in a call for a project where two separate cost consultants where being given the opportunity to engage the BIM process and take advantage of any model-based data they could derive. The client was allowing them some time to engage the process and use the project as a pilot to identify their own gaps, whatever those may be.

After some talk, a leading voice emerged making the claim that the traditional method would be preferable as it would be the fastest. The comparison focused on expediency to produce the same product that was required by industry standard and their scopes. Based on these parameters, this person was probably correct. The chips were stacked against the new process as the lack of experience in the tools alone would doom the surveyors to a slower process than usual.

These parameters, however, failed to make a true comparison since the two products would be fundamentally different. The traditional method would lead them to a reasonably accurate pricing estimate, but with some considerable allowance due to uncertainty. The new process would require this allowance as well, but the introduction of the transparency afforded to us by a design model being made available should move the needle.

In software development project management, the Cone of Uncertainty represents the ‘…evolution of the amount of uncertainty during a project.‘ (source) The premise is that at the onset of a project, little is actually known about what the product will look like in the end and how it is going to be made. As we move through the process of design and development, decisions are made, paths are selected until eventually unknowns ‘…decrease, reaching 0% when all residual risk has been terminated or transferred. This happens by the end of the project i.e. by transferring the responsibilities to a separate maintenance group.’

Source: modernanalyst.com via Federico on Pinterest

The same article states that “Most environments change so slowly that they can be considered static for the duration of a typical project, and traditional project management methods therefore focus on achieving a full understanding of the environment through careful analysis and planning.” This strikes a clear parallel with the traditional design stages of the design delivery process. A lot of work is done through design development and construction documents to more clearly define a building’s scope of work. ‘Careful analysis and planning’ is done, sometimes for years, to come up with a set of specifications and drawings that will act as the basis for design in the bidding and construction phase. 

It continues “Well before any significant investments are made, the uncertainty is reduced to a level where the risk can be carried comfortably.” It stands to reason then that a rich source of insight into the designer’s intent – as is the model – would go a long way to move the dial farther to the right in the above graph. Even in a model of poor craft, having a clear understanding of the ‘gaps’ in the design is of value.

Model and data driven approaches to studying how a model was built and the intent behind it then should derive a richer context for the quantity surveyor, essentially ‘de-risking’ the estimating process by increasing transparency and democratizing access to information.

The person described above identified risk in the many unknowns that came with the new process and new information, and would act to mitigate that risk by recommending we don’t engage it. Instead, he should see the opportunity to provide a more complete estimate at an earlier time in the project schedule. Essentially, a more complete estimate, earlier in time and with less unknowns. This, regardless of whether a model existed, should be at the core of their research and development efforts internally. In this case, they are being given the opportunity to work with a client jointly on that R&D. I hope they take that chance.

Managing Complexity at GSAPP

This past semester I had the privilege of teaching a short 6-week seminar at Columbia’s GSAPP. The class is called Managing Complexity and the core objective is to expose students to the world of collaborative design using a core technology as the venue.

In the first go at this class, I decided to use CATIA vr20. There were two main drivers behind this decision: first, the old solid modeling concept of parts and assemblies as a federated model is SUPER clear and a good step in the right direction from what they already know in linking and worksessions in Rhino. Second, I think having architects be exposed to a structured modeling process like this is a good mental exercise since it forces them to think methodically.

The class focused on exposing students to real world applications of multi-author models in Catia, Revit and Navisworks. I showed a couple of projects and explained how different scopes within a building are being modeled not just by different people, but by completely different companies that can easily be thousands of miles apart, yet are completely dependent on one another.

The first step was to help them further understand the concept of BIM. They all had experience with the tools that support BIM processes, but few, if any, had participated in a BIM process. Although this was super light and we only had 6 weeks, the core objective of the class was to have them experience BIM through the lens of plural authorship and collaboration. Most student’s except for one or two, had no experience with Catia either.

Assignment

They formed groups of two to three and selected a project to model collectively. I wanted them to work on a built project. I also wanted them to pick something that would leverage the tool that was being used and further expose them to the concept of ‘best tool for the job’. I also wanted a fairly high level of detail on the models. For these reasons then, I suggested a few projects by well known architects that were NOT buildings. I selected three foot bridges and two ski jumps.

Here are the five projects that I recommended:

These structures are very interesting to me. They are not like buildings in that they do not necessarily have levels, or are enclosed…But they are also big custom fabrication jobs, which makes them ideal candidates for a good solid model. A tool like Revit could be used to model a bridge like one of these, but you would have to be an advanced user willing to deal with tons of workarounds in order to get ther. Again, picking the best tool for the job was part of the lesson. Catia or Inventor would do just fine.

I took on the modeling of the Zapallar Pedestrian Bridge by Enrique Brown (Chile), a personal favorite. I used this bridge to walk them through the process.

Wireframe model of Zapallar bridge

The assignment is clear:

  1. Select a project
  2. Devise a strategy for modeling the project
  3. Design a structure for the Product model
  4. Author model – Wireframing
  5. Author model – Solids
  6. Assemble onto wireframe
  7. Iterate
  8. Document

Student Work

We ended up with three teams in groups of at least two. A technical objective was to make sure that they used a structure that used Generative Shape Design, Part Design, Assembly Design workbenches as a basis. Too often people ignore proper model structures and end up building a whole assembly within a single part. There are many downfalls for this but I won’t get into that here. The main objective here was plural authorship which by definition would require them to collaborate and work on different pieces which would later be brought together.

Here are some images of the final projects.

SL2803_SangLee_ManagingComplexity_FA2012_Page_29

Volantin Pedestrian Bridge by Sang Wook Lee

Volantin_Bridge_FINAL_Page_05

Volantin Pedestrian Bridge by Karl Bengzon

Railway Footbridge at Roche-sur-Yon by Yue Du and Xian Lai

Railway Footbridge at Roche-sur-Yon by Yue Du and Xian Lai

All in all, I was very happy to see the results. We had some great successes with the modeling concepts and we even had some successes with the concept of plural authorship. Next time I’ll try to harp on this a bit more. Out of three teams, we only had one team that worked together well but never got to assembling their files together. All could’ve split their files up further into parts to create a truer assembly. In general though, it was a great class. Congrats guys.