Make your own way

A couple of weeks ago I took part in the yearly Intersections Symposium held by the Fuse Lab at Citi Tech. I was part of a panel (aptly) named “Tools” (?) with Axel Killian from Princeton SOA, David Rife from ARUP Assocites, and Jonatan Schumacher from Thornton Thomasetti. It was great to get a chance to catch up with these guys and recommend you follow their work where you can.

The central question posed to the panel was ‘how are students to prepare and keep up with the fast moving advance of digital tools?’  It’s a great question and one that has been a big part of the professional world as well. The explosion of digital tools in the last decade has had a profound impact in practice just as much as it has in the academy. We hear a lot today (…has it been 6 years??) about BIM and how this process is changing the established workflows. For some reason, we did not hear as much about the effect that Rhino (+ Grasshopper) and SketchUp  also had on our industry. I sense these were more disruptive technologies in their day, giving access to cheap or free 3D modeling to designers en masse.

Tinkering and architecture

In any case, David, Jonatan and I, the three non-academics on the panel, all took a similar approach to the topic, presenting concepts and examples of pushing past limitations of commercially available tools to find workflows and solutions for a our projects.

My part focused on defining the difference between tools and workflows. A workflow being defined as a series of connected steps, or a recipe as Dave Fano would call it. A tool (a digital tool in this case) is the means by which we perform any one of those steps in a way that is both efficient and yielding high quality. A good workflow will re-purpose tools in ways their makers never imagined. A good workflow has a purpose.

At the time of the symposium and only a few months after the site launch, AEC Apps already lists more than 425 tools. This number is made up of both highly successful commercial products and one off hacked-together add-ins made by many of us. My sense is that this number is still low, but it already begins to expose the magnitude of options that we as designers have every day in tool selection as we design the workflows that enable our design process.

Apps listed on AEC Apps

Some may see this as a problem, and will call for consolidation of options to a select few. They will do so in the name of inter-operability or standardization, but this is the wrong move for AEC.

An industry in dire need of more applied research should find ways to to add fuel to the fire and encourage designers to continue to take on tool-making as part of the design process, while also challenging software vendors to continue to innovate. But we shouldn’t stop there. We also need to share more. Sharing our experiments or tools, whether they be a model jig, a good family, a few lines of code, or even a polished add-in does much to push us all forward. It is through these examples and the conversations that happen online that we learn, adjust and forge ahead.

The underlying lesson of the talk was that emphasis must be given to the problem being solved, and not the tool. The question is not ‘what does the tool enable us to do?’, but instead, ‘what do we want to do, and how can these tools help us get there?’. By refocusing the conversation on solving real building problems we will find it much easier to make decisions about which tools to use, and when it may be worthwhile to even build your own.

IMG_0338

Thanks to Brian Ringley for the great photos.

Thanks to NY City Tech for the invite. I had a lot of fun. If you’re interested in all the work that is happening over there, have a look:

http://openlab.citytech.cuny.edu/fuselab/

@NYCCTfab

http://www.nycctfab.com/

http://www.flickr.com/photos/nycctfab/with/8949366921/

Some Examples of How BIM Data Can Help Us Make Better Decisions

This is a presentation I gave at last year’s AIA TAP at CIFE. It was a short presentation on some of the potential benefits of mining BIM data and studying it to help emerge some truths, patterns, etc. about our projects, processes and our teams. I showed some specific examples on how we are using some of this data today to make more informed decisions.

Archiving and search

One of the lowest hanging fruit of all this is the idea that if we can access all the information that is stored in these models, we can literally search through our past projects. Imagine being able to search for what doors or what floor to floor height were used on a building you finished 5 years ago, without having to run down to the archives.

Comparative analysis

It is common for us as industry to use precedents, but most of the time we are limited by relatively obvious attributes in our selection. We use things like typology, project size, client, client type and region as good meta organizers. What if we wanted to take that further? What if we wanted to identify projects by their floor to floor heights? or, what if we wanted to identify projects by the use of some particular piece of mechanical equipment? By that same measure, what could we learn if we could take successful projects and compare them against those considered to be less successful?

Smarter QA/QC

When working with large models and large teams, knowing ‘what is in the model’ can be almost impossible to determine. The rate at which people are adding elements is faster than the speed at which we can check them. Our qa/qc tools are not in line with our speed of production. Running model checks against a set performance standard is another big opportunity here. The information is there, we just need to be able to see it in the right form, and it needs to come to us at the right speed. It needs to keep up with production.

The Cone of Uncertainty

I recently sat in a call for a project where two separate cost consultants where being given the opportunity to engage the BIM process and take advantage of any model-based data they could derive. The client was allowing them some time to engage the process and use the project as a pilot to identify their own gaps, whatever those may be.

After some talk, a leading voice emerged making the claim that the traditional method would be preferable as it would be the fastest. The comparison focused on expediency to produce the same product that was required by industry standard and their scopes. Based on these parameters, this person was probably correct. The chips were stacked against the new process as the lack of experience in the tools alone would doom the surveyors to a slower process than usual.

These parameters, however, failed to make a true comparison since the two products would be fundamentally different. The traditional method would lead them to a reasonably accurate pricing estimate, but with some considerable allowance due to uncertainty. The new process would require this allowance as well, but the introduction of the transparency afforded to us by a design model being made available should move the needle.

In software development project management, the Cone of Uncertainty represents the ‘…evolution of the amount of uncertainty during a project.‘ (source) The premise is that at the onset of a project, little is actually known about what the product will look like in the end and how it is going to be made. As we move through the process of design and development, decisions are made, paths are selected until eventually unknowns ‘…decrease, reaching 0% when all residual risk has been terminated or transferred. This happens by the end of the project i.e. by transferring the responsibilities to a separate maintenance group.’

Source: modernanalyst.com via Federico on Pinterest

The same article states that “Most environments change so slowly that they can be considered static for the duration of a typical project, and traditional project management methods therefore focus on achieving a full understanding of the environment through careful analysis and planning.” This strikes a clear parallel with the traditional design stages of the design delivery process. A lot of work is done through design development and construction documents to more clearly define a building’s scope of work. ‘Careful analysis and planning’ is done, sometimes for years, to come up with a set of specifications and drawings that will act as the basis for design in the bidding and construction phase. 

It continues “Well before any significant investments are made, the uncertainty is reduced to a level where the risk can be carried comfortably.” It stands to reason then that a rich source of insight into the designer’s intent – as is the model – would go a long way to move the dial farther to the right in the above graph. Even in a model of poor craft, having a clear understanding of the ‘gaps’ in the design is of value.

Model and data driven approaches to studying how a model was built and the intent behind it then should derive a richer context for the quantity surveyor, essentially ‘de-risking’ the estimating process by increasing transparency and democratizing access to information.

The person described above identified risk in the many unknowns that came with the new process and new information, and would act to mitigate that risk by recommending we don’t engage it. Instead, he should see the opportunity to provide a more complete estimate at an earlier time in the project schedule. Essentially, a more complete estimate, earlier in time and with less unknowns. This, regardless of whether a model existed, should be at the core of their research and development efforts internally. In this case, they are being given the opportunity to work with a client jointly on that R&D. I hope they take that chance.