Bob Bea, the nation's foremost forensic engineer, is the guy to call when levees break or oil rigs explode.
Credit: Photograph by Robyn Twomey
In 1967, Eric Brown, who taught failure analysis at Imperial College London, described structural engineering as "the art of molding materials we do not really understand into shapes we cannot really analyze, so as to withstand forces we cannot really assess, in such a way that the public does not really suspect." His observation can be applied to many modern engineering marvels.

It is one of the ironies of our time that while new technologies allow us to go ever-faster, deeper, and higher, they are increasingly laden with a "catastrophic complexity," which can overwhelm our ability to operate them safely. As the gap between man and his machines narrows – as computers become an extension of our nervous systems – the distinction between engineering and human problems has shrunk.

The cause of most large-scale calamities, notes Bea, is "the human factor." Too often, designers fall in love with their ideas and lose sight of practical, quotidian concerns. "We engineers tend to believe in the myth of perfection," he says. "We understand the logic of systems and machines. What we don't understand is all of you illogical humans. We aren't trained to take into account things like hubris, greed, sloth, office politics, and the rest of it. It's not part of our skill set. But it needs to be. Dealing with the human factor is almost always more complicated than the technology."

Indeed, Bea's database reveals that 20 percent of engineering disasters are the result of "intrinsic uncertainties" – i.e., natural events such as floods, tornadoes, and dust storms – while 80 percent are rooted in "extrinsic uncertainties" – i.e., human error.

Consider the Challenger disaster in 1986, which was ostensibly a technical problem but was really caused by organizational dysfunction at NASA. The space shuttle blew up after launching, when an O-ring seal failed, killing its seven crew members. Investigating the crash, Bea found that NASA had known that O-rings could be compromised by cold weather: The January launch day was cool, yet NASA controllers did not wait for the temperature to rise – a decision driven by inflexible managers and their schedules. The result was "an avoidable tragedy," says Bea. Similarly, when the space shuttle Columbia was destroyed during atmospheric reentry in 2003, killing all seven crew members, the given cause was broken foam insulation, but the root cause was NASA's broken culture. "We look back and say these accidents could have been prevented," Bea sighs. "NASA delivered incredibly high levels of individual performance, but they struggled to manage groups of people."

Looming behind such behavioral concerns is an intractable engineering problem: Much of America's critical infrastructure is aging and on the verge of failure. (A case in point: the 2007 collapse of an outmoded and improperly maintained highway bridge in Minneapolis, which killed 13 people.) Once the envy of the world, our power grids, bridges, tunnels, highways, dams, airports, drinking water, and wastewater-treatment plants are in desperate need of investment and updating.

In 2009, the American Society of Civil Engineers (ASCE) gave U.S. infrastructure an average grade of D, meaning it is nearing failure, and estimated that the cost to repair and modernize these building blocks of society will be at least $2 trillion. But the cost of allowing them to fail is even greater, both in the short term (leading to expensive and inconvenient stopgap measures) and long term (limiting America's efficiency and global competitiveness). The longer we avoid the problem, the greater the risk. Or, as Bea puts it: "We wait, we're fucked."