If failure is the greatest teacher, then Venkat Venkatasubramanian, the Reilly professor of chemical engineering at Purdue University, is one of its star pupils.
For 25 years, he’s studied risk analysis and the management of complex systems—or, in layman’s terms, how to keep horrendously complicated, excruciatingly twitchy technological edifices from collapsing under their own weight.
That’s meant taking close, post-mortem looks at everything from the Deepwater Horizon oil spill in the Gulf of Mexico to Japan’s recent nuclear crisis, then trying to figure out what went wrong and when.
“I often say that a disaster is a terrible thing to waste,” Venkatasubramanian said. “When these accidents happen, let’s at least draw the right lessons from it.”
Recent years have offered plenty of such lessons, from oil spills and meltdowns to the housing bubble and Bernie Madoff’s Ponzi scheme. Venkatasubramanian examined them all, and discovered that, while complex systems can serve drastically different purposes, they all fail for surprisingly similar reasons.
He got into this esoteric field in 1984, while finishing his doctorate and starting his post-doctorate work at Carnegie Mellon University. In December of that year, a Union Carbide-owned pesticide plant in Bhopal, India, suffered a catastrophic failure, releasing a cloud of highly toxic methyl isocyanate that killed thousands and injured tens of thousands more.
“I became interested in the question of how to prevent such disasters, through better design and control of chemical plants, so that these risks can at least be minimized, if not avoided entirely,” he said.
His work is of interest to plenty of people and businesses in the “real world,” including Eli Lilly and Co, which collaborates with him to explore the use of sophisticated computer models in pharmaceutical product development.
“Complex systems are a fact of life and by their very nature are challenging to understand,” said Lilly senior research fellow Dr. Henry Havel, who has collaborated with Venkatasubramanian for four years.
“Through the work of professor Venkatasubramanian and others in his field, considerable advancements have been made to provide mankind with the ability to understand and navigate complex systems.”
‘Like a genie’
In his early work, Venkatasubramanian focused mostly on risk mitigation at chemical plants—a field where the stakes are extremely high. As Bhopal proved, when chemical plants have a bad day, everybody—or at least, everybody downwind—has a bad day, too.
“A chemical plant is like a genie, with one important difference,” Venkatasubramanian said. “A chemical plant only grants you good things if you can keep it contained in its bottle. When it gets out, it’s a disaster. And this genie wants to get out every second. You have to work hard to keep it bottled.”
But there are other kinds of genies. Over the years, Venkatasubramanian’s studies have expanded to include pretty much all complex systems, chemical-related or not.
And by “complex” he means human constructions so complicated and interconnected that they more closely resemble ecosystems than mere machines or processes.
In these ecosystems, changing a couple of random factors can trigger unexpected—sometimes catastrophic—consequences, making it literally impossible to predict all the ways in which they can fail. Or even most of them.
“There are literally zillions of those kinds of combinations,” Venkatasubramanian said. “It’s hard to anticipate them.”
After last summer’s BP oil disaster, Venkatasubramanian wrote a cover story for the American Institute of Chemical Engineers’ AIChE Journal on that challenge.
The article highlighted obvious, though rarely noted, similarities among disasters in different fields.
“I argue that there are lots of common features across different kinds of system failures,” Venkatasubramanian said. “Except we don’t seem to look at them that way.”
If we did look at them that way, this is what he thinks we’d see: Modern technology creates more and more complex engineered systems, which are increasingly difficult to design, control and maintain.
The “nonlinear” interactions of a system’s myriad components can lead to “emergent” behavior—hard-to-predict, hard-to-control developments not anticipated by the designer.
Stir in human error and equipment failure, and you get a recipe for disaster.
In such situations, failures are caused by a myriad of factors that come together in an unexpected way—not that management ever rushes to admit this. Venkatasubramanian said disasters are almost always initially blamed on a single person.
And those responsible almost inevitably label the disasters as totally unpredictable and unexpected. Bhopal was at first pinned (incorrectly) on a single disgruntled employee, and Enron’s collapse was laid at the feet of the company’s chief financial officer, Andrew Fastow.
Post-mortems usually reveal not a solitary smoking gun, but a long-running, near-systemic deterioration of safety precautions.
“In the subprime [mortgage meltdown] case, for example, there were lots of signs that most people ignored,” Venkatasubramanian said.
Hindsight being 20/20, it’s pretty easy to understand the danger of a train once it’s run you over. The trick is to see it in time to step out of the way. And the best way to do that, Venkatasubramanian said, is with extreme vigilance, close regulation and fanatical attention to safety.
“You watch the systems even more closely,” he said. “And you pay attention to all kinds of things. And you design systems that are inherently safer. You cannot have 100 percent safety but we can get very, very close to it. This is going to be more and more important.”
And therein lies perhaps the biggest roadblock to Venkatasubramanian’s ideas. It’s tough enough to get budget-constrained managers to buy a new coffeemaker for the break room, let alone spend to guard against seemingly improbable disasters. Or to get next-quarter-obsessed chief financial officers to take the long view.
“They can do all this and all they’ll see is the expense, but the benefit comes only in the long run,” Venkatasubramanian said. “Thirty years later, we’ll know we had a fantastic safety record.”
Which is why safety can get shortchanged. Venkatasubramanian believes the erosion can start small, with a corner cut here and there, often to no perceived effect. This brings more cuts.
“Generally speaking, when engineers design these systems, they put in a certain amount of cushion,” he said. “So you cut corners here and there and find that things still work, so you cut a few more. You keep doing this and eventually you eat away your safety margin, and it’s just a matter of time before something happens.”
The problem is that the human brain isn’t wired to worry about long-range, somewhat nebulous threats. For instance, Venkatasubramanian notes that while proper diet and regular exercise can prevent heart disease, you’ll routinely find far more people waiting in line at the Wendy’s drive-through than waiting for a turn at the weight machines at the gym.
This ignorance to encroaching threats has become something of a media talking point. Writer and business speaker Margaret Heffernan recently wrote a book on the topic called “Willful Blindness.”
“You need to be aware that it’s human nature not to see what you don’t want to see,” Heffernan said. “You have to put systems in place that assure that people bring these things to your attention. When you turn a blind eye to something, it makes you feel safer. But what it’s doing is putting you in greater and greater danger.”
Venkatasubramanian offers a couple of ways to keep us safe in spite of ourselves.
For instance, he said it’s crucial to design systems—be they chemical plants or federal monetary policy—that are inherently safer.
At Bhopal, for instance, some 60 tons of lethal methyl isocyanate were kept on-site.
A far better approach, Venkatasubramanian stated, would be to make it during the manufacturing process, then immediately use it.
“At any given time, you may have only tens of kilograms on hand instead of thousands of kilograms,” he said. “Even when things happen, damage is limited. They may be a little more expensive to build, design and operate, but in the long run it pays off.”
Given humanity’s inability to see trouble coming, some might see a Chicken Little-ish quality to Venkatasubramanian’s quest for safety.
But he fears that mind-set only increases the risk of calamities. If all disasters are essentially alike, that means what happens in one can provide a teaching moment for all of us. Perhaps the lessons learned from a nuclear meltdown could help prevent a financial meltdown.
“That’s one of the messages I try to give my students,” Venkatasubramanian said. “I say we can never take safety for granted. You have to work at it. You have to think about the unthinkable.”•