Queen Bess Island Restoration

FILE - In this June 4, 2010, file photo, a clean-up worker picks up blobs of oil in absorbent snare on Queen Bess Island at the mouth of Barataria Bay near the Gulf of Mexico in Plaquemines Parish, La. Nearly $17 million in Deepwater Horizon oil spill money would rebuild a barrier island bird rookery off Louisiana to more than seven times its current size under a recently released plan. Queen Bess Island was the first spot where brown pelicans were returned to Louisiana after the pesticide DDT wiped them out. It was heavily hit by oil from the 2010 spill. (AP Photo/Gerald Herbert, File)

Next month marks the ninth anniversary of the British Petroleum Deepwater Horizon oil rig explosion off the coast of Louisiana that killed 11, injured 17 others, and spewed millions of gallons of oil into the Gulf of Mexico.

For those of us closest to the accident, the April 20, 2010, explosion will always be, first and foremost, a grave tragedy. But for analysts who study such things, the mishap is also something else: a case study yielding insights about how similar mistakes might be prevented in the future.

Or so we’ve been reminded by “Meltdown,” a 2018 book by Chris Clearfield and András Tilcsik that’s just been published in paperback. The subtitle of “Meltdown” is “What Plane Crashes, Oil Spills, and Dumb Business Decisions Can Teach Us About How to Succeed at Work and at Home.”

Clearfield is a former derivatives trader who lives in Seattle. Tilcsik, who researches organizational behavior, lives in Toronto. “Meltdown” is about a number of systems failures, including Deepwater Horizon, a crash on the Washington, D.C. metro, and an accidental overdose in a state-of-the-art hospital.

Although the incidents vary, the authors maintain that accidents are, in many ways, more alike than we often assume. Their “underlying causes,” Clearfield and Tilcsik write, “turn out to be surprisingly similar. These events have a shared DNA, one that researchers are just beginning to understand. That shared DNA means that failures in one industry can provide lessons for people in other fields ...”

The Deepwater Horizon explosion, the authors conclude, “all came down to BP’s failure to manage the complexity of its well. Just as radioactivity makes it hard to observe the core of a nuclear reactor directly, the high-pressure, underwater environment obscured what was going on inside the well. Drillers just couldn’t ‘send a guy down’ to see what was happening miles below the earth. Instead, they had to rely on computer simulations and indirect measurements like well pressure and pump flow ... Horizon’s crew was operating on the razor’s edge of catastrophe, but they didn’t know it.”

After the accident, according to “Meltdown,” as the Deepwater Horizon crew battled the blowout, “complexity struck again. The rig’s elaborate emergency systems were just too overwhelming. There were as many as thirty buttons to control a single safety system, and a detailed emergency handbook described so many contingencies it was hard to know which protocol to follow.”

Clearfield and Tilcsik conclude that BP’s “approach to safety might have worked in a simpler system, like a routine onshore drilling operation ... But Deepwater Horizon was a complex offshore rig. It operated squarely in the danger zone.”

Their point is that as complicated operations like oil exploration evolve, safety procedures have to evolve with them. It’s an abiding lesson of Deepwater Horizon, one worth remembering nearly a decade after a disaster that shook Louisiana and the world.