Risk, reward as more complexity leads to new, more problems

In discussing the recent fine levied about BP for the 2010 oil issue in the Gulf of Mexico, an interesting question can be raised: are events and problems like this simply inevitable given the growing complexity of society?

In 1984, a Yale University sociologist named Charles Perrow published a book called “Normal Accidents: Living with High-Risk Technologies.” He argued that as technologies become more complex, accidents become inevitable.

The more complex safety features that are built in, the more likely it is that something will go wrong. You not only add technical complexity more things to go wrong but you add a human element of complacency. The more often things don’t go wrong, the more likely it is that people think they won’t. The phrase for this is “normalization of deviance,” coined by Boston University sociologist Diane Vaughan, part of the team that examined the 1986 explosion of space shuttle Challenger.

“Normal accident” and “normalization of deviance” come to mind because 10 days ago, the oil company BP agreed to plead guilty to 12 felony and two misdemeanor criminal charges in connection with the 2010 explosion of the Deepwater Horizon drilling rig in the Gulf of Mexico. Eleven workers were killed and nearly 5 million barrels of oil (210 million gallons) poured into the Gulf over 87 days…

But it requires complex systems that will, at some point, fail. Politically, the government can only seek to explain those risks, try to minimize them with tough regulation and make sure those who take big risks have the means to redress inevitable failure.

If these sorts of events are inevitable given more complexity and activity (particularly in the field of drilling and extraction), how do we balance the risks and rewards of such activity? How much money and effort should be spent trying to minimize risky outcomes? This is a complex social question that involves a number of factors. Unfortunately, such discussions often happen after the fact rather than ahead of possible occurrences. This is what Nassim Taleb discusses in The Black Swan; we can do certain things to prepare for or at least think about known and unknown events. We shouldn’t be surprised that oil accidents happen and should have some idea of how to tackle the problem or make things better after the fact. A fine against the company is punitive but will it necessarily provide the solution to the consequences of the event or guarantee that no such event will happen in the future? Probably not.

At the same time, I wonder if such events are more difficult for us to understand today because we do have strong narratives of progress. Although it is not often stated this explicitly, we tend to think such problems can be eliminated through technology, science, and reason. Yet, complex systems have points of frailty. Perhaps technology hasn’t been tested in all circumstances. Perhaps unforeseen or unpredictable environmental or social forces arise. And, perhaps most of all, these systems tend to involve humans who make mistakes (unintentionally or intentionally). This doesn’t necessarily mean that we can’t strive for improvements but it also means we should keep in mind our limitations and the possible problems that might arise.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s