We are still trying to cope with 19th century social changes

I recently heard a talk from historian Heath Carter regarding his new book Union Made: Working People and the Rise of Social Christianity in Chicago where he drew connections between the Gilded Age and our own current times of inequality. In thinking further about the topic, I was struck by the number of issues that were pertinent then and are still present today. While it is difficult to know exactly when social processes begin and end, here is an incomplete list of concerns from the 1800s that we are still trying to figure out:

-How do we cope with all the people moving from small towns/agricultural areas to big cities?

-How can we have fulfilling lives in an industrialized, mechanized, global, capitalistic economic system? How do we deal with influential corporations as well as the ultra-rich?

-How can welfare states operate effectively with numerous interest groups and big money involved?

-Can science and religion coexist?

-On the whole, does mass culture (through mass media whether newspapers, telegraph, radio, TV, or Internet) help or hinder society?

-Can technological progress solve many of our problems?

-How can society – particularly, nation states – be cohesive and unified given increased levels of heterogeneity and specialization?

-What is the role of the self compared to the shaping and undeniable influence of growing (and often necessary) social institutions?

-What kind of relationship should we have with nature given industrialization and modifications to the environment?

-Under what pretenses and at what costs should major wars and social conflicts be waged?

Put another way, it is little surprise to me that the discipline of sociology emerged when it did: as these large-scale social changes were getting underway, numerous people were interested in explaining the effects. But, these are long-term social processes that may take decades or centuries to play out across a variety of contexts and as they interact with each other.

Science joke that leads to sociology

A review of a sci-fi novel uses a joke to help make the argument that sociological issues underlying scientific and technological issues:

There’s an old joke in science: Applied physics is called chemistry, applied chemistry is called biology, and applied biology is called sociology.

The purpose of the joke?

Each is more complex, vague, and unpredictable than the last. The problems Robinson lays out proceed neatly along that ladder of intricacy.

Scientific knowledge and technological breakthroughs may not matter much if social groups don’t know what to do with them or use them well to help benefit society.

Big claim in a new book title: “Society Explained”

Go big or go home with your sociology book titles as this new sociology book Society Explained illustrates:

Rousseau had a couple of overriding goals in writing the book, his third.

One goal was to make the case that technological change, the use of social media and a sense of both economic and personal powerlessness are causing people to turn inward and become increasingly self-absorbed.

“People are much more alone than they need to be,” Rousseau said.

His other goal was to write a book that “is not dull or jargon-filled,” as many sociology texts tend to be, using personal examples and historical perspective.

“I was trying to take a very down-to-earth look at how our society functions,” he said.

In that, he appears to have struck a chord. After reviewing more than 7,000 titles, the American Library Association has named “Society Explained” one of the top 25 academic books of 2014.

Does this book offer one or a few key social forces that explain society today or does it take the typical introduction to sociology approach of looking at numerous subfields? I would expect the former with such a title though I’ve seen enough books to suspect the latter might be true. Alas, society is complex with numerous moving parts and doesn’t have the same kind of universal laws that might be found in the natural sciences. (What is the sociological equivalent of the law of gravity?) Yet, this is precisely what makes the subject so fascinating.

Aurora fire illustrates need for redundancy in key infrastructure systems

A fire at an Aurora FAA facility caused all sorts of airport problems in Chicago and across the country:

The FAA said it’s working “closely with the airlines that serve the Chicago-area airports to minimize disruptions for travelers” and expects to “continue to increase the traffic flow at those two airports over the weekend.” FAA officials did not respond Saturday to requests for more information.At least 778 flights had been canceled Saturday out of both airports by just before 3 p.m., according to Flightstats, a website that monitors air traffic.

O’Hare was able to operate at around 60 percent of its usual Saturday capacity, said Doug Church, a spokesman for the National Air Traffic Controllers Association.

Because of the fire at the Aurora facility, O’Hare’s control tower can’t receive or send to other control centers the airlines’ automated flight plans, so airlines are having to fax them to O’Hare. That’s requiring two controllers to staff every position at the main O’Hare tower, and had to close the auxiliary north tower at the airport, Church said.

While this is certainly an unusual accident, it illustrates the fragility of some of our key infrastructure: the behind-the-scenes equipment and people that keep airplanes flying and airports operating. As many have noted, flying has become quite hum-drum in the United States in recent decades and this is partly due to the general efficiency of this system. No one likes delays or lost luggage or maintenance problems but it is still pretty remarkable the number of flights in the air on a daily basis and the relative ease of traveling across long distances.

What we need are some redundancies in these key systems in case something does go wrong. As the article notes, the whole system isn’t shut down because flight plans can be sent by fax. But, there isn’t a quicker way – like digital photos or digital scans – to do this? Can’t this be done with one person? But, building redundant systems might often cost significant money upfront, a luxury many systems don’t have. At the least, this incident in Aurora should lead to some rethinking of what can be done better in the future if a key facility breaks down.

“Normal accidents” and black swans in the complex systems of today

A sociological idea about the problems that can arise in complex systems is related to Taleb’s ideas of black swans:

This near brush with nuclear catastrophe, brought on by a single foraging bear, is an example of what sociologist Charles Perrow calls a “normal accident.” These frightening incidents are “normal” not because they happen often, but because they are almost certain to occur in any tightly connected complex system.

Today, our highly wired global financial markets are just such as system. And in recent years, aggressive traders have repeatedly played the role of the hungry bear, setting off potential disaster after potential disaster through a combination of human blunders and network failures…

In his book Normal Accidents, Perrow stresses the role that human error and mismanagement play in these scenarios. The important lesson: failures in complex systems are caused not only by the hardware and software problems but by people and their motivations.

See an earlier post dealing with the same sociological ideas. Nassim Taleb discusses this quite a bit and suggests knowing about this complexity should lead us to different kinds of actions where we try to minimize the disastrous risks and find opportunities for extraordinary success (if there are inevitable yet unknown opportunities for crisis, there could also be moments where low risk investments can pay off spectacularly).

If these are inherent traits of complex systems, does this mean more people will argue against such systems in the future? I could imagine some claiming this means we should have smaller systems and more local control. However, we may be at the point where even much smaller groups can’t escape a highly interdependent world. And, as sociologist Max Weber noted, bureaucratic structures (a classic example of complex organizations or systems) may have lots of downsides but they are relatively efficient at dealing with complex concerns. Take the recent arguments about health care: people might not like the government handling more of it but even without government control, there are still plenty of bureaucracies involved, it is a complex system, and there is plenty of potential for things to go wrong.

Argument: Tom Wolfe’s “sociological novel” about Miami doesn’t match reality

A magazine editor from Miami argues Tom Wolfe’s latest “sociological novel” Back to Blood doesn’t tell the more complex story of what is going on today in that city:

TOM WOLFE has often declared that journalistic truth is far stranger — and narratively juicier — than fiction, a refrain he’s returned to while promoting his latest sociological novel, the Miami- focused “Back to Blood.” With cultural eyes turning to Miami for this week’s Art Basel fair, and on the heels of a presidential election in which South Florida was once again in the national spotlight, “Back to Blood” would seem a perfectly timed prism.

Yet Mr. Wolfe would have done well to better heed his own advice. The flesh-and-blood reality not only contradicts much of his fictional take, it flips the enduring conventional wisdom. Miami is no longer simply the northernmost part of Latin America, or, as some have snarked, a place filled with folks who’ve been out in the sun too long.

For Mr. Wolfe, the city remains defined by bitter ethnic divisions and steered by la lucha: the Cuban-American community’s — make that el exilio’s — frothing-at-the-mouth fixation on the Castro regime across the Florida Straits. The radio format whose beats Miami moves to isn’t Top 40, rap or even salsa, but all Fidel, all the time. It’s a crude portrait, established in the ’80s, reinforced by the spring 2000 telenovela starring Elián González, hammered home in the media by that fall’s Bush v. Gore drama and replayed with the same script every four years since.

Yet the latest data hardly depicts a monolithic Cuban-exile community marching in ideological lock step. Exit polls conducted by Bendixen & Amandi International revealed that 44 percent of Miami’s Cuban-Americans voted to re-elect President Obama last month, despite a Mitt Romney TV ad attempting to link the president with Mr. Castro. The result was not only a record high for a Democratic presidential candidate, it was also a 12 percentage-point jump over 2008.

Can a novel, even a sociological one, capture all of the nuances of a big city? Or, is a novel more about capturing a spirit or the way these complexities influence a few characters? While I do enjoy fictional works, this is why I tend to gravitate toward larger-scale studies about bigger patterns. One story or a few stories can explore nuance and more details. However, it is hard to know how much these smaller stories are representative of a larger whole. In Wolfe’s case, is his book a fair-minded view of what is taking place all across Miami or does he pick up on a few fault lines  and exceptional events?

While browsing in a bookstore the other day, I did notice an interesting book that was trying to bridge this gap: The Human Face of Big Data. On one hand, our world is becoming one where large datasets with millions of data points are the norm. With this, it may be harder and harder for novels to capture all of the patterns and trends. Yet, we don’t want to lose perspective on how this data and the resulting policies and actions affect real people.

Risk, reward as more complexity leads to new, more problems

In discussing the recent fine levied about BP for the 2010 oil issue in the Gulf of Mexico, an interesting question can be raised: are events and problems like this simply inevitable given the growing complexity of society?

In 1984, a Yale University sociologist named Charles Perrow published a book called “Normal Accidents: Living with High-Risk Technologies.” He argued that as technologies become more complex, accidents become inevitable.

The more complex safety features that are built in, the more likely it is that something will go wrong. You not only add technical complexity more things to go wrong but you add a human element of complacency. The more often things don’t go wrong, the more likely it is that people think they won’t. The phrase for this is “normalization of deviance,” coined by Boston University sociologist Diane Vaughan, part of the team that examined the 1986 explosion of space shuttle Challenger.

“Normal accident” and “normalization of deviance” come to mind because 10 days ago, the oil company BP agreed to plead guilty to 12 felony and two misdemeanor criminal charges in connection with the 2010 explosion of the Deepwater Horizon drilling rig in the Gulf of Mexico. Eleven workers were killed and nearly 5 million barrels of oil (210 million gallons) poured into the Gulf over 87 days…

But it requires complex systems that will, at some point, fail. Politically, the government can only seek to explain those risks, try to minimize them with tough regulation and make sure those who take big risks have the means to redress inevitable failure.

If these sorts of events are inevitable given more complexity and activity (particularly in the field of drilling and extraction), how do we balance the risks and rewards of such activity? How much money and effort should be spent trying to minimize risky outcomes? This is a complex social question that involves a number of factors. Unfortunately, such discussions often happen after the fact rather than ahead of possible occurrences. This is what Nassim Taleb discusses in The Black Swan; we can do certain things to prepare for or at least think about known and unknown events. We shouldn’t be surprised that oil accidents happen and should have some idea of how to tackle the problem or make things better after the fact. A fine against the company is punitive but will it necessarily provide the solution to the consequences of the event or guarantee that no such event will happen in the future? Probably not.

At the same time, I wonder if such events are more difficult for us to understand today because we do have strong narratives of progress. Although it is not often stated this explicitly, we tend to think such problems can be eliminated through technology, science, and reason. Yet, complex systems have points of frailty. Perhaps technology hasn’t been tested in all circumstances. Perhaps unforeseen or unpredictable environmental or social forces arise. And, perhaps most of all, these systems tend to involve humans who make mistakes (unintentionally or intentionally). This doesn’t necessarily mean that we can’t strive for improvements but it also means we should keep in mind our limitations and the possible problems that might arise.