Social science assumes “human living is not random”

Noted sociologist of religion Grace Davie gives a brief description of her work:

My work, like that of all social scientists, rests on the assumption that human living is not random. Why is it, for example, that Christian churches in the West are disproportionately attended by women? That requires an explanation.

This is a good starting point for describing the social sciences. There are patterns to human social life and we can’t rely on anecdotes or interpretations of whether there are patterns or how to understand them. We want to apply a scientific perspective to these patterns and explain why those patterns, and not others, exist. Then, we might delve deeper into level of analysis, theoretical assumptions, and techniques of data collection and analysis – three areas where the various social science disciplines differ.

An award-winning guess at London’s fate in 50 years

Here is part of the winning entry of a UK contest to predict how social science research will affect life in 50 years:

Amidst decentralisation, London continued to grow, steadily gaining devolved powers. As 2043 arrived, the city into which I had moved 28 years previously was unrecognisable. From the 900m high tower in which I now lived, I surveyed a transforming cityscape, embracing recent technological developments. In 2022, Saudi Arabia completed Kingdom Tower, the world’s first kilometre high building. Besides technological innovation, it also had profound cultural implications; a range of social science consultants having pioneered community creation models. Under their guidance, its 5.5million ft2 of floor space offered offices, malls, accommodation and even artificial forests, stimulating a self-contained society with a culture of independence. Twelve years and four towers later, Kingdom City was a thriving metropolis of 2.1 million people. It represented a triumph for private finance and social science collaboration, setting a precedent for socially conscious corporation rule with minimal state involvement. Kingdom City prompted numerous equivalent developments throughout the Middle East and Asia in the late 2030s; social theory informed, self-contained, and privately administered. These express-cities dealt with population problems and boosted economies with ease, vindicating social planning.

Meanwhile, London had developed an immense housing crisis; its ballooning population shackled by construction regulation. London was desperate to emulate aforementioned eastern successes. It turned to its collection of world leading institutions, representing internationally renowned social psychologists, human geographers and many more, to plan ground-breaking reinvention. Throughout the 2040s, backed by multinational finance, London set about implementing their vision. Whilst primarily based around sociological community-seeding-housing ideas, this also facilitated a transport revolution. London already scorned cars, championing cycling and enjoying an unrivalled underground system following four Crossrail projects. Driverless electric vehicles had been increasingly present since the mid-2020s as battery technology improved. By the mid-30s, London proposed banning all human-driven petrol-fuelled vehicles, but the UK government was opposed; concerned that decreased fuel imports might jeopardise Gulf State relations. By the early-2040s, London was powerful enough to press ahead. Again the social sciences, bolstered by increasingly successful corporate ventures into city design, were instrumental in infrastructure planning, embedding the belief that public and corporate desires for liveability and efficiency were compatible. Resultantly, in 2053, the last human drove through the city. Simultaneously, influential internet scholars drove complete 5G rollout, providing unparalleled internet access. Contrastingly, large parts of the rest of the UK lacked 4G, creating a national digital divide. The scene was now set for divorce. In 2056 the government accepted a federalisation referendum. On May 4th 2058, London voted to become the UK’s fifth state.

Today, whilst technically federalised, London is essentially sovereign. Since the early-50s, state involvement has been nominal, particularly following parliament’s relocation to Manchester. London, like many 20th century capitals, now more closely resembles the Martian colonies than the nation surrounding it. These old nation states, largely unaltered from 2015, are increasingly inferior, especially as Space X’s mines and hydroponic innovations further improve city living standards. Social science’s guidance of private capital has enabled Jakarta, Doha and many more to smoothly transcend state structures, each now existing as a well-organised corporate amalgamation. This change is evident in my current work. Whilst trickledown economics and stringent immigration controls have all but ended real-term deprivation, inequality remains entrenched. Employed by London Inc., who are concerned by talent prevention, I am currently developing proposals to stimulate social mobility. This is just one example of how corporate-social science synergy is cultivating prosperous city societies in 2065.

These predictions appear to hinge on social science and private industry working together for London’s good, or at least for technological advancement. How many social scientists today would be interested in such collaboration, particularly if it meant that corporations could profit immensely or that the rich continue to get richer? As the essay hints, such improvements could come at the expense of many other UK residents who are left behind as London continues to grow and the rest of the country falls behind.

Maybe we should just file this away for five decades from now to see if any of this comes true…

Comparing ethnography and journalism, stories vs. data

A letter writer to the New York Times unfavorably compares ethnography and social scientific methods to journalism:

David Brooks’s review of George Packer’s book “The Unwinding: An Inner History of the New America” (June 9) is befuddling. First, Brooks praises Packer’s “gripping narrative survey” of recession-era life, comparing it to earlier efforts like that of John Dos Passos. Then, bizarrely, he faults Packer for not providing a “theoretical framework and worldview” that would include “sociology, economics or political analysis.” Narrative description and evocation has for centuries been among the most powerful forms of argument — so powerful, in fact, that the social psychologists Brooks admires appropriated the styles and cloaked them in the pseudoscientific garb of “ethnography” (which we used to call “journalism”).

Have we reached a point where devotion to instrumental reason is so maniacal we can’t handle mere stories anymore? Or perhaps we accept stories only when they’re accompanied by the tenuous methodology of social “scientists.” I would bet that a single profile by Packer, one of America’s best journalists, provides a better snapshot of real life than the legions of sociology and economics articles published since the crash.

Here is someone suspicious of social science. This is not an uncommon position. There is no doubt that stories and narratives are powerful and also have a longer history than the social sciences which developed in and after the Enlightenment. Yet, we also live in a world where science and data have also become powerful arguments.

Intriguingly, ethnography is a social scientific method that might help bridge this gap between narrative and data. This method differs from journalism in some important ways but also shares some similarities. The ethnographer doesn’t just work with statistics and data from a distance or through a few interviews. Through an extended engagement with the research subject, even living with the subjects for months or years, the researcher gets an insider perspective while also trying to maintain objectivity. The participant observer is engaged with larger social science theories and ideas, trying to understand how more specific experiences and groups line up with larger theories and models. The research case is of interest but the connection to the bigger picture is very important in the end. At the same time, ethnographies are often written in a more narrative style than social science journal articles (unless we are talking about journal articles utilizing ethnography).

Stories and data can both be illuminating. I know which side I tend to favor, hence I’m a sociologist, but I also enjoy narratives and “mere stories.”

Science more about consensus than proven facts

A new book titled The Half-Life of Facts looks at how science is more about consensus than canon. A book review in the Wall Street Journal summarizes the argument:

Knowledge, then, is less a canon than a consensus in a state of constant disruption. Part of the disruption has to do with error and its correction, but another part with simple newness—outright discoveries or new modes of classification and analysis, often enabled by technology. A single chapter in “The Half-Life of Facts” looking at the velocity of knowledge growth starts with the author’s first long computer download—a document containing Plato’s “Republic”—journeys through the rapid rise of the “@” symbol, introduces Moore’s Law describing the growth rate of computing power, and discusses the relevance of Clayton Christensen’s theory of disruptive innovation. Mr. Arbesman illustrates the speed of technological advancement with examples ranging from the magnetic properties of iron—it has become twice as magnetic every five years as purification techniques have improved—to the average distance of daily travel in France, which has exponentially increased over the past two centuries.

To cover so much ground in a scant 200 pages, Mr. Arbesman inevitably sacrifices detail and resolution. And to persuade us that facts change in mathematically predictable ways, he seems to overstate the predictive power of mathematical extrapolation. Still, he does show us convincingly that knowledge changes and that scientific facts are rarely as solid as they appear…

More commonly, however, changes in scientific facts reflect the way that science is done. Mr. Arbesman describes the “Decline Effect”—the tendency of an original scientific publication to present results that seem far more compelling than those of later studies. Such a tendency has been documented in the medical literature over the past decade by John Ioannidis, a researcher at Stanford, in areas as diverse as HIV therapy, angioplasty and stroke treatment. The cause of the decline may well be a potent combination of random chance (generating an excessively impressive result) and publication bias (leading positive results to get preferentially published)…

Science, Mr. Arbesman observes, is a “terribly human endeavor.” Knowledge grows but carries with it uncertainty and error; today’s scientific doctrine may become tomorrow’s cautionary tale. What is to be done? The right response, according to Mr. Arbesman, is to embrace change rather than fight it. “Far better than learning facts is learning how to adapt to changing facts,” he says. “Stop memorizing things . . . memories can be outsourced to the cloud.” In other words: In a world of information flux, it isn’t what you know that counts—it is how efficiently you can refresh.

To add to the conclusion of this review as cited above, it is less about the specific content of the scientific facts and more about the scientific method one uses to arrive at scientific conclusions. There is a reason the scientific process is taught starting in grade school: the process is supposed to help observers get around their own biases and truly observe reality in a reliable and valid way. Of course, whether our bias can actually be eliminated and how we go about observing both matter for our results but it is the process itself that remains intact.

This also gets to an issue some colleagues and I have noticed where college students talk about “proving” things about the world (natural or social). The language of “proof” implies that data collection and analysis can yield unchanging facts which cannot be disputed. But, as this book points out, this is not how science works. When a researcher finds something interesting, they report on their finding and then others go about retesting the findings or applying the findings to new areas. Over time, knowledge accumulates. To put it in the terms of this review, a consensus is eventually reached. But, new information can counteract this consensus and the paradigm building process starts over again (a la Thomas Kuhn in The Structure of Scientific Revolutions). This doesn’t mean science can’t tell us anything but it does mean that the theories and findings of science can change over time (and here is another interesting discussion point: what exactly is a law, theory, and a finding).

In the end, science requires a longer view. As I’ve noted before, the media tends to play up new scientific findings but we are better served looking at the big picture of scientific findings and waiting for a consensus to emerge.

NYT lays out three options for how personal religious faith could influence sociological work

At the end of a column looking at this summer’s public debate over research findings from sociologist Mark Regnerus, the writer suggests there are three ways personal religious faith could influence a sociologist’s work:

So if there is not really a Christian method in sociology, but there is a role for a self-described Christian in sociology, as Dr. Regnerus once averred, then what is that role? One can imagine several answers.

First, the religious — or atheist, for that matter — sociologist might have a set of topics that she finds particularly relevant to her beliefs. Given their traditions’ emphasis on traditional family, for example, a conservative Catholic or evangelical Protestant could reasonably gravitate toward the study of family structure.

Second, a scholar might have faith that good research ultimately brings people to God or furthers his plans. A Christian historian might trust that even a modest study of the Spanish-American War, or of Rhode Island history, would do a small part to reveal the providential nature of all history.

Finally, a scholar might be a “Christian scholar” by virtue of the pride he takes in his faith, especially in the secular academy. Dr. Regnerus was a proud Christian witness, once upon a time. But these days he won’t discuss his faith, even with a Christian magazine. Two weeks ago, Christianity Today ran a lengthy interview with Dr. Regnerus in which he said nothing about his religious beliefs.

Option one presented here seems to be the one that would probably be most acceptable to the broader scientific community. Lots of researchers have personal interests that help guide them to particular areas of study but then we tend to assume (or hope), a la Weber’s arguments about value-free sociology, that the findings will not necessarily be influenced by these personal interests. At the same time, some might argue that completely separating personal life and research results may be a modernist dream.

I suspect options two and three wouldn’t get as much broad support.

It would also be interesting to see how this would play out if we weren’t talking about personal religious beliefs but other personal beliefs. For example, Jonathan Haidt has been looking at politics within social psychology and thinking about how these personal (and more collective) beliefs might influence a whole field.

Question at the beginning of urban planning: “beautiful people or beautiful cities”?

Here is part of an overview of the “birth of urban planning” and how the field began with a “focus on place at the expense of people”:

Before then, there were three types of people thinking about how a city should look and function — architects, public health officials, and social workers. Each group approached the question of city building very differently.

The architects were focused on the city as a built environment, implementing ideas like L’Enfant’s grand vision for Washington, D.C., and the New York City grid (set out by the Commissioner’s Plan of 1811). The public health professionals, on the other hand, were consumed with infrastructure. They knew there was a connection between certain diseases and social conditions, even if they didn’t know precisely what it was. Planning how a water system would work, or where waste should go, or how to get garbage out of a city, was the most effective way to stop diseases from spreading (see, for example, John Snow, who figured out in the 1850s that a single water pump on Broad Street in London had infected hundreds of people with cholera). And lastly the social workers wanted to use the city to improve the lives of the people living there. They wanted cleaner tenements, spaces for immigrant children to play, and more light and fresh air for residents.

These thinkers were brought together by the pressure cooker that was the Industrial Revolution. “At that moment, we began to look for technological ways to expand the city,” says Elliott Sclar, a professor of urban planning at Columbia University. “All of a sudden here’s a pressure to comprehensively plan. You can’t just put a privy wherever you want.”…

At that conference, and in the years that followed, any one of these early urban planning strains could have taken over as the intellectual giant in the field. Though the social workers and the public health officials continued to play a role, urban planning’s intellectual history ended up grounded in architecture.

That outcome is thanks in a large part to the creation of the country’s first urban planning school, at Harvard. The University founded a school of landscape architecture in 1898. It was, effectively, a vanity project, slavishly devoted to Frederick Law Olmstead (in fact, it was started by Olmstead’s son). At the same time, It was a place to start. Soon after, they began offering classes in city planning, a first for higher education in America.

This could be an intriguing intellectual “what if”: what if urban planning had initially followed a public health or social work path? How might our cities be different and how would that have changed our culture?

This reminds me of the roots of sociology. Like urban planning, sociology became a more formal academic discipline around the turn of the 20th century. While some people had been practicing sociology and urban planning, it took time for this to become institutionalized and formalized. Similarly, American sociology had its roots in a few influential departments, particularly Chicago, which shaped the early years of the field. Indeed, I suspect a number of the social sciences were formalized in this period as the cultural turn toward science and rationality combined with expanding college campuses.

Chuck Todd: President Obama takes an anthropological view of the world

In an interview, journalist Chuck Todd explains how President Obama sees the world:

CHUCK TODD: I would say the real danger for the president on issues like this, is less about this, and more about–Paul Begala one time said this to me–he said, you know, the guy really is his mother’s son sometimes when it comes to studying society.  He’s anthropological about it.  Remember that time when he was studying people in Pennsylvania, and he said to that fundraiser in Pennsylvania, you know they cling to their guns.  He wasn’t meaning it as demeaning in his mind, but it came across that way.

ANDREA MITCHELL: It’s intellectualized.

TODD: He’s the son of an anthropologist, and I think sometimes he goes about religion that way, almost in this, as I said because he’s very well studied on, not just Christianity but on a lot of religions, but in that, frankly, anthropological way, and that can come across as distant.

As you can see from the link above, conservatives don’t particularly like this, particularly because they think intellectuals, and perhaps social scientists in particular (see this example regarding social psychologists), are against them already. But this is an interesting quote if correct: Obama then may see the world like a social scientist, looking at larger patterns and trends and making observations. Of course, an anthropological view may reveal unpleasant or unspoken truths, it may provide some insights, but it may also be unfamiliar to some and may be mixed up with political agendas rather than simply be “value-free” (a la Max Weber).

This also raises an intriguing question about what background Americans prefer a president to have. In the past, being a general was important or at least serving in the armed forces but this has declined in significance. Both parties tried a candidate who was a veteran in the last two presidential elections and both lost. Is a business leader better equipped? What about an academic? This is not simply confined to liberals; Newt Gingrich has a background as an academic historian. Hollywood or entertainment stars? Think Ronald Reagan, Jesse Ventura, Arnold Schwarzenegger, etc. Perhaps the best way to look at this is to work in the other direction and focus on different traits that polling organizations have asked about. Here are the results of a Gallup poll from a few months ago:

While more than nine in 10 Americans would vote for a presidential candidate who is black, a woman, Catholic, Hispanic, or Jewish, significantly smaller percentages would vote for one who is an atheist (54%) or Muslim (58%). Americans’ willingness to vote for a Mormon (80%) or gay or lesbian (68%) candidate falls between these two extremes.