James Q. Wilson on the difficulties of studying culture

In a long opinion piece looking at possible explanations for the reduction in crime in America, James Q. Wilson concludes by suggesting that cultural explanations are difficult to test and develop:

At the deepest level, many of these shifts, taken together, suggest that crime in the United States is falling—even through the greatest economic downturn since the Great Depression—because of a big improvement in the culture. The cultural argument may strike some as vague, but writers have relied on it in the past to explain both the Great Depression’s fall in crime and the explosion of crime during the sixties. In the first period, on this view, people took self-control seriously; in the second, self-expression—at society’s cost—became more prevalent. It is a plausible case.

Culture creates a problem for social scientists like me, however. We do not know how to study it in a way that produces hard numbers and testable theories. Culture is the realm of novelists and biographers, not of data-driven social scientists. But we can take some comfort, perhaps, in reflecting that identifying the likely causes of the crime decline is even more important than precisely measuring it.

I find it a little strange that a social scientist wants to leave culture to the humanities (“novelists and biographers”). This sounds like a traditional social science perspective: culture is a slippery concept that is difficult to quantify and make generalizations about. I can imagine this viewpoint from quantitatively minded social scientists who would ask, “where it the data?”

But there is a lot of good research regarding culture that utilizes data. Some of this data is fuzzier qualitative data that involves ethnographies and long interviews and observations. But other data regarding culture comes from more traditional data sources such as large surveys. And if you put together a lot of these data-driven studies, qualitative and quantitative, I think you could put together some hypotheses and ideas regarding American culture and crime. Perhaps all of this data can’t fit into a regression or this isn’t the way that crime is traditionally studied but that doesn’t mean we have to simply abandon cultural explanations and studies.

Venkatesh argues Anderson’s recent book highlights sociology’s identity problem

Sudhir Venkatesh reviews Elijah Anderson’s new book The Cosmopolitan Canopy (earlier review here) and argues that the text is emblematic of a larger identity crisis within sociology:

Anderson’s struggle to make sense of the current multicultural situation is not only a function of his own intellectual uncertainty. It is also a symptom of the field in which he is working, which is confused about its direction. Where sociology once gravitated to the most pressing problems, especially the contentious issues that drove Americans apart, it no longer seems so sure of its mission. With no obvious crisis, disaster, or glaring source of inequity as a backdrop demanding public action, a great American intellectual tradition gives every sign of weathering a troubled transition…

Anderson’s fascinating foray and his inability to tie together the seemingly contradictory threads highlight the new challenges that face our field. On the one hand, sociology has moved far away from its origins in thoughtful feet-on-the ground analysis, using whatever means necessary. A crippling debate now pits the “quants,” who believe in prediction and a hard-nosed mathematical approach, against a less powerful, motley crew—historians, interviewers, cultural analysts— who must defend the scientific rigor and objectivity of any deviation from the strictly quantitative path. In practice, this means everyone retreats to his or her comfort zone. Just as the survey researcher isn’t about to take up with a street gang to gather data, it is tough for an observer to roam free, moving from one place to another as she sees fit, without risking the insult: “She’s just a journalist!” (The use of an impenetrable language doesn’t help: A common refrain paralyzing our field is, “The more people who can understand your writing, the less scientific it must be.”)

For Anderson to give up “fly on the wall” observation, his métier, and put his corporate interviews closer to center-stage would risk the “street cred” he now regularly receives. This is sad because Anderson is on to the fact that we have to re-jigger our sociological methods to keep up with the changes taking place around us. Understanding race, to cite just one example, means no longer simply watching people riding the subway and playing chess in parks. The conflicts are in back rooms, away from the eavesdropper. They are not just interpersonal, but lie within large institutions that employ, police, educate, and govern us. A smart, nimble approach would be to do more of what Anderson does—search for clues, wherever they may lie, whether this means interviewing, observing, counting, or issuing a FOIA request for data.

If you search hard enough, you can find pockets of experimentation, where sociologists stay timely and relevant without losing rigor. It is not accidental they tend to move closer to our media-frenzied world, not away from it, because it’s there that some of the most illuminating social science is being done, free of academic conventions and strictures. At Brown and Harvard, sociologists are using the provocative HBO series, The Wire, to teach students about urban inequality. At Princeton and Michigan, faculty make documentary films and harness narrative-nonfiction approaches to invigorate their research and writing. At Boston University, a model turned sociologist uses her experiences to peek behind the unforgiving world of fashion and celebrity. And the Supreme Court’s decision to grant the plaintiffs a “class” status in the Wal-Mart gender-discrimination case will hinge on an amicus brief submitted by a sociologist of labor. None of this spirited work occurs without risk, as I’ve found out through personal experience. Each time I finish a documentary film, one of my colleagues will invariably ask, “When are you going to stop and get back to doing real sociology?”

I have several thoughts about this:

1. I think it is helpful (and perhaps unusual) to see this piece at Slate.com rather than in an academic journal. At the same time, is this only possible for an academic like Venkatesh who has a best-selling popular book (Gang Leader For a Day) and is also tied to the Freakonomics crowd?

2. Venkatesh seems to be bringing up two issues.

a. The first issue is one of direction: what are the main issues or areas in which sociology could substantially contribute to society? If some of the issues of the early days such as race (still an issue but Anderson’s data suggests it is exists in different forms) and urbanization (generally settled in favor of suburbanization in America) are no longer that noteworthy, what is next? Consumerism? Gender? Inequality between the rich and poor? Exposing the contradictions still present in society (Venkatesh’s conclusion)?

This is not a new issue. Isn’t this what public sociology was supposed to solve? There also has been some talk about fragmentation within the discipline and whether sociology has a core. Additionally, there is occasional conversation about why sociology doesn’t seem to get the same kind of public or policy attention as other fields.

b. The second issue is one of data. While both Anderson and Venkatesh are well-known for practicing urban ethnography (as Venkatesh notes, a tradition going back to the early 20th century work of the Chicago School), Venkatesh notes that even Anderson had to move on to a different technique (interviewing) to find the new story. More broadly, Venkatesh places this change within a larger battle between quantitative and qualitative data where people on each side discuss what is “real” data.

This quantitative vs. qualitative debate has also been around for a while. One effort in recent years to address this moves to mixed methods where researchers use multiple sources and techniques to reach a conclusion. But it also seems that one common way to critique the work of others is to jump right to the methodology and suggest that it is limited to the point that one cannot come to much of a conclusion. Most (if not all) data is not perfect and there are often legitimate questions regarding validity and reliability but researchers are often working with the best available data given time and monetary constraints.

In the end, I’m not sure Venkatesh provides many answers. So, perhaps just like his own conclusions regarding Anderson’s book (“Better to point [these contradictions] out, however speculative and provisional the results may be, than to hide from the truth.”), we should be content just that these issues have been outlined.

(Here is an outsider’s take on this piece: “One thing that’s the matter with sociology is that like economics the discipline’s certitude of conclusion outran its methodological rigor. Being less charitable, sociology is just an ideology which occasionally dons the gown of dispassionate objectivity to maintain a semblance of respectability.” Ouch.)

Scorecasting: Freakanomics for the sports world

A movement has been growing in the sports world in the last few decades: the use of lots of data in order to make decisions. Some of this data goes against “conventional wisdom” such as ideas of whether players can be “clutch” (some good stuff on which NBA players you would want to take the final shot with the game on the line) and what should actually be valued in free agents (MLB’s shift toward statistics like on-base percentage over home runs and RBIs).

A new book, Scorecasting, tackles a number of sports issue from a quantitative perspective. Read an interview (including a few examples from the book) with one of the authors here.

It will be interesting to see just how mainstream these sorts of ideas become. Does the average sports fan, or even the average sports broadcaster, want to rely on these kinds of data as opposed to their intuition or their feeling? Numbers may provide a better explanation – but numbers have all sorts of perceptions tied to them including the idea that people are just twisting the data to fit their explanation and that numbers about sports are developed by geeks who can’t play sports (or something along these lines).

I, for one, would like to have more quantitative data available to me when watching sports. Information like the batting average of a batter for particular parts of the plate (usually split into nine segments) or on a particular count would be useful. The data might seem overwhelming but ultimately, I think it helps people see the patterns underlying their favorite sport. For example, a home run hit on an 0-2 count in the 9th inning to win the game is impressive in its own right. But to know how rarely home runs are hit on the 0-2, even more so for some batters, adds to the feat.

Combining quantative and qualitative data collection on the Internet

I’ve quickly seen some recent mentions of a new project out of Princeton called All Our Ideas. Here is how the creators describe the project:

All Our Ideas is a research project to develop a new form of social data collection that combines the best features of quantitative and qualitative methods. Using the power of the web, we are creating a data collection tool that has the scale, speed, and quantification of a survey while still allowing for new information to “bubble up” from respondents as happens in interviews, participant observation, and focus groups.

Of course, one of the problems with surveys is that they force respondents to fit their responses to the questions that are asked. If you ask bad questions, you get bad results or if you don’t provide the options respondents want, you don’t really get the kind of data you want. Qualitative data, on the other hand, tends to be limited to a smaller sample because it takes more time to interview people or conduct focus groups.

I will be very curious to see what emerges out of this website.