Murdered cats and discussing suburban troubles in the US and Britain

The Croydon Cat Killer leads to reflection on how Americans and Brits view troubles in their suburbs:

When I told a friend I was writing about the Croydon cat killer, as he (or a copycat) appears to be holidaying in Washington State, her lips collapsed into a little moue, and then she looked away. “What?” I pressed, and she paused before replying, earnestly, “But what if he comes for you?” It was a risk I’d considered, having just celebrated our kitten’s first birthday, but one I am willing to take, because this story — some believe the same man has killed more than 500 cats over the last four years — is compelling and terrifying. And it encourages obsession: It pricks at ancient anxieties.

In midcentury America, the suburbs were seen by some as a dangerous social experiment — this style of living brought sickness. Suburban men fell ill from the stress of commuting; suburban women, trapped at home, had it even worse. In a best-selling 1961 study the authors renamed these regions “Disturbia.”

The place of suburbs in our collective psyche has been on my mind recently, as last year, with great internal drama, I moved out of the city, got a cat for my daughter — pets, of course, traditionally being tools for children to practice grief upon — and settled all the way down. In Britain the idea of suburbia has none of the David Lynchian perversion or drama of the United States. But it’s still thought of as an in-between place, a punch line, where small neat gardens reflect the dimensions of their owners’ minds. Suffocating, but safe. Until a predator shatters the illusion…

A year ago, after our baby was born, my partner and I moved to the area where I grew up, to a quiet street at the end of the Northern Line where the capital opens out into golf courses and garden centers, and I immediately began boring him with much existential whining about the shame of having returned to the safety of a life I’d thought left behind. Then, a month after we moved, our house was broken into. The bed was stained with muddy footprints — the burglar had turned over our furniture and opened my face cream, seemingly confused by the lack of jewelry. That night, tidying up, my partner said quietly, “I wonder what he thought of us.” The city had broadcast its dangers, using sirens and loud lights, but we learned quickly the suburbs hide theirs; here, on school fences, cartoon drawings warn of the threat of accidents and strangers’ cars in cute, childish scribbles. Now we always keep a light on.

This is not an uncommon story: person or family moves to the suburbs expecting an ideal life centered around a home and family life. Something occurs, often a crime or unpleasant experience with some other suburbanites, that then shatters the happy suburban illusion. The suburbanite then often lives on edge. This is also the plot of innumerable movies, books, and other cultural products.

On one hand, this is very understandable. The suburbs, particularly in the United States, are often sold as an idyllic place. Neighborhoods should be safe, kids can grow up without worry and also get ahead, and families should have plenty of good times together. These things do not always happen for a variety of reasons including an emphasis on privacy (which limits both exposure to and discussions of things that may otherwise be typical events), occasional crime, and personal choices.

On the other hand, most suburban places are relatively safe. A single encounter with crime could be very traumatic. Yet, on the whole, wealthier suburban communities do have less crime. Plus, crime on the whole is down compared to several decades ago. Perhaps we just know more about the crimes that do occur – a curse of too much information – and it is hard to keep the big picture in mind.

Perhaps the biggest issue here is the setup of the suburbs as a perfect place. This is a powerful cultural narrative. Yet, no communities are perfect. Simply making it to a nice home in a nice suburb is not a guarantee of a happy life. While there has been talk of developing resiliency in cities, do we also need resilient suburbanites who are able to weather some tough situations?

Creative writing at the suburban shopping mall

The Mall of America is offering a unique opportunity:

The Mall of America is taking applications for a summer writing residency, which makes now a good time to question whether our collective taste for absurd mash-ups has gone too far. Is this an attempt to out-quirk the Amtrak residency, where writers typed on trains? What’s next — a sculptors’ retreat in a Chevron station? A poetry workshop at Ikea?

Maybe, but think about it: The people-watching alone would make easy fodder. Look — there goes a man in a plaid shirt, walking past Foot Locker into Sephora. He emerges with a small bag. The story practically writes itself.

Submit your residency application by Friday, and you could be selected to commemorate the shopping destination’s 25th birthday by spending five days writing while “deeply immersed in the Mall atmosphere.” (Nights are spent in a hotel, not sleeping on a department store mattress as I’d hoped.)

I agree: the shopping mall could be part of all sorts of interesting stories. But, I do wonder how many of them would be positive in the hands of today’s writers. Too often, stories of suburban life follow a similar script: happy looking family or couple ends up spiraling downward as the facade falls away from their American Dream. Such scripts could involve the postwar suburban tract home, current McMansions, or the shopping mall, perhaps the symbol of suburban affluence, consumerism, and emptiness. What is the shopping mall but the poor facsimile of authentic Main Streets?

Maybe this would be an even more challenging task: do such a residency and craft a range of stories representing the experiecnes of all those mall visitors and employees.

Haidt argues Anthro and Soc are the worst academic monocultures

Jonathan Haidt discusses the monoculture of academia and names two disciplines that may be the worst:

JOHN LEO: To many of us, it looks like a monoculture.

JONATHAN HAIDT: Yes. It is certainly a monoculture. The academic world in the humanities is a monoculture. The academic world in the social sciences is a monoculture – except in economics, which is the only social science that has some real diversity. Anthropology and sociology are the worst — those fields seem to be really hostile and rejecting toward people who aren’t devoted to social justice.

JOHN LEO: And why would they be hostile?

JONATHAN HAIDT: You have to look at the degree to which a field has a culture of activism.  Anthropology is a very activist field. They fight for the rights of oppressed people, as they see it. My field, social psychology, has some activism in it, but it’s not the dominant strain. Most of us, we really are thinking all day long about what control condition wasn’t run. My field really is oriented towards research. Now a lot of us are doing research on racism and prejudice. It’s the biggest single area of the field. But I’ve never felt that social psychology is first and foremost about changing the world, rather than understanding it. So my field is certainly still fixable. I think that if we can just get some more viewpoint diversity in it, it will solve the bias problem.

Interesting view from the outside as Haidt says later in the interview, “Anthro is completely lost. I mean, it’s really militant activists.” From the inside, a lot of sociology faculty and students seem to be at least partly motivated by wanting to address particular social issues or problems. Whether that clouds their research judgment more than social psychologists – who just want to understand the world, as any scientist would claim – would be interesting to explore.

If you haven’t read it, Haidt’s book The Righteous Mind is fascinating. He argues that opposing sides – say in politics or academic disciplines – have different narratives about how the world works and this causes them to simply talk past each other. In a 2012 piece, Haidt describes the moral narratives of the American political left and right:

A good way to follow the sacredness is to listen to the stories that each tribe tells about itself and the larger nation. The Notre Dame sociologist Christian Smith once summarized the moral narrative told by the American left like this: “Once upon a time, the vast majority” of people suffered in societies that were “unjust, unhealthy, repressive and oppressive.” These societies were “reprehensible because of their deep-rooted inequality, exploitation and irrational traditionalism — all of which made life very unfair, unpleasant and short. But the noble human aspiration for autonomy, equality and prosperity struggled mightily against the forces of misery and oppression and eventually succeeded in establishing modern, liberal, democratic, capitalist, welfare societies.” Despite our progress, “there is much work to be done to dismantle the powerful vestiges of inequality, exploitation and repression.” This struggle, as Smith put it, “is the one mission truly worth dedicating one’s life to achieving.”

This is a heroic liberation narrative. For the American left, African-Americans, women and other victimized groups are the sacred objects at the center of the story. As liberals circle around these groups, they bond together and gain a sense of righteous common purpose.

Contrast that narrative with one that Ronald Reagan developed in the 1970s and ’80s for conservatism. The clinical psychologist Drew Westen summarized the Reagan narrative like this: “Once upon a time, America was a shining beacon. Then liberals came along and erected an enormous federal bureaucracy that handcuffed the invisible hand of the free market. They subverted our traditional American values and opposed God and faith at every step of the way.” For example, “instead of requiring that people work for a living, they siphoned money from hard-working Americans and gave it to Cadillac-driving drug addicts and welfare queens.” Instead of the “traditional American values of family, fidelity and personal responsibility, they preached promiscuity, premarital sex and the gay lifestyle” and instead of “projecting strength to those who would do evil around the world, they cut military budgets, disrespected our soldiers in uniform and burned our flag.” In response, “Americans decided to take their country back from those who sought to undermine it.”

This, too, is a heroic narrative, but it’s a heroism of defense. In this narrative it’s God and country that are sacred — hence the importance in conservative iconography of the Bible, the flag, the military and the founding fathers. But the subtext in this narrative is about moral order. For social conservatives, religion and the traditional family are so important in part because they foster self-control, create moral order and fend off chaos. (Think of Rick Santorum’s comment that birth control is bad because it’s “a license to do things in the sexual realm that is counter to how things are supposed to be.”) Liberals are the devil in this narrative because they want to destroy or subvert all sources of moral order.

Holding so tightly to different understandings of the world means that compromising is very difficult.

Will Beatles songs eventually become as well known as nursery rhymes?

Scientist and musician Daniel Levitin wrote about the ubiquity of Beatles songs a while back:

One hundred years from now Beatles songs may be so well known that every child will learn them as nursery rhymes, and most people will have forgotten who wrote them. They will have become sufficiently entrenched in popular culture that it will seem as if they’ve always existed, like Oh Susannah, This Land Is Your Land, and Frère Jacques.

Why can we listen to certain songs across a lifetime and still find pleasure in them? Great songs activate deep-rooted neural networks in our brains that encode the rules and syntax of our culture’s music. Through a lifetime of listening, we have learned what is essentially a complex calculation of statistical probabilities of what chord is likely to follow what, and how melodies are formed. Skilful composers play with these expectations, meeting and violating them in interesting ways. In my laboratory we’ve found that listening to a familiar song that you like activates the same parts of the brain as sex or opiates do. But there is no one song that does this for everyone; musical taste is both variable and subjective…

On the bus to my office, the radio played And I Love Her and a Portuguese immigrant my grandmother’s age sang along. How many people can hum even two bars of Beethoven’s Fourth, or Mozart’s 30th? I recently played one minute of these to an audience of 700 people – professional musicians included – but not one recognised these pieces. Then I played a half-second of two Beatles songs – a fraction of the first “aah” of Eleanor Rigby and the guitar chord that opens A Hard Day’s Night – and virtually everyone shouted out the song names, more than could recognise the Mona Lisa.

To a neuroscientist, the Beatles’ longevity can be explained by the fact that their music creates subtle and rewarding schematic violations of popular musical forms, causing a symphony of neural firings from the cerebellum to the prefrontal cortex. To a musician, each listening showcases subtle nuances not heard before, details of arrangement and intricacy that slowly reveal themselves across hundreds or thousands of listenings. I have to admit, they’re getting better all the time.

While the neuroscience piece is interesting in its own right, a sociologist might be more interested in thinking about what songs make it to this level of common knowledge or become part of cultural narratives across societies. How many times does a song have to be played? Does it matter in what venues the song is performed? I could imagine a mega radio hit vs. a lesser known song that gets licensed dozens of times in the coming decades in commercials. Does the relative importance of the musical artist matter? Some of this has to do with diffusion and the various gatekeepers at play. The Beatles likely have all these factors (except the commercials) in their favor as they were a cultural phenomenon, produced numerous #1 singles around the world, changed the music industry (from recording to stopping touring), and were generally liked by critics.

Just thinking back, I feel like I only tend to hear this argument about what songs will last into the future when people are worried about their idea of bad music (maybe boy bands of the late 1990s, Brittany Spears, Lady Gaga, etc.) becoming the equivalent nursery rhymes.

Common narrative: bucolic suburbs surprised by deviance

A recent revelation in the Baltimore suburbs is a common story across media platforms: idyllic suburban communities are shocked by hidden deviance and crime that is suddenly exposed.

The hills in Clarks Glen are gently rolling, the homes McMansions. And the lawns are mowed to the near-perfection a country club groundskeeper might envy.

It’s the very model of affluent suburbia, hardly a place where anyone thinks the man next door would be stopped by customs agents on his way to China with the makings of missile detectors in his bags.

But appearances can be deceiving.

Zhenchun “Ted” Huang, a longtime resident of the Clarksville subdivision in Howard County, pleaded guilty this month to federal charges that he tried tofraudulently obtain electronic devices that can be used in fabricating missile detectors and other high-grade military equipment…

In Clarks Glen, the development where he lived for at least eight years, former neighbors were astonished to hear the news. They saw Huang, an electrical engineer, as anything but the cloak-and-dagger type.

Instead, they said, he was a taciturn man who mowed his lawn once a week, whether it was needed or not, and rarely socialized.

On one hand, people in the suburbs are genuinely shocked by such stories. They often move to nice suburbs to escape such issues like crime and international espionage. Nobody wants to think that a sex offender is lurking down the street where they let their kids play. These sorts of things are problems more often associated with cities or less affluent locales.

On the other hand, reactions like this sound like a TV show. Oh wait, is this an episode of The Americans or a Hollywood movie or a John Keats novel about the hidden problems of suburbia? One shouldn’t be completely naive about what can be lurking in any community, let alone suburbs. I’m not advocating for paranoia or hypervigilance – this isn’t the best way to promote social ties or community life – but people everywhere are capable of dastardly deeds. The reactions of neighbors like those quoted above might say more about how well suburban neighbors know each other (often not very well) than the overall actions of suburbanites.

Perhaps the issue here is the overselling of suburban life over the decades. If suburbs were and are often marketed as escapes from social problems (there is a long history of suburban developers suggesting such things as well as suburban residents and leaders), places that are perfect for children and offer private space, the American Dream, then any actions in contrast to that are viewed quite negatively.

Carefully designing museum exhibits of traumatic events

Museums help us know and interpret our past so what is the best way to design exhibits that tackle traumatic events?

Working to affect the museumgoer’s subconscious is how Layman talks about exhibition design. First, he strives to understand – reading, consulting with historians, trying to learn the material as well as the curators do in order to find what resonates, what surprises. When it comes to putting materials in galleries, yes, he wants to manipulate you, but for the purposes of telling the story.

“We do a technique called ‘swing focus’ as the visitors go through,” Layman said. “Their eye catches one thing after the next, and it works all the way through, and the story, then, it just unfolds almost intuitively. It comes off the walls, and the people get lost in this story, and it becomes a very moving experience.”

Earlier this winter, Layman was in the opening galleries at the Illinois Holocaust Museum, in Skokie, the ones that, in parallel, establish what Jewish life was like in Europe before World War II and how the Nazis rose to power in Germany.

The two hours Layman took to explain what his firm did in Skokie, a sort of ultimate guided tour, were absolutely fascinating. The museum deftly takes viewers into some of humanity’s least human moments and then escorts them back out. It works so well, in part, because every inch of the design is pored over. “We pay attention to excruciating detail on absolutely everything,” he said.

It sounds like the purpose is trying to tell an immersive narrative. This narrative is carefully crafted and meant to give the attendee a particular viewpoint on the world. Museums can reinforce existing cultural narratives, particularly in their ability to involve all the senses.

I like museums and what they can offer: original artifacts and powerful experiences. Yet, as someone who values education, museums seem like they can only go so far: they provide an introduction to most topics. If the museum is the only time a person encounters an important topics like the Holocaust, then that is not enough. I would encourage my students to find out for themselves, to find original texts and numerous interpretations to start developing what they think on their own. Museums can do some of this but there simply isn’t enough space (and this process requires a lot more text that the typical museumgoer would be willing to read) to tell the whole story.

A fascinating example of this is at The Sixth Floor Museum at Dealey Plaza in Dallas, Texas. Before going, I wondered how they would handle conspiracy theories about JFK’s assassination. But, the museum had a whole section on the various theories at the end without making a strong statement against such theories. The better parts of the museum told the story of JFK’s rise, involving artifacts, texts, and videos. The ultimate part of the journey is looking at the reconstructed spot at the sixth floor window from which Lee Harvey Oswald fired at the president. I could see that taking this all in moved numerous visitors. All together, the museum is a well-done taste of JFK’s life, legacy, and the theories surrounding his death but an individual could spend years going through all that is out there and trying to make sense of it all. The museum isn’t the final word but rather an authoritative source.

 

American preferences for returning to a past decade shaped by when they grew up, their politics

The Economist reports on a recent poll that asked Americans which decade to which they would wish to return:

In our latest weekly Economist/YouGov poll, we asked Americans which decade of the 20th century they would most like to go back to. Most popular was the 1950s. The decade of economic boom following the second world war is regarded as a time of consumerism, conservatism and cold-war caution. It was an age of stay-at-home wives, novel household appliances and new suburbs—yet was also most popular among women. The haze of Woodstock and Haight-Ashbury in the 1960s rolled up in second place. Republicans in particular preferred the morally uncomplicated 1950s under President Eisenhower and the 1980s of Reagan; Democrats tended to opt for Bill Clinton’s 1990s. In general, people yearned for their youth. Over 50% of those over 65 wanted to revisit the 1950s and 1960s, while 45- to 64-year-olds pined for the 1980s. The youngest were torn between the jazz age of the 1920s and the 1990s, their own salad days.

On one hand, this might be somewhat meaningless: stereotypes of entire decades are much too simplistic and even The Economist falls into that trap in their descriptions. On the other hand, perhaps knowing what decade people would prefer to return to helps give us some indication of what people are trying to accomplish now. If your preferred era is the 1950s, you might pursue different social norms and policies compared to something who most fondly recalls the 1960s. Indeed, conservatives and liberals might both want to push such a narrative: Republicans to return to the prosperous and calm 1950s (maybe also their vision of the 1980s) while Democrats would prefer the more liberating and exciting 1960s (and perhaps also the 1990s).

Crafting the perfect Gothic McMansion in a 21st century novel

A review of the new novel Fallen Land suggests the McMansion at the heart of the book plays a big role:

The McMansion, that derisively nicknamed trophy home of suburban arrivistes, is different things to all people: the darling of building contractors, the forest-guzzling residential equivalent of the SUV to land preservationists.

Among American practitioners of the modern Gothic novel, the McMansion has rarely been rendered with the resplendent gloom of, say, Shirley Jackson’s Hill House, or the majesterial melancholy of Edgar Allan Poe’s House of Usher. In his smashing followup to his formidable debut novel “Absolution,” however, Patrick Flanery has fashioned a crumbling 21st-century manor that can hold its own among those authors’ most sepulchral, ALLEGORICAL inspirations.

The trappings of “Fallen Land’’ are pure old-school Hollywood. Imagine a housing development that evokes the splashy-cum-sinister Victorian fantasy of “Meet Me in St. Louis” and Hitchcock’s “Shadow of a Doubt” and you have Dolores Woods, a Midwestern subdivision committed to a regressive aesthetic “in which the past was preferable and this country was at its greatest before it tried to tear itself apart in the middle of the nineteenth century.” The community’s pastiche array of gabled roofs and picket fences disguise the jerry-built nature of its construction: pop-up palaces whose yawning spaces and teetering infrastructure “terrify where they were meant to comfort,” the American Dream turned nightmare.

The development’s showpiece, classically enough, has been erected atop the site of tragic events from a darker epoch whose emotional undercurrents will haunt the home’s new tenants, Julia and Nathaniel Noailles. The Noailles have relocated from Boston with their smart, idiosyncratic son Copley (named for the hotel address where he was conceived) in pursuit of snazzier positions: she with a university lab, he with a mega-corporation that powers virtually every private enterprise on earth, including the fascistic private school in which Copley is newly installed.

I’ve noted before that the McMansion has become a popular tragic setting for modern stories. See this post about McMansions and horror films. The McMansion represents a hollow setting, a place that may look impressive but is empty at its core. The people who inhabit such homes are similar: people who thought purchasing a big home would bring satisfaction but are sadly mistaken. Even worse, the inhabitants – and it sounds like those in Fallen Land fit the bill – might be bad people, the kinds who squander money, are mean or amoral, and are up to nefarious purposes. All together, these stories suggest at the least that tragedies befall those in McMansions with the stronger argument that those who live in McMansions and their homes are rotten to the core.

Perhaps my argument would be strengthened by searching for counterfactuals: can we find many positive depictions of McMansion dwellers in novels, movies, TV shows, etc.?

Narratives built around the sociological “small world theory” of social networks

A review of a new novel highlights recent ideas in the sociological analysis of social networks:

In sociology, the “small world theory” holds that any two people can be connected to one another along a chain of no more than a few acquaintances (typically six, the fabled “six degrees of separation”). Though the research behind it is at best contentious, there’s something deeply appealing about its logic-defying simplicity, something exciting about what it implies. In a world that can seem vast and alienating, the idea that we’re all much closer than it seems is, at first glance, comforting. The flip side is that our influence may extend further than we realize.

In his second novel, “The Illusion of Separateness,” Simon Van Booy presents a cast of characters who have had a profound effect on one another’s lives, yet cannot see the bonds that link them. He divides his book into six separate narratives, each following a different character through different eras, from the Second World War to the present: Martin, a retirement home caretaker; Mr. Hugo, a disfigured Wehrmacht veteran; Sébastien, a lovelorn young boy; John, whose B-24 bomber is shot out of the sky over France; Amelia, a blind museum curator and John’s granddaughter; and Danny, a budding filmmaker. Van Booy presents their stories in a nonlinear fashion, shifting back and forth from character to character, decade to decade.

Van Booy’s premise — that we are all linked in ways we may not fully understand, and that our smallest actions can have a significant effect on the lives of others — is fairly banal, and its execution verges on overly sentimental. He builds to the scenes in which his characters cross paths with great ceremony, yet these intersections are the book’s weakest moments. While the plot seems to aspire to present an overarching sense of meaning, Van Booy never quite drives it home. For some, the significance is inscrutable, as when John and Mr. Hugo engage in a tense, but ultimately inconsequential standoff in a field in war-torn France. Others, like when Martin cradles the dying Mr. Hugo in the book’s opening pages, seem like contrivances meant to give the narrative the appearance of structure and meaning.

For more popular descriptions of recent sociological research on this topic, see Six Degrees by Duncan Watts, Connected by Nicholas Christakis and James Fowler, and Linked by Barabasi (a physicist who covers a lot of ground).

This kind of narrative involving interweaving stories is not new. It seems to be popular in movies in the last 10-15 years – I’m thinking of films like Crash, Love Actually, and others that make use of intersecting characters. This leads to two thoughts:

1. Is this a good instance of social science discoveries, that social networks influence people without their knowledge, influencing popular culture? It could be relatively easy to track whether this is a new kind of plot or whether it has a longer history.

2. The reviewer suggests that while the author has impressive prose, the overall structure of the story is lacking. Since we do indeed live in social worlds strongly influenced by social networks, how can that be effectively translated into a compelling narrative? Going back to the movies I mentioned above, those intersecting storylines involved quite a bit of individual or small group interaction by the end of the movie. In contrast, this book seems to be going for a more disconnected set of stories. Setting up the structure of a social network narrative likely involves balancing the connections alongside the individual interactions that tend to characterize and propel narratives.

Should the Newtown shootings be considered a mass shooting, a school shooting, mass murder, workplace violence…

Sociologist Joel Best provides a history of how American scholars and media have classified mass shootings:

It may seem self-evident that the killings at Sandy Hook Elementary ought to be classified as a shooting event, or as a school shooting or a mass shooting. Of course we classify events into categories that make sense to us, and it is easy to take familiar categories for granted. We learn of terrible crimes and we are accustomed to commentators talking about incidents as instances. But the ways we make sense of the world—the terms we use to describe that world—are created by people, and they are continually evolving, so that specific categories come into and fall out of favor. In fact, in recent decades, Americans have understood events like the Newtown killings in a variety of ways…

By the early 1980s, the Federal Bureau of Investigation promoted the distinction between mass murder and serial murder. The Bureau had a new databank—the Violent Criminal Apprehension Program, or VICAP—that could help law enforcement identify similar crimes that had occurred in other jurisdictions. But in the aftermath of revelations about the FBI’s surveillance of the civil rights movement, an effort to expand the bureau’s domestic data collection invited suspicion and resistance. The FBI used the serial murderer menace—and particularly the idea that serial killers might be nomadic, able to kill in different jurisdictions without the authorities ever recognizing that crimes in different places might be linked—to justify the VICAP program. That set the stage for Clarice Starling and all the other heroic FBI agents who began pitting their wits against serial murderers in crime fiction and movies. “Son of Sam” would no longer be classified with Charles Whitman…

Journalists notice patterns, so similarities between cases invite the creation of new categories. For example, in 1986, a postal worker killed 14 postal employees; then, in 1991, there were two more incidents involving former postal workers killing employees at post offices. This led to the expression going postal. Eventually, after further incidents in 1993, the Postal Service responded with a program to improve their workplace and prevent violence. Some criminologists began writing about workplace violence, although this category was defined as including any violence in a workplace, not just mass murders. Under that definition, a large share of workplace violence involved robberies. (According to one analysis, the three most common sites for workplace violence were taxicabs, liquor stores, and gas stations—all isolated settings likely to have cash on hand.)

At the end of the 1990s, attention shifted to schools. During the 1997–98 academic year, there were heavily publicized incidents in West Paducah, Kentucky; Jonesboro, Arkansas; and Springfield, Oregon. The Jonesboro story made the cover of Time, which featured a photo of one of the shooters as a young child wearing camouflage and holding a rifle, with the caption “Armed & Dangerous.” Thus, the expression school shooting was already familiar a year before the April 1999 killings at Columbine. Advocates and academics began compiling databases of school violence, although the results were surprising: The average number of deaths per year fell, from 48 during the period from the fall of 1992 through the spring of 1997, to 32 during the period spanning September 1997 through the end of the school year in 2001, even though Columbine and the other best-publicized cases occurred during the latter period. In spite of commentators declaring that the nation was experiencing a wave or epidemic of school shooting, the evidence suggested that violent deaths in schools were declining.

Best has written a lot about how social categories, experiences, and data then get used in political and civic discussions. The classification as a school shooting or the result of mental illness then shapes the rest of the discussion including what should be done in the future. Social problems can’t be social problems until they can be shown to be worth the public’s attention.

What Best doesn’t do is try to forecast how the Newtown shooting will come to be known in the future. What is the dominant narrative that will develop? And how will it be used?