Fewer people out and about in cities in 2020 so more people were victims of crime?

A working paper tries to put crime in the recent context of fewer people moving around cities in 2020:

Photo by Nout Gons on Pexels.com

Each of these metrics basically reports the same thing: A huge and prolonged decrease in the total number of hours people spent out and about in the American city. This decline peaked in April 2020, but urbanites stayed sedentary throughout the year, relative to 2019.

For crime data, the duo used statistics from New York, Los Angeles, and Chicago, which enabled them to sort for violent crime that occurred in public—a category that included streets, parks, alleyways, commercial establishments, and offices.

The results: From March to December, 2020, public violence in the three cities was 19 percent lower than it had been in 2019. But when put into the context of how little Americans left the house that year, that data takes on a different significance. In April, for example, violent street crime fell by 30 percent—but the risk of being a victim of such a crime rose by almost 40 percent. A similar pattern held for the whole year: Even as street crime fell, the risk of being a victim of a crime rose between 15 and 30 percent over the previous year, depending on which measure of “outdoor activity” was used. In short, if you spent time in public, you were more likely to be robbed or assaulted in public in 2020 than in 2019.

For what it’s worth, that risk remained very, very small: 12 violent crimes per million outdoor hours, or more than 80,000 safely-spent outdoor hours for each violent crime.

This is an interesting way to think about perceptions of crime: even if fewer crimes were committed, they might feel like more if there was less activity. This reminds me of some of the images going around from the early days of the COVID-19 pandemic of empty streets and cities. Once busy places simply did not have people. How did this affect perceptions of safety in public settings?

Would the same idea apply to media reporting on crime: because of a lack of other public activity (beyond COVID-19), did crime receive more attention even if there were fewer crimes? Perceptions of crime might be more important than the actual statistics themselves. Americans can be fearful even as numbers go down.

Connecting big drops in homicide rates and race and ethnicity

A new sociological study finds that homicide rates across different racial and ethnic groups have fallen:

The study revealed that three of the most significant social trends over the past 20 years — mass incarceration, rapid immigration and growing wealth inequality — all contributed to the reduction in the gaps between the white homicide victimization rate and those for blacks and Hispanics.

As a result, the black-white homicide victimization rate gap decreased by 40 percent, the Hispanic-white gap dropped by 55 percent and the black-Hispanic gap shrunk by 35 percent, according to the study to be published Thursday in the April issue of the American Sociological Review…

In fact, the study found that an influx of immigrants actually decreases homicides. “People who decide to come here are not people with strong tendencies toward violent crime,” Light said. “They are coming here for educational opportunities, employment opportunities and opportunities to help their families.”…

The study also showed that the increasing racial/ethnic disparities in incarceration rates were associated with significant reductions in black-white and black-Hispanic homicide victimization rate gaps. However, the authors were quick to caution against drawing the conclusion that even more incarceration would produce even more benefits because the findings have to be viewed in a larger context.

There are several matters of public perception that this study seems to address. Many are not aware of these declines and instead think crime has risen (see earlier posts here and here). Or, how about the data on immigration on crime where higher rates of immigration lead to lower homicide rates? Or, the roughly 35-40 percent decrease in the homicide rates for whites, blacks, and Latinos?

Thinking more broadly, what would it actually take for the American public to change their perceptions about crime? Could this sociology study help convince average Americans that violent crime rates have significantly dropped in recent decades? Would the media have to stop highlighting violent crime? Would the entertainment industry (movies, TV, video games, books, etc.) have to become less violent? Thinking about this particular study, perhaps positive changes to race relations would help…

Compared to unprotected sex, Americans underestimate risks of driving

A recent study looks at how Americans compare the risks of driving and unprotected sex:

Imagine that a thousand people—randomly selected from the U.S. population—had unprotected sex yesterday. How many of them will eventually die from contracting HIV from that single sexual encounter?

Now, imagine a different thousand people. These people will drive from Detroit to Chicago tomorrow—about 300 miles. How many will die on the trip as a result of a car crash?…

If you’re anything like the participants in a new study led by Terri D. Conley of the University of Michigan, the HIV estimate should be bigger—a lot bigger. In fact, the average guess for the HIV case was a little over 71 people per thousand, while the average guess for the car-crash scenario was about 4 people per thousand.

In other words, participants thought that you are roughly 17 times more likely to die from HIV contracted from a single unprotected sexual encounter than you are to die from a car crash on a 300-mile trip.

But here’s the deal: Those estimates aren’t just wrong, they’re completely backward.

According to statistics from the U.S. Centers for Disease Control and Prevention and the United States National Highway Traffic Safety Administration, you are actually 20 times more likely to die from the car trip than from HIV contracted during an act of unprotected sex.

While the rest of this article goes on to talk about perceptions of sex in the United States, these findings are consistent with others that suggest Americans don’t see driving as a threat to their safety. Driving is one of the riskier behaviors Americans regularly engage in: more than 30,000 Americans are killed each year in vehicle accidents. (It should be noted that this figure has dropped from the low 50,000s from the late 1960s and early 1970s. Driving today is safer than in the past.) Yet, Americans tend to like driving (or at least what it enables) and find it necessary in their daily lives (by social and political choices we have made) so those deaths and car accidents are acceptable losses.

Of course, it may not be long before even having to acknowledge our difficulties in weighing risks is no longer a problem due to driverless cars that eliminate all vehicle deaths.

Perceptions of extreme weather affected by social context

A new study in Environmental Sociology finds that people view extreme weather differently depending on their context:

“Odds were higher among younger, female, more educated, and Democratic respondents to perceive effects from extreme weather than older, male, less educated, and Republican respondents,” said the study’s author, Matthew Cutler of the University of New Hampshire.

There were other correlations, too. For example, people with lower incomes had higher perceptions of extreme weather than people who earned more. Those who live in more vulnerable areas, as might be expected, interpret the effects of weather differently when the costs to their homes and communities are highest.

Causes of extreme weather and the frequency of extreme weather events is an under-explored area from a sociological perspective. Better understanding is important to building more resilient and adaptive communities. After all, why prepare or take safety precautions if you believe the weather isn’t going to be all that bad or occur all that often?…

“The patterns found in this research provide evidence that individuals experience extreme weather in the context of their social circumstances and thus perceive the impacts of extreme weather through the lens of cultural and social influences. In other words, it is not simply a matter of seeing to believe, but rather an emergent process of both seeing and believing — individuals experiencing extreme weather and interpreting the impacts against the backdrop of social and economic circumstances central to and surrounding their lives,” Cutler concludes.

Context matters! (Many sociology studies could be summed up this way.) Weather may have some objective features – it can be measured, quantified, examined, and predicted (to a small degree). Yet we all experience slightly differently based on what shapes us. While it sounds like this study focuses more on demographic factors, I wonder if there would also be big differences based on general attitudes about nature: is it something that is bigger than humans/has a life of its own vs. it is something that humans can control or not be affected by because of our increasing knowledge? Plus, humans are often not the best at detecting patterns; we perceive things to be related when they are not or vice versa.

Perhaps this helps explain why so many people can make small talk about the weather. It isn’t just that it affects us; rather, we all view it in slightly different ways. One person’s big storm that requires changing their behavior might be just an inconvenience to someone else.

Americans not so sure playing field is level, American dream attainable

Data from recent years suggests fewer Americans think they can get ahead:

Surveys continue to show that Americans, in large numbers, still believe in many of the tenets of the American dream. For example, majorities of Americans believe that hard work will lead to success. But, their belief in the American dream is wavering. Between 1986 and 2011, around 50 percent of those polled by Pew consistently said they felt that the American dream was “somewhat alive.” However, over that same time period, the share who said it was “very alive” decreased by about half, and the share that felt it was “not really alive” more than doubled…

The majority of Americans once thought the playing field was more or less level. No more. Back in 1998, a Gallup poll about equal opportunity found that 68 percent thought the economic system was basically fair, while only 29 percent thought it was basically unfair. In 2013, feelings about fairness had reversed: Only 44 percent thought the economic system was fair, while 50 percent had come to feel it was unfair. Another 2013 poll found that by an almost two-to-one margin (64 to 33 percent), Americans agreed that “the U.S. no longer offers an equal chance to get ahead.”

Perhaps as a result of all of this, there are signs that the very idea of the American dream is changing. The American dream has long been equated with moving up the class ladder and owning a home. But polling leading up to the 2012 election revealed something new—middle-class Americans expressed more concern about holding on to what they had than they were with getting more. Echoing these concerns, Pew reported in 2015 that when asked which they would prefer—financial security or moving up the income ladder—92 percent selected security. This is a seven percentage point increase since just 2011, when 85 percent selected security over economic mobility.

And while majorities of Americans continue to say that home ownership is a key part of the American dream in general, when a survey asked people which things were the most important to their personal American dream, only 26 percent selected “owning a nice home” as a top choice, while 37 percent chose “achieving financial security” and 36 percent chose “being debt free.” In a 2013 Allstate/National Journal Heartland Monitor poll that asked respondents to define what it means to be middle class, 54 percent of respondents chose “having the ability to keep up with expenses and hold a steady job while not falling behind or taking on too much debt,” and only 43 percent defined being middle class as earning more, buying a home, and saving…

Three thoughts:

  1. Presumably, the economic crisis of the late 2000s contributed to this but so likely have other trends such as a declining amount of trust in social institutions and the decades-in-the-making changes brought about by economic globalization.
  2. Some have suggested that these numbers mean Americans no longer want these traditional markers of the American dream – like owning a home. More precisely, the surveys suggest Americans are more pessimistic about their own chances of owning a home. But, if the economy turned around (wages started going up, more good jobs became available, etc.), I suspect many Americans would go back to earlier behaviors. Maybe this would change if the pessimism and economic trouble continues. Yet, Americans have shown a willingness in the last century or so to consume at high levels when economic times are good.
  3. There has never truly been an “equal chance of getting ahead” in the United States. There have been times – such as after World War II – when prosperity was more broadly shared among the population and the gap between the rich and the poor shrank. Additionally, perceptions of this matter beyond the social realities. If people feel that social conditions are unequal, they can be unequal indeed.

How a growing suburb plans to remain “a small town at heart”

Many growing suburbs claim to still be small towns in spirit. Here is how the mayor of Warrenville makes this argument while explaining a new development:

At the September 21, City Council meeting, nearly unanimous preliminary approval was given to a new development that will occupy a 4.3-acre site adjacent to the Warrenville Library called Settlers Pointe. this moderate dense development will consist of 34 single-family homes, 14 two-story and 20 three-story units, selling in the $350,000 to $450,000 price range. I believe this project will be a wonderful addition to Warrenville on many levels, but there was a time when I would have viewed this development through a different lens, and because of its density, would have been adamantly opposed to it as “not in line with the character of Warrenville”…

In the case of Settlers Pointe, it will be good for Warrenville in many ways. It is an attractive development being done by an accomplished and quality developer (google David Weekley Homes) who knows the market. You have told us that a very high priority is economic development. Essential to that goal is “rooftops”. Businesses will not invest in areas without enough people to support them. these new homes will help spur the redevelopment of our downtown, something else you have given us as a priority…

Rural may no longer be geographically possible for our town, but we have resolved to remain a small town at heart. this is the “character” that you have consistently told us that is most important to you to enhance and preserve. It is independent of housing style or lot size. The people who choose to come to Settlers Pointe in Warrenville will do so because they see who we are and want to be one of us: small town folks enjoying the best of all possible worlds.

This explanation seems to me to be a bit odd given the relatively small size of the development – it is a small site though centrally located – yet the way it is made is similar to pitches I’ve seen in other suburbs in my research. Here are some key elements:

  1. Americans generally like the ideas of small towns. As this earlier post put it, American politicians push small town values in a suburban country. The vast majority of Americans live in urban areas – over 80% – yet they hold to older visions of community life. Appealing to small town ideals is a safe move.
  2. Broader social forces have pushed a community past its old identity and the community can’t go back. Once there is a certain level of growth or enough time has passed, “progress” is happening with or without us. (Of course, there are plenty of communities where they try to freeze things in time. See this example. But, those who support new development often say this can’t be done – and they’re probably right in thinking about the long-term.)
  3. New growth can be good, even as it contributes to change and a newer identity. Economic reasons are typically cited: business growth is good, an expanded tax base is good, new attention from potential new residents is good.
  4. The development under approval is not too different from what already exists. If there is a group fighting the project, they will argue otherwise.
  5. Even with change and growth, it is possible to hold on to the “character” or “spirit” of a small town. Local officials typically refer to the actions of residents and community groups, implying that people still know and care for each other. For example, Naperville leaders suggest their suburb with over 140,000 people still has this spirit.

Of course, these arguments are often challenged by residents who don’t see it the same way. NIMBY responses typically don’t want a community to fundamentally change; the way it is now is why those residents moved into town. But, some change is inevitable so perhaps these arguments are really about the degree of perceived change. Will this “fundamentally” alter the community? Is this a slippery slope? This can be the case with development decisions but significant change tends to come through a chain of decisions and these patterns are easier to diagnose in hindsight. (See Naperville as an example.) Residents can also feel relatively powerless compared to local politicians or businesses who have power to make decisions while local leaders tend to claim they are looking out for the good of the whole community.

Change is not easy in suburbs. And it is often a process that may look different in its physical manifestations even as the elements of the arguments made both for and against development follow some common patterns.

Majority of Americans wrong about the decline in global poverty

Nicholas Kristof discusses the role of the media in contributing to incorrect knowledge about global poverty:

One survey found that two-thirds of Americans believed that the proportion of the world population living in extreme poverty has almost doubled over the last 20 years. Another 29 percent believed that the proportion had remained roughly the same.

That’s 95 percent of Americans — who are utterly wrong. In fact, the proportion of the world’s population living in extreme poverty hasn’t doubled or remained the same. It has fallen by more than half, from 35 percent in 1993 to 14 percent in 2011 (the most recent year for which figures are available from the World Bank).

When 95 percent of Americans are completely unaware of a transformation of this magnitude, that reflects a flaw in how we journalists cover the world — and I count myself among the guilty…

The world’s best-kept secret is that we live at a historic inflection point when extreme poverty is retreating. United Nations members have just adopted 17 new Global Goals, of which the centerpiece is the elimination of extreme poverty by 2030. Their goals are historic. There will still be poor people, of course, but very few who are too poor to eat or to send children to school. Young journalists or aid workers starting out today will in their careers see very little of the leprosy, illiteracy, elephantiasis and river blindness that I have seen routinely.

Kristof and a growing number of others have noted that certain aspects of life are getting better for many people – like decreasing violence around the world or lower crime rates in the United States – yet the general public is not aware of this. The media is certainly complicit but they are not the only social forces at work here.

Turning to my own discipline of sociology, several sociologists, including Ulrich Beck, Barry Glassner, and Harvey Molotch, have written books on the topic of fear. Yet, it doesn’t seem to get much attention from the discipline as a whole. Of course, sociologists are regularly pointing out social problems (critics may say even inventing social problems) and often trying to offer arguments for why people and those in power should do something.

If there is positive psychology, how about positive sociology? Here is a rumbling or two

Baseball games average 17 minutes 58 seconds of action

Surpassing football games, one analysis suggests baseball games average 17 minutes and 58 seconds of action:

By WSJ calculations, a baseball fan will see 17 minutes and 58 seconds of action over the course of a three-hour game. This is roughly the equivalent of a TED Talk, a Broadway intermission or the missing section of the Watergate tapes. A similar WSJ study on NFL games in January 2010 found that the average action time for a football game was 11 minutes. So MLB does pack more punch in a battle of the two biggest stop-and-start sports. By seven minutes.

The WSJ reached this number by taking the stopwatch to three different games and timing everything that happened. We then categorized the parts of the game that could fairly be considered “action” and averaged the results. The almost 18-minute average included balls in play, runner advancement attempts on stolen bases, wild pitches, pitches (balls, strikes, fouls and balls hit into play), trotting batters (on home runs, walks and hit-by-pitches), pickoff throws and even one fake-pickoff throw. This may be generous. If we’d cut the action definition down to just the time when everyone on the field is running around looking for something to do (balls in play and runner advancement attempts), we’d be down to 5:47.

I’m sure some might quibble with the methodology. Yet, the findings suggest two things:

1. A significant amount of excitement about sporting events may have to do with the time between action rather than the action itself. Sure, we care a lot about the plays but the fun includes the anticipation between action as well as the conversation and analysis that takes place then. In other words, sports involves a lot of patience.

2. The “feel” of the action may matter more for perceptions than the actual measurement of action. Football and other sports include faster action and more players moving at a time, giving an image of more total action. This particularly shows up on television. Perhaps it is more of a question of do fans prefer group action or more solitary action?

Selection bias with Derek Jeter’s fielding: bad stats, memorable moments

As Derek Jeter’s career winds down, one baseball pundit wrestles with how Jeter’s defense numbers are so bad even as we remember some of his great fielding moments:

Data-mindful observers couldn’t figure out why the decorated Yankee kept winning those Gold Gloves and garnering raves for his defense. Stats such as Ultimate Zone Rating and Defensive Runs Saved didn’t merely suggest that Jeter was overrated; they pegged him as downright terrible. Even the best glovemen lose range as they age, which means Jeter actually hurt himself by playing past his 40th birthday and seeing his career defensive totals dip as a result, but the figures are unnerving regardless. Based on Baseball-Reference’s Runs From Fielding, which is based on DRS, Jeter’s combination of subpar defense and exceptional longevity don’t merely make him a defensive liability; they make him the worst defensive player relative to others at his position in baseball history.

That ranking is incredibly hard to fathom because of a very human weakness: selection bias. People remember a few extraordinary events, then ignore or even repress the information that might contradict that initial impression. With Jeter in particular, it’s nearly impossible to make the visceral reactions agree with the data, because Jeter has pulled off some of the most incredible defensive plays we’ve ever seen.

How to put this all together?

So really, it’s OK to agree in part with both sides of the argument. Even if we acknowledge the flaws of advanced defensive stats that aren’t yet based on play-by-play data or dispute the claim that Jeter was the worst ever, we can comfortably say he was overrated defensively by many people for many years, and cost the Yankees their share of outs. But we can also say that every huge-leverage play like The Flip negated a handful of squibbers through the infield during random April games in Cleveland, even if they left him as a net-negative defender on the leaderboards. Jeter might not have deserved five Gold Gloves, but he does deserve credit for crafting memorable plays that can’t simply be chalked up to coincidence or luck.

In other words, memorable plays that lead to key victories can go a long way to wiping out more objective data over a longer period of time. Of course, this is true across a broad range of contexts beyond baseball; showing unusual urban crimes and police responses as “normal” could have a similar effect on television viewers. In the long run, perceptions may have less of a shelf life as people who witnessed those events – like “The Flip” – stop remembering them or die and the data lives on.

Chicago P.D. promotes untruths about urban police work

Gregg Easterbrook points out that the TV show Chicago P.D. takes numerous liberties in depicting urban police and crime:

NBC promotes Chicago P.D. by implying it shows the gritty, realistic truth of urban police work, much as the network promoted Hill Street Blues a generation ago. But Chicago P.D. isn’t vaguely realistic. The 15-episode first season depicted half-dozen machine-gun battles on Chicago streets. Gunfire is distressingly common in Chicago, but nothing like what the show presents. Mass murders, explosions and jailbreaks are presented as everyday events in the Windy City. A dozen cops have been gunned down in the series so far; that’s more than the total killed on-duty by gunfire in actual during the current decade. (Look on the left for Chicago; the right is the national figure.) Officers on Chicago P.D. obtain in minutes the sort of information that takes real law enforcement months to compile. A detective barks, “Get me a list of all gang-affiliated males in this neighborhood.” A moment later, she’s holding the info.

The antihero protagonist is said to have been in prison for corruption but released “by order of the police chief.” This really is not how the justice system works. Then a cop-killer also is released “by order of the police chief,” which sets up a plot arc in which the good guys seek vengeance. In the real Chicago — or any big city — a convicted cop-killer would never see sunlight again.

Okay, it’s television. But what’s disturbing about Chicago P.D. is audiences are manipulated to think torture is a regrettable necessity for protecting the public. Three times in the first season, the antihero tortures suspects — a severe beating and threats to cut off an ear and shove a hand down a running garbage disposal. Each time, torture immediately results in information that saves innocent lives. Each time, viewers know, from prior scenes, the antihero caught the right man. That manipulates the viewer into thinking, “He deserves whatever he gets.”…

NBC executives don’t want to live in a country where police have the green light to torture suspects. So why do they extol on primetime the notion that torture by the police saves lives? Don’t say to make the show realistic. Nothing about “Chicago P.D.” is realistic — except the scenery.

One excuse is that this is just TV. At the same time, shows like this perpetuate myths about urban crime and police. While crime is down in cities in recent decades, shows like this suggest worse things are happening: it’s not just gun violence but open use of machine guns, not just some crooked cops but consistently crooked cops in a crooked system, and prisoners are routinely tortured. There may be a little truth in all of these things but consistently showing them leads to incorrect perceptions which then affect people’s actions (voting, whether they visit the city, who they blame for social problems, etc.).