When communities resist and protest COVID-19 testing and treatment sites

NIMBY attitudes can be present even – or maybe especially – during pandemics:

Last week, residents in Darien, Connecticut, a tony exurb of New York City, successfully lobbied to shut down plans for a coronavirus testing site, despite surging demand. The reason? Complaints from neighbors. As it turns out, the “Not In My Backyard” impulse to block new development — which has been implicated in the severe affordability crisis affecting cities from coast to coast — translates far too neatly into blocking certain measures needed to stop the spread of the virus.

In a similar case in Ewing, New Jersey, a local landlord issued a cease-and-desist letter to the operator of a coronavirus testing center amid complaints about congestion in the parking lot. As The Trentonian reported, one resident who wanted to be tested in order to protect his three-year-old child wasn’t subtle about how he felt about the decision: “It blows my f**king mind.”

Community resistance from neighbors of testing sites is a rerun of the fierce NIMBY reaction to potential coronavirus quarantine sites. Back in February, California began looking for a place to shelter Americans returning from abroad with the virus and settled on an isolated medical campus in Costa Mesa. But after local residents complained, city officials sought and received a court injunction to stop the project.

As the need for quarantine sites expanded, so did the NIMBY backlash. Finding sites that won’t suffer the same fate has proven to be a major hurdle as the federal government attempts to manage the crisis. Back when the focus was still on returning cruise ship passengers, officials in Alabama went to the mat to keep passengers of the Diamond Princess cruise ship out of a local FEMA facility, eventually forcing the federal government to scrap the plan altogether. Similar fights have played out from Seattle to San Antonio, potentially undercutting the response to the coronavirus at key early stages. As a result, the federal government largely shifted quarantining efforts to military bases, where complaining neighbors hold less sway…

At first glance, it might seem like efforts to block potentially life-saving public health screenings and complaints about community character have little in common. But in both cases, the formula is the same: Whether out of an understandable fear of the unknown or a selfish desire to shift the burden elsewhere, local impulses are given veto power over broader social needs. Under normal conditions, the inability to constructively manage this means higher rents. In a public health emergency, it could be lethal.

In addition to what is in the last paragraph quoted above, I am struck by the resistance to facilities and sites that would be home to temporary concerns. It is one thing to object to a long-term health facility (see recent posts about a drug treatment facility in the western suburbs of the Chicago area here and here) but another to resist something that is needed now and presumably not permanent. Of course, this could be part of the fear: if a site treats COVID-19, could it then later be turned into a more permanent fixture in the community?

The logical extension of the NIMBY claims would be to push COVID-19 treatment sites or testing facilities to communities that could not resist it. When this plays out in areas like housing or unwanted land uses, this means that communities with less wealth and political power tend to become home to land uses that wealthier communities refuse. If such a pattern occurs here (and there is evidence that health differs dramatically by location in the United States), it could be evidence that pandemics further locational and health inequalities.

Infectious diseases in urban and suburban life

Americans already have a predilection for suburban life; might a global pandemic push even more people out of cities and to the edges of metropolitan regions? One take regarding safety in suburban life:

As maps like this show, major metropolitan areas are bearing the brunt of the Covid-19 infections spreading across North America. And that makes sense: Though there’s no way to know for sure how the virus arrived, it almost certainly came by way of an international flight to a major airport (or several of them). But while infectious disease spreads faster where people are more densely clustered — hence the strategy of social distancing to contain the coronavirus — that doesn’t necessarily make suburban or rural areas safer, health experts say…

That is not to say that cities aren’t Petri dishes — they are. Relative to rural areas, urban centers do provide stronger chains of viral transmission, with higher rates of contact and larger numbers of infection-prone people. And historically, urbanites paid a price for this vulnerability…

Modern transportation networks have made the population shield that rural areas once provided much more porous. Now that humans and freight can travel from, say, Hong Kong to Los Angeles in less than 13 hours — and arrive by vehicle to somewhere sparsely populated hours after that — outbreaks can happen just about anywhere. New pathogens tend to arrive sooner in global hubs, but that doesn’t mean they can’t quickly reach rural locales and proliferate from there, says Benjamin Dalziel, a professor of mathematics at Oregon State University who studies population dynamics…

But while the CDC recommends decreasing social contact to limit the spread of the virus, that’s just as doable in a downtown apartment as a countryside manor. Says Viboud: “If you’re staying at home and limiting outside contact, you’d achieve the same purpose.”

Three thoughts come to mind:

  1. This highlights the connectedness of cities and suburbs today, even if there is significant physical distance separating communities. The rate at which people travel around the world, to other regions, and throughout regions is high compared to all of human history and is relatively easy to do. Cities and suburbs are not separate places; they are parts of interdependent regions that are highly connected to other places.
  2. Safety and health was a part of creating the suburbs in the United States but it is hard to know how this might matter in the future. Given all the reasons people now settle in the suburbs, would avoiding communicable diseases be a top factor? I would think not, particularly compared to factors like housing prices or amenities (schools, quality of life, etc.), or demographics.
  3. If particular places are not that much safer, does the sprawl of American life then limit the response to any illness? Imagine the Chicago region with dozens of hospitals that need to be equipped spread throughout the region as opposed to that same number of people packed into a smaller area where it is easier to get supplies and people to medical facilities. Or, the need to supply grocery stores throughout a huge region.

Bringing medical clinics to vacant shopping mall space

Filling emptying shopping malls can be a hard task. Add medical services to the list of possible replacement uses:

Mall of America in Minneapolis, America’s largest mall, announced plans last week to open a 2,300-square-foot walk-in clinic in November with medical exam rooms, a radiology room, lab space and a pharmacy dispensary service. Mall of America is teaming up with University of Minnesota physicians and a Minnesota-based health care system to operate the clinic…

While mall leases for clothing retailers declined by more than 10% since 2017, medical clinics at malls have risen by almost 60% during the same period, according to Drew Myers, real estate analyst at CoStar Group. The growth of medical clinic leases at malls has been the “strongest among all major retail sectors over the past five years,” he said.

Mall landlords are betting that when patients visit for a flu shot or eye exam, they’ll shop around for clothes or electronics. Adding medical clinics also makes sense for mall owners because they draw in doctors, nurses and technicians every day who may shop and eat at restaurants, according to a May research report by real estate firm JLL. Health care providers are also attractive tenants for mall landlords because they tend to have high credit ratings and sign longer leases compared with other retailers, JLL analysts noted.

On the provider and health insurer side, shopping malls give companies convenient locations to set up outpatient care posts and preventative care locations for patients. Providers are increasingly looking to these lower-cost clinics to help patients avoid expensive trips to the emergency room.

The medical offices can serve the new residents and commercial uses that are also now occupying shopping mall space in addition to blending shopping and medical trips (dubbed “medtail” in the article). Just wait until the new hospital takes over the mall and patients and visitors can walk out one door and into a clothing store down the hall.

More broadly, this hints at a blending of activity within single structures that suburbs are not used to. Suburbs are known for separating land uses, often with the goal of protecting single-family homes. Suburban downtowns, places where multiple uses might be found, are limited and now often seem geared more toward entertainment and cultural use. Could the shopping mall truly be a community center in the coming decades with more residential units, medical offices, and community spaces?

Americans consume more media, sit more

A recent study shows Americans are sitting more and connects this to increased media usage:

That’s what Yin Cao and an international group of colleagues wanted to find out in their latest study published in JAMA. While studies on sitting behavior in specific groups of people — such as children or working adults with desk jobs — have recorded how sedentary people are, there is little data on how drastically sitting habits have changed over time. “We don’t know how these patterns have or have not changed in the past 15 years,” says Cao, an assistant professor in public health sciences at the Washington University School of Medicine.

The researchers used data collected from 2001 to 2016 by the National Health and Nutrition Examination Survey (NHANES), which asked a representative sample of Americans ages five and older how many hours they spent watching TV or videos daily in the past month, and how many hours they spent using a computer outside of work or school. The team analyzed responses from nearly 52,000 people and also calculated trends in the total time people spent sitting from 2007 to 2016. Overall, teens and adults in 2016 spent an average of an hour more each day sitting than they did in 2007. And most people devoted that time parked in front of the TV or videos: in 2016, about 62% of children ages five to 11 spent two or more hours watching TV or videos every day, while 59% of teens and 65% of adults did so. Across all age groups, people also spent more time in 2016 using computers when they were not at work or school compared to 2003. This type of screen time increased from 43% to 56% among children, from 53% to 57% among adolescents and from 29% to 50% among adults…

The increase in total sitting time is likely largely driven by the surge in time spent in front of a computer. As eye-opening as the trend data are, they may even underestimate the amount of time Americans spend sedentary, since the questions did not specifically address time spent on smartphones. While some of this time might have been captured by the data on time spent watching TV or videos, most people spend additional time browsing social media and interacting with friends via texts and video chats — much of it while sitting.

Does this mean the Holy Grail of media is screentime that requires standing and/or walking around to avoid sitting too much? Imagine a device that requires some movement to work. This does not have to be a pedal powered gaming console or smartphone but perhaps just a smartphone that needs to move 100 feet every five minutes to continue. (Then imagine the workarounds, such as motorized scooter while watching a screen a la Wall-E.)

Of course, the answer might be to just consume less media content on screens. This might prove difficult. Nielsen reports American adults consume 11 hours of media a day. Even as critics have assailed television, films, and Internet and social media content, Americans still choose (and are pushed as well) to watch more.

Mutant statistic: marketing, health, and 10,000 steps a day

A recent study suggests the 10,000 steps a day for better health advice may not be based in research:

I-Min Lee, a professor of epidemiology at the Harvard University T. H. Chan School of Public Health and the lead author of a new study published this week in the Journal of the American Medical Association, began looking into the step rule because she was curious about where it came from. “It turns out the original basis for this 10,000-step guideline was really a marketing strategy,” she explains. “In 1965, a Japanese company was selling pedometers, and they gave it a name that, in Japanese, means ‘the 10,000-step meter.’”

Based on conversations she’s had with Japanese researchers, Lee believes that name was chosen for the product because the character for “10,000” looks sort of like a man walking. As far as she knows, the actual health merits of that number have never been validated by research.

Scientific or not, this bit of branding ingenuity transmogrified into a pearl of wisdom that traveled around the globe over the next half century, and eventually found its way onto the wrists and into the pockets of millions of Americans. In her research, Lee put it to the test by observing the step totals and mortality rates of more than 16,000 elderly American women. The study’s results paint a more nuanced picture of the value of physical activity.

“The basic finding was that at 4,400 steps per day, these women had significantly lower mortality rates compared to the least active women,” Lee explains. If they did more, their mortality rates continued to drop, until they reached about 7,500 steps, at which point the rates leveled out. Ultimately, increasing daily physical activity by as little as 2,000 steps—less than a mile of walking—was associated with positive health outcomes for the elderly women.

This sounds like a “mutant statistic” like sociologist Joel Best describes. The study suggests the figure originally arose for marketing purposes and was less about the actual numeric quantity and more about a particular cultural reference. From there, the figure spread until it became a normal part of cultural life and organizational behavior as people and groups aimed to walk 10,000 steps. Few people likely stopped to think about whether 10,000 was an accurate figure or an empirical finding. As a marketing ploy, it seems to have worked.

This should raise larger questions about how many other publicly known figures are more fabrication than empirically based. Do these figures tend to pop up in health statistics more than in other fields? Does countering the figures with an academic study stem the tide of their usage?

 

“Distinctive behaviors of the actively religious” across countries

Pew analyzed international data and found that individuals who are actively religious have different behaviors than others in their nation in a number of countries:

DistinctiveBehaviorsoftheActivelyReligious.png

It appears that religiosity affects certain areas more consistently – particularly smoking, voting, happiness, and participation in nonreligious organizations – than others even as these relationships between religiosity and health, well-being, and prosocial behaviors can differ across countries. Of course, why some of these relationships and not others exist, even in the same categories like the example that the more religious do not smoke but religiosity has no impact on obesity or exercise, gets more complicated…

Argument that obesity and McMansions are linked

One “muckraker” tries to suggest that bigger houses – such as McMansions – make it easier for people to be obese:

No, the truth is that like cars, McMansion houses, food portions and soft drink sizes, Americans are getting bigger every day–and because it is happening everywhere, few notice. Worse, the harder we try to lose poundage with low calorie foods, fitness centers and personal trainers, the bigger we are becoming. Even people in non-industrialized countries are packing on the pounds as Big Food peddles it high calorie, addictive processed food in “new markets.”

A correlation without causation argument. And you do not have to go McMansions to make the same claim: the average size of new homes has increased from roughly 1,000 square feet to 2,500 square over sixty years. But, how might we really show that having other bigger items in our lives leads to having other bigger items in our lives? Would the reverse also be true: that if we had increasingly smaller items in our lives, we would desire smallness over all? If these are all linked, perhaps we could tie this to the big American frontier or the large American ideals at the founding of the country.

Perhaps there are other arguments to be made here. Do McMansions offer more space for people to spread out? Or, could heavier people be more likely to purchase McMansions (and is this related more to their stage in life)?

Smoking as a marker of social class

Recent data shows who in America is smoking and who is not:

Among the nation’s less-educated people — those with a high-school-equivalency diploma — the smoking rate remains more than 40 percent, according to the Centers for Disease Control and Prevention. Today, rural residents are diagnosed with lung cancer at rates 18 to 20 percent above those of city dwellers. By nearly every statistical measure, researchers say, America’s lower class now smokes more and dies more from cigarettes than other Americans.

 

This widening gap between classes carries huge health implications and is already reshaping the country’s battle over tobacco control. Cigarette companies are focusing their marketing on lower socioeconomic communities to retain their customer base, researchers say. Nonprofit and advocacy groups are retooling their programs for the complex and more difficult work of reaching and treating marginalized groups…

When smoking first gained popularity in the early 20th century, it was a habit of the rich, a token of luxury dusted with Hollywood glamour. Then came the 1964 surgeon general’s report on its deadly effects, and during the next 3½ decades, smoking among the nation’s highest-income families plummeted by 62 percent. But among families of the lowest income, it decreased by just 9 percent.

It is remarkable how little one encounters smoking in wealthier communities compared to less well off places. Would smoking be one of the single best lifestyle indicators of someone who has less education? Imagine a game where you had to guess someone’s education/social class based on observing their normal behavior in public.

Thinking more broadly, perhaps the newest major marker of having more education and a higher social class is good health and the lifestyle associated with it, everything from gym membership to regular jogging to eating patterns to having intense outdoor sports/hobbies. It is not just smoking; these class differences go across a variety of conditions and behaviors.

Zoning trade-off: privacy vs. adverse effects

The conclusion of Sonia Hirt’s book Zoned in the USA sums up the advantages and disadvantages of a zoning system that privileges the single-family home:

Arguably, zoning – the kind of zoning that makes explicitly private space the formative compositional element of America’s settlements – does deliver the gift of privacy to American families. But put all the other arguments mentioned in the previous paragraphs together, and one begins to wonder whether the original promises of zoning were either highly suspect from the beginning or have since been turned on their heads. Paradoxically (from the viewpoint of zoning’s founders), we may not have more pollution and worse public health with our current zoning that we would have if we had modified our land-use laws more substantially over the last hundred years.

As Hirt discusses, residents can have their own private homes – the largest new single-family homes in the world – but that comes at a cost of traffic and commuting, worse pollution and using more land, and worse health as well as some unrealized dreams of zoning including reduced crime. Some would argue that the privacy is overrated as well: compared to many other countries, Americans have given up on public life.

While it is easier to imagine mixed uses in dense urban neighborhoods – imagine Jane Jacobs’ vision of a bustling mixed use New York neighborhood – it is harder to imagine mixed use or zoning throughout the vast expanses of American suburbs. Even New Urbanists have tended to design neighborhoods or shopping centers dropped into suburban settings rather than the whole fabric of suburban communities. From the beginning of American suburbs, there was the idea that the urban dweller was escaping to a cottage in nature. The home out there offered refuge from people, dirt, and bustle. Today, this legacy lives on when suburban residents oppose certain land uses near their homes for fear of a lower quality of life and subsequently reduced property values.

Ultimately, would the American suburbs even exist without the fundamental desire for privacy?

In one decade, major suburban areas go from lowest drug overdose rates to highest

A new report details the rise of drug overdose deaths in suburbs:

Released Wednesday, the Robert Wood Johnson Foundation’s 2017 County Health Rankings and an accompanying report analyze county-level data from all 50 states on more than 30 public health outcomes and behaviors. The report finds there’s been a clear flip in the geography of addiction: One decade ago, large suburban areas experienced the lowest rates of premature deaths due to drug overdoses. In 2015, they had the highest.

The Johnson Foundation’s analysis doesn’t pinpoint which counties experienced the most dramatic gains in drug-induced death. What it does is rank every county in the U.S., by state, using data that reflects local health conditions, such as diabetes and obesity, as well as measures that can predict health outcomes, including teen birth, smoking rates, and grocery store access…

Comparing those numbers to the Johnson Foundation report, I found startling disconnects between deadly drug problems and places that have an otherwise fairly “healthy” facade. For example, Essex County ranks sixth out of the 14 counties in the Bay State by the new report—middle-of-the-road when it comes to the chronic health conditions that normally wave red flags for public health researchers. Yet it’s increasingly afflicted by drug-related deaths.

On the fringes of Cincinnati, Boone County, Kentucky, ranks first out of 120 across its state on all other health rankings. As in Essex County, rates of diabetes, smoking, and teen births are relatively low; poverty is suppressed, and employment is solid. Yet a look at CDC data shows county saw its drug-related death rate leap from 26 in 2010 to nearly 46 in 2015. Ranked smack in the middle of Ohio’s 88 counties and also included in the Cincinnati metro area, Clermont County saw a similar leap. Another example: Clay County, part of the Jacksonville, Florida, metro area, is 11th of the Sunshine State’s 67 counties. But drug-related deaths increased from 14 in 2010 to 23 in 2015.

It has been interesting thus far and it will continue to be interesting to observe how this is treated by the media, government, and public. This would be a good case for studying how a social problem develops: American society is so large that not everything can receive the attention it deserves. For example, how do the reactions to suburban drugs differ to how Americans treat drug use in cities (or rural areas which rarely get any attention)? How is the drug use explained: as part of criminal activity, irresponsibility, broken down homes and/or neighborhoods, wealth, or addiction? Because these deaths are happening to suburbanites – who as this article notes, are supposed to be healthier and often are – the story will be different.