About one-third of Americans have a four-year college degree, and they are living longer and more prosperous lives while the rest face rising death rates and declining prospects, said researcher Angus Deaton, a professor at the University of Southern California’s Center for Health Policy and Economics.
For a good segment of Americans, college is the expected path that follows after high school and also leads to future opportunities, particularly regarding jobs. But, many American adults did not or do not follow that path and this has all kinds of consequences. At the least, it can provide a reminder to current college students and instructors that college is an opportunity and/or blessing, not just something to be endured for later outcomes. More broadly, that degree can separate workers in the job market, lead to subsequent educational opportunities, and, as this study suggests, interact with health.
If people gather for Thanksgiving, experts are advising they meet and eat outside. Here is one example:
How much safer is an outdoor meal than an indoor meal?
Much, much safer. Almost all transmission of this virus happens indoors.
Even if people are close together?
Eating outdoors doesn’t mean you’re invincible. Still try to stay six feet apart. If you huddle together around a cramped table and have close, face-to-face conversations with the people next to you, you could absolutely infect them.
This is time for the patio or lawn, found in millions of single-family homes and in many suburbs, to shine. The lawn does not just have to be a status symbol; it can confer health benefits by allowing people to spread out.
This is not the first time that the suburban lawn was said to boost health. In the gathering urbanization of the nineteenth century, suburban lawns provided space away from polluted and noisy cities. Listening to the radio the other day, I again heard mentioned how River Forest, Illinois was intentionally built with features meant to highlight nature.
Before COVID-19, the suburban lawn was also said to aid good health. It helps people get outside to work and move around (canceled out by the use of gas-powered equipment?). It encourages kids to play in a safe space. Depending on the season and/or weather, the patio and yard can act as an outdoor extension of private living space.
Now, the lawn and patio can be a private spot away from COVID-19. Outsiders are not welcome. The fresh air, breeze, and distance can limit transmission. Nature, or “nature” in many suburban settings, can serve as an oasis. All that lawn and patio maintenance can be put to use. And, hopefully, people can stay COVID free.
Being 35 or older is labeled by the medical community as “advanced maternal age.” In diagnosis code speak, these patients are “elderly,” or in some parts of the world, “geriatric.” In addition to being offensive to most, these terms—so jarringly at odds with what is otherwise considered a young age—instill a sense that one’s reproductive identity is predominantly negative as soon as one reaches age 35. But the number 35 itself, not to mention the conclusions we draw from it, has spun out of our collective control…
The 35-year-old threshold is not only known by patients, it is embraced by doctors as a tool that guides the care of their patients. It’s used bimodally: If you’re under 35, you’re fine; if you’re 35 or older, you have a new host of problems. This interpretation treats the issue at hand as what is known as a “threshold effect.” Cross the threshold of age 35, it implies, and the intrinsic nature of a woman’s body has changed; she falls off a cliff from one category into another. (Indeed, many of my patients speak of crossing age 35 as exactly this kind of fall, with their fertility “plummeting” suddenly.) As I’ve already stated, though, the age-related concerns are gradual and exist along a continuum. Even if the rate of those risks accelerates at a certain point, it’s still not a quantum leap from one risk category to another.
This issue comes up frequently in science and medicine. In order to categorize things that fall along a continuum, things that nature itself doesn’t necessarily distinguish as being separable into discrete groups, we have to create cutoffs. Those work very well when comparing large groups of patients, because that’s what the studies were designed to do, but to apply those to individual patients is more difficult. To a degree, they can be useful. For example, when we are operating far from those cutoffs—counseling a 25-year-old versus a 45-year-old—the conclusions to draw from that cutoff are more applicable. But operate close to it—counseling a 34-year-old trying to imagine her future 36-year-old self—and the distinction is so subtle as to be almost superfluous.
The trade-offs seem clear. A single point where the data turns from one category to another, an age of 35, simplifies the research findings (though the article suggests they may not actually point to 35) and allows doctors and others to offer clear guidance. The number is easy to remember.
A continuum, on the other hand, might better fit the data where there is not a clear drop-off at an age near 35. The range offers more flexibility for doctors and patients to develop an individualized approach.
Deciding which is better requires thinking about the advantages of each, the purpose of the categories, and who wants what information. The “easy” answer is that both sets of categories can exist; people could keep in mind a rough estimate of 35 while doctors and researchers could have conversations where they discuss why that particular age may or may not matter for a person.
More broadly, learning more about continuums and considering when they are worth deploying could benefit our society. I realize I am comfortable with them; sociologists suggest many social phenomena fall along a continuum with many cases falling in between. But, this tendency toward continuums or spectrums or more nuanced or complex results may not always be helpful. We can decry black and white thinking and yet we all need to regularly make quick decisions based on a limited number of categories (I am thinking of System 1 thinking described by behavioral economists and others). Even as we strive to collect good data, we also need to pay attention to how we organize and communicate that data.
Writing in the journal PNAS, researchers from several California universities describe how they used anonymized cell phone location data and census info to show a dramatic reversal in how mobile Americans have been this year. Before Covid-19 struck, rich Americans moved about more than poor Americans—they can always afford to travel. But between January and April, that flipped. Rich folk are now far more likely to stay completely at home than poor folk: The study found that 25 percent more high earners stayed completely at home during the pandemic, compared to the number of them who had stayed home before. That increase was only 10 percent among low earners. And that has major implications for how we as a nation can fight the pandemic.
“In the early stages of the Covid-19 pandemic, there was a clear mobility response across the board,” says University of California, Davis environmental economist Joakim Weill, lead author on the paper. “In the US, everyone started to stay at home more. But we also found that there is a clear differential between wealthier communities and poor communities, where individuals in wealthier neighborhoods tended to stay at home much more than people in poorer neighborhoods.”…
Close to half of the wealthiest Americans stayed completely at home on weekdays in April, compared to less than 40 percent of low-earners. The poor traveled farther distances on average: In the same month, people who live in lower-income areas traveled between 5 and 6 kilometers, while the rich traveled closer to 4. The rich nearly halved their visits to recreational and retail areas in April, while the poor cut their visits by only a quarter—perhaps because their jobs required them to return to work there.
To be clear, the researchers can’t definitively say why the data shows this dramatic discrepancy, but they can begin to speculate. For one, essential workers often earn lower incomes, like clerks at grocery stores and pharmacies. Indeed, the US Bureau of Labor Statistics has found that among Americans 25 and older with less than a high school diploma, just 5 percent teleworked in June. On the other hand, 54 percent of Americans with a bachelor’s or more advanced degree were able to work remotely.
As the article goes on to note, the fact that anyone can contract COVID-19 is not the same as saying everyone has the same likelihood of contracting COVID-19. Those with resources have more options in how to respond to crises plus more options when it comes to treatment. These differences are generally present regarding health but a large pandemic reveals some of the underlying patterns that deserve attention.
Last week, residents in Darien, Connecticut, a tony exurb of New York City, successfully lobbied to shut down plans for a coronavirus testing site, despite surging demand. The reason? Complaints from neighbors. As it turns out, the “Not In My Backyard” impulse to block new development — which has been implicated in the severe affordability crisis affecting cities from coast to coast — translates far too neatly into blocking certain measures needed to stop the spread of the virus.
In a similar case in Ewing, New Jersey, a local landlord issued a cease-and-desist letter to the operator of a coronavirus testing center amid complaints about congestion in the parking lot. As The Trentonian reported, one resident who wanted to be tested in order to protect his three-year-old child wasn’t subtle about how he felt about the decision: “It blows my f**king mind.”
Community resistance from neighbors of testing sites is a rerun of the fierce NIMBY reaction to potential coronavirus quarantine sites. Back in February, California began looking for a place to shelter Americans returning from abroad with the virus and settled on an isolated medical campus in Costa Mesa. But after local residents complained, city officials sought and received a court injunction to stop the project.
As the need for quarantine sites expanded, so did the NIMBY backlash. Finding sites that won’t suffer the same fate has proven to be a major hurdle as the federal government attempts to manage the crisis. Back when the focus was still on returning cruise ship passengers, officials in Alabama went to the mat to keep passengers of the Diamond Princess cruise ship out of a local FEMA facility, eventually forcing the federal government to scrap the plan altogether. Similar fights have played out from Seattle to San Antonio, potentially undercutting the response to the coronavirus at key early stages. As a result, the federal government largely shifted quarantining efforts to military bases, where complaining neighbors hold less sway…
At first glance, it might seem like efforts to block potentially life-saving public health screenings and complaints about community character have little in common. But in both cases, the formula is the same: Whether out of an understandable fear of the unknown or a selfish desire to shift the burden elsewhere, local impulses are given veto power over broader social needs. Under normal conditions, the inability to constructively manage this means higher rents. In a public health emergency, it could be lethal.
In addition to what is in the last paragraph quoted above, I am struck by the resistance to facilities and sites that would be home to temporary concerns. It is one thing to object to a long-term health facility (see recent posts about a drug treatment facility in the western suburbs of the Chicago area here and here) but another to resist something that is needed now and presumably not permanent. Of course, this could be part of the fear: if a site treats COVID-19, could it then later be turned into a more permanent fixture in the community?
The logical extension of the NIMBY claims would be to push COVID-19 treatment sites or testing facilities to communities that could not resist it. When this plays out in areas like housing or unwanted land uses, this means that communities with less wealth and political power tend to become home to land uses that wealthier communities refuse. If such a pattern occurs here (and there is evidence that health differs dramatically by location in the United States), it could be evidence that pandemics further locational and health inequalities.
As maps like this show, major metropolitan areas are bearing the brunt of the Covid-19 infections spreading across North America. And that makes sense: Though there’s no way to know for sure how the virus arrived, it almost certainly came by way of an international flight to a major airport (or several of them). But while infectious disease spreads faster where people are more densely clustered — hence the strategy of social distancing to contain the coronavirus — that doesn’t necessarily make suburban or rural areas safer, health experts say…
That is not to say that cities aren’t Petri dishes — they are. Relative to rural areas, urban centers do provide stronger chains of viral transmission, with higher rates of contact and larger numbers of infection-prone people. And historically, urbanites paid a price for this vulnerability…
Modern transportation networks have made the population shield that rural areas once provided much more porous. Now that humans and freight can travel from, say, Hong Kong to Los Angeles in less than 13 hours — and arrive by vehicle to somewhere sparsely populated hours after that — outbreaks can happen just about anywhere. New pathogens tend to arrive sooner in global hubs, but that doesn’t mean they can’t quickly reach rural locales and proliferate from there, says Benjamin Dalziel, a professor of mathematics at Oregon State University who studies population dynamics…
But while the CDC recommends decreasing social contact to limit the spread of the virus, that’s just as doable in a downtown apartment as a countryside manor. Says Viboud: “If you’re staying at home and limiting outside contact, you’d achieve the same purpose.”
Three thoughts come to mind:
This highlights the connectedness of cities and suburbs today, even if there is significant physical distance separating communities. The rate at which people travel around the world, to other regions, and throughout regions is high compared to all of human history and is relatively easy to do. Cities and suburbs are not separate places; they are parts of interdependent regions that are highly connected to other places.
Safety and health was a part of creating the suburbs in the United States but it is hard to know how this might matter in the future. Given all the reasons people now settle in the suburbs, would avoiding communicable diseases be a top factor? I would think not, particularly compared to factors like housing prices or amenities (schools, quality of life, etc.), or demographics.
If particular places are not that much safer, does the sprawl of American life then limit the response to any illness? Imagine the Chicago region with dozens of hospitals that need to be equipped spread throughout the region as opposed to that same number of people packed into a smaller area where it is easier to get supplies and people to medical facilities. Or, the need to supply grocery stores throughout a huge region.
Mall of America in Minneapolis, America’s largest mall, announced plans last week to open a 2,300-square-foot walk-in clinic in November with medical exam rooms, a radiology room, lab space and a pharmacy dispensary service. Mall of America is teaming up with University of Minnesota physicians and a Minnesota-based health care system to operate the clinic…
While mall leases for clothing retailers declined by more than 10% since 2017, medical clinics at malls have risen by almost 60% during the same period, according to Drew Myers, real estate analyst at CoStar Group. The growth of medical clinic leases at malls has been the “strongest among all major retail sectors over the past five years,” he said.
Mall landlords are betting that when patients visit for a flu shot or eye exam, they’ll shop around for clothes or electronics. Adding medical clinics also makes sense for mall owners because they draw in doctors, nurses and technicians every day who may shop and eat at restaurants, according to a May research report by real estate firm JLL. Health care providers are also attractive tenants for mall landlords because they tend to have high credit ratings and sign longer leases compared with other retailers, JLL analysts noted.
On the provider and health insurer side, shopping malls give companies convenient locations to set up outpatient care posts and preventative care locations for patients. Providers are increasingly looking to these lower-cost clinics to help patients avoid expensive trips to the emergency room.
That’s what Yin Cao and an international group of colleagues wanted to find out in their latest study published in JAMA. While studies on sitting behavior in specific groups of people — such as children or working adults with desk jobs — have recorded how sedentary people are, there is little data on how drastically sitting habits have changed over time. “We don’t know how these patterns have or have not changed in the past 15 years,” says Cao, an assistant professor in public health sciences at the Washington University School of Medicine.
The researchers used data collected from 2001 to 2016 by the National Health and Nutrition Examination Survey (NHANES), which asked a representative sample of Americans ages five and older how many hours they spent watching TV or videos daily in the past month, and how many hours they spent using a computer outside of work or school. The team analyzed responses from nearly 52,000 people and also calculated trends in the total time people spent sitting from 2007 to 2016. Overall, teens and adults in 2016 spent an average of an hour more each day sitting than they did in 2007. And most people devoted that time parked in front of the TV or videos: in 2016, about 62% of children ages five to 11 spent two or more hours watching TV or videos every day, while 59% of teens and 65% of adults did so. Across all age groups, people also spent more time in 2016 using computers when they were not at work or school compared to 2003. This type of screen time increased from 43% to 56% among children, from 53% to 57% among adolescents and from 29% to 50% among adults…
The increase in total sitting time is likely largely driven by the surge in time spent in front of a computer. As eye-opening as the trend data are, they may even underestimate the amount of time Americans spend sedentary, since the questions did not specifically address time spent on smartphones. While some of this time might have been captured by the data on time spent watching TV or videos, most people spend additional time browsing social media and interacting with friends via texts and video chats — much of it while sitting.
Does this mean the Holy Grail of media is screentime that requires standing and/or walking around to avoid sitting too much? Imagine a device that requires some movement to work. This does not have to be a pedal powered gaming console or smartphone but perhaps just a smartphone that needs to move 100 feet every five minutes to continue. (Then imagine the workarounds, such as motorized scooter while watching a screen a la Wall-E.)
Of course, the answer might be to just consume less media content on screens. This might prove difficult. Nielsen reports American adults consume 11 hours of media a day. Even as critics have assailed television, films, and Internet and social media content, Americans still choose (and are pushed as well) to watch more.
I-Min Lee, a professor of epidemiology at the Harvard University T. H. Chan School of Public Health and the lead author of a new study published this week in the Journal of the American Medical Association, began looking into the step rule because she was curious about where it came from. “It turns out the original basis for this 10,000-step guideline was really a marketing strategy,” she explains. “In 1965, a Japanese company was selling pedometers, and they gave it a name that, in Japanese, means ‘the 10,000-step meter.’”
Based on conversations she’s had with Japanese researchers, Lee believes that name was chosen for the product because the character for “10,000” looks sort of like a man walking. As far as she knows, the actual health merits of that number have never been validated by research.
Scientific or not, this bit of branding ingenuity transmogrified into a pearl of wisdom that traveled around the globe over the next half century, and eventually found its way onto the wrists and into the pockets of millions of Americans. In her research, Lee put it to the test by observing the step totals and mortality rates of more than 16,000 elderly American women. The study’s results paint a more nuanced picture of the value of physical activity.
“The basic finding was that at 4,400 steps per day, these women had significantly lower mortality rates compared to the least active women,” Lee explains. If they did more, their mortality rates continued to drop, until they reached about 7,500 steps, at which point the rates leveled out. Ultimately, increasing daily physical activity by as little as 2,000 steps—less than a mile of walking—was associated with positive health outcomes for the elderly women.
This sounds like a “mutant statistic” like sociologist Joel Best describes. The study suggests the figure originally arose for marketing purposes and was less about the actual numeric quantity and more about a particular cultural reference. From there, the figure spread until it became a normal part of cultural life and organizational behavior as people and groups aimed to walk 10,000 steps. Few people likely stopped to think about whether 10,000 was an accurate figure or an empirical finding. As a marketing ploy, it seems to have worked.
This should raise larger questions about how many other publicly known figures are more fabrication than empirically based. Do these figures tend to pop up in health statistics more than in other fields? Does countering the figures with an academic study stem the tide of their usage?
It appears that religiosity affects certain areas more consistently – particularly smoking, voting, happiness, and participation in nonreligious organizations – than others even as these relationships between religiosity and health, well-being, and prosocial behaviors can differ across countries. Of course, why some of these relationships and not others exist, even in the same categories like the example that the more religious do not smoke but religiosity has no impact on obesity or exercise, gets more complicated…