Bringing medical clinics to vacant shopping mall space

Filling emptying shopping malls can be a hard task. Add medical services to the list of possible replacement uses:

Mall of America in Minneapolis, America’s largest mall, announced plans last week to open a 2,300-square-foot walk-in clinic in November with medical exam rooms, a radiology room, lab space and a pharmacy dispensary service. Mall of America is teaming up with University of Minnesota physicians and a Minnesota-based health care system to operate the clinic…

While mall leases for clothing retailers declined by more than 10% since 2017, medical clinics at malls have risen by almost 60% during the same period, according to Drew Myers, real estate analyst at CoStar Group. The growth of medical clinic leases at malls has been the “strongest among all major retail sectors over the past five years,” he said.

Mall landlords are betting that when patients visit for a flu shot or eye exam, they’ll shop around for clothes or electronics. Adding medical clinics also makes sense for mall owners because they draw in doctors, nurses and technicians every day who may shop and eat at restaurants, according to a May research report by real estate firm JLL. Health care providers are also attractive tenants for mall landlords because they tend to have high credit ratings and sign longer leases compared with other retailers, JLL analysts noted.

On the provider and health insurer side, shopping malls give companies convenient locations to set up outpatient care posts and preventative care locations for patients. Providers are increasingly looking to these lower-cost clinics to help patients avoid expensive trips to the emergency room.

The medical offices can serve the new residents and commercial uses that are also now occupying shopping mall space in addition to blending shopping and medical trips (dubbed “medtail” in the article). Just wait until the new hospital takes over the mall and patients and visitors can walk out one door and into a clothing store down the hall.

More broadly, this hints at a blending of activity within single structures that suburbs are not used to. Suburbs are known for separating land uses, often with the goal of protecting single-family homes. Suburban downtowns, places where multiple uses might be found, are limited and now often seem geared more toward entertainment and cultural use. Could the shopping mall truly be a community center in the coming decades with more residential units, medical offices, and community spaces?

When medical care didn’t contribute as much to improved health outcomes

An interesting piece on the efficacy of medicine and medical procedures (TLDR: they aren’t always effective but doctors and patients feel compelled to try something) ends with this suggestion about the power medicine has over the public:

Historians of public health know that most of the life-expectancy improvements in the last two centuries stem from innovations in sanitation, food storage, quarantines, and so on. The so-called “First Public Health Revolution”—from 1880 to 1920—saw the biggest lifespan increase, predating antibiotics or modern surgery.

In the 1990s, the American Cancer Society’s board of directors put out a national challenge to cut cancer rates from a peak in 1990. Encouragingly, deaths in the United States from all types of cancer since then have been falling. Still, American men have a ways to go to return to 1930s levels. Medical innovation has certainly helped; it’s just that public health has more often been the society-wide game changer. Most people just don’t believe it.

In 2014, two researchers at Brigham Young University surveyed Americans and found that typical adults attributed about 80 percent of the increase in life expectancy since the mid-1800s to modern medicine. “The public grossly overestimates how much of our increased life expectancy should be attributed to medical care,” they wrote, “and is largely unaware of the critical role played by public health and improved social conditions determinants.” This perception, they continued, might hinder funding for public health, and it “may also contribute to overfunding the medical sector of the economy and impede efforts to contain health care costs.”

It is a loaded claim. But consider the $6.3 billion 21st Century Cures Act, which recently passed Congress to widespread acclaim. Who can argue with a law created in part to bolster cancer research? Among others, the heads of the American Academy of Family Physicians and the American Public Health Association. They argue against the new law because it will take $3.5 billion away from public-health efforts in order to fund research on new medical technology and drugs, including former Vice President Joe Biden’s “cancer moonshot.” The new law takes money from programs—like vaccination and smoking-cessation efforts—that are known to prevent disease and moves it to work that might, eventually, treat disease. The bill will also allow the FDA to approve new uses for drugs based on observational studies or even “summary-level reviews” of data submitted by pharmaceutical companies. Prasad has been a particularly trenchant and public critic, tweeting that “the only people who don’t like the bill are people who study drug approval, safety, and who aren’t paid by Pharma.”

We might attribute this overconfidence in medical care among Americans to two cultural traits: (1) a belief that science can and should solve problems and lead to better lives and (2) an interest in efficient solutions to complex problems. Yet, one takeaway from this is that a healthier lifestyle may be boring and be hard work to implement (on both an individual and community level) but could be more effective in the long-term than medical intervention.

Medical TV shows skew how Americans view doctors, health

The portrayals of medical work on television have had an effect on American viewers:

A 2005 survey by the Centers for Disease Control and Prevention found that the majority of primetime TV viewers reported learning something new about a disease or other health issue over six months of viewing. About one-third of viewers took some kind of action after learning about a health issue on TV…

As a result, “a fan of medical dramas … can develop a skewed perception of what are more or less prevalent health issues in the real world,” study author Dr. Jae Eun Chung, an assistant professor in the school of communication at Howard University, told me in an email. Heavy viewers of medical dramas in her study were less likely to rate cardiovascular disease and cancer as important societal issues (when they are, in fact, the top two causes of death in the U.S.), and when it came to cancer, they were more fatalistic, “more likely to say that cancer prevention is uncertain and that the disease is fatal.”…

Studies of modern medical shows have found fictional doctors’ professionalism disappointing at best. In an analysis of 50 episodes of Grey’s Anatomy and House, researchers found that the characters handled issues involving patient consent well 43 percent of the time. “The remainder [of the depictions] were inadequate,” the study says.

The analysis also found several incidents of doctors endangering patients without being punished, sexual misconduct (of course), and disrespect. The study notes that “88 percent of disrespectful incidents in House involved Dr. House.”

But despite all the inappropriate romances, and Dr. House’s rude mouth, the analysis found that there’s one arena in which TV doctors still shine: caring for patients.

Interesting contrast: TV doctors are caring people but to be that, they cut corners in terms of professionalism as they treat a whole range of odd medical conditions. Additionally, could there be a compelling medical TV show that addresses normal American health problems like obesity and heart disease? (Cancer does get some coverage on these shows.)

All of this makes me wonder whether all professions really would want their activities portrayed on TV. While certain fields may not get much airtime, like sociology, wouldn’t TV likely warp any of these professions in the name of entertainment and stereotypes?

Quick Review: League of Denial

I had a chance this past week to read the book League of Denial and see the PBS documentary by the same name. Some thoughts about the story of the NFL and concussion research (focusing mostly on the book which provides a more detailed narrative):

1. I know some fans are already complaining of “concussion fatigue” but it is hard to think of football the same way after hearing this story. For decades, we have held up players for their toughness and yet it may be ruining their brains.

2. The human story in all of this is quite interesting. This includes some of the former football players who have been driven to the edge by their football-related brain injuries. At the same time, the story amongst the doctors is also pretty fascinating, the chase for fame, publishing articles, and acquiring brains. Running through the whole book is this question of “who is really doing this research for the right reasons?” Even if the NFL research appears to be irrevocably tainted, are the researchers on the other side completely neutral or pure of heart?

3. The whole scientific process is laid out in the book (glossed over more in the documentary)…and I’m not sure how it fares. You have scientists fighting each other to acquire brains. You have peer-reviewed research – supposed to help prevent erroneous findings – that is viewed by many as erroneous from the start. You have scientists fighting for funding, an ongoing battle for all researchers as they must support their work and have their own livelihoods. In the end, consensus seems to be emerging but the book and documentary highlight the messy process it takes to get there.

4. The comparisons of the NFL to Big Tobacco seem compelling: the NFL tried to bury concussions research for a few decades and still doesn’t admit to a long-term impact of concussions on its players. One place where the comparison might break down for the general public (and scientific research could change this in the near future): the worst problems seem to be in long-time NFL players. When exactly does CTE start in the brains of football players? There is some evidence younger players, college or high school, might already have CTE but we need more evidence of this to be sure. If that is established, that perhaps kids as young as junior high already have CTE and that CTE is derived from regular hits at a young age (not the big knock-out blows), the link to Big Tobacco might be complete.

5. It is not really part of this story but I was struck again by how relatively little we know about the brain. Concussion research didn’t really take off until the 1990s, even as this had happened with football players for decades. (One sports area where it had been studied: boxing.) Much of this research is quite new and is a reminder that we humans don’t know as much as we might think.

6. This also provides a big reminder that the NFL is big business. Players seem the most aware of this: they can be cut at any time and an injury outside of their control could end their careers. The league and owners do not come off well here as they try to protect their holdings. The employees – the players – are generally treated badly: paid well if they perform but thrown aside otherwise. This may lead to a “better product” on the field but the human toll is staggering.

7. How exactly you change people’s opinions, both fans and players, regarding concussions will be fascinating to watch. It will take quite a shift among players from the tough-guy image to being willing to consider their futures more carefully. For fans, they may become more understanding as their favorite players consider what concussions might do to their lives. Will the NFL remain as popular? Hard to say though I imagine most fans this past weekend of football had little problem watching lots of gridiron action Saturday and Sunday.

Health includes social and behavioral dimensions

There may be privacy concerns about the government having behavioral and social data as part of medical records but that doesn’t necessarily mean they aren’t important factors when looking at health:

The Centers for Medicare and Medicaid Services (CMS) wants to require health care providers to include “social and behavioral” data in Electronic Health Records (EHR) and to link patient’s records to public health departments, it was announced last week.

Health care experts say the proposal raises additional privacy concerns over Americans’ personal health information, on top of worries that the Obamacare “data hub” could lead to abuse by bureaucrats and identify theft…

The “meaningful use” program already requires doctors and hospitals to report the demographics of a patient and if he smokes to qualify for its first step. The second stage, planned for 2014, will require recording a patient’s family health history.

The National Academy of Sciences will make recommendations for adding social and behavioral data for stage three, which will be unveiled in 2016.

Maybe these are separate concerns: one might argue such data is worthwhile but they don’t trust he government with it. But, I suspect there are some who don’t like the collection of social and behavioral data at all. They would argue it is too intrusive. People have made similar complaints about the Census: why exactly does the government need this data anyway?

However, we know that health is not just a physical outcome. You can’t separate health from behavior and social interactions. There is a lot of potential here for new understandings of health and its multidimensionality. Take something like stress. There are physical reactions to it but this is an issue strongly influenced by context. Solutions to it could include pills or medicine but that is only dealing with the physical outcomes rather than limiting or addressing stressful situations.

We’ll see how this plays out. I suspect, federal government involvement or not, medical professionals will be looking more at the whole person when addressing physical concerns.

Quick Review: The Immortal Life of Henrietta Lack

After a few people mentioned a particular New York Times bestseller to me recently, I decided to read The Immortal Life of Henrietta Lack. While the story itself was interesting, there is a lot of material here that could be used in research methods and ethics classes. A few thoughts about the book:

1. The story is split into two narratives. One is about both the progress science has made with a Lack’s cells but also the struggle of her family to understand what actually has been done with her cells. The story of scientific progress is unmistakable: we have come a long way in identifying and curing some diseases in the last sixty years. (This narrative reminded me of the book The Emperor of All Maladies.)

2. The second narrative is about the personal side of scientific research and how patients and relatives interpret what is going on. The author initially finds that the Lacks know very little about how their sister or mother’s cells have been used. These problems are compounded by race, class, and educational differences between the Lacks and the doctors utilizing Henrietta’s cells. In my opinion, this aspect is understated in this book. At the least, this is a reminder about how inequality can affect health care. But I think this personal narrative is the best part of the book. When I talk in class about the reasons for Institutional Review Boards, informed consent, and ethics, students often wonder how much social science research can really harm people. As this book discusses, there are some moments in relatively recent history that we would agree were atrocious: Nazi experiments, the Tuskegee experiments, experiments in Guatemala, and so on. Going beyond those egregious cases, this book illustrates the kind of mental and social harm that can result from research even if using Henrietta’s cells never physically harmed the Lacks. I’m thinking about using some sections of this narrative in class to illustrate what could happen; even if new research appears to be safe, we have to make sure we are protecting our research subjects.

3. This book reminded me of the occasional paternalistic side of the medical field. This book seems to suggest this isn’t just an artifact of the 1950s or a racial division; doctors appear slow in addressing concerns some people might have about the use of human tissue in research. I realize that there is a lot at stake here: the afterward of the book makes clear how difficult it would be to regulate this all and how this might severely limit needed medical research. At the same time, doctors and other medical professionals could go further in explaining the processes and the possible outcomes to patients. Perhaps this is why the MCAT is moving toward involving more sociology and psychology.

4. There is room here to contrast the discussions about using body tissue for research and online privacy. In both cases, a person is giving up something personal. Are people more disturbed by their tissue being used or their personal information being used and sold online?

All in all, this book discusses both scientific breakthroughs, how patients can be hurt by the system, and a number of ethical issues that have yet to be resolved.

Adding sociology to the MCAT

I’ve wondered recently about sociology in grade schools and here is news that sociology has been added to the MCAT, the entrance exam for medical schools:

The 2015 change marks the fifth major alteration of the MCAT since it was introduced in the 1920s.

The new MCAT will include the addition of an entirely new section titled “Psychological, Social and Biological Foundations of Behavior,” as well as a more intense examination of the biomedical sciences, such as genetics and biochemistry…

Koetje said the addition of psychology and sociology to the test was necessary because of the advances made in health care as well as the sociocultural changes within the health care system.

“Patients are more complex today, and medical schools have to ensure that these students will be capable of treating the whole person and everything that comes with that,” Koetje said.

As the article hints at, will more schools now have pre-med students take courses in sociology and psychology in order to prepare? In response to this question (part of a larger set of FAQs about the change), here is what the AAMC says:

Examinees who would not otherwise take biochemistry, cellular and molecular biology, introductory psychology, and introductory sociology would need to study the concepts tested. We do not anticipate the need for additional coursework in research methods and statistics.

I wonder exactly what sociological concepts will be included on the test. I assume race, social class, and gender are non-negotiable?

This also gives sociology teachers more room to tell undergraduates in Intro to Sociology classes that sociology is helpful for many fields, including medicine.