Two surveys, one from 2014 and one more recent, suggest many older Americans want to stay put:
But the vast majority of older Americans—more than 70 percent of those over 50, according to a 2014 AARP survey—plan to “age in place,” or stay in their homes or communities. And the desire to stay put persists across urban, suburban, and rural residents—even in Snow Belt cities and among those with the financial resources to buy that condo in Boca or Scottsdale…
Welltower, a company that owns health-care real estate, from retirement communities to outpatient medical office buildings, recently surveyed 3,000 people to find out more about this desire among urbanites to age in place. Respondents were of various ages—Baby Boomers, Generation X, and Millennials—and lived in 10 cities across the country, from Seattle to Houston to Boston. One Canadian city—Toronto—was also included.
The survey showed that 7 out of 10 urbanites still want to live in their city after the age of 80. For Boomers, the share was higher, at 8 out of 10. The result was fairly uniform across the cities. Though some residents ranked their metropolises higher for livability for older residents—Washington, D.C., Miami, and Chicago got the highest marks, while Los Angeles, San Francisco, and New York City received the lowest—all respondents were still largely interested in staying and complimentary of their respective cities…
As CityLab reported earlier this year, this presents numerous challenges, especially for those who want to age in place. Only 1 percent of our housing stock is currently equipped with “universal design” elements that aid older residents, like no-step entrances, single-floor living, and wide halls and doorways. And more older adults also means more lower-income adults, who will struggle to afford the rent or mortgage, let alone modify their living space or employ in-home nursing care.
Three quick thoughts in response:
- While we know a lot about residential segregation due to race and class, could we be headed to scenarios where the elderly and younger adults live in very different environments? The two groups could be interested in very different urban features and differ on what amenities they should pay for through taxes. Some of these issues pop up from time to time when proposals are made for senior living facilities or there are requests for more school funding.
- What would happen if communities did not respond much to changes that would help the elderly? Would they revolt?
- Even though the elderly say they do not want to move, perhaps some cities and suburbs could gain a competitive edge by catering to this group. A neighborhood within a particular city could make changes so that while people would have to move, they would not necessarily have to go far. Or, certain communities could become regional centers for the elderly.
As this article notes, this demographic change is coming within the next few decades and it will be interesting to see how communities react.
One way to revive America’s cities may be to adapt to increasing densities in Americans suburbs:
But this analysis also misses something important. These trends don’t just represent people’s moving decisions — they also represent changes in the places themselves. If enough people move to a low-density area, it becomes a high-density area.
People are pouring into Dallas and San Diego. So unless those cities continue to sprawl ever farther out across the countryside, the new arrivals will increase density. People will want to live close to their jobs instead of enduring hour-long commutes. Apartment blocks will spring up where once-empty fields or single-family homes stood. Today’s fast-growing suburb is tomorrow’s urban area.
In other words, the great urban revival might not be ending, it might just be relocating. Instead of piling into existing cores, Americans might simply be creating new ones across the country. And if each of these new cities creates the productivity advantages enjoyed by places like San Francisco and New York City, this could be a good thing for the economy.
This is an intriguing concept: some suburbs, because of their popularity, willingness to build taller structures, and population size, might become like cities. This has already happened to some degree in a number of suburbs across the country.
Yet, just because a location has a certain number of people or reaches certain population densities does not necessarily mean that it feels or operates like a city. We also already have some denser urban areas – see the Los Angeles suburbs which are pretty dense compared to many metropolitan areas – but that does not automatically make them cities or urban. What is required? Most American cities have: a core or multiple cores that are multi-use and include a good number of businesses or offices; a walkability that extends for a good distance (beyond just a suburban downtown or large shopping center) and mass transit options to extend beyond the core(s) – in other words, good options beyond operating a car; a vibrancy and diversity that could range from thriving economic activity to restaurants and bars to filled public spaces; and an identity among residents and others that the area is a city.
Imagine Naperville, Illinois really wanted to become a city. It starts approving dense residential and commercial projects throughout the community. (Just to note: the local government has rejected these in the past.) The population ticks upward past 200,000 or even 300,000. There are still some pockets of single-family homes and vestiges of small-town life. How long would it take for the conditions of a city as discussed above arise? How would the community adapt to having so many businesses along I-88 rather than downtown? Would this limit the number of people who ride into Chicago on the Metra each day? (Naperville right now has the busiest stops in the whole system.) How would a city atmosphere develop? This all would take significant time and effort and perhaps decades before Naperville would be considered from both the inside and outside a city.
An article discussing the difficulties of avoiding flooding in a sprawling city like Houston includes this summary of a key problem:
One problem is that people care about flooding, because it’s dramatic and catastrophic. They don’t care about stormwater management, which is where the real issue lies. Even if it takes weeks or months, after Harvey subsides, public interest will decay too. Debo notes that traffic policy is an easier urban planning problem for ordinary folk, because it happens every day.
It is difficult to get people interested in infrastructure that does not effect them daily or they do not see it. Yet, flooding is a regular issue in many cities and suburban areas and it can be very hard to remedy once development has already occurred. Indeed, it is difficult imagine abandoning full cities or major developments:
The hardest part of managing urban flooding is reconciling it with Americans’ insistence that they can and should be able to live, work, and play anywhere. Waterborne transit was a key driver of urban development, and it’s inevitable that cities have grown where flooding is prevalent. But there are some regions that just shouldn’t become cities.
Given the regularity of flooding in developed areas, it is interesting to consider that there are not more solutions available in the short-term. Portable and massive levees? Water gates that can be quickly installed? Superfast pumps that can remove water?
With ownership of dogs on the rise, it is trickier to find public space in cities for the canines and their owners:
It’s not surprising that relations between dog walkers and dog owners are fraught: They’re competing for finite real estate. Market research shows dog ownership has skyrocketed some 29 percent nationwide in the past decade, an increase propelled largely by higher-income millennials. As young adult professionals increasingly put off having families, dogs have become “starter children,” as Joshua Stephens wrote in The Atlantic in 2015. With demand growing, cities and developers are building more dog-friendly zones both in response to and in anticipation of more four-legged residents. Off-leash dog parks are growing faster than any other type of park in America’s largest cities, with 2,200 counted as of 2010.
And when square footage is at a premium, dog parks are the setting of some of the most contentious fights for public space….
Resistance to dog parks takes on a different tenor when animals seem to displace humans in housing-crunched cities. New dog owners are disproportionately younger and whiter than the residents of the cities they move into, and that has real estate implications: For one third of Americans aged 18 to 36 who’d purchased a first home, finding better space for a dog was the primary motivator, according to a SunTrust Mortgage poll. When young, white, affluent dog owners snap up properties in historically lower-income neighborhoods of color—and start advocating for amenities like dog parks, which can bump up property values further—the optics are complicated…
Consider that, on Chicago’s predominantly black South Side, there’s not a single designated dog park, despite the efforts of local dog-owners and city aldermen. To address the needs of lower-income communities like this—as well as gentrifying ones—planners should approach dog parks as they do any other, says Wolch: listen to, and account for, their needs in an ingenuous way. To mitigate the displacement effects of a property value pick-up, affordable housing solutions should come to the dog-park planning table, too.
Three quick thoughts:
- It would be nice to know more about the “average” dog park. What does it cost to build and maintain compared to the typical park? How many people does it serve and what population does it cater to? Is it a good public use of space compared to other options? (One thing that article above does not address is whether there is an overall shortage of public space.)
- I’m surprised there isn’t more creativity in developing solutions. If land is at a premium, could there be some dog parks inside structures? Imagine a high-rise where one of the amenities is half a floor of dog park space. (I assume this is feasible.) Where there is more available land, such as on Chicago’s South Side, why couldn’t community groups or private interests put together a dog park? Imagine a non-profit that buys vacant lots and improves them for this purpose or an organization that charges a membership fee to their dog parks.
- At this point, it sounds like dog parks are more of a luxury good usually located in wealthier or whiter neighborhoods. What would it take to incorporate dog parks into public planning processes and/or see dog parks as necessary parts of thriving neighborhoods? Dog owners could band together and demand dog parks.
A recent study suggests cities may have started much earlier:
For centuries, archaeologists believed that ancient people couldn’t live in tropical jungles. The environment was simply too harsh and challenging, they thought. As a result, scientists simply didn’t look for clues of ancient civilizations in the tropics. Instead, they turned their attention to the Middle East, where we have ample evidence that hunter-gatherers settled down in farming villages 9,000 years ago during a period dubbed the “Neolithic revolution.” Eventually, these farmers’ offspring built the ziggurats of Mesopotamia and the great pyramids of Egypt. It seemed certain that city life came from these places and spread from there around the world.
But now that story seems increasingly uncertain. In an article published in Nature Plants, Max Planck Institute archaeologist Patrick Roberts and his colleagues explain that cities and farms are far older than we think. Using techniques ranging from genetic sampling of forest ecosystems and isotope analysis of human teeth, to soil analysis and lidar, the researchers have found ample evidence that people at the equator were actively changing the natural world to make it more human-centric.
It all started about 45,000 years ago. At that point, people began burning down vegetation to make room for plant resources and homes. Over the next 35,000 years, the simple practice of burning back forest evolved. People mixed specialized soils for growing plants; they drained swamps for agriculture; they domesticated animals like chickens; and they farmed yam, taro, sweet potato, chili pepper, black pepper, mango, and bananas…
“The tropics demonstrate that where we draw the lines of agriculture and urbanism can be very difficult to determine. Humans were clearly modifying environments and moving even small animals around as early as 20,000 years ago in Melanesia, they were performing the extensive drainage of landscapes at Kuk Swamp to farm yams [and] bananas… From a Middle East/European perspective, there has always been a revolutionary difference (“Neolithic revolution”) between hunter gatherers and farmers, [but] the tropics belie this somewhat.”
Two things strike me:
- The article suggests that this finding just occurred now because scholars assumed it wasn’t worth examining the tropics. This happens more often than researchers want to admit: we explore certain phenomena for certain reasons and this may blind us to other phenomena or explanations. In a perfect world, there would be so many researchers that everything could be covered and research that rules out explanations or shows a lack of phenomena would be valued more highly.
- That cities and agriculture took a longer time to develop does not seem too surprising. The shift to more anchored lives – tied to farming and larger population centers – would have been quite a change. Arguably, the world is still going through this process with the pace of urbanization increasing tremendously in the last century and nations and cities desperately trying to catch up.
Now that scientists are looking into this matter, hopefully we get a more complete understanding soon.
Compared to the rhetoric of the mid to late twentieth century, the possibility of a nuclear attack on an American city gets little attention outside of occasional new threats:
Calculating the range of the missile in the direction of some major US cities gives the approximate results in Table 1.
Table 1 shows that Los Angeles, Denver, and Chicago appear to be well within range of this missile, and that Boston and New York may be just within range. Washington, D.C. may be just out of range.
Relations between the two countries are not good at this point. Yet, is anyone in a major American city or metropolitan region really worried about this? Cities are full of a lot of younger residents (people who didn’t live through the earlier nuclear threat), history suggests no country would use a weapon on a major metropolitan area (except the United States in 1945), Americans are pretty confident in their military abilities (even if they haven’t had to actually use their nuclear capabilities or defenses recently), and cities have plenty of other concerns to consider (from inequality to affordable housing to congestion).
I suppose one could argue that we have become too comfortable in light of an ongoing existential threat (and there are plenty of nuclear weapons beyond what North Korea might have and the discussions about dirty bombs are not too old). However, perhaps this suggests we have come a long way since the 1950s as few American urban dwellers or suburbanites will lose much sleep over this.
One of my studies, From I Love Lucy in Connecticut to Desperate Housewives’ Wisteria Lane: Suburban TV Shows, 1950-2007, recently came out in print in Sociological Focus. Here is the abstract for the piece and I’ll add a few thoughts afterward:
The majority of Americans now live in suburbs, and a number of scholars have highlighted how various pop culture objects, from novels to television shows, have either reflected or encouraged suburban life. An analysis of the top 30 Nielsen-rated television shows from 1950 to 2007, a period of both rapid suburbanization and television growth, reveals that suburban TV shows did not dominate popular television. There is slightly more evidence for reflection theory with more sets of seasons with higher numbers of suburban-set shows following decades of rapid suburban growth. Additionally, the number of suburban-set shows was also influenced by the popularity of the genres of sitcoms and dramas. These findings suggest a need for further research into why relatively few popular shows were set in suburbs compared to big cities and how viewing settings on television directly influences suburban aspirations and behavior.
In sum: even if suburban set television shows have been a staple of fall lineups and reruns since the 1950s, they often do not rank among the most highly rated and there is limited evidence that they inspired suburban growth.
All that said, I think there is a lot to be done with connecting television depictions of locations with behaviors and attitudes. While Americans still watch multiple hours of TV a day on average, it is not fully clear how all that viewing affects people. What it does mean if the suburbs tend to be depicted in certain ways – either family sitcoms or the underside of happy-looking suburban life – and cities are depicted in other ways – the main setting for crime or police shows, which are heavily represented in top rated shows going back decades? On the whole, few shows are able or willing to deeply delve into a location and its people – such as the celebrated The Wire – even though they have the hours to do so. Does the generic big city or suburb on TV change viewers?