Increasing gap in wealth between older and younger generations in America

It isn’t too surprising that older Americans have more wealth than younger Americans but perhaps the bigger story is that this gap has increased in recent decades:

The wealth gap between younger and older Americans has stretched to the widest on record, worsened by a prolonged economic downturn that has wiped out job opportunities for young adults and saddled them with housing and college debt.

The typical U.S. household headed by a person age 65 or older has a net worth 47 times greater than a household headed by someone under 35, according to an analysis of census data released Monday.

While people typically accumulate assets as they age, this wealth gap is now more than double what it was in 2005 and nearly five times the 10-to-1 disparity a quarter-century ago, after adjusting for inflation.

The median net worth of households headed by someone 65 or older was $170,494. That is 42 percent more than in 1984, when the Census Bureau first began measuring wealth broken down by age. The median net worth for the younger-age households was $3,662, down by 68 percent from a quarter-century ago, according to the analysis by the Pew Research Center.

The analysis in the story suggests that this growing gap is indicative of tougher economic conditions brought about by difficulties in finding a job, the delaying of marriage, growing college debt, and less of an ability to purchase a home when younger.

I wonder how this gap might translate into social or political action. Older Americans are well known for their relatively high voting turnout compared to younger Americans who are more fickle. Would younger Americans vote consistently about down-the-road issues like the national debt, Social Security, and other things they may be several decades from personally experiencing? Is this less consistent voting behavior among younger Americans the reason that there aren’t more safety nets for younger adults? Are Millennials, and not “Walmart Moms,” the next major voting bloc to emerge?

How much of this should raise concern about the economic welfare of younger Americans now or should we be more worried about how this later, rougher start in life will lead to less wealthy Americans (with its impact on American society) decades down the road?

It would be interesting to tie this to information about the demographics of the Occupy Wall Street protests. Media reports have tended to portray many of the protestors as college students or just our of college – how true is this? In public support for the movement, how much is based in the younger ages versus older demographics (who might support the Tea Party more?)?

Census data shows increase in people living in neighborhoods with concentrated poverty

New Census data shows that the population of the “poorest poor” in America has grown (about 20.5 million Americans), particularly in neighborhoods of concentrated poverty:

After declining during the 1990s economic boom, the proportion of poor people in large metropolitan areas who lived in high-poverty neighborhoods jumped from 11.2 percent in 2000 to 15.1 percent last year, according to a Brookings Institution analysis released Thursday. Such geographically concentrated poverty in the U.S. is now at the highest since 1990, following a decade of high unemployment and rising energy costs.

Extreme poverty today continues to be prevalent in the industrial Midwest, including Detroit, Grand Rapids, Mich., and Akron, Ohio, due to a renewed decline in manufacturing. But the biggest growth in high-poverty areas is occurring in newer Sun Belt metro areas such as Las Vegas, Riverside, Calif., and Cape Coral, Fla., after the plummeting housing market wiped out home values and dried up construction jobs.

As a whole, the number of poor in the suburbs who lived in high-poverty neighborhoods rose by 41 percent since 2000, more than double the growth of such city neighborhoods.

Elizabeth Kneebone, a senior research associate at Brookings, described a demographic shift in people living in high-poverty neighborhoods, which have less access to good schools, hospitals and government services. As concentrated poverty spreads to new areas, including suburbs, the residents are now more likely to be white, native-born and high school or college graduates — not the conventional image of high-school dropouts or single mothers in inner-city ghettos.

Two things to note: the percentage of people living in poverty concentrated areas is back at 1990 levels and these areas themselves have shifted to new places like the suburbs and the Sun Belt. Are we any better off in addressing this issue than we were when scholars called attention to this like William Julius Wilson in the 1980s and Paul Jargowsky in the 1990s?

It is interesting that there is very little in current political or cultural discourse about the “poorest poor” as most of the current talk centers on the middle class or perhaps the working class. Even Occupy Wall Street seems to be about the middle and working classes. Perhaps much of this group’s anger is driven by the middle-class who now feels the pinch of the economic crisis but the “poorest poor” have been dealing with similar and/or worse concerns for decades.

Lord Giddens as “Blair guru”

I occasionally run across stories involving Anthony Giddens, well-known sociologist, speaking about political issues in Britain. Here is another example of the actions of the “Blair guru”:

Labour peer Lord Giddens, who brought the debate on 13 October entitled Universities: Impact of Government Policy, said ministers appeared to be pursuing policies of “ill-considered, untutored radicalism” that were not based in proper research and had “imponderable outcomes”.

The academic, who advised former prime minister Tony Blair and is professor of sociology at LSE, said the reforms would leave England as a “global outrider” with one of the lowest levels of public support for higher education in the industrialised world.

He said the “ideological thrust” of the Browne Review should have been rejected and instead tuition fees only gradually raised alongside the maintenance of direct public support for universities, due to their “massive” beneficial impact on society.

“Universities are not a sort of supermarket where education can be chosen like a washing powder off the shelf. Students are not simply consumers, making day-to-day purchasing decisions. They will make a one-off decision,” he said.

Reading these stories, it seems like Giddens has more political clout than most sociologists. Is this simply a function of having been close to Tony Blair, did Giddens do specific work/research that put him in contact with politicians, or does Britain simply have a different culture regarding public intellectuals and how sociologists can be involved in social and government life?

American language about government policy and economic life shifts from community to individualism

Here is an interesting argument about how common American discourse about public policy and economic life has shifted since the 1930s:

In 1934, the focus was on people, family security and the risks to family economic well-being that we all share. Today, the people have disappeared. The conversation is now about the federal budget, not about the real economy in which real people live. If a moral concept plays a role in today’s debates, it is only the stern proselytizing of forcing the government to live within its means. If the effect of government policy on average people is discussed, it is only as providing incentives for the sick to economize on medical costs and for the already strapped worker to save for retirement.

From the 1930s to the 1960s, as the Princeton historian Daniel T. Rodgers demonstrates in his recent book, “The Age of Fracture,” American public discourse was filled with references to the social circumstances of average citizens, our common institutions and our common history. Over the last five decades, that discourse has changed in ways that emphasize individual choice, agency and preferences. The language of sociology and common culture has been replaced by the language of economics and individualism.

In 1934, the government was us. We had shared circumstances, shared risks and shared obligations. Today the government is the other — not an institution for the achievement of our common goals, but an alien presence that stands between us and the realization of individual ambitions. Programs of social insurance have become “entitlements,” a word apparently meant to signify not a collectively provided and cherished basis for family-income security, but a sinister threat to our national well-being.

Over the last 50 years we seem to have lost the words — and with them the ideas — to frame our situation appropriately.

This is a fascinating line: “The language of sociology and common culture has been replaced by the language of economics and individualism.” This reminds me of the findings about how public opinion changes when asked about “welfare” versus “assistance for the poor.” The concepts are similar but the connotations of the specific terms matter.

Is the end argument here that changing the language will lead to more communal understandings or does reversing the “Bowling Alone” phenomenon have to come first? It would be helpful to know what exactly these commentators think happened in this period beyond simply the change in language. Could we argue that the success of the community-oriented policies of the mid 1900s that led to a booming economy, rising incomes, suburbanization, and homeownership was “too successful” in that it led to these shifts in language and focus?

Military sociologist coined the term “Don’t Ask, Don’t Tell”

While the policy of “Don’t Ask, Don’t Tell” disappeared yesterday in the American armed forces, I wonder how many people know the term originated with a sociologist:

The “Don’t Ask, Don’t Tell” policy that prohibited gays from serving openly in the military is over, and the web is full of renewed interest in the phrase’s history. Who, folks want to know, coined the expression?

Credit goes to the late Charles Moskos, a military sociologist and professor from Northwestern University. The phrase, which was later expanded to “Don’t Ask, Don’t Tell, Don’t Pursue, Don’t Harrass,” came about during the first term of the Clinton administration. At the time, the policy was viewed as a kind of compromise. It allowed gay men and women to serve in the military, provided they did not openly admit to their sexual preference. It also prohibited other military personnel from asking questions. In other words, don’t ask, don’t tell…

As a younger man, Moskos served in the United States Army as a company clerk, before going on to a distinguished academic career. In 1997, he was honored by the American Sociological Association. According to an article from Northwestern, “some of the gay and lesbian and sex and gender people organized a silent protest” due to “Don’t Ask, Don’t Tell.” After the ceremony, he spoke to the protesters “and made friends with some of them, even though they disagree with his position.”

Beyond the controversial policy, Moskos was seen as a highly influential voice in military policy. The Wall Street Journal called him the country’s “most influential military sociologist.” Though he was the person behind the policy, Moskos did recognize its shortcomings. “I always say about ‘Don’t Ask, Don’t Tell’ what Winston Churchill said about democracy: ‘It’s the worst system possible except for any other,'” remarked Moskos.

Here is Moskos’ 2008 obituary from the Washington Post.

It is not too often these days that you hear about military sociologists. While I haven’t looked into the topic much, I get the sense that they used to be more common back before sociologists (and academics in other disciplines) started raising more critical questions about US foreign and military policy. Would it be acceptable at any universities these days to have or start a “war studies” program or center as opposed to “peace studies” programs or centers more commonly found today?

Balancing libertarian and humanitarian instincts when using the word “NIMBY”

Megan McArdle discusses how the word NIMBY is a prejorative term that tends to be used in instances when the user doesn’t approve of particular uses (opposed to uses that they would approve):

I think this is a little bit too cute.  I read DePillis pretty regularly, and I don’t usually see her calling out, say, people opposing a local Wal-Mart as “NIMBYs”; they’re “opposition groups”.  The term NIMBY seems to be reserved for people who oppose locating things in their back yards that DePillis herself thinks are laudable.  Small wonder that when she uses the word, people take it as a perjorative.

Nonetheless, she has a point: many people oppose having necessary but potentially disruptive things located near them, even if you think those things are a good idea; if you do, you should own it, not make up ridiculously implausible stories about how those inner-city kids wouldn’t really enjoy a halfway house in a nice, suburban neighborhood; they’d be much happier in a crack-infested ghetto like the one where they came from.  Don’t you know you shouldn’t remove creatures from their natural habitat?
 
In the case of people in some DC neighborhoods, they may even be justified.  Anacostia–and my own neighborhood–house an unusually large number of social service organizations, because land has been cheap, and the communities have lacked the socioeconomic power to block new projects the way that, say, Dupont and Friendship Heights have.  I don’t know the statistics on Anacostia, but Eckington/Truxton Circle house thirteen social service groups, from women’s shelters to So Others Might Eat, a wonderful organization that serves thousands of meals to homeless people every day.  Frankly, I haven’t found them disruptive–and indeed, didn’t really know they were there until controversy erupted over a plan to build a fourteenth service facilities.  But the fact remains that a lot of the homeless people hang out in what passes for the area’s park space between meals, and more than a few spend the day drinking single-serving beers from the area’s many liquor stores…
 
In this case, my libertarian instinct squares with my humanitarian instinct: at least in the case of private charities, I cannot, in good conscience, oppose letting them do whatever they want with the property they buy (within reasonable limits on things like toxic fumes and all-night jackhammer parties.)  But I don’t think it’s helpful to brand my neighbors who do as NIMBYs.  Oversaturation of neighborhoods with social services is a genuine problem for those neighborhoods.  We should treat it with at least as much respect as we give to those who don’t want to live near a big-box store.

McArdle seems to be suggesting that the use of the term NIMBY escalates a discussion about land use to an unhelpful level. As soon as the word is brought out, the terms of the discussion changes as the user implies that people are being selfish and those being called NIMBY then have to go on the defensive. Additionally, NIMBY is in the eyes of the beholder: what one person would see as desirable is an abomination to another.

The term McMansion, something I have spent a lot of time studying, is used in a similar manner. Just like NIMBY, the term evokes larger issues such as excessive consumption, sprawl, the disruption of a neighborhood, etc. McMansion and NIMBY are not simple descriptive terms that just refer to a big house or opposition to a particular land use. Both are politicized terms. NIMBY often refers to wealthier, white, more educated homeowners who want to protect their private utopias that many see as exclusionary and government subsidized.

Are there helpful alternatives to the term NIMBY?

Two sociological studies on politicial self-selection in academia

The topic of political bias in academia comes up now and again – it was in the news earlier this year after when a social psychologist made a presentation at a professional meeting. In bringing up the topic again, two sociological studies about self-selection in academia are briefly discussed:

Tierney describes the research of George Yancey, professor of sociology at the University of North Texas, who found that more than a quarter of sociologists he surveyed would be favorable toward a Democrat or an ACLU member and unfavorable toward a Republican; about 40 percent said they would have an unfavorable attitude toward a member of the NRA or an evangelical. “If you were a conservative undergraduate,” Tierney asks, “would you risk spending at least four years in graduate school in the hope of getting a job offer from a committee dominated by people who don’t share your views?”

Tierney also mentions a field experiment, conducted by Neil Gross, professor of sociology at the University of British Columbia, in which researchers posing as potential graduate students sent emails to various humanities departments — including literature, history, sociology, political science, and economics — describing their interests and credentials and asking if the department might be a good fit for them. Some of the mock applicants mentioned working for the McCain campaign and some for Obama. There was no discernible difference in the promptness of the reply or the enthusiasm expressed in the replies. This was taken as proof that discrimination is not a serious factor. But couldn’t it be that a feeler e-mail is not the same thing as an actual application, and it costs nothing to respond positively to something that is only potential? (Alternatively, could it be that many humanities departments are so aching for good students that they can’t afford to discourage potential applicants who at least exhibit signs of life? By the way, isn’t there something dishonest in this kind of research?)

Several quick thoughts:

1. Gross’ study doesn’t sound like dishonest research to me: it might include a little deception (suggesting there is a student behind the email) but ultimately it is just an email.

2. There may indeed be a different response for graduate students who are needed (to some degree – some programs can be pickier than others) may still be moldable versus other academics or people outside the academic realm. If graduate departments showed overt biases, they may find themselves with fewer applications, decreasing their pool.

3. Yancey’s research sounds like it found disapproval of conservatives but these numbers are still minorities among sociologists. Perhaps sociologists were unwilling to reveal their true feelings but it suggests there is still room for alternative viewpoints.

On the whole, I’m glad we have some studies about this rather than just having to rely on sweeping generalizations and anecdotes.

The social history of the food pyramid

With the unveiling later this week of a replacement to the food pyramid (it will be a “plate-shaped symbol, sliced into wedges for the basic food groups and half-filled with fruits and vegetables”), the New York Times provides a quick look at the background of the food pyramid:

The food pyramid has a long and tangled history. Its original version showed a hierarchy of foods, with those that made up the largest portions of a recommended diet, like grains, fruit and vegetables, closest to the wide base. Foods that were to be eaten in smaller quantities, like dairy and meat, were closer to the pyramid’s tapering top.

But the pyramid’s original release was held back over complaints from the meat and dairy industry that their products were being stigmatized. It was released with minor changes in 1992.

A revised pyramid was released in 2005. Called MyPyramid, it turned the old hierarchy on its side, with vertical brightly colored strips standing in for the different food groups. It also showed a stick figure running up the side to emphasize the need for exercise.

But the new pyramid was widely viewed as hard to understand. The Obama administration began talking about getting rid of it as early as last summer. At that time, a group of public health experts, nutritionists, food industry representatives and design professionals were invited to a meeting in Washington where they were asked to discuss possible alternative symbols. One option was a plate.

Two things stand out to me:

1. This is partly about changing nutritional standards but also is about politics and lobbying. Food groups are backed by businesses and industries that have a stake in this. Did they play any part in this new logo?

2. This is a graphical design issue. The old food pyramid suggests that certain foods should be the basis/foundation for eating. The most recent pyramid is a bit strange as the pyramid is broken into slivers so the peaking aspect of a pyramid seems to have been discarded. The new logo sounds like it will be a more proportional based object where people can quickly see what percentage of their diet should be devoted to different foods. Since this is a logo that is likely to be slapped on many educational materials and food packages, it would be helpful if it is easy to understand.

Lakoff on Obama: a progressive moral vision plus systems thinking

George Lakoff has an interesting take on President Obama’s April 13th speech. While the speech was ostensibly about the budget, Lakoff argues that Obama was making two larger points:

1. President Obama was laying out a progressive vision of democracy. Here is how Lakoff sums it up:

The basic idea is this: Democracy is based on empathy, that is, on citizens caring about each other and acting on that care, taking responsibility not just for themselves but for their families, communities, and their nation. The role of government is to carry out this principle in two ways: protection and empowerment.

Obama quotes Lincoln: “to do together what we cannot do as well for ourselves.” That is what he calls patriotism. He spotlights “the American belief… that each one of us deserves some basic measure of security… that no matter how responsibly we live our lives, hard time or bad luck, crippling illness or a layoff, may strike any one of us.” He cites the religious version of this moral vision: “There but for the grace of God go I.” The greatness of America comes from carrying out such moral commitments as Medicare, Social Security, and Medicaid.

It would be an interesting public discussion to have over whether these three programs are a moral commitment. I suspect that a good number of Americans would see it this way but this is not the typical angle taken in public discourse.

2. President Obama highlighted the role of systems and how a budget cannot be isolated from other important needs and goals in society:

President Obama, in the same speech, laid the groundwork for another crucial national discussion: systems thinking, which has shown up in public discourse mainly in the form of “systemic risk” of the sort that led to the global economic meltdown. The president brought up systems thinking implicitly, at the center of his budget proposal. He observed repeatedly that budget deficits and “spending” do not occur in isolation. The choice of what to cut and what to keep is a matter of factors external to the budget per se.

Long-term prosperity, economic recovery, and job creation, he argued, depend up maintaining “investments” — investments in infrastructure (roads, bridges, long-distance rail), education, scientific research, renewable energy, and so on. The maintenance of American values, he argued, is outside of the budget in itself, but is at the heart of the argument about what to cut. The fact is that the rich have gotten rich because of the government — direct corporate subsidies, access to publicly-owned resources, access to government research, favorable trade agreements, roads and other means of transportation, education that provides educated workers, tax loopholes, and innumerable government resources taken advantage of by the rich, but paid for by all of us. What is called a “tax break” for the rich is actually a redistribution of wealth from the poor and middle class whose incomes have gone down to those who have considerably more money than they need, money they have made because of tax investments by the rest of America…

Progressives tend to think more readily in terms of systems than conservatives. We see this in the answers to a question like, “What causes crime?” Progressives tend to give answers like economic hardship, or lack of education, or crime-ridden neighborhoods. Conservatives tend more to give an answer like “bad people — lock ’em up, punish ’em.” This is a consequence of a lifetime of thinking in terms of social connection (for progressives) and individual responsibility (for conservatives). Thus conservatives did not see the president’s plan, which relied on systemic causation, as a plan at all for directly addressing the deficit.

This sort of systems thinking sounds like sociological approaches to the world: the complex social realm can be difficult to understand and predict but settling on simple (often individualistic) explanations leaves much to desired.

I can imagine that conservatives might find holes with Lakoff’s argument, not the least that all of this explanation still doesn’t say much about how the United States could deal with its budget issues. But Lakoff highlights the cultural ideas and values surrounding political debate: speeches and political activities may be about budgets and practical matters but there are underlying values that guide such actions.

Sarkozy joins growing chorus of Western European leaders who have said multiculturalism has failed

In a recent interview, French President Nicolas Sarkozy said multiculturalism has failed in his country:

“My answer is clearly yes, it is a failure,” he said in a television interview when asked about the policy which advocates that host societies welcome and foster distinct cultural and religious immigrant groups.

“Of course we must all respect differences, but we do not want… a society where communities coexist side by side.

“If you come to France, you accept to melt into a single community, which is the national community, and if you do not want to accept that, you cannot be welcome in France,” the right-wing president said.

“The French national community cannot accept a change in its lifestyle, equality between men and women… freedom for little girls to go to school,” he said.

“We have been too concerned about the identity of the person who was arriving and not enough about the identity of the country that was receiving him,” Sarkozy said in the TFI channel show.

British Prime Minister David Cameron, German Chancellor Angela Merkel, Australia’s ex-prime minister John Howard and Spanish ex-premier Jose Maria Aznar have also recently said multicultural policies have not successfully integrated immigrants.

Based on what Sarkozy said in this interview, it sounds like he either has a different definition of multiculturalism or a different end goal. A contrast to multiculturalism would be assimilation where newcomers to a country (or any group) should quickly or eventually adopt the customs and values of the country they have entered. Sarkozy is suggesting that because some immigrants have not done this, multiculturalism has failed. But Sarkozy seems to be explaining how assimilation has failed. The Oxford English Dictionary defines multiculturalism thusly: “the policy or process whereby the distinctive identities of the cultural groups within such a society are maintained or supported.” In this sense, a long-running policy of multiculturalism ends up changing the larger culture to some degree. It sounds like Sarkozy (and some of these other leaders) are not as interested in this. Can French or English or German culture change and incorporate elements of cultures from immigrants living within their borders?

These comments from various leaders seem to have been motivated in part by growing Muslim populations in these nations.

It is also interesting to note that there is not a whole lot of public discussion about this in the United States. Some of this may be more below the surface, particularly when issues like immigration arise (though this has been overwhelmed by economic concerns). Can you imagine an American political leader of any party making a statement like these Western European leaders have?