The benefits of institutions over charismatic authority for evangelicals

American evangelicals may often prize celebrity pastors and figures but sociologist and college president Michael Lindsay argues institutions provide more lasting impact:

Weber distinguished between different kinds of authority. Traditional authority is what the Queen of England has. You inherit it from your parents. Rational-legal authority is what President Obama has. You’re on top of a major bureaucracy, and that’s how you get things done. And then there’s charismatic authority. This is the authority that Billy Graham had. It’s the authority that Jesus had. It’s the authority that gathers and collects around an outstanding individual, a persona.

But in order for that person to have lasting impact, Weber says, it has to be routinized; in other words, it has to be channeled into an institutional form. The authority of a charismatic individual has to be transferred into a rational-legal bureaucracy. So, for instance, the Billy Graham Evangelistic Association is a great example of the routinization of charisma. After Billy Graham is gone, his ministry will continue. Charles Colson died two years ago. But much of his work is continuing in Prison Fellowship even though the founder is no longer there.

So, while it is true that evangelicalism does prize the personality, and there is a cult of celebrity in the church, what we are witnessing is evangelicals coming to appreciate the importance and the primacy of institutions.

Charismatic leaders are rare and it can often be difficult to take the better things they do and imbue that into institutions. Yet, institutions can have incredible staying power and operate at a broader level of society.

While evangelicals may be showing more interest in institutions, such a viewpoint rubs against the typical evangelical tendency toward individualism. The charismatic leader can fit the American story of working hard and making something of oneself. The attractive leader can pull in individuals through new technologies as evangelicals effectively used the ascending radio and television scenes. (Interestingly, I’ve seen much less about evangelicals effectively harnessing the Internet for their ends. Perhaps such an analysis can come with time.) Appealing to institutions requires both leaders and adherents to turn their focus more to the communal than their own interests. This is a difficult switch, particularly in certain areas like Smith and Emerson demonstrate in Divided By Faithwith the inability for white evangelicals to beyond the individual to the social dimensions of race in America.

Wealthy Chinese seeking out McMansions

The Financial Times suggests there is one primary reason more Chinese homebuyers are choosing McMansions: they are status symbols. One note: the McMansions hinted at in this article sound opulent beyond the average American McMansion.

Critics of McMansions would often argue a similar process is at work in the United States: McMansion owners want to impress others with their large house. While the price is not so much of an issue (much smaller pieces of real estate in desirable locations can cost much more), the homes show off through an impressive/ostentatious front, plenty of interior space, nice furnishings, and lots of stuff. On the other hand, I suspect a good number of owners purchased such homes because they say they need the space or got a good deal or liked the amenities of the home and neighborhood.

I’m not sure these are mutually exclusive arguments. Homebuyers can want a suburban experience and want to do it in a home that broadcasts their success. After all, the suburban single-family home represents middle- or upper-class success as well as expressions of individualism.

The evolving definition and usage of “selfie”

The word “selfie” was the Oxford Dictionary’s word of the year in 2013 but its usage and meaning continues to evolve:

A selfie isn’t just “a photograph that one has taken of oneself,” but also tends to be “taken with a smartphone or webcam and uploaded to a social media website,” as the editors at Oxford Dictionaries put it. That part is key because it reinforces the reason why we needed to come up with a new name for this kind of self-portraiture in the first place.

Think of it this way: A selfie isn’t fundamentally about the photographer’s relationship with the camera, it’s about the photographer’s relationship with an audience. In other words, selfies are more parts communication than self-admiration (though there’s a healthy dose of that, too).

The vantage point isn’t new; the form of publishing is.

This explains why we call the photo from the Oscars “Ellen’s selfie” — because she was the one who published it. Selfies tether the photographer to the subject of the photo and to its distribution. What better way to visually represent the larger shift from observation to interaction in publishing power?

Ultimately, selfies are a way of communicating narrative autonomy. They demonstrate the agency of the person behind the lens, by simultaneously putting that person in front of it.

The key to the selfie is not that people are talking photos of themselves for the first time in history; rather, they are doing it with new purposes, to tell their own stories to their online public. This is what social media and Web 2.0 are all about: putting the power into the hands of users to create their own narratives. The user now gets to decide what they want to broadcast to others. One scholar described it giving average people the ability to be a celebrity within their online social sphere. The selfie is also part of a shift toward telling these narratives through images rather than words – think about the relative shift in updating Facebook statuses years ago to now posting interesting pictures on Instagram.

Sociologist argues hidden shame destructive in modern society

Sociologist Thomas Scheff argues that hidden shame is a large problem in modern society:

According to Scheff a society that fosters individualism (ours, for example) provides a ripe breeding ground for the emotion of shame because people are encouraged to “go it alone, no matter the cost to relationships,” he said. “People learn to act as if they were complete in themselves and independent of others. This feature has constructive and creative sides, but it has at least two other implications: alienation and the hiding of shame.”

Scheff noted that while shame is no less prevalent now than in previous years or decades or generations, it is more hidden. “Shame is a biological entity like other emotions, but people are more ashamed of it than they are of the others,” he said. “The hiding of emotions is more widespread in modern societies than in traditional ones.”…

The problem with that kind of thinking, however, is that shame is, in reality, a very useful emotion. “Shame is the basis of morality,” Scheff said. “You can’t have a moral society without shame. It provides the weight for morality. There are a hundred things in your head about what you should or shouldn’t do, but the one that hits you is the one that has shame behind it.”

Scheff suggests that shame — or the reaction to it — can manifest itself in larger acts of aggression, such as wars and other military conflicts. “Especially for leaders, both shame and anger are carefully hidden behind a veil of rationality,” he writes in the article. “The Bush administration may have been deeply embarrassed by the 9/11 attack during their watch and their helplessness to punish the attackers. The invasion of Iraq on the basis of false premises might have served to hide their shame behind anger and aggression.”

I remember reading Scheff’s work in a microsociology course in grad school where he was cited as a key example of the growing body of research in the subfield of the sociology of emotions. While we tend to chalk up emotions to an individual’s psychological and physiological state, emotions that we feel and how we can express them are also dependent on social forces. Thus, if individualism is a key feature of early 21st century life, particularly for younger adults/millennials, displaying feelings of shame contradicts this individualistic approach. For example, one of the findings about younger adults in the National Study of Youth and Religion (with this particular finding discussed in Souls in Transition) is that they have very few regrets about their past actions. This is indicative of an individualistic approach to life: regrets may be based on the idea that the individual didn’t live up to some standard. But, to have shame or regrets, the individual has to be anchored to a particular moral system.

Scheff’s solution to hidden shame?

The answer, according to Scheff, is to have a good laugh. “That is, laugh at yourself or at the universe or at your circumstances, but not at other people. Most of the laughing we do in comedy is good. No matter the actors, we are really laughing at our own selves that we see in their foolishness.”

It would then be interesting to study who using humor laughs more than themselves than at others. Is most of our humor/comedy today compared to the past directed at others rather than exploring our own shame and embarrassing moments?

Who wants to be in the “McMansion and minivans” category?

Big data makes it possible to slice up Americans into all sorts of consumer categories like “McMansions and minivans.” However, how many would want to be in that category?

Acxiom provides “premium proprietary behavioral insights” that “number in the thousands and cover consumer interests ranging from brand and channel affinities to product usage and purchase timing.” In other words, Acxiom creates profiles, or digital dossiers, about millions of people, based on the 1,500 points of data about them it claims to have. These data might include your education level; how many children you have; the type of car you drive; your stock portfolio; your recent purchases; and your race, age, and education level. These data are combined across sources—for instance, magazine subscriber lists and public records of home ownership—to determine whether you fit into a number of predefined categories such as “McMansions and Minivans” or “adult with wealthy parent.” Acxiom is then able to sell these consumer profiles to its customers, who include twelve of the top fifteen credit card issuers, seven of the top ten retail banks, eight of the top ten telecom/media companies, and nine of the top ten property and casualty insurers.

Acxiom may be one of the largest data brokers, but it represents a dramatic shift in the way that personal information is handled online. The movement toward “Big Data,” which uses computational techniques to find social insights in very large groupings of data, is rapidly transforming industries from health care to electoral politics. Big Data has many well-known social uses, for example by the police and by managers aiming to increase productivity. But it also poses new challenges to privacy on an unprecedented level and scale. Big Data is made up of “little data,” and these little data may be deeply personal.

This is not new though the amount of data advertisers and others have – which is often given voluntarily on the Internet – may have increased in recent years. What might be more interesting, given that this is happening, is then to present Americans with the categories they are in and see how they react. Neither McManions or minivans have very good reputations. McMansions are seen as ugly houses owned by people who just want to make a splash, not own a quality house or participate in a close-knit community. Minivans signify suburban parent schlepping kids from place to place. Think the Toyota commercials from a few years back that tried to make owning a minivan cool. Put together these two functional objects that also serve as status markers and I suspect many people would not want to identify themselves as being in such an uncool group. Yet, there are plenty of people in such a group. Drive through any well-to-do suburb and both the homes and the parking lots (lots of Toyota and Honda minivans as well as a range of upscale SUVs – does this category include “McMansions and SUVs”?) reveal a certain lifestyle built around home, kids, school, and safety. It may be derided by outsiders and the people on the inside might not self-identify as such (and they might object to being lumped in a group – we Americans are individuals after all), but these are fairly popular choices to which marketers and businesses can then cater.

Peter Berger: new atheist megachurches really about forming a denomination

Sociologist Peter Berger offers his take on the news that some atheists are looking to form their own megachurches.

How then is one to understand the phenomenon described in the story? I think there are two ways of understanding it. First, there is the lingering notion of Sunday morning as a festive ceremony of the entire family.  This notion has deep cultural roots in Christian-majority countries (even if, especially in Europe, this notion is rooted in nostalgia rather than piety).  Many people who would not be comfortable participating in an overtly Christian worship service still feel that something vaguely resembling it would be a good program to attend once a week, preferably en famille. Thus a Unitarian was once described as someone who doesn’t play golf and must find something else to do on Sunday morning. This atheist gathering in Los Angeles is following a classic American pattern originally inspired by Protestant piety—lay people being sociable in a church (or in this case quasi-church) setting. They are on their best behavior, exhibiting the prototypical “Protestant smile”.  This smile has long ago migrated from its original religious location to grace the faces of Catholics, Jews and adherents of more exotic faiths. It has become a sacrament of American civility. It would be a grave error to call it “superficial” or “false”. Far be it from me to begrudge atheists their replication of it.

However, there is a more important aspect to the aforementioned phenomenon: Every community of value, religious or otherwise, becomes a denomination in America. Atheists, as they want public recognition, begin to exhibit the characteristics of a religious denomination: They form national organizations, they hold conferences, they establish local branches (“churches”, in common parlance) which hold Sunday morning services—and they want to have atheist chaplains in universities and the military. As good Americans, they litigate to protect their constitutional rights. And they smile while they are doing all these things.

As far as I know, the term “denomination” is an innovation of American English. In classical sociology of religion, in the early 20th-entury writings of Max Weber and Ernst Troeltsch, religious institutions were described as coming in two types: the “church”, a large body open to the society into which an individual is born, and the ”sect”, a smaller group set aside from the society which an individual chooses to join. The historian Richard Niebuhr, in 1929, published a book that has become a classic, The Social Sources of Denominationalism. It is a very rich account of religious history, but among many other contributions, Niebuhr argued that America has produced a third type of religious institutions—the denomination—which has some qualities derived from both the Weber-Troeltsch types: It is a large body not isolated from society, but it is also a voluntary association which individuals chose to join. It can also be described as a church which, in fact if not theologically, accepts the right of other churches to exist. This distinctive institution, I would propose, is the result of a social and a political fact. The denomination is an institutional formation seeking to adapt to pluralism—the largely peaceful coexistence of diverse religious communities in the same society. The denomination is protected in a pluralist situation by the political and legal guarantee of religious freedom. Pluralism is the product of powerful forces of modernity—urbanization, migration, mass literacy and education; it can exist without religious freedom, but the latter clearly enhances it. While Niebuhr was right in seeing the denomination as primarily an American invention, it has now become globalized—because pluralism has become a global fact. The worldwide explosion of Pentecostalism, which I mentioned before, is a prime example of global pluralism—ever splitting off into an exuberant variety of groupings.

The argument: a pluralistic society, created through a set of legal and social codes, encourages denominations. Thus, if atheists want to be part of an American landscape, they must adapt to the forms that give religious groups the ability to band together and rally to their cause.

I wasn’t sure why atheists would want megachurches when these don’t have the greatest reputations (though they may be popular and influential) but I’m even less sure that atheists would want to be part of denominations. Much of the story of American religion in the last 50 years is the decline of denominations, the trend toward more independent, non-denominational churches that are not constrained by hierarchies. Similarly, individuals have moved from seeking membership in religious organizations to a more individualized form of religious expression, immortalized as “shielaism” in Habits of the Heart and illustrated with the increasing number of “religious nones.” On one hand, denominations allow religious congregations to band together and exert more collective force but Americans also don’t like to be limited by social structures.

Durkheim, modern American hyper-individualism, and moral consensus today

One commentator links Durkheim’s ideas about suicide, anomie, and society to individualism in America today:

Here in the West, we take individualism and freedom to be foundational to the good life. But Durkheim’s research revealed a more complicated picture. He concluded that people kill themselves more when they are alienated from their communities and community institutions. “Men don’t thrive as rugged individualists making their mark on the frontier,” the University of Virginia sociologist W. Bradford Wilcox pointed out recently: “In fact, men seem to be much more likely to end up killing themselves if they don’t have traditional support systems.” Places where individualism is the supreme value; places where people are excessively self-sufficient; places that look a lot like twenty-first century America—individuals don’t flourish in these environments, but suicide does.

Durkheim’s work emphasizes the importance of community life. Without the constraints, traditions, and shared values of the community, society enters into a state of what Durkheim called anomie, or normlessness. This freedom, far from leading to happiness, often leads to depression and social decay (as the “twerking” Miley Cyrus perfectly exemplified recently at the Video Music Awards). Durkheim thought that the constraints—if not excessive—imposed on individuals by the community ultimately helped people lead good lives.

But we live in a culture where communitarian ideals, like duty and tradition, are withering away. Even conservatives, who should be the natural allies of these virtues, have in large part become the champions of an individualism that seems to value freedom, the market, and material prosperity above all else, leaving little room for the more traditional values that well known thinkers like Russell Kirk and Richard Weaver cherished. “Man is constantly being assured today that he has more power than ever before in history,” wrote Weaver in Ideas Have Consequences (1948), “but his daily experience is one of powerlessness. . . . If he is with a business organization, the odds are great that he has sacrificed every other kind of independence in return for that dubious one known as financial.”…

Let’s return to the Google Books Ngram Viewer to illustrate the point. When Twenge, Campbell, and their colleague Brittany Gentile analyzed books published between 1960 and 2008, they found that the use of words and phrases like “unique,” “personalize,” “self,” “all about me,” “I am special,” and “I’m the best” significantly increased over time. Of course, it is not just in our books where this narcissism appears. It is also throughout the popular culture, not least in pop music. When a group of researchers, including Campbell and Twenge, looked at the lyrics of the most popular songs from 1980 to 2007, they found that the songs became much more narcissistic and self-centered over time. In the past three decades, the researchers write, the “use of words related to self-focus and antisocial behavior increased, whereas words related to other-focus, social interactions, and positive emotion decreased.”

Durkheim was very much about social cohesion, moral consensus, and the interdependence of individuals in modern society. Individuals may think that they are self-sufficient or able to do a lot on their own but much of their lives are built on the efforts of others.

Another aspect of this might be the declining participation of Americans in civic groups as outlined by Robert Putnam in Bowling Alone. This doesn’t mean Americans are completely withdrawn but it does suggest they might be more wary of collectives or only choose to participate when it suits them. This is how you can view online social networks like Facebook and Twitter: they enable social interaction but it is at the demand of individual users as they get to decide when and how they interact.

You could flip this this around and ask a different question: what are Americans all committed to? Where do we still have moral consensus? Perhaps in declining trust in institutions. Perhaps in celebrating Super Bowl Sunday. Perhaps the idea that homeownership is a key part of the American Dream. Perhaps in religiosity (even with the rise of the “religious nones,” some of whom still believe in God). Here are a few other things 90% of Americans can agree on:

Yet there are some opinions that 90% of the public, or close to it, shares — including a belief that citizens have a duty to vote, an admiration for those who get rich through hard work, a strong sense of patriotism and a belief that society should give everyone an equal opportunity to succeed. Pew Research’s political values surveys have shown that these attitudes have remained remarkably consistent over time.

The proportion saying they are very patriotic has varied by just four percentage points (between 87% to 91%) across 13 surveys conducted over 22 years. Similarly, in May 1987, 90% agreed with the statement: “Our society should do what is necessary to make sure everyone has an equal opportunity to succeed.” This percentage has remained at about 90% ever since (87% in the most recent political values survey).

It is not that we don’t have zero social cohesion these days. The argument here could be two-fold: (1) social cohesion has declined from the past; (2) social cohesion today has changed – it might be more “alone together” than everything else where we can be around others at times and share some common values but we generally want to follow our own paths, as long as they aren’t impeded too much by the paths of others.

Argument: the movie “42” ignores Jackie Robinson’s role in the larger Civil Rights Movement

Peter Drier argues that the new movie 42 fails to properly put Jackie Robinson in a larger context: as part of a larger social movement.

The film portrays baseball’s integration as the tale of two trailblazers—Robinson, the combative athlete and Rickey, the shrewd strategist—battling baseball’s, and society’s, bigotry. But the truth is that it was a political victory brought about by a social protest movement. As an activist himself, Robinson would likely have been disappointed by a film that ignored the centrality of the broader civil rights struggle…

42 is the fourth Hollywood film about Robinson. All of them suffer from what might be called movement myopia. We may prefer our heroes to be rugged individualists, but the reality doesn’t conform to the myth embedded in Hollywood’s version of the Robinson story…

Starting in the 1930s, reporters for African-American papers (especially Wendell Smith of the Pittsburgh Courier, Fay Young of the Chicago Defender, Joe Bostic of the People’s Voice in New York, and Sam Lacy of the Baltimore Afro-American), and Lester Rodney, sports editor of the Communist paper, the Daily Worker, took the lead in pushing baseball’s establishment to hire black players. They published open letters to owners, polled white managers and players (some of whom were threatened by the prospect of losing their jobs to blacks, but most of whom said that they had no objections to playing with African Americans), brought black players to unscheduled tryouts at spring training centers, and kept the issue before the public. Several white journalists for mainstream papers joined the chorus for baseball integration.

Progressive unions and civil rights groups picketed outside Yankee Stadium the Polo Grounds, and Ebbets Field in New York City, and Comiskey Park and Wrigley Field in Chicago. They gathered more than a million signatures on petitions, demanding that baseball tear down the color barrier erected by team owners and Commissioner Kennesaw Mountain Landis. In July 1940, the Trade Union Athletic Association held an “End Jim Crow in Baseball” demonstration at the New York World’s Fair. The next year, liberal unions sent a delegation to meet with Landis to demand that major league baseball recruit black players. In December 1943, Paul Robeson, the prominent black actor, singer, and activist, addressed baseball’s owners at their annual winter meeting in New York, urging them to integrate their teams. Under orders from Landis, they ignored Robeson and didn’t ask him a single question…

Robinson recognized that the dismantling of baseball’s color line was a triumph of both a man and a movement. During and after his playing days, he joined the civil rights crusade, speaking out—in speeches, interviews, and his column—against racial injustice. In 1949, testifying before Congress, he said: “I’m not fooled because I’ve had a chance open to very few Negro Americans.”

Fascinating. Robinson can be applauded for his individual efforts and we can also recognize that he was part of a larger movement – it doesn’t have to be one or the other. But, our narratives, now prominently told in biopic movies, love to emphasize the individual. This is part of a larger American issue regarding an inability to recognize and discuss larger social structures, forces, and movements.

Many Americans might assume the Civil Rights Movement begins in the mid-1950s with Brown vs. Board of Education or the actions of Rosa Parks (this is where the Wikipedia article on the subject starts) but things were stirring in Robinson’s day. While baseball was America’s sport (pro football didn’t start its meteoric rise until a decade or so later) and Robinson’s play was influential, there were other efforts going on. In 1948 the military was integrated via an order from President Truman. After World War II, blacks tried to move into better housing, often found in white neighborhoods, but faced serious (sometimes violent) opposition in a number of locations.

I’ve been conflicted about whether I should see this movie as a big baseball fans. Sports movies are a little too mawkish for me and don’t ever really reflect how the game is played. This argument is not helping the movie’s cause…

When Chicago suburbs disqualify candidates running for public office

Local government and control is a cherished part of suburban life. But, the Chicago Tribune highlights today on its front page how often Chicago suburban governments disqualify candidates running for local office:

For its investigation, the Tribune focused on races that critics say are the most troubling: suburban candidates running for city and village offices. Reporters canvassed every suburb in the Chicago region, reviewed scores of objections filed against candidates and interviewed dozens of those involved in the system. The newspaper found:

Widespread abuse. At least 200 candidates faced objections this year, with only a small fraction alleging serious matters, such as criminal histories, residency issues or outright fraud. Ultimately panels kicked 76 candidates off the ballot across three dozen suburbs.

Rampant bias. Of those knocked off, most fell at the hands of panels stacked with members who had a political stake in their own decisions. Conflicts also went beyond simple politics: Even relatives ruled on their own family members’ cases.

Wild inconsistencies. The rules are not evenly applied, with similar infractions leading some panels to remove candidates, but not other panels.

Costly tabs. The challenges cost taxpayers in some towns tens of thousands of dollars each election cycle, many times in suburbs that can least afford it…

The Tribune studied local election systems in the suburbs of the nation’s other largest metro areas: New York, Los Angeles, Dallas and Philadelphia. None has Illinois’ combination of difficulty getting on the suburban ballot and ease in getting kicked off.

Local government is often thought to be more non-partisan than elections at higher levels of government. But non-partisanship does not necessarily mean that officeholders aren’t still looking to stay in office and will do what they can to keep challengers out. Local races can be particularly nasty even as very few people vote. I suspect most suburbanites would not like what the Tribune found but ironically probably wouldn’t be too motivated to vote on the issue, pressure politicians about their concerns, or run for office themselves to change the situation.

Underlying all of this in the suburbs is that suburban culture promotes letting people do their own thing and trying to avoid public friction. A great source on this is the book The Moral Order of a Suburb by M. P. Baumgartner. Here is how the Amazon book description puts it:

Drawing on research, observation, and hundreds of in-depth interviews conducted during a twelve-month study of an affluent New York City suburb, M.P. Baumgartner reveals that the apparent serenity of the suburb is caused by the avoidance of open conflict. She contends that although nonviolence, nonconfrontation, and tolerance produce a superficial social harmony, these behaviors arise from disintegrative tendencies in modern culture–transience, fragmentation, weak family and communal ties, isolation, and indifference–conditions customarily viewed as sources of disorder, antagonism, and violence. A kind of moral minimalism pervades the suburbs, a disorganized social order that, with the suburbs’ rapid growth in America, promises to be the moral order of the future.

This is a paradox of the suburbs: we tend to think of transience and fragmentation leading to social disorder but Baumgartner argues this is what actually brings suburbanites together.

Argument: individualistic political arguments don’t work in cities since they require contributing to the “public good”

After looking at the Democratic vote advantage in cities for the 2012 election, here is an argument about why individualistic political arguments don’t work in cities:

If Republicans are ever going to earn real votes in cities in the future, though, they’ll have to do more than just talk about them differently. The real problem seeps much deeper. As the Republican Party has moved further to the right, it has increasingly become the party of fierce individualism, of “I built that” and you take care of yourself. Cities, on the other hand, are fundamentally about the shared commons. If you live in a city and you think government – and other people – should stay out of your life, how will you get to work in the morning? Who will police your neighborhood? Where will you find a public park when your building has no back yard?

In a good piece on the GOP’s problem with geography earlier this week, The New Republic’s Lydia DePillis interviewed Princeton Historian Kevin Kruse, who made this point succinctly: “There are certain things in which the physical nature of a city, the fact the people are piled on top of each other, requires some notion of the public good,” he said. “Conservative ideology works beautifully in the suburbs, because it makes sense spatially.”

The real urban challenge for conservatives going forward will be to pull back from an ideology that leaves little room for the concept of “public good,” and that treats all public spending as if it were equally wasteful. Cities do demand, by definition, a greater role for government than a small rural town on the prairie. But the return on investment can also be much higher (in jobs created through transportation spending, in the number of citizens touched by public expenditures, in patents per capita, in the sheer share of economic growth driven by our metropolises).

Density makes all of these things possible, and it requires its own kind of politics. There’s no reason why the Democratic Party should have an exclusive lock on this idea. Investing government money efficiently – as Republicans want to do – is also about focusing on how it’s spent in cities. While Republicans are mulling this over in the next four years, it may help to look at Howard’s map. What is going on in those dark blue dots? What does it mean to live in those places – and to live there and hear from politicians that “government should get out of the way?”

This reminds me of some of the observations of early sociologists about the transition from more rural village and farm life to urban life in the late 1800s and early 1900s. Cities aren’t just different because there are more people who are living and working closer together; this changes the social interactions (think of Simmel’s talk of the blase attitude in cities) as well as the social interdependence (think of Durkheim’s discussion of the division of labor).

One way Republicans could positively argue about cities: along with their surrounding metropolitan regions, cities are economic engines. A thriving economy needs thriving firms in these regions that encourage innovation, provide jobs, and interact with and operate in nearby communities.

Are there cities that are more individualistic than others? Can you have a global city that has a more individualistic ethos?