In discussion of Occupy Wall Street, McMansions seen as part of the culture war

As part of a larger fascinating discussion about who the members of Occupy Wall Street actually are (the almost-elite versus the elite?), Megan McArdle suggests McMansions are part of the larger culture war in the United States:

Orwell goes on to point out that it is the anxious lower-upper-middle-class who have the most venom towards those below them–precisely because to preserve their status, they have to keep themselves sharply apart from the workers and tradesmen. And I think that that does apply here as well, at least to some extent. One of the interesting things about going back to my business school reunion earlier in the month was simply the absence of the sort of cutting remarks about flyover country that I have grown used to hearing in any large gathering of people. I didn’t notice it until after the events were over, because it was a slow accumulation of all the jokes and rants I hadn’t heard about NASCAR, McMansions, megachurches, reality television, and all the other cultural signifiers that make up a small but steady undercurrent of my current social milieu, the way Polish jokes did when I was in sixth grade.

Some of my former classmates now live in flyover country, of course, but mostly, I think, they just didn’t care. No one seemed very interested in the culture war.

So why does that same culture war seem so important to so many of the people that I know in New York and DC? (“The intellectuals”, as one of my classmates laughingly called us, when I started dropping statistics in the middle of cocktail chitchat, and then lamely explained that this is kind of what passes for fascinating small talk in DC.)

It’s not entirely crazy to suspect, as Orwell did, that this has something to do with money. Specifically, you sneer at the customs of the people you might be mistaken for. For aside from a few very stuffy conservatives, no white people I know sneer at hip-hop music, telenovelas, Tyler Perry films, or any of the other things often consumed by people of modest incomes who don’t look like them. They save it for Thomas Kinkade paintings, “Cozy cottage” style home decoration, collectibles, child beauty pageants, large pickup trucks***, and so forth.

It is fascinating to think about the comments that McArdle describes: in some circles, there is a different set of profane objects while such objects barely rate as topics among “average” people in middle America. Being in academia also leads to hearing more of such comments. I would add Walmart in as another significant “cultural signifier” in these conversations.

McMansions is an interesting addition to this group. There is often quite a bit of scorn intended when using this term. Of course, most people in flyover country don’t own McMansions (though perhaps they aspire to own them) but many communities allow them. I have found that the use of the term McMansion is often tied to sprawl, another issue that can separate the big cities from flyover country. McMansions are often seen as a part of the larger package of sprawl which includes an emphasis on cars, big houses, a waste of natural resources, and a lack of beauty and quality.

I don’t know if she knows it but it sounds like McArdle is making Bourdieu’s argument: those with more education look at aesthetics and a deeper understanding of objects while those with more money purchase for functionality. Take a McMansion: someone with more education might note its lack of quality, its contribution to sprawl, and wish for an architect-designed home. Someone with more money might note that you can have eight family members easily fit in the home and each can have their own bedroom, bathroom space, and play space.

A side note: I did have to laugh when McArdle suggests that dropping statistics into conversation is also a signifier. If so, I am guilty…

(A caveat: these sorts of flyover country/big city or red vs. blue state dichotomies are always more complex than they are commonly presented in public discourse. But just because they are broad terms describes tens of millions of people doesn’t mean that there isn’t necessarily some truth to them.)

The contradictions of social commentary: too much and too little saving and moving is bad

In this week’s column, Gregg Easterbrook points out two interesting contradictory social messages regarding the behavior of the American public:

Ten years ago the fact that Americans had a negative savings rate — by borrowing, most spent more than their incomes — was said to be very bad. In the last three years the personal savings rate has gone steadily up as Americans react to the unsettled economy by spending less and building up reserves. That was supposed to be bad, too. Prominent commentators blamed “higher personal saving” for dampened demand, which in turn slows GDP growth. So not saving is bad, and saving is also bad.

Wait — the latest indicators are that not saving is on the upswing again, and of course, that’s bad. The sudden surge in not saving “raises the question of whether consumers are returning to their old spendthrift habits.”

In the recent past, the fact Americans move often has been decried as rootlessness and a barometer of too many unconnected to the life of their communities. “Bowling alone” and all that. Now comes word that the Great Recession has led to a sharp decline in moving. Previously, moving was said to be bad. Now not moving is bad, being spun as evidence of “loss of mobility.”

Here are several possible interpretations of these conflicting takes:
1. Commentators have moved to generally seeing only negative traits in people’s behaviors.

2. Most commentators genuinely don’t know what is good or bad for the economy or larger society so they always suggest the opposite. Sometimes they may be right, sometimes they may be wrong.

2a. An added bonus: when the situation appears to be “bad” (which is often a social construction itself), commentators want any action that attempts to reverse the trend. Doing nothing is seen as worse than trying something that you perhaps think has a reasonable chance of failing.

3. Americans themselves live with these tensions. Take the mobility issue. We have always had a running battle in the United States between rootedness/community and mobility. We say we value civic organizations and discussions but we also are willing to drop everything and leave if a great job offer comes along. Plenty of people wrestle with this on a regular basis. These commentators simply reflect true tensions in American culture.

Is America caught between democratic inclusion and economic stratification?

Here is an interesting discussion topic: America’s ability to weave/meld new people groups into the democratic process seems inversely related to America’s ability to have more economic equality.

It’s a puzzle: one dispossessed group after another — blacks, women, Hispanics and gays — has been gradually accepted in the United States, granted equal rights and brought into the mainstream.

At the same time, in economic terms, the United States has gone from being a comparatively egalitarian society to one of the most unequal democracies in the world…

European countries have done a better job of protecting workers’ salaries and rights but have been reluctant to extend the benefits of their generous welfare state to new immigrants who look and act differently from them. Could America’s lost enthusiasm for income redistribution and progressive taxation be in part a reaction to sharing resources with traditionally excluded groups?

“I do think there is a trade-off between inclusion and equality,” said Gary Becker, a professor of economics at the University of Chicago and a Nobel laureate. “I think if you are a German worker you are better off than your American equivalent, but if you are an immigrant, you are better off in the U.S.”

I often bring this up in my introduction to sociology course: while the United States has a less than pretty racial and discriminatory history (and there is still much to do), the scale of inclusion in the United States (in a country of roughly 310 million people) is remarkable, particularly compared to some of the issues European countries face.

In the end, does this have to be a zero sum game? Is there a country in the world that has successfully done both of these things? Is there a system that can accomplish both?

And getting into the territory of values and morality, which of these outcomes is more worthwhile if you could only have one?

“The Steve Jobs Anti-Eulogy” raises some interesting points

Now that the media blitz following the death of Steve Jobs has slowed, there is more space to consider the coverage. Here are five interesting observations from one writer who also wins points for invoking “Victorian sociologist Herbert Spencer” and Malcolm Gladwell:

1. People write about Steve to write about themselves…

2. Individuals do not make history. Populations do…

So the idea that Steve Jobs changed history is just plain bad analysis. Victorian sociologist Herbert Spencer argued that attributing historical events to the decisions of individuals was a hopelessly primitive, childish, and unscientific position. After he published these views in The Study of Sociology, the case was closed. At least for professional historians.

3. You can tell a lot about a society by the people they honor…

4. Steve Jobs sheds more light on the nature vs. nurture debate than he does on the history debate…

5. Espousing the glories of genius gets us nowhere.

What I like the most about these is that they try to place Jobs within his context. They also raise larger questions including “what does it mean to be a genius,” “what values does society promote,” and “are societal or group trends more important than individual actions.”

Asking “why aren’t Americans moving to the city”

Even as the percentage of Americans who live in the suburbs has increased over the decades, one writer asks “why aren’t Americans moving to the city?

Polling by the real estate advising firm RCLCO finds that 88 percent of Millenials want to live in cities. Their parents, the Baby Boomers, also express a burning desire to live in denser, less car-dependent settings. But in the past decade, many major cities saw population declines, and the overwhelming majority of population growth was in the suburbs…

Methinks we may have jumped the gun on the whole collapse of the suburbs bit…

For the Millenials, the showstopper was jobs, or lack thereof. They managed to survive the last few years of college, but lacking paying work in the city, they’ve moved back in with mom and dad. So now they’re all kicking it in the TV room back on Deerhaven Drive, watching It’s Always Sunny in Philadelphia reruns and dreaming of big city living.

There are other factors that have slowed down the great urban migration that predate our recent economic woes: Crime rates are down nationwide, but that has done little to diminish the perception that cities are dark, violent places. Poverty, addiction, and blight still haunt many urban centers. Then there are the kids. The Millenials aren’t the first generation of young people to get all stoked about the city. The ones before them continue to pick up and leave as soon as Junior hits school age.

Of course, much of this is the result of ill-advised investment: We’ve poured money into unsustainable suburban development while starving the urban centers. (One writer on this website recently argued convincingly that subsidized sprawl is a giant Ponzi scheme.)

But I think there is a deeper force at work here. Here’s another headline that reads like it could have come out of the Onion: “Almost half of Americans want to live somewhere else.”

It’s actually from USA Today, and the accompanying story looks at a 2009 PEW Research Center poll that found that 46 percent of the public “would rather live in a different type of community from the one they’re living in now — a sentiment that is most prevalent among city dwellers.”…

Listen, I don’t mean to belabor this point. This is all just to say that the urban renaissance is not fait accompli.

This seems like a reasonable argument to me: there is no guarantee, as some critics have suggested, that Americans will see the error of the suburbs and flock back to the city. For many Americans, the suburbs seem to offer the best alternative to other living options: it combines some of more rural living (a bit of land) and more urban living (amenities nearby). Attacks on the suburbs won’t necessarily change their minds though higher costs of living (gas prices, less valuable houses) might.

The cited survey is also interesting. The Pew website about the survey is titled “For Nearly Half of America, Grass Is Greener Somewhere Else.” Are Americans simply afflicted with an itch to be somewhere else? Is this manifest destiny in action? Also in this survey:

Americans are all over the map in their views about their ideal community type: 30% say they would most like to live in a small town, 25% in a suburb, 23% in a city and 21% in a rural area.

If you combined the small town and suburban percentages, you would get almost the exact percentage of Americans who live in the suburbs. So when people responded that they would prefer a small town, do they really mean a suburban small town or a more rural small town and living in a rural area is more of living on a farm or five acre plot of land far from a big city?

No sociological explanations for “the year of the sitcom”?

A critic suggests we don’t need big sociological explanations to understand why television viewers have returned to sitcoms:

For the Chinese, this is the Year of the Rabbit; to the Jews, it’s 5772. And for journalists covering the TV business? That’s simple: It’s the Year of the Sitcom! Early coverage of the 2011–12 small screen season’s winners and losers has understandably focused on the fact that comedies such as New Girl, Suburgatory, and 2 Broke Girls seem to be doing far better than other kinds of programming this fall. This is what those of us who cover entertainment call a “trend,” and as such, we feel a profound professional responsibility to dig deep and search our souls for the answers: Why laughter? Why now? This will almost certainly result in a dramatic uptick in articles featuring sprawling sociological theories supported by quotes from ubiquitous TV historian Robert J. Thompson and all manner of Hollywood insiders: People want to laugh in a down economy! Comedies only take 30 minutes to watch, and we’re all too busy for dramas! We’ve found a funnier, totally new way to make comedies that’s unlike anything you’ve seen before! But no matter how intelligently the stories are written, or how wise the talking heads doing the explaining might be, the bottom line about TV’s alleged sitcom renaissance is much simpler. It’s just not nearly as interesting…

To understand what’s happening with comedies right now, consider how things often work in the movie business. After X-Men hit big in 2000, Hollywood decided to make Spider-Man and many, many more superhero movies. After audiences demonstrated a willingness to watch girls be gross in Bridesmaids, you could almost hear studio bosses shouting from their offices, “Get me the next Kristen Wiig!” TV is no different; it can just react to trends more quickly. And so, when ABC’s Modern Family rocketed on to TV in 2009, networks suddenly started feeling sitcoms might be worth the risk again, as co-creator Steve Levitan told Variety last summer. “My guess is that programmers see the success of a show like Modern Family and it gives them the impetus, the appetite to program more comedies,” he told the industry trade. This is why, post-MF, CBS decided to roll the dice and try half-hours on Thursdays; Fox chose to double down its efforts at finding live-action laughers by launching an hour-long post-Glee sitcom block; and this fall, new sitcom blocks have popped up on both Tuesdays (ABC) and Wednesdays (NBC). All told, that’s eight new half-hour slots for comedy to try to gain a foothold with viewers. Since TV types love talking in sports metaphors, put it this way: More at-bats generally result in more runners getting on base, and with a little luck, more runs scored. Likewise, while producing lots and lots of comedies is no guarantee of success (NBC once programmed a massive eighteen sitcoms one fall), you’re almost certainly going to up the odds of finding worthwhile new comedies by aggressively playing the game rather than sitting on the bench and hoping reality shows get you the win…

Bottom line? There may be no grand logic behind why sometimes we watch a lot of comedies and other times we waste our time on reality shows or obsess over the personal lives of melodramatic medical practitioners. And often it’s just a matter of finding the right balance of numbers of shows (a glut is a glut) and networks figuring out the best way to schedule them. So let’s all resist the urge to make up sociological or economic explanations for the sitcom’s resurgence. (Thereby freeing up Robert J. Thompson’s day: Hey, Bob, why don’t you and Paul Dergarabedian go whale watching? You deserve a break from all the quoting!) Yes, these are tough times, but they do not necessarily make people more eager to laugh: In boom times, do people come home and say, “I’ve been smiling all day and I’m tired of it: give me something dour to balance me out!” They do not. And viewers are not being lured back by new innovations in comedy: Sure, Zooey Deschanel is a unique personality, but Two and a Half Men remains top-rated, and that’s just The Odd Couple with more erection jokes. (Though who could forget the Odd Couple classic, “Felix gets his junk caught in his tie-clip case”?) As ever, trends are just another way of saying that success breeds imitation, whether it’s comedies, dramas, movies, or Angus hamburgers — available for a limited time only!

A few thoughts:

1. So the best explanation is that TV networks have simply put more sitcoms out there and several have caught on? This Moneyball-esque explanation (you are bound to have more hit shows if you simply put more out there!) could have some merit. Think about the music, movie, book publishing, and TV industries. The companies behind the products have little idea which particular products will prove successful and so they throw all sorts of options at the public. To have a successful year within each industry, only a few of these products have to have spectacular success. Essentially, these few popular ones can subsidize the rest of the industry. There is no magic formula for writing a successful sitcom, movie, book, or album so companies throw a lot of products at the wall and see what sticks.

2. A note: those people peddling “sprawling sociological theories” sound like they are not sociologists but rather “pop sociologists.” To really get at this issue, we would have to compare success of different genres over time to try to see if there is a relationship between genre and social circumstances at the time. Yes, I agree that people can be quick to find big explanations for new phenomena…and do so without consulting any data. Knee-jerk reactions are not too helpful.

3. At the same time, one might argue that the tastes of the public guided or at least prompted by some of these sociological factors. While there are no set formulas, won’t “good shows” win out? Not in all circumstances – think of the “critical darlings” versus those that end up being popular. Perhaps we need to ask a different question: how do shows become popular? What kind of marketing campaigns pull people in and how does effective “word of mouth” spread?

Are the suburbs truly American?

The suburbs are a key part of American life: a majority of Americans live in them and they are part of the American Dream. So can we really ask whether they are truly American?

In his review of American Horror Story, which premieres tonight on FX, Slate’s TV critic Troy Patterson writes that the show’s “title carries more weight than its content can bear.” He then quotes a book review by Joyce Carol Oates of Curtis Sittenfeld’s American Wife:

Is there a distinctly American experience? The American, by Henry James; An American Tragedy, by Theodore Dreiser; The Quiet American, by Graham Greene; The Ugly American, by William Lederer and Eugene Burdick; Philip Roth’s American Pastoral; and Bret Easton Ellis’ American Psycho—each suggests, in its very title, a mythic dimension in which fictitious characters are intended to represent national types or predilections…. ‘American’ is an identity fraught with ambiguity, as in those allegorical parables by Hawthorne in which ‘good’ and ‘evil’ are mysteriously conjoined.

But wait: Are only Americans “fraught with ambiguity”? Oates lets these titles—and, especially, the many, many lesser books she might have mentioned—off the hook too easily. Too many books—and movies, and now TV shows—use the word “American” in their titles as a cheap shortcut to gravitas and sociological importance…

Besides bullying us with their national import, these titles often reinforce the fairly exaggerated ideas we tend to have about the uniqueness of this country. There are many things particular to and remarkable about the United States, but let’s not get carried away. Capitalism is not uniquely American (sorry, American Psycho). Suburbs are not uniquely American (sorry, American Beauty—the movie, I mean; and yes, plastic bags float in the wind in other countries, too).

This seems related to American exceptionalism: we like to think we have done everything in the best way. Perhaps we have the best capitalistic system. (A lot of people might argue with this these days.) But to pick on the suburbs here seems misguided: are there really other countries in the world that can match the American suburbs? A few countries have suburbs like ours such as Australia and Canada. Most other developed nations have limited suburban developments and sometimes they are the inverse of American suburbs where the wealthier live closer to the center of the cities and the poor live more on the edges. But Australia and Canada have relatively few people compared to the United States and I’m not sure they have the same suburban culture that pervades their national identity.

Perhaps we are particularly jingoistic in our naming but the American suburbs do seem to be uniquely American.

The beginnings of the word “individualism” in de Toqueville’s Democracy in America

Americans are often described as individualists. Where exactly did this term come from? It can be partly attributed to a famous work by French observer Alexis de Toqueville.

It is interesting to note that the word “individualist” wasn’t part of the vocabulary of the first colonists or even the revolutionaries. It is a 19th Century word, likely first used out of necessity by the translators of Alexis de Tocqueville’s Democracy in America — an almost sociological work based on the author’s visit to America during the 1830s.

On the matter of American individualism de Tocqueville wrote: “There are more and more people who… have gained enough wealth and understanding to look after their own needs. Such folk owe no man anything and hardly expect anything from anybody. They form the habit of thinking of themselves in isolation and imagine that their destiny is in their hands. … Each man is forever thrown back on himself alone and there is danger that he might be shut up in the solitude of his own heart.”

Importantly, de Tocqueville saw several social forces that worked against the isolation of individualism and the danger of being locked in solitary: the family, the church and a set of civic virtues fostered, he believed, by American mothers. Whether or not we agree with this particular formulation, we might agree on a more general point. In discussions of American individualism, it is important to treat it as part of a balanced pair — often, yoked in a tense arrangement with one side headed for individual isolation and the other toward full immersion in a community. As long as the forces are fairly equal, the arrangement stays centered…

Three hundred years later, Herbert Hoover coined the now famous phrase “rugged individualist.” But he, too, saw a natural constraining partner for his American creation — the right of others to exercise opportunities arising from their own individuality.

The Oxford English Dictionary lists a translation of de Toqueville’s work, Democracy in America, as the second use of the term “individualism.” I wonder if this is an accurate translation of de Toqueville – what exactly did he intend to say?

Just because the word came along in the 1830s doesn’t mean that Americans were not individualists prior to this use. At the same time, could we argue that Americans have increasingly adopted this label and tried to live up to it? As labeling theory might suggest, Americans have acted in accordance with expectations and perhaps this has even become easier because of the country’s burgeoning wealth and power after World War II.

But as this commentator suggests, the individualism is often limited by ever-present ties to the larger community. We complain about taxes but don’t want the services paid for by taxes to disappear. De Toqueville’s work is partly famous because he also talks about the propensity of Americans to volunteer for organizations, a zeal that surprises him. But then we have more recent works like Bowling Alone that suggest Americans have largely lost this zeal, withdrawing into more personal networks and generally retreating from public life. Are we at the individualistic end of the pendulum swing now and will we soon swing back to a middle ground?

Was the popularity of the Kennedy mystique a rejection of 1950s American suburbs?

The Kennedy mystique has been well established in American culture: John F. and Jackie Kennedy swept into the White House, bringing in the television age, the space age, and jumpstarting the 1960s. But I hadn’t connected this mystique to what critics saw as the bland American suburbs of the 1950s:

In the normal course of the apparat’s work, elevating the Kennedys requires the denigration of the Eisenhowers, the 1950s, and the supposed dullness of the country that the Kennedys rescued us from—“our country of suburbs and Ozzie and Harriet, poodle skirts and one kind of cheese,” as Diane Sawyer oddly put it, while the screen showed a golden brick of Velveeta. Jackie by contrast wore clothes by designers who would have gone into a dead faint at the sight of a poodle skirt. When the Kennedys moved in, added the court historian Michael Beschloss, “we had a White House that looked like a bad convention hotel.” The Kennedys brought French cuisine to the White House, Diane Sawyer added. “No more Eisenhower cheese sauce and cole slaw. .??.??. In our middle-class nation, it wasn’t easy for us to fathom this first lady.” Jackie herself is heard complaining about the marks that Ike’s golf shoes left in the flooring. Dwight Eisenhower, lumbering ox.

This view of the suburbs fits well with a set of suburban critiques that began in the 1950s: the suburbs were bland, about conformity, and were populated by people who couldn’t really act like those nice suburban families on TV and who had popular tastes. In comparison to the Eisenhowers and Ozzie and Harriet on TV, the Kennedys were the cultural elite, the fashionable who had refined tastes and opinions. This same argument can be heard today and still pits two sets of people against each other: the urban intellectuals versus the middle class suburbanites, progressives versus conservatives, fashionable and novel versus bland and predictable, novel versus boring, upscale shoppers versus Walmart (or maybe Target on the slightly higher end) patrons. Perhaps it all goes back to those arguments in the early years of America when Thomas Jefferson advocated for a more rural America and Alexander Hamilton pushed for the capital to be in New York City.

Having to “prove” racism versus assuming that it is a common feature of American life

In defending some comments she made regarding white liberals and their support for President Obama, Melissa Harris-Perry looks at three common objections to conversations about race in the United States: “prove it,” “I have black friends,” and “who made you an expert?” While these are all familiar responses, the first one is a particularly sociological point that raises questions about how we view society and how this plays out in court:

The first is a common strategy of asking any person of color who identifies a racist practice or pattern to “prove” that racism is indeed the causal factor. This is typically demanded by those who are certain of their own purity of racial motivation. The implication is if one cannot produce irrefutable evidence of clear, blatant and intentional bias, then racism must be banned as a possibility. But this is both silly as an intellectual claim and dangerous as a policy standard.

In a nation with the racial history of the United States I am baffled by the idea that non-racism would be the presumption and that it is racial bias which must be proved beyond reasonable doubt. More than 100 years of philosophical, psychological and sociological research that begins, at least, with the work of W.E.B. Du Bois has mapped the deeply entrenched realities of racial bias on the American consciousness. If anything, racial bias, not racial innocence is the better presumption when approaching American political decision-making. Just fifty years ago, nearly all white Democrats in the US South shifted parties rather than continuing to affiliate with the party of civil rights. No one can prove that this decision was made on the basis of racial bias, but the historical trend is so clear as to require mental gymnastics to imagine this was a choice not motivated by race.

Progressives and liberals should be particularly careful when they demand proof of intentionality rather than evidence of disparate impact in conversations about racism. Recall that initially the 1964 Civil Rights Act made “disparate impact” a sufficient evidentiary claim for racial bias. In other words, a plaintiff did not need to prove that anyone was harboring racial animus in their hearts, they just needed to show that the effects of a supposedly race neutral policy actually had a discernible, disparate impact on people of color. The doctrine of disparate impact helped to clear many discriminatory housing and employment policies off the books.

Michelle Alexander brilliantly demonstrates in The New Jim Crow, the pernicious effect of the Supreme Court moving away from disparate impact as a standard to forcing plaintiffs to demonstrate racist intention. This new standard has encouraged the explosive growth of incarceration of African-Americans, turning a blind eye to disparate impact while it demands “proof” of racial bias.

I believe we must be careful and judicious in our conversations about racism. But I also believe that those who demand proof of interpersonal intention to create a racist outcome are missing the point about how racism works. Racism is not exclusively about hooded Klansmen; it is also about the structures of bias and culture of privilege that infect the left as well.

I like how Harris-Perry flips this objection: looking at the broad sweep of American history, from its days of more overt racism to more covert racism today, why don’t we assume that racism plays a role in everyday life in this society? Can we really assume, as many seem to do, that the issues with race ended at some point, either in the Civil Rights legislation of the 1960s or in the election of minority politicians or the ending of segregationist society in the South? With plenty of indicators of racial disparity today, from online comments from young adults to incarceration rates to homeownership to wealth to residential segregation, perhaps we should we see racism as a default feature of American society until proven otherwise.

Harris-Perry hints at one reason why it is difficult for Americans to see the effects of racism: the court system moving to the burden of proof shifting to “proving” “racist intention.” Without the proverbial smoking gun, it then becomes more difficult to develop arguments just from data and patterns, even if they are overwhelming. While the recent court case involving gender discrimination at Walmart and sociologists siding with the prosecution isn’t about race, it illustrates some of these principles. The data suggests discrimination may have taken place as more women did not receive promotions or pay raises. But without “proof” that this was a deliberate Walmart policy meant to harm women, the numbers may not be enough. The same holds true with race: “statistical discrimination,” stereotypes about large groups of people, may be okay because no individual or corporation can be held directly responsible for the outcome.