If one survey option receives the most votes (18%), can the item with the least votes (2%) be declared the least favorite?

The media can have difficulty interpreting survey results. Here is one recent example involving a YouGov survey that asked about the most attractive regional accents in the United States:

Internet-based data analytics and market research firm YouGov released a study earlier this month that asked 1,216 Americans over the age of 18 about their accent preferences. The firm provided nine options, ranging from regions to well-known dialects in cities. Among other questions, YouGov asked, “Which American region/city do you think has the most attractive accent?”

The winner was clear. The Southeastern accent, bless its heart, took the winning spot, with the dialect receiving 18 percent of the vote from the study’s participants. Texas wasn’t too far behind, nabbing the second-most attractive accent at 12 percent of the vote…

The least attractive? Chicago rolls in dead last, with just 2 percent of “da” vote.

John Kass did not like the results and consulted a linguist:

I called on an expert: the eminent theoretical linguist Jerry Sadock, professor emeritus of linguistics from the University of Chicago…

“The YouGov survey that CBS based this slander on does not support the conclusion. The survey asked only what the most attractive dialect was, the winner being — get this — Texan,” Sadock wrote in an email.

“Louie Gohmert? Really? The fact that very few respondents found the Chicago accent the most attractive, does not mean that it is the least attractive,” said Sadock. “I prefer to think that would have been rated as the second most attractive accent, if the survey had asked for rankings.”

In the original YouGov survey, respondents were asked: “Which American region/city do you think has the most attractive accent?” Respondents could select one option. The Chicago accent did receive the least number of selections.

However, Sadock has a point. Respondents could only select one option. If they had the opportunity to rank them, would the Chicago accent move up as a non-favorite but still-liked accent? It could happen.

Additionally, the responses were fairly diverse across the respondents. The original “winner” Southeastern accent was only selected by 18% of those surveyed. This means that over 80% of the respondents did not select the leading response. Is it fair to call this the favorite accent of Americans when fewer than one-fifth of respondents selected it?

Communicating the nuances of survey results can be difficult. Yet, journalists and other should resist the urge to immediately identify “favorites” and “losers” in such situations where the data does not show an overwhelming favorite respondents did not have the opportunity to rate all of the possible responses.

McMansion as a verb: “could McMansion R.I.’s coast”

I have seen numerous creative uses of the word McMansion but using the term as a verb is rare. Here is some of the story with the headline “Planners Concerned New Rules Could McMansion R.I.’s Coast“:

Two bills recently approved by the General Assembly support the construction of taller buildings along the Ocean State’s shoreline, which, according to some municipal planners and building officials, would essentially result in the walling off of the coast…

“This new bill would allow for three to three and a half floors instead of two,” Warner said. “We promote elevating above base flood elevation and the changes we made two years ago are working well. This bill isn’t adding any incentive or benefit for flood protection or protection against extreme weather. It does nothing to protect buildings from damage. We’d be building elevated mansions.”

Perhaps the use of McMansion as a verb is a function of writing a concise headline. The meaning of “to McMansion” seems clear: to construct large, undesirable homes. It also gets at some of the use of the word McMansion alongside words like “invasion” or “sprouting” which suggest the spread of McMansions. Whether this use of McMansion as a verb is better than the existing phrase mansionization is unclear.

The issue is one that many communities in the United States face: just how large should new homes or teardowns be allowed to be? At the same time, the shoreline adds further complications in that debates rage about who should have access to beaches and how the land should be best used to benefit the community in the long term. For an example of the shoreline issues, see my review of One Big Home which details of the fight over mansions on Martha’s Vineyard.

Repealing a suburb’s English language resolution amid demographic change

The Chicago suburb of Carpentersville passed a resolution in 2007 saying English was the official language. The suburb continued to change and now officials have repealed the resolution:

Local officials say the English resolution caused nothing but controversy, and that progress came instead from targeting troublemakers, not Spanish speakers. Now, as one of the most diverse communities in the Chicago area, leaders hope to put the controversy behind them.

There’s also the demographic and political reality that Hispanics now account for slightly more than 50 percent of Carpentersville’s population of about 38,000, up from about 40 percent when the language measure was passed. Whites now make up about a third of the local populace, with most of the rest African- or Asian-American…

Still, it’s a touchy subject. When asked about the change in local law, Village President John Skillman, a lifelong resident, downplayed it. He said village documents and meetings will continue to be in English, and emphasized that the resolution made no concrete changes in the first place…

At the same time, efforts have been made to reach across ethnic boundaries. Last year, in addition to its Fourth of July fireworks, the village held a Mexican Independence Day celebration, and this year, its first Cinco de Mayo festival.

It is a relatively quick turnaround from a set of white candidates running for office and getting enough votes to join the Village Board and passing this resolution (and other measures aimed at undocumented immigrants) to repealing that same resolution eleven years later. At the least, it could suggest there is power of being part of local government: in a suburb of roughly 38,000 people, it may not take much to run for local office and campaign for particular issues. Regardless of what side of a political issue a resident is on, running for local office can make a difference.

The rest of the article hints at ways the suburb has come to terms with an increasing Latino population: Latino businesses in town, addressing gang activity, local festivals, and whether residents experienced discrimination. But, there is a lot more that could be addressed here. Did such a resolution significantly change day to day life? (The article suggests no.) How much do white, Latino, and black residents interact and participate in each other’s social networks? How does this play out in certain civic institutions like schools, religious groups, and community organizations? Resolutions or ordinances can certainly have a symbolic effect but there are a number of layers to community life and interactions in a suburb like Carpentersville.

(Side note: this is an apropos follow-up to yesterday’s post about how many Americans speak a language other than English at home. This affects more than just home life.)

20% of Americans speak a language other than English at home

Occasionally, statistics about the United States stand out. Here is one I recently saw involving language as reported by the AP:

In the United States, one in five people age 5 and over speak a language other than English at home, according to data from the U.S. Census Bureau. In immigrant-friendly Los Angeles, more than half of people do.

This is likely linked to relatively high levels of immigration in recent decades (and projections for more foreign-born residents in years to come). Pew summarizes the trend:

There were a record 43.2 million immigrants living in the U.S. in 2015, making up 13.4% of the nation’s population. This represents more than a fourfold increase since 1960, when only 9.7 million immigrants lived in the U.S., accounting for just 5.4% of the total U.S. population…

PewForeignBornPopulation

And by far, the language other than English spoken most at home is Spanish.

The evolving definition and usage of “selfie”

The word “selfie” was the Oxford Dictionary’s word of the year in 2013 but its usage and meaning continues to evolve:

A selfie isn’t just “a photograph that one has taken of oneself,” but also tends to be “taken with a smartphone or webcam and uploaded to a social media website,” as the editors at Oxford Dictionaries put it. That part is key because it reinforces the reason why we needed to come up with a new name for this kind of self-portraiture in the first place.

Think of it this way: A selfie isn’t fundamentally about the photographer’s relationship with the camera, it’s about the photographer’s relationship with an audience. In other words, selfies are more parts communication than self-admiration (though there’s a healthy dose of that, too).

The vantage point isn’t new; the form of publishing is.

This explains why we call the photo from the Oscars “Ellen’s selfie” — because she was the one who published it. Selfies tether the photographer to the subject of the photo and to its distribution. What better way to visually represent the larger shift from observation to interaction in publishing power?

Ultimately, selfies are a way of communicating narrative autonomy. They demonstrate the agency of the person behind the lens, by simultaneously putting that person in front of it.

The key to the selfie is not that people are talking photos of themselves for the first time in history; rather, they are doing it with new purposes, to tell their own stories to their online public. This is what social media and Web 2.0 are all about: putting the power into the hands of users to create their own narratives. The user now gets to decide what they want to broadcast to others. One scholar described it giving average people the ability to be a celebrity within their online social sphere. The selfie is also part of a shift toward telling these narratives through images rather than words – think about the relative shift in updating Facebook statuses years ago to now posting interesting pictures on Instagram.

Analyzing gendered uptalk on Jeopardy!

As part of a household that regularly watches Jeopardy! via the magic of DVR, I was intrigued to read about this sociological study of uptalk on the show:

Linneman’s study involves issues deeper than how game show contestants talk—specifically, the implications uptalk has for gender identities. According to his article, “The primary sociological controversy surrounding uptalk concerns the fact that women use uptalk more often than men do, and some interpret this as a signal of uncertainty and subordination.”Linneman found that both gender and uncertainty played a role: “On average, women used uptalk nearly twice as often as men. However, if men responded incorrectly, their intonation betrayed their uncertainty: their use of uptalk shot up dramatically.”

The use of uptalk is not merely an academic concern, as Linneman discovered with one of his results.

“One of the most interesting findings coming out of the project is that success has an opposite effect on men and women on the show…The more successful a man is on the show, uptalk decreases. The opposite is true for women…I think that says something really interesting about the relationship between success and gender in our society, and other research has found this too: successful women in a variety of ways get penalized.”

Uptalk’s sometimes-negative connotations bring up the subject of how women speak, a provocative issue.

While this isn’t an earthshaking finding, two things are very interesting here:

1. It is a reminder that language usage and speech patterns reflect larger social forces. While individuals may have unique ways of expressing themselves, language and expression is also learned behavior influenced by others.

2. Selecting Jeopardy! as the research case for this particular phenomenon is clever. While uptalk is related to perceptions of a lack of confidence, the contestants on the show should not have as much reason for nervousness as others might have about being on TV. In order to make it on air, they have to be smart enough to pass a qualifying test and then they have to pass an in-person audition. In other words, the contestants, males and female, are bright people. Granted, being in front of a camera is a different matter but these contestants aren’t caught completely unaware nor should they be fully perplexed by the questions they are trying to answer.

The world of McDonalds, McQuarks, and McMansions

Wired has a few recent pieces that are related to McMansions. First, an “Alt Text” piece parodies other “theoretical particles” that might follow the recent Higgs-Boson news:

McQuark

This subatomic particle is found in all McDonald’s food, and is the reason that all the menu offerings — including the burgers, shakes and dipping sauces — taste “McDonaldy,” as if they were all just carved out of a big lump of McSubstance. Currently, the McQuark is the universe’s only trademarked subatomic particle, although Motorola, maker of the Photon smartphone, is attempting to gain traction against Apple’s battery of lawsuits by patenting actual photons.

Wired‘s Matt Simon follows up and defines McMansions:

The most widely used of these pejoratives is McMansions. These are the quickly produced cookie-cutter homes that some say lack taste.

It would be interesting to hear more from McDonald’s about how they feel about the expanding usage of such terms, particularly McMansion. According to Wikipedia, McDonalds was not too happy about the term “McJobs”:

The term “McJob” was added to Merriam-Webster’s Collegiate Dictionary in 2003, over the objections of McDonald’s. In an open letter to Merriam-Webster, Cantalupo denounced the definition as a “slap in the face” to all restaurant employees, and stated that “a more appropriate definition of a ‘McJob’ might be ‘teaches responsibility.'” Merriam-Webster responded that “[they stood] by the accuracy and appropriateness of [their] definition.”

On 20 March 2007, the BBC reported that the UK arm of McDonald’s planned a public petition to have the OED’s definition of “McJob” changed. Lorraine Homer from McDonald’s stated that the company feels the definition is “out of date and inaccurate”. McDonald’s UK CEO, Peter Beresford, described the term as “demeaning to the hard work and dedication displayed by the 67,000 McDonald’s employees throughout the UK”. The company would prefer the definition to be rewritten to “reflect a job that is stimulating, rewarding … and offers skills that last a lifetime.”…

According to Jim Cantalupo, former CEO of McDonald’s, the perception of fast-food work being boring and mindless is inaccurate, and over 1,000 of the men and women who now own McDonald’s franchises began behind the counter.Because McDonald’s has over 400,000 employees and high turnover, Cantalupo’s contention has been criticized as being invalid, working to highlight the exception rather than the rule.

In 2006, McDonald’s undertook an advertising campaign in the United Kingdom to challenge the perceptions of the McJob. The campaign, developed by Barkers Advertising and supported by research conducted by Adrian Furnham, professor of psychology at University College London, highlighted the benefits of working for the organization, stating that they were “Not bad for a McJob”. So confident were McDonald’s of their claims that they ran the campaign on the giant screens of London’s Piccadilly Circus.

Instead of trying to change or block the definition, why doesn’t McDonald’s try to introduce its version of a “Mc-” term that it can then work to define? Of course, such things can be quickly turned around on the Internet but McDonald’s has plenty of resources and reach. I’m sure they could develop a positive version and there are still plenty of people going to their restaurants…

The need to study language AND culture

A literature professor argues that in order to truly learn and use a language, you must also learn about the culture in which the language is used:

I have been asked several times at my university in Oman to do a brief “cultural introduction” to native speakers of English from North America and Europe who have come to improve their Arabic. I start by mentioning that there is a large difference between learning how to speak a language and learning how to navigate a culture. Then I segue into a discussion of how to dress appropriately. My watchwords are: no knees or elbows on display in public. Usually, at this point, several of the listeners look angry, disbelieving and/or bored, especially the men wearing tight, casual T-shirts and women in spaghetti-strap underwear shirts…

My attempts to make Westerners understand that they will need to make adjustments to fit into Omani society have not gone well. The most common response is, “But I am me. They will just have to accept me as I am.” The problem with the “I need to be me” response is that most Westerners do not realize that the consequences of “being me” are not the same as in the West. Omanis rarely use direct confrontation and will simply avoid a person who they feel is violating cultural norms.

The trick is to find a balance between integration and self-integrity while learning not just the language but also how to use it in a culturally appropriate manner. For example, most Gulf Arabs use an indirect communication style. They will rarely make a negative comment in public and never convey negative information that they do not want to share. For example, if there is a specific need to convey a warning or bad news, Omanis will often recruit an intermediary to deliver it. That is why I, a non-Omani, have been asked to give the “dress and act politely” lecture to Western students…

The other answer I get when Westerners refuse to, say, comb their hair, smile when greeting an Omani, or stand up to shake hands, is that “I don’t need to talk to people—I just need the language.” As a literature professor, I find this bewildering. Imagine a person who visited Britain having read all major political-theory textbooks but never having seen Monty Python, read Wordsworth, tasted tea, or been to a soccer game. Could that person cope with references to the “Beeb,” “Oxbridge,” “Beckham,” “twee,” or “pillock”? Words such as “slamming,” “in the dumps,” “bummed,” or “shambolic” don’t show up in vocabulary lists. So much of daily language is slang and metaphors that if a person is not speaking often with native speakers, she or he will never be able to carry on a normal conversation in that country. The last response I often get from Arabic language learners, is “I don’t plan to live in this country, so I don’t need to fit in here.” While it is true that the people who say this may never live in Oman, if they have careers that involve familiarity with the Arabic language, literature, politics, or business, they will probably meet some Omanis down the line. Imagine the icebreaker or dealmaker comments that a person will have at hand if she or he can greet an Omani with a local expression or a local joke.

There is much more to language than just the words, grammar, and inflections: language is a window into much larger cultural frameworks that are full of complicated symbols, values, and meanings.

It sounds like the language learners described above want to learn the language but don’t want the language to affect them too much. In other words, they want the skill of being able to speak another language (and can be perceived as being really marketable) but they want to keep the language at arm’s length. To some degree, this sounds like modern day ethnocentrism: “I want to learn your language to be able to talk to you but I don’t want to have to learn about what makes you tick because that wouldn’t be worth my time.” Of course, it could very well be worth one’s time for business or political or social purposes as the examples at the end of the last quoted paragraph above illustrate.

But, it sounds like a larger issue here is explaining to students why one should learn a new language: is it about checking off a box on a list of high school or college requirements? Is it about being able to put this on a resume? Is it about becoming “smarter” or more “cosmopolitan”? Is it about learning how to authentically interact with cultures different from your own? This last reason fits with calls for students to learn cross-cultural skills as they will go on to navigate a world where more frequently cultures interact and sometimes clash.

Debating the merits of using the word “cancer”

Many would say that they know what cancer is. But medical experts suggest it is not so clear and perhaps the term “cancer” is not the best description for every situation that might usually be labelled with this term.

Though it is impossible to say whether the treatment was necessary in this case, one thing is growing increasingly clear to many researchers: The word “cancer” is out of date, and all too often it can be unnecessarily frightening…

“The definition of cancer has changed,” said Dr. Robert Aronowitz, a professor of history and sociology of medicine at the University of Pennsylvania.

Many medical investigators now speak in terms of the probability that a tumor is deadly. And they talk of a newly recognized risk of cancer screening — overdiagnosis. Screening can find what are actually harmless, if abnormal-looking, clusters of cells.

But since it is not known for sure whether they will develop into fatal cancers, doctors tend to treat them with the same methods that they use to treat clearly invasive cancers. Screening is finding “cancers” that did not need to be found. So maybe “cancer” is not always the right word for them.

This is an interesting discussion to read about after having recently completed reading The Emperor of All Maladies: A Biography of Cancer. Several points are found in both works:

1. Our knowledge of cancer is constantly evolving. We don’t know as much about it as the public might think.

2. Different cancers present different issues, leading to some of the issue with the term cancer. Cancers don’t have a common cause or necessarily act in the same way.

3. Screening is a big issue. Who should get screened? Is it cost-effective?

One other issue that I don’t see discussed in this article or in the book: is part of the problem with the word “cancer” the connotations that this has for people? In his book, Mukherjee suggests that cancer is associated with a bleak prognosis. When patients hear this term, they know they are in for a very difficult fight. Would changing the use of the term shift some of this conversation away from the immediate fear involving cancer to a more medical term that requires more explanation and obscures the severity a bit? Is this also in even just a little way about public relations?

American language about government policy and economic life shifts from community to individualism

Here is an interesting argument about how common American discourse about public policy and economic life has shifted since the 1930s:

In 1934, the focus was on people, family security and the risks to family economic well-being that we all share. Today, the people have disappeared. The conversation is now about the federal budget, not about the real economy in which real people live. If a moral concept plays a role in today’s debates, it is only the stern proselytizing of forcing the government to live within its means. If the effect of government policy on average people is discussed, it is only as providing incentives for the sick to economize on medical costs and for the already strapped worker to save for retirement.

From the 1930s to the 1960s, as the Princeton historian Daniel T. Rodgers demonstrates in his recent book, “The Age of Fracture,” American public discourse was filled with references to the social circumstances of average citizens, our common institutions and our common history. Over the last five decades, that discourse has changed in ways that emphasize individual choice, agency and preferences. The language of sociology and common culture has been replaced by the language of economics and individualism.

In 1934, the government was us. We had shared circumstances, shared risks and shared obligations. Today the government is the other — not an institution for the achievement of our common goals, but an alien presence that stands between us and the realization of individual ambitions. Programs of social insurance have become “entitlements,” a word apparently meant to signify not a collectively provided and cherished basis for family-income security, but a sinister threat to our national well-being.

Over the last 50 years we seem to have lost the words — and with them the ideas — to frame our situation appropriately.

This is a fascinating line: “The language of sociology and common culture has been replaced by the language of economics and individualism.” This reminds me of the findings about how public opinion changes when asked about “welfare” versus “assistance for the poor.” The concepts are similar but the connotations of the specific terms matter.

Is the end argument here that changing the language will lead to more communal understandings or does reversing the “Bowling Alone” phenomenon have to come first? It would be helpful to know what exactly these commentators think happened in this period beyond simply the change in language. Could we argue that the success of the community-oriented policies of the mid 1900s that led to a booming economy, rising incomes, suburbanization, and homeownership was “too successful” in that it led to these shifts in language and focus?