Majority of young adults “see online slurs as just joking”

A recent survey of teenagers and young adults suggests that they are more tolerant of offensive or pejorative terms in the online realm:

Jaded by the Internet free-for-all, teens and 20-somethings shrug off offensive words and name-calling that would probably appall their parents, teachers or bosses. And an Associated Press-MTV poll shows they don’t worry much about whether the things they tap into their cellphones and laptops could reach a wider audience and get them into trouble.

Seventy-one percent say people are more likely to use slurs online or in text messages than in person, and only about half say they are likely to ask someone using such language online to stop…

But young people who use racist or sexist language are probably offending more people than they realize, even in their own age range. The poll of 14- to 24-year-olds shows a significant minority are upset by some pejoratives, especially when they identify with the group being targeted…

But they mostly write off the slurs as jokes or attempts to act cool. Fifty-seven percent say “trying to be funny” is a big reason people use discriminatory language online. About half that many say a big reason is that people “really hold hateful feelings about the group.”…

It’s OK to use discriminatory language within their own circle of friends, 54 percent of young people say, because “I know we don’t mean it.” But if the question is put in a wider context, they lean the other way, saying 51-46 that such language is always wrong.

This would seem to corroborate ideas that anonymity online or comments sections free people up to say things that they wouldn’t say in real life. Perhaps this happens because there is no face-to-face interaction or it is harder to identify people or there are few repercussions. In the end, the sort of signs, verbal or non-verbal cues, that might stop people from saying these things near other people simply don’t exist online.

I would be interested to see more research about this “joking” and how young adults understand it. Humor can be one of the few areas in life where people can address controversial topics with lesser consequences. Of course, there are limits on what is acceptable but this can often vary by context, particularly in peer-driven settings like high school or college where being “cool” means everything. These young adults likely know this intuitively as they wouldn’t use the same terms around parents or adults. Are these young adults then more polite around authority figures and save it all up for online or are they more uncivil in general as some would argue?

For an important issue like racism, does this mean that many in the next generation think being or acting racist is okay as long as they are among friends but is not okay to exhibit in public settings? Is it okay to be racist as long as it is accompanied by a happy emoticon or a j/k?

Knowing that this is a common issue, what is the next step in cutting down on this offensive humor, like we are already seeing in many media sites’ comments sections? And who gets to do the policing – parents, schools, websites?

The norms of college protests in court

Arguments in a California courtroom revolve around this question: what are the norms governing college protests?

Sociologist Steven Clayman took the stand on Thursday, the final day of testimony. He is an expert in “speaker-audience interaction,” and has written a scholarly article titled, “Booing: The Anatomy of a Disaffiliative Response,” which examines environments such as presidential debates, TV talk shows and British Parliament. He believes audience participation cannot be prevented because members of the crowd are “free agents,” able to express approval or disapproval of what a speaker is saying.

Having watched a video of the Irvine 11 incident, Clayman affirmed that the audience response seemed to be a “normal and unavoidable” part of Ambassador Michael Oren’sspeech.

Lead prosecutor Dan Wagner then fired, “It’s unavoidable that 10 people would stand up with planned statements that have nothing to do with what the speaker is saying? . . . Are you saying that the only way to prevent [protests] is to put a straitjacket and muzzle on them?” The questions were stricken by the judge.
Ten UC Irvine and UC Riverside students have been charged with misdemeanor conspiracy to commit a crime and misdemeanor disruption of a meeting. To be convicted of the latter, one must commit an act that violates the “implicit customs” or “explicit rules” for the event. The defense team claims the defendants did neither, arguing that they were merely following the norms and customs of protests on college campuses.

So what exactly is “normal” college protest behavior? A number of colleges have faced these questions in recent years as protests have moved from just being outside the event to occurring during the event. Think the “Don’t Tase Me Bro” incident of 2007. Or witness the various pie-throwing attempts involving politicians. I wonder if this trial is then less about whether such actions are harmful but rather how these norms have changed over the decades and whether there is widely understood agreement about these changes.

Of course, this particular trial in California involves a number of contentious political and social issues.

I wonder if this case, and other similar ones, will lead to more schools creating more explicit rules about what is allowed and not allowed in on-campus protests and to make this information widely known.

60% of British teenagers, 37% of adults “highly addicted” to their smartphones

A recent British study found that many teenagers are “highly addicted” to their smartphones:

Britons’ appetite for Facebook and social networks on the go is driving a huge demand for smartphones – with 60% of teenagers describing themselves as “highly addicted” to their device – according to new research by the media regulator, Ofcom…

The study, published on Thursday, also shows that smartphones have begun to intrude on our most private moments, with 47% of teenagers admitting to using their device in the toilet. Only 22% of adults confessed to the same habit. Unsurprisingly, mobile-addicted teens are more likely than adults to be distracted by their phones over dinner and in the cinema – and more would answer their phone if it woke them up…

Of the new generation of smartphone users, 60% of teenagers classed themselves as “highly addicted” to their device, compared to 37% of adults.

Ofcom surveyed 2,073 adults and 521 children and teenagers in March this year. The regulator defines teenagers as aged between 12 and 15, with adults 16-years-old and above.

Perhaps these results are not that surprising but it leads to several thoughts about addiction:

1. Since this is self-reported, couldn’t the percentage of teenagers and adults who are “highly addicted” actually be higher? If asked, how many people would admit to being “highly addicted” to things that they were actually addicted to?

2. That this many people were willing to say that they are “highly addicted” suggests that this addiction is probably considered to be normal behavior. If everyone or most people are actually addicted to using their smartphones, doesn’t this turn into a norm rather than an addiction in the eyes of the public? In twenty years, when these teenagers are the ones running these surveys, they may not use the same language or terms to describe phone/mobile device/computer use.

Why we need “duh science”

There are a lot of studies that are completed every year. The results of some seem quite obvious than others, what this article calls “duh research.” Here is why experts say these studies are still necessary:

But there’s more to duh research than meets the eye. Experts say they have to prove the obvious — and prove it again and again — to influence perceptions and policy.

“Think about the number of studies that had to be published for people to realize smoking is bad for you,” said Ronald J. Iannotti, a psychologist at the National Institutes of Health. “There are some subjects where it seems you can never publish enough.”…

There’s another reason why studies tend to confirm notions that are already widely held, said Daniele Fanelli, an expert on bias at the University of Edinburgh in Scotland. Instead of trying to find something new, “people want to draw attention to problems,” especially when policy decisions hang in the balance, he said.

Kyle Stanford, a professor of the philosophy of science at UC Irvine, thinks the professionalization of science has led researchers — who must win grants to pay their bills — to ask timid questions. Research that hews to established theories is more likely to be funded, even if it contributes little to knowledge.

Here we get three possible answers as to why “duh research” takes place:

1. It takes time for studies to draw attention and become part of cultural “common sense.” One example cited in this article is cigarette smoking. One study wasn’t enough to show a relationship between smoking and negative health outcomes. Rather, it took a number of studies until there was a critical mass that the public accepted. While the suggestion here is that this is mainly about convincing the public, this also makes me think of the general process of science where numerous studies find the same thing and knowledge becomes accepted.

2. These studies could be about social problems. There are many social ills that could be deserving of attention and funding and one way to get attention is to publish more studies. The findings might already be widely accepted but the studies help keep the issue in the public view.

3. It is about the structure of science/the academy where researchers are rewarded for publications and perhaps not so much for advancing particular fields of study. “Easy” findings help scientists and researchers keep their careers moving forward. These structures could be altered to promote more innovative research.

All three of these explanations make some sense to me. I wonder how much the media plays a role in this; why do media sources cite so much “duh research” where there are other kinds of research going on as well? Could these be “easy” journalistic stories that fit particular established narratives or causes? Do universities/research labs tend to promote these studies more?

Of course, the article also notes that some of these studies can also turn out unexpected results. I would guess that there are quite a few important findings that came out of research that someone at the beginning could have easily predicted a well-established answer.

(It would be interesting to think more about the relationship between sociology and “duh research.” One frequent knock against sociology is that it is all “common sense.” Aren’t we aware of our interactions with others as well as how our culture operates? But we often don’t have time for analysis and understanding in our everyday activities and we often simply go along with prevailing norms and behaviors. It all may seem obvious until we are put in situations that challenge our understandings, like stepping into new situations or different cultures.

Additionally, sociology goes beyond the individual, anecdotal level at which many of us operate. We can often create a whole understanding of the world based on our personal experiences and what we have heard from others. Sociology looks at the structural level and works with data, looking to draw broad conclusions about human interaction.)

A “grand, unifying theory of humor”?

A marketing and psychology professor argues that he can explain all humor:

There may be many types of humor, maybe as many kinds as there are variations in laughter, guffaws, hoots, and chortles. But McGraw doesn’t think so. He has devised a simple, Grand Unified Theory of humor—in his words, “a parsimonious account of what makes things funny.” McGraw calls it the benign violation theory, and he insists that it can explain the function of every imaginable type of humor. And not just what makes things funny, but why certain things aren’t funny. “My theory also explains nervous laughter, racist or sexist jokes, and toilet humor,” he told his fellow humor researchers…

The theory they lay out: “Laughter and amusement result from violations that are simultaneously seen as benign.” That is, they perceive a violation—”of personal dignity (e.g., slapstick, physical deformities), linguistic norms (e.g., unusual accents, malapropisms), social norms (e.g., eating from a sterile bedpan, strange behaviors), and even moral norms (e.g., bestiality, disrespectful behaviors)”—while simultaneously recognizing that the violation doesn’t pose a threat to them or their worldview. The theory is ludicrously, vaporously simple. But extensive field tests revealed nuances, variables that determined exactly how funny a joke was perceived to be.

I’ll attempt a quick and dirty translation into sociological terms: each society or culture has particular norms about right and wrong behavior. Violating these norms often leads to negative sanctions. But according to this academic,  humor works because the recipient of humor sees that violating the norms isn’t an attempt to overthrow the norms. The key appears to be the ability to show that the intended humor is “benign,” that the person sharing the humor has good intentions or still operates within the culture’s larger norms. Humor ceases to be humor when hearers think that the teller has “hit too close to home” or is mean-spirited.

After reading about this attempt at theory, I’m a little surprised that I haven’t read more from sociologists about humor. I know there is some work out there on this but in my reading and training, I remember hearing little about this basic feature of everyday life.

Considering the portrayal of single women

In the beginning of a film review, a British reviewer highlights a sociological study about how people treat and interact with single women:

Apparently, couples still shun the female singleton, fearful that she’ll wreck their marriages or at least their dinner-party numbers. One survey found that half of its sample never had single women as visitors, and 19% knew no single women at all. Casual disregard for this social group goes unremarked. Our prime minister insists that marriage must be prioritised and rewarded. The last government repeatedly identified “hard-working families” as its abiding concern. WAGs, meanwhile, are celebrated as much as manless Anistons are pitied.

In a world centred on cosily coupled units, leftover women labour under an enduring disadvantage. When they’re not ignored completely, they’re expected to provide tireless but unrecompensed support for people who matter more than them, as babysitters, carers or shoulders to cry on. When a mother is called upon to bunk off work to attend a nativity play, her unpartnered colleague is expected to take up the slack.

Cinema hasn’t done much for the benighted single woman.

The sociological study in question included 48 Australian married people. It is an interesting area of gender roles to consider; the norm in society is still to find a spouse or partner by a certain age. Cultural values and norms plus supportive public policies put pressure on people. This is particularly the case in many churches where singleness is frowned upon.

But hasn’t there been some pushback in the cultural realm on this front? Perhaps not in movies but television shows like “Cougar Town” have taken up this issue. Some of it may depend on the end goal: is the message of such films and TV shows (and books and music) that single women need to find men/husbands to be complete?

Learning the norms of audience behavior at the orchestra concert

Going to a symphony orchestra concert of a major group, such as the Chicago Symphony Orchestra, is an event: certain behavior is expected of the audience. An article from the Chicago Tribune offers some tips and a comment from a musician about how to learn about going to the orchestra:

It is extremely hard for anyone without significant exposure to classical music to truly understand it, he said.

“It’s something that has to be cultivated,” he said. “Beethoven’s music is filled with philosophy. …You can’t just come to one concert and understand it.”

But he hopes beginners try. One concert, after all, can lead to another. And another.

I would like to know when exactly symphony halls became places of quiet and decorum. If you read about classical music in the early 20th century, such as in The Rest Is Noise, some concerts, particularly those featuring modern music by the likes of Stravinsky and others, were places of displayed emotions. Classical music wasn’t just nice background music; it was music that was tied to bigger ideas and revolutionary thoughts.

The possible housing bubble in China

While the American housing crisis continues, FinanceAsia takes a look at the current housing situation in China:

Many homebuyers nowadays in China consider their property assets as part of their long-term savings plan, as well as a hedge against inflation.

Why property? China’s tightly run financial system leaves only three places for its zealous savers to put their money. Bank deposits are one option. But they yield 2.25%, less than the 3.1% rise in May’s consumer price inflation. The equity markets are a second choice. But stocks have been performing poorly; Shanghai’s benchmark index was one of the world’s worst performers in the first half of 2010. (And the bond market is underdeveloped.) Even with its high transaction costs and manic price moves, property has become the preferred investment choice for everyone from young married couples to middle-aged factory workers trying to ensure their retirement.

Recent statistics show that there are about 64 million apartments and houses that have remained empty during the past six months, according to Chinese media reports. On the assumption that each flat serves as a home to a typical Chinese family of three (parents and one child), the vacant properties could accommodate 200 million people, which account for more than 15% of the country’s 1.3 billion population. But instead, they remain empty. This is in part because many Chinese believe that a home is not a real home unless you own the flat.
And so people prefer buying to renting, and as a result, the rental yield is relatively low.

That’s a lot of vacant property. This is a testament to the power of cultural norms regarding housing: since renting is less desirable, a large percentage of the housing stock goes unoccupied. Also, savings behavior seems partly driven by these norms (and perhaps also by limited economic returns elsewhere) – houses have developed into investments rather than just places to live.

I don’t know much about the Chinese housing market but it is intriguing to read about non-American norms and values attached to housing. I wonder how these norms and values developed over time.