Sociology professor who taught class on Lady Gaga becomes “Gaga sensei” and celebrity himself

Read about the fame a sociology professor who taught a class titled “Lady Gaga and the Sociology of Fame” has himself found:

Deflem’s entry into the world of celebrity began quietly enough. He had an idea for a course looking at Lady Gaga’s rise to fame – and examining it from a sociological point of view – in the summer of 2010 and got the go-ahead to design it. In October, 2010, the course was announced to the university newspaper. From there – to the astonishment of many – the course suddenly became news across the globe.

In the weeks that followed, Deflem was swamped by interview requests and media appearances to discuss the course. They came from the New York Times, the BBC, the Washington Post, MTV, Billboard, Elle and USA Today. Media from countries including Italy, Germany, Ireland, Slovenia, India, Vietnam, Lebanon, Oman and even Zambia ran pieces about it. He fended off accusations that he had cynically designed the course and its title just to get such attention. “There is no way I could have planned this. I am not that smart,” he said.

But that was just the beginning. Soon he got an avalanche of criticism from figures like conservative firebrand Ann Coulter as well as Christian fundamentalists. His course even became an answer on the game show Who Wants To Be A Millionaire.

Lady Gaga herself noticed the course and talked about it on radio interviews and a chat with broadcast journalist Anderson Cooper on the flagship news programme 60 Minutes. Saturday Night Live did a skit about Lady Gaga featuring a fan of the star who was dressed to look like Deflem…

He was also amazed at the lack of agency he had over his own fate and image as it spiralled out of control in the hands of hundreds of journalists. “You kind of undergo it. You experience it. You do not really have any control,” he said.

Does this then count as participant observation?

The course did indeed get a lot of attention, see an earlier post here, but it sounds like it has been worthwhile in the end: it allowed a sociology professor to take a current topic and use it to teach sociology as well as learn on the inside about the nature of celebrity.

I still think it would be interesting to hear sociologists discuss their opinions about courses like this or Michael Eric Dyson’s courses on hip-hop. The names and subject matter of the course can stir up controversy but it helps draw attention to a discipline that doesn’t generally receive much. Plus, what is the difference between giving a course a provocative name and then using it to teach sociology well versus the current events and examples lots of sociology professors use in the classroom?

Female characters in recent movies and TV shows still marginalized

A new study shows female characters in recent movies and TV shows still play very different roles than men:

The study, lead by sociologist Stacy L. Smith, analyzed 11,927 speaking roles on prime-time television programs aired in spring 2012, children’s TV shows aired in 2011 and family films (rated G, PG, or PG-13) released between 2006 and 2011. Smith’s team looked at female characters’ occupations, attire, body size and whether they spoke or not.

The team’s data showed that on prime-time television, 44.3 percent of females were gainfully employed — compared with 54.5 percent of males. Women across the board were more likely to be shown wearing sexy attire or exposing some skin, and body size trends were apparent: “Across both prime time and family films, teenaged females are the most likely to be depicted thin,” Smith wrote in the study’s executive summary. The ratio of men to women in STEM fields was 14.25 to 1 in family films and 5.4 to 1 on prime time TV. Perhaps most telling are the percentages of speaking female characters in each media form: only 28.3 percent of characters in family films, 30.8 percent of characters in children’s shows, and 38.9 percent of characters on prime time television were women.

In a summary of the study’s findings, the researchers reported that they found a lack of aspirational female role models in all three media categories, and cited five main observations: female characters are sidelined, women are stereotyped and sexualized, a clear employment imbalance exists, women on TV come up against a glass ceiling, and there are not enough female characters working in STEM fields.

This reminds me of the video Killing Us Softly 4: similar images of women are spread throughout advertising and other areas. Television and movies don’t exactly depict reality but we can still ask what values they are portraying. It is not just about entertainment; sure, people want to escape from the real world from time to time but any sort of media is creating and working with values an ideas. Of course, the real values portrayed by television and movies may be consumerism (for example, in the latest Bond film) and making money.

TV shows for teenagers show professors as “old, boring, white, and mean”

Here is how college professors are portrayed on television shows for teenagers: “old, boring, white, and mean.”

They may be fictional characters, but their small-screen images may affect students in big ways, says one researcher. Barbara F. Tobolowsky, an assistant professor of educational leadership and policy studies at the University of Texas at Arlington, found that television’s image of the professor is intimidating, uninterested, and generally old, boring, and white. She is scheduled to present a working paper on her research, “The Primetime Professoriate: Representations of Faculty on Television,” this week at the annual meeting of the Association for the Study of Higher Education…

While previous studies of television have focused on how much time students spend watching TV and not studying, Ms. Tobolowsky looked at the content of those television shows. In her study, Ms. Tobolowsky, who has a master’s degree in film history and criticism and who previously worked in the film industry, analyzed 10 shows that aired from 1998 to 2010 and were geared toward the 12-to-18-year-old demographic group in the Nielsen ratings.

Scrutinizing professors in those shows, she examined characters’ clothing and ways of talking, camera angles and background music, and a variety of other film nuances to break down how enthusiastic the faculty were and how they interacted with students, along with other criteria.

On the whole, she says, professors on the television shows tended to be relatively old, white, and traditional, wearing sweater-vests and sporting graying hair. Young, female, and minority professors on the shows tended to teach only at arts-oriented institutions or community colleges. Most were intimidating or, at the very least, distant, throwing a scare into characters like Matt on 7th Heaven, who worried he’d seem weak if he asked a question in class.

This study seems to suggest that shows for teenagers depict professors as the enemy. While not all teenagers love school, I wonder if this is part of a larger message on television and in movies that the learning part of school isn’t that important while the social aspects, think of the message in Mean Girls, is what really matters. Of course, there is a genre of movies that depicts heroic teachers but these are formulaic in their own ways.

It would be interesting to compare these depictions to how professors are portrayed on shows aimed at adults. I’m reminded of the TNT show Perception that features Eric McCormarck playing a neuroscientist at a Chicago area university. (Disclosure: I know about this show because I tend to catch a few minutes of its opening after watching Major Crimes which I watch because of The Closer.) The show tends to open in this way: McCormack is at the front of the classroom that is full of eager students who are hanging on his every word. At the side of the room is his trusty graduate student TA who occasionally chimes in. McCormack has scribbled all sorts of profound things on the board and then he ends class with a deep question or a witty joke. When the class ends, he quickly leaves the classroom and gets wrapped up in some fascinating case. Sound like a typical college classroom? While the professor here is depicted as a cool young guy, it is not exactly realistic to most college classrooms.

I realize what takes place day in and day out in a college classroom likely does not make scintillating television. Indeed, have you watched DVDs of The Great Courses? Yet, this doesn’t mean there isn’t something worthwhile going on in that classroom that doesn’t require severely stereotyping professors one way or the other depending on the audience.

Argument: George Lucas is the “greatest artist of our time”

Camille Paglia explains why she believes George Lucas is “the greatest artist of our time”:

Who is the greatest artist of our time? Normally, we would look to literature and the fine arts to make that judgment. But Pop Art’s happy marriage to commercial mass media marked the end of an era. The supreme artists of the half century following Jackson Pollock were not painters but innovators who had embraced technology—such as the film director Ingmar Bergman and the singer-songwriter Bob Dylan. During the decades bridging the 20th and 21st centuries, as the fine arts steadily shrank in visibility and importance, only one cultural figure had the pioneering boldness and world impact that we associate with the early masters of avant-garde modernism: George Lucas, an epic filmmaker who turned dazzling new technology into an expressive personal genre.

The digital revolution was the latest phase in the rapid transformation of modern communications, a process that began with the invention of the camera and typewriter and the debut of mass-market newspapers and would produce the telegraph, telephone, motion pictures, phonograph, radio, television, desktop computer, and Internet. Except for Futurists and Surrealists, the art world was initially hostile or indifferent to this massive surge in popular culture. Industrial design, however, rooted in De Stijl and the Bauhaus, embraced mechanization and grew in sophistication and influence until it has now eclipsed the fine arts.

No one has closed the gap between art and technology more successfully than George Lucas. In his epochal six-film Star Wars saga, he fused ancient hero legends from East and West with futuristic science fiction and created characters who have entered the dream lives of millions. He constructed a vast, original, self-referential mythology like that of James Macpherson’s pseudo-Celtic Ossian poems, which swept Europe in the late 18th century, or the Angria and Gondal story cycle spun by the Brontë children in their isolation in the Yorkshire moors. Lucas was a digital visionary who prophesied and helped shape a host of advances, such as computer-generated imagery; computerized film editing, sound mixing, and virtual set design; high-definition cinematography; fiber-optic transmission of dailies; digital movie duplication and distribution; theater and home-entertainment stereo surround sound; and refinements in video-game graphics, interactivity, and music.

Read the entire interesting argument.

Four quick thoughts:

1. This broadens the common definition of artist. It acknowledges the shift away from “high art,” the sort of music, painting, and cultural works that are typically found in museums or respectful places to “popular art” like movies and music.

2. The argument doesn’t seem to be that Lucas is the best filmmaker or best storyteller. Rather, this is based more on his ability to draw together different cultural strands in a powerful way. Paglia argues he brought together art and technology, combined stories from the past and present, promoted the use and benefits of new technologies that were influential far beyond his own films.

3. Another way to think of a “great artist” is to try to project the legacy of artists. How will George Lucas be viewed in 50 or 100 years? Of course, this is hard to do. But, part of creating this legacy starts now as people review an artist’s career though it could change with future generations. I wonder: if technology is changing at a quicker pace, does this also mean the legacy of cultural creators will have a shorter cycle? For example, if movies as we know them today are relics in 50 years, will Lucas even matter?

4. How would George Lucas himself react to this? Who would he name as the “greatest artist” of today?

Evidence: TV shows can lower fertility rates

An article about the cultural power of television discusses several studies that show TV programs can lower fertility rates:

Several years ago, a trio of researchers working for the Inter-American Development Bank set out to help solve a sociological mystery. Brazil had, over the course of four decades, experienced one of the largest drops in average family size in the world, from 6.3 children per woman in 1960 to 2.3 children in 2000. What made the drop so curious is that, unlike the Draconian one-child policy in China, the Brazilian government had in place no policy to limit family size. (It was actually illegal at some point to advertise contraceptives in the overwhelmingly Catholic country.) What could explain such a steep drop? The researchers zeroed in on one factor: television.

Television spread through Brazil in the mid-sixties. But it didn’t arrive everywhere at once in the sprawling country. Brazil’s main station, Globo, expanded slowly and unevenly. The researchers found that areas that gained access to Globo saw larger drops in fertility than those that didn’t (controlling, of course, for other factors that could affect fertility). It was not any kind of news or educational programming that caused this fertility drop but exposure to the massively popular soap operas, or novelas, that most Brazilians watch every night. The paper also found that areas with exposure to television were dramatically more likely to give their children names shared by novela characters.

Novelas almost always center around four or five families, each of which is usually small, so as to limit the number of characters the audience must track. Nearly three quarters of the main female characters of childbearing age in the prime-time novelas had no children, and a fifth had one child. Exposure to this glamorized and unusual (especially by Brazilian standards) family arrangement “led to significantly lower fertility”—an effect equal in impact to adding two years of schooling.

In a 2009 study, economists Robert Jensen and Emily Oster detected a similar pattern in India. A decade ago, cable television started to expand rapidly into the Indian countryside, where deeply patriarchal views had long prevailed. But not all villages got cable television at once, and its random spread created another natural experiment. This one yielded extraordinary results. Not only did women in villages with cable television begin bearing fewer children, as in Brazil, but they were also more able to leave their home without their husbands’ permission and more likely to disapprove of husbands abusing their wives, and the traditional preference for male children declined. The changes happened rapidly, and the magnitude was “quite large”—the gap in gender attitudes separating villages introduced to cable television from urban areas shrunk by between 45 and 70 percent. Television, with its more progressive social model, had changed everything.

Four quick thoughts:

1. Such shows (TV and radio) have been used deliberately by public health organizations to fight AIDS. It is one thing to hold training sessions and open and maintain clinics but it is another to have successful soap operas that promote certain behaviors.

2. These situations provided some fascinating natural experiments. I occasionally ask students this very question: how might you set up a natural experiment to test the effects of television? In the United States, outside of some ultra-controlled environment a la The Truman Show, it is difficult to quickly answer this question.

3. Sociologist Juliet Schor nicely explains the mechanism behind this in The Overspent American. Mass media presents average residents a new, commonly known reference group to which they can compare themselves. Instead of primarily comparing themselves to neighbors or acquaintances, viewers started seeing what “middle-class” or “normal” look like on television and then work to emulate that.

4. Media output is not simply entertainment – something is being promoted. Being able to watch and experience this critically is crucial in a world awash with media and information.

Sociologist: Oscars are “insiders rewarding insiders”

As people watching the Oscars last night might have wondered what some some of the winning films were about (Best Picture winner, The Artist, has taken in just over $31 million at the box office), a sociologist argues that the Oscars represent “insiders rewarding insiders”:

“The annual Oscars are a vital component of our cultural machinery, not only reflecting taste but producing it – and thereby creating profit for moviemakers,” says Ben Agger, director of the Center for Theory in the University of Texas at Arlington’s sociology department, in an e-mail. “The voters are insiders rewarding insiders.”…

A Los Angeles Times report found that 94 percent of Academy members are white and 77 percent are male, with blacks making up only 2 percent and Latinos less than that. The median age of Oscar voters is 62, with just 14 percent under 50 years old.

This has led to accusations of gender and race bias. But Charles Bernstein, who for 10 years was chairman of the Academy Award rules committee, is a bit tired of the yearly accusations that come AMPAS’s way.

“The Academy is not a democracy but a meritocracy,” he says.

The job of the Academy is not to reflect but to lead, he adds. These are great professionals who have achieved distinction in motion picture-making, and they are merely saying, “Here is what we most respect.’”

This is a classic culture question: does culture reflect society (perhaps the organizations and social conditions or the demands of consumers)? Or put another way, should cultural products be rewarded for being popular or being the best or outside of the box?

This could be viewed as a gatekeeper issue: who gets to decide the merits of a cultural product? I suspect the battle between “mass culture” and “high culture” will not be settled anytime soon. At this point, what would Hollywood gain by changing the current system? The Oscars are popular television and there still are enough blockbusters for Hollywood to keep moving forward. At the Oscar gathering I attended, another attendee and I were thinking through an award titled “the movie American movie-goers loved the most,” perhaps marked by the box office winner or some votes from people who actually attended the movies (perhaps like the older system of doing all-star balloting at sporting events). I also wouldn’t be surprised if the Oscars found a way to include some voting input from the public, even if it was more symbolic than anything else. Perhaps their solution right now is to include enough popular films (like Bridesmaids) and celebrities (like Tom Cruise, Jennifer Lopez) in the show to keep people happy even though the popular people aren’t going to win.

If we truly are headed toward a more individualistic, more culturally diffuse world, we might expect that the Oscars and Grammys and all sorts of cultural gatekeepers (officials reviewers, critics, etc.) will face more trouble. This would not only be an issue of whether a majority of a culture actually experiences significant works (an interesting question in itself) but whether the public actually cares about what the gatekeepers think (why watch the Oscars if they don’t even talk about movies that most people see?). I don’t think we are close to the end of the gatekeepers but this is going to continue to be a fault line to watch.

Was there really cultural consensus in America in 1963?

Virginia Postrel takes issue with one recent claim from Charles Murray that 1963 America was some sort of golden era of cultural consensus. Postrel raises two counterpoints:

There are two big problems with this fable [of cultural consensus]. The first is that the old consensus was an illusion. Editing out anomalies was essential to the whole concept of a single culture as defined not merely by basic values but by taste and experience. Some of those anomalies were huge.

Take religion, a topic that looms large in Murray’s analysis. In 1976, Gallup for the first time asked people whether they had had a “born again” experience in which they committed themselves to Jesus Christ. It was a concept largely unknown to the popular media before the emergence of Jimmy Carter…

That’s the second problem with Murray’s fable: The cultural consensus was not just an illusion. It was an unhealthy one. Instead of promoting understanding, it fed contempt.

One piece of evidence is right on page 2 of the book: “The Beverly Hillbillies,” the highest-rated TV show the week Kennedy was killed. As Murray points out, nearly a third of American households watched it on CBS every week — astounding numbers by today’s standards. “The Beverly Hillbillies” was not just popular. It was, by most measures, the biggest hit in sitcom history. By its fourth week on the air, it had knocked Lucille Ball out of her top spot, and it only fell from the top 10 in its ninth and final season. It even saved “The Dick Van Dyke Show,” a flop in its original slot, by providing a big lead-in audience in an era when it was hard to change the channel. In a true consensus culture, everyone would have loved it…

Critics damned “The Beverly Hillbillies” as utter trash. The New York Times called it “steeped in enough twanging-guitar, polkadot gingham, deliberative drawl, prolific cousins and rural no-think to make each half hour seem as if it contained 60 minutes.” Variety declared it “painful to sit through.” Newsweek said it was “the most shamelessly corny show in years.”

So while Murray wants to tell a story of a dividing America, Postrel is suggesting there has always been an America divided between the elites and the masses. It seems to me that there would be ways to collect data to answer this question about whether the divide today is more pronounced than in the past and whether it is more problematic today than in the past.

This reminds me of all the suburban critiques that quickly emerged after World War II. While there are indeed viable issues to raise about suburban life (whether it is a good use of land and resources, whether it could be planned better, whether concessions could be made so that it is accessible by more than just cars, whether it could offer opportunities for the elderly and teenagers, that it should be welcoming to all people, etc.), there is also some scorn in this analysis. There was a lot of concern about “mass culture,” how the average American was being tempted by low-brow culture. Marxist commentators labeled this as the trade-off of “lawns for pawns.” These viewpoints tended to come from upper-class, urban commentators who couldn’t understand why so many Americans wanted the suburban lifestyle that these commentators argued was simply a glittering facade with no depth. One sociologist who jumped into this fray was Herbert Gans. Writing about the suburban experience (after living in Levittown, unlike many of the negative commentators) or popular culture, Gans debunked some of the myths. Using sociological data and theory, Gans poked holes in some of this commentary, suggesting that perhaps society wasn’t rapidly unraveling and that we were all doomed to live in the land of Idiocracy.

This is a reminder of a few things:

1. Analyzing American culture all at once is a tough task that requires good data and nuance.

2. Closing this gap between high and low culture may be a worthy task but it is not an easy one.

3. America will have to move forward while balancing these multiple perspectives of high and low culture. Either side demonstrating contempt for the other (think about attacks on “academic elites” or “mass culture”) isn’t helpful.