When television shows help interpret history

What responsibility do television shows have to accurately depicting history? Take the case of The Crown:

Photo by John-Mark Smith on Pexels.com

Historical dramas might similarly warp our attitude toward history, encouraging us to expect that cause and effect are obvious, or that world events hinge on single decisions by identifiable individuals. Academics have been trying to demolish the great-man theory of history for more than a century; television dramas put it back together, brick by brick.

What matters here is that we are having the right arguments about these ethical and dramatic decisions, not lobbing grenades at each other from opposing trenches of the culture war. Reasonable people can disagree over artistic license and the writer’s duty of care to her or his subjects. And none of this would be an issue if so many people didn’t love The Crown. Dowden is right to argue that the show is so popular that its interpretation of history will become the definitive one for millions of viewers.

That is something Netflix could mitigate, if it wanted to. Not with a pointless disclaimer, but with an accompanying documentary, rounding out the stories told in the drama. (There is a Crown podcast, featuring Morgan, but I mean something packaged more obviously alongside the main series.) There is certainly an appetite for one: Three unrelated Diana documentaries now clog up my Netflix home screen, and newspapers have published multiple articles separating fact from fiction.

Ultimately, it is not illegitimate to create narratives out of real lives. In fact, a good historical drama has to do so. But when we talk about the monarchy, modern Britain, and the legacy of divisive politicians like Thatcher, The Crown should be the start of a conversation, not the last word.

Television, and mass media more broadly, has the potential to shape how people udnerstand the world. This is not only because people find it a compelling window to the world; the sheer amount of time Americans spend watching TV on a daily basis means that television depictions have at least some influence.

Given this, it is interesting to consider whether Netflix and other producers and distributors of television should do more to depict history accurately. How possible is this? Here are a few problems that might arise:

  1. Balancing a historical drama with an accompanying documentary might help. But, documentaries are also told from particular points of view. And how many viewers will watch all of both?
  2. History is an ongoing narrative. The Crown comes from a particular point of view in a particular time that may or may not with other depictions before and after. Imagine some time passes after Queen Elizabeth dies and another director with a different vision comes along – how different is the story in facts and tone?
  3. Other mediums could present different realities in different ways. History often requires working with a variety of sources, not just visuals. How about at least giving viewers additional resources to consult?
  4. How much should TV viewers know or be expected to know about particular phenomena they observe?

Public understandings of history, academic understandings of history, and other interpretations of history have the potential to interact with and shape each other. How exactly The Crown helps shape the ongoing conversation about the monarchy, Queen Elizabeth, and all the involved actors remains to be seen – and studied.

How residents of Great Britain choose where to live

A new study looks at why people live where they do in Great Britain:

Not surprisingly, the key things that matter to people about the neighborhoods they live in include a mix of housing costs, being close to family, and proximity to where they work. More than a quarter (28 percent) of respondents cited housing costs and proximity to friends as key factors in the neighborhoods where they live, followed by the size and type of available housing (22 percent), and proximity to their workplace or their partner’s workplace (21 percent)…

The full report offers this conclusion:

Where people choose to live is largely determined by their stage of life. Young people aged between 25 and 34 prioritise proximity to the workplace, cost of housing, and access to leisure and cultural facilities when choosing where to live. Those aged between 35 and 55 tend to value access to good schools, and the  size and type of their houses. And those aged over 55 prioritise access to countryside and green space.
These preferences help to explain the differing demographics seen across cities and their surrounding areas – different parts of cities are more able to offer amenities that are prioritised by people at different stages of their lives.

Overall, it sounds like two factors matter most even with the age differences: a favorable location in regards to social necessities (jobs and relationships) and good but affordable housing. Of course, obtaining these two goals may be quite difficult for many given that: families and friends don’t necessarily prioritize living near each other as opposed to living close to work or going where the jobs are, employers tend to be concentrated in certain locations, affordable and desirable housing can be very difficult to find in many popular areas, and consumers can’t exactly find housing that is everything that they want.

If age or life stage matters so much, should planners and others really go after lifestyle sorts of communities appealing to just one group or provide options for multiple groups within individual communities?

The BBC on why many think the suburbs are boring

A sociologist suggests British suburbs are not quite as boring as some might think:

Unlike the usual presumption of suburbs as quiet, featureless places “where nothing ever happens”, recent years have seen dramatic happenings in suburbs, not least the riots of 2011 in places like Ealing and Croydon in London.

In many ways the 21st Century suburb faces some thoroughly modern problems. There is crumbling infrastructure, with hollowed out High Streets. There is pressure on public services prompted by population increases, as witnessed in the annual scramble for school places…

But far from being cultural deserts, suburbs have been a fertile breeding ground for artistic movements. It is from the nation’s Acacia Avenues that almost all post-war pop has emerged, even if its artists would rather make out that they hailed from high-rise hell and so be more “edgy”…

Suburbia has shifted to become a place of dynamism housing ethnically mixed populations, as illustrated by the 2011 Census figures, in contrast to the assumptions of uniformity.

This description could also fit some of the changes in American suburbs in recent decades. Inner-ring suburbs, adjacent to big cities, face big city problems. A number of suburbs are looking for revenue due to cuts in federal and state aid. Suburbs are often marked by single-family homes. More suburbs are seeking out cultural and entertainment opportunities, at least to provide increased tax revenues. Increasing numbers of non-whites and poorer residents now live in suburbs. In fact, the final paragraph of the op-ed seems to suggest American and British suburbs are not so different:

We should smash the stereotypes of nondescript suburbia and rather than being embarrassed by them, celebrate those places on the edges of our cities that give our nation its essential character.

The essential character of Britain is in its suburbs?

With these changes afoot, it then is interesting to consider why suburbs consider to have this image as boring. As the op-ed says, some of this is due to media portrayals of banal suburban life, whether through television sitcoms or songs by musicians railing against their suburban upbringings. It is also due to academics and other socially influential people arguing against suburbs. When I think about it, I don’t know if I would say these portrayals suggest suburbs are boring; these critiques are often more negative. Boring implies there isn’t much going on but the criticisms of suburbs range from invoking individualism, racism, materialism, classism, and other social ills.

Cultural differences: British produce popular bands, Americans produce popular solo artists

Here is an interesting musical argument: among the world’s best-selling music artists, Britain is represented by bands while the United States has mainly solo artists.

That fact conforms a rule that becomes more and more noticeable the further down you look on the list of the greatest-selling artist of all time: The biggest bands in the world are British, and the biggest solo artists are North American.

The top 20 artists, in order, are The Beatles, Michael Jackson, Madonna, Led Zeppelin, Elton John, Pink Floyd, Mariah Carey, Celine Dion, AC/DC, Whitney Houston, The Rolling Stones, Queen, ABBA, The Eagles, U2, Billy Joel, Phil Collins, Aerosmith, Frank Sinatra, and Barbra Streisand. The list is perfectly split between 10 solo artists and 10 groups. Eight of the 10 solo artists are from North America, while eight of the 10 bands are from outside America, the majority being British. Remarkably, the country that invented rock and roll has not produced any of the top seven rock bands. America’s strongest contender, in at No. 8, is often-derided soft-rock stalwarts The Eagles…

It’s hard to avoid wondering whether political/social mores play a role in the dichotomy. America, after all, likes to think of itself as a land of individualists. Elvis, Jackson, and Madonna all came from humble beginnings, surrounded by poverty and family tragedy. They epitomized the American dream, and so you might argue that the more left-leaning Europeans are happier to celebrate the collectivism of a band. If we look to what’s thought to be the most ideologically “right” genre, this theory holds true: Of the 25 greatest selling country-music stars of all time, all are solo artists. The UK’s two bestselling solo stars, meanwhile, do not fit the rags-to-riches mold of the American singers, but are rather privileged virtuosos who were in stage school from a very young age (Phil Collins, Elton John.)

But an arguably sturdier explanation lies in the way those first two giants, Elvis and The Beatles, influenced listeners, musicians, and recording industries in their respective countries. The most-talented aspiring artists on the east side of the Atlantic, from Bono to Freddy Mercury, wanted to be in a band like the Beatles. In the States and Canada everyone from Madonna to Michael Jackson wanted to be the next King.

I’m not sure I buy this final argument. After all, a number of these important early British bands like The Beatles and The Rolling Stones learned much of their craft from American solo artists like Elvis, Little Richard, Muddy Waters, and others. Every artist in America wanted to be Elvis and every British artist wanted to be like The Beatles?

Another aspect of this is that even solo artists need backing bands and collaborators. It is not like the solo artist does everything alone even if they get much of the credit. Additionally, many bands have more dominant and less dominant members. Many bands have struggled with this as members vie for attention. In the end, perhaps this is more about notions of who gets to take credit for musical achievements: the front person or the collective?

This topic seems ripe for more prolonged study. This argument is based on the top 20 artists of all time and perhaps represents a statistical anomaly compared to a broad slice of chart-toppers. And why not expand the study to other countries who might have even different musical cultures?

Sociologist discusses why the BBC’s “class calculator” can help the field of sociology

Check out the BBC’s class calculator and this argument from a sociologist about how the calculator matters for sociology:

As an academic sociologist, this take-up, while exciting, is also disconcerting. I am more used to debating social class with my academic peers than seeing the topic taken up so actively in the public arena, and it has been subject to much biting comment. We are deluged by emails complaining about how the calculator puts you in the wrong class, with the wrong labels. Eminent sociologists such as David Rose are concerned with the quality of the social science lying behind the work (do we really need Bourdieu rather than Weber?). Guy Standing is not convinced about our use of his “precariat” (precarious proletariat) term as the label for the most disadvantaged class that we uncover. There are already numerous spoofs and take-offs of the class model and its measurement. Given this furore, I want to explain what we are trying to achieve sociologically with this project. Is this a model of a new kind of accessible social science? Or is it a worrying case of pandering to media headlines?

We are relaxed about people having fun “placing” themselves and discussing this with family and friends, and arguing with us sociologists along the way. It has led to a wider collective discussion on Twitter and Facebook, which we see as a desirable resource for a public-facing sociology in a digital age. We do need to set the record straight, however. The Class Calculator was designed by the BBC to mimic the more complex model we had developed on the basis of the survey data, and the two should not be conflated. As numerous people have pointed out, changing just one response can shift you between different classes. This would not be possible within the latent class analysis we deployed, where all six measures are simultaneously used to allocate class membership. Actually, this kind of simplification was deliberate, as the measures used in the Class Calculator were chosen precisely to make respondents aware of the most important factors in placing people into classes. But it still poses questions about whether we have been simplistic.

Let me be blunt. The concept of class matters, because we need a way of connecting accentuating economic inequalities to social and cultural differences which permeate our society. Rather than seeing our lifestyles and social networks as somehow separate from economic inequalities, there are overlaps that can work together to produce social advantage and disadvantage. For all its problems, the concept of class remains fundamental to making these connections. Sure, we would all rather not live in a class-divided society. But in reality, the markers of class cannot be doubted. Our model seeks to find a way of making these connections, arguing that occupational measures alone are too blunt a tool for this purpose…

In my view, probably the most important finding from our research is the existence of a distinctive “elite” class. We are so used to turning the telescope on the poor and disadvantaged that sociologists have had little to say about those who are at the apex of British society. Sociological studies of class have no specific place for an elite category. What we have shown is that this very wealthy class is now clearly distinguished from all the other classes in Britain, and the economic differences are huge. That is a powerful and unsettling finding.

It is a simple little survey (it took me a few minutes and this was a little longer than it had to be because I was trying to do some mental conversions from dollars to pounds) but it sounds like it might have some potential for research and reflection.

I wonder how well this might work in an American setting. Compared to the United States, Britain is known for being more conscious of class. In contrast, most Americans would prefer to say they are middle class. So, what would happen if PBS or the New York Times or an equivalent news source ran such a survey? Would it be beneficial in that it could help show people where they really fall in society rather than the middle-class aspirations many claim to have?

Sociologist: lower rates of poverty the result of “robust social policy”

A profile of sociologist David Brady summarizes his arguments about how a larger welfare state limits poverty:

Brady’s 2009 book Rich Democracies, Poor People: How Politics Explain Poverty, offers an analysis of social inequality that is counter to the prevailing notion that it is an inescapable outcome of individual failings – known as the “culture of poverty” – or the result of rising unemployment. It shows that among affluent western societies there are immense variations in poverty: from almost 20% of the population in the US at one end of a scale, followed by Canada, Australia, Spain and Italy, to less than 10% at the other end – where the Scandinavian countries sit – with the UK and Germany somewhere in between.

The reason for such stark differences lies not with the numbers of single mums or jobless people but with whether a country has made larger investments in the welfare state, argues Brady. For those countries that have spent proportionately more on pensions, healthcare, family assistance and unemployment compensation – what we in Britain call the welfare state and Brady refers to as “social policy” – poverty levels will be lower…

British attitude surveys have shown a marked decline in support for redistribution since the mid-1980s, and opinion polls suggest a majority of the British public believes that the government pays out too much in benefits and that welfare levels overall should be reduced…

He challenges poverty campaigns in the UK to address head-on politicians’ concerns around benefit dependency and the so-called something for nothing culture. “Spending on social policy is something for something,” he asserts. “[It is] a social investment in the next generation – on good schools and childcare – that manages against risk by preventing people from falling into poverty. And, above all, it is a citizen’s right.”

While this profile talks about how Brady’s work fits with current British politics and government cost-cutting, I imagine he would have some commentary about the current situation in the United States.

I would be interested to hear Brady discuss whether there are trade-offs for this kind of welfare state spending or whether it really is more good than not. For example, if you spend all of that money fighting poverty, does it limit a country’s abilities to spend in other important areas?

This gets more complicated when Brady introduces the ideas of rights. In America, we often have costs-benefits arguments about government spending – can we afford it or is it worth the money spent? If we spend money in one direction, say, promoting job creation, will we get money on the other end, say less paid out in unemployment? The idea of rights shifts the discussion away from just the finances and suggests it is more about values than money.

Maybe I should just track down the book…

Sociologist: Canadians and Americans are more alike than people might think

A Canadian sociologist argues that Americans and Canadians are quite similar:

But experts suggest English Canadians — though the QMI Agency poll found we’re still divided whether stereotyping is widespread — are alike on most fronts.

In fact, so much so that most of us could blend in with our U.S. cousins, according to one scholar.

Ed Grabb, a professor in the University of British Columbia’s Department of Sociology, has begun a new course outlining how Canadians and Americans, while not identical, are more alike than most of us would have thought.

In fact, on things like attitudes toward health care, government and individuality, research has found we’re very similar.

Even differences in religion are shrinking. In 1991, Americans were 16% more likely than Canadians to take in a religious service at least once a week.

By 2006, that number had dropped to 11%.

While Grabb sees regional differences in both countries — during national elections, Quebec generally pulls Canada to the left just as the southern U.S. pulls that nation to the right — he’s also noticed a softening of old hackneyed chestnuts.

“I do think the Alberta redneck jibe is an endangered species,” Grabb said.

“I think that the assumption that all Ontarians are affluent is also going by the boards.

It would be interesting to see comparisons across the board: income, political and social views (both at home and abroad), religion, education, and consumer purchases and entertainment choices. Then, compare these to what Americans and Canadians think about each other. Why do I think Canadians would know way more about Americans than the other way around?

I also want to know how to explain this. Both the United States and Canada are settler colonies but we have different histories as Canada has had a different relationship with Great Britain in the last few centuries. Perhaps people might fall back on the frontier hypothesis since both countries pursued territorial expansion and span between two different (geographically and cultural) coasts. Perhaps today we tend to share a lot of media and cultural influences. For example, how many Americans care or would they have been able to tell without being told that Justin Bieber is Canadian. Perhaps our geopolitical position away from major international wars has led to similar ways of viewing the world. Perhaps the better way to differentiate between the countries is to refer to the “Jesusland” map where Canada joins with the East and West American coasts plus some of the Great Lakes states and red America is the south, great plains, and mountain west.

Lord Giddens as “Blair guru”

I occasionally run across stories involving Anthony Giddens, well-known sociologist, speaking about political issues in Britain. Here is another example of the actions of the “Blair guru”:

Labour peer Lord Giddens, who brought the debate on 13 October entitled Universities: Impact of Government Policy, said ministers appeared to be pursuing policies of “ill-considered, untutored radicalism” that were not based in proper research and had “imponderable outcomes”.

The academic, who advised former prime minister Tony Blair and is professor of sociology at LSE, said the reforms would leave England as a “global outrider” with one of the lowest levels of public support for higher education in the industrialised world.

He said the “ideological thrust” of the Browne Review should have been rejected and instead tuition fees only gradually raised alongside the maintenance of direct public support for universities, due to their “massive” beneficial impact on society.

“Universities are not a sort of supermarket where education can be chosen like a washing powder off the shelf. Students are not simply consumers, making day-to-day purchasing decisions. They will make a one-off decision,” he said.

Reading these stories, it seems like Giddens has more political clout than most sociologists. Is this simply a function of having been close to Tony Blair, did Giddens do specific work/research that put him in contact with politicians, or does Britain simply have a different culture regarding public intellectuals and how sociologists can be involved in social and government life?

Laughter and fun declines precipitously during the life course

A study from the University of Glamorgan found that age 52 is when “both men and women begin to suffer a sharp decline in their sense of humour and get increasingly grumpy.”

Also, humor and the laughing drops quite a bit from being an infant to being a teenager and then drops again after having kids:

The study found that while an infant can laugh aloud as many as 300 times every day, life rapidly becomes far less fun.

As Harry Enfield’s Kevin and Perry so deftly depicted, things soon change. While teenagers are the age group most likely to laugh at other people’s misfortunes, they laugh on average just six times a day.

Things get even bleaker in what should be the relatively carefree twenties, when we laugh four times a day.

This rises to five times a day throughout the thirties, when having children is cited as a major factor in restoring a sense of humour.

By the time we reach 50, Brits are laughing just three times a day, while the average 60-year-old manages a hearty guffaw just 2.5 times in the same period.

Just five or less laughs a day throughout all of adulthood? Assuming that this can be somewhat generalizable to Americans, it suggests that we need to laugh more.