Argument: The Myth of ‘I’m Bad at Math’

Two professors argue being good at math is about hard work, not about genetics:

We hear it all the time. And we’ve had enough. Because we believe that the idea of “math people” is the most self-destructive idea in America today. The truth is, you probably are a math person, and by thinking otherwise, you are possibly hamstringing your own career. Worse, you may be helping to perpetuate a pernicious myth that is harming underprivileged children—the myth of inborn genetic math ability…

Again and again, we have seen the following pattern repeat itself:

  1. Different kids with different levels of preparation come into a math class. Some of these kids have parents who have drilled them on math from a young age, while others never had that kind of parental input.
  2. On the first few tests, the well-prepared kids get perfect scores, while the unprepared kids get only what they could figure out by winging it—maybe 80 or 85%, a solid B.
  3. The unprepared kids, not realizing that the top scorers were well-prepared, assume that genetic ability was what determined the performance differences. Deciding that they “just aren’t math people,” they don’t try hard in future classes, and fall further behind.
  4. The well-prepared kids, not realizing that the B students were simply unprepared, assume that they are “math people,” and work hard in the future, cementing their advantage.

Thus, people’s belief that math ability can’t change becomes a self-fulfilling prophecy.

Interesting argument: if you believe you can’t do well at a subject, you probably won’t. The authors then go on to hint at broader social beliefs: Americans tend to believe in talent, other countries tend to emphasize the value of hard work.

This lines up with what I was recently reading about athletes in The Sports Gene. The author reviews a lot of research that suggests training and genetics both matter. But, genetics may not matter in the way people typically think they do – more often, it matters less that people are “naturally gifted” and more that some learn quick than others. So, the 10,000 hours to become an expert, an idea popularized by Malcolm Gladwell, is the average time it takes one to become an expert. However, some people can do it much more quickly, some much more slowly due to their different rates of learning.

United States now #1 oil-producing country in the world

The United States is again #1 in oil production, passing Saudi Arabia:

The United States has overtaken Saudi Arabia to become the world’s biggest oil producer as the jump in output from shale plays has led to the second biggest oil boom in history, according to leading U.S. energy consultancy PIRA.

U.S. output, which includes natural gas liquids and biofuels, has swelled 3.2 million barrels per day (bpd) since 2009, the fastest expansion in production over a four-year period since a surge in Saudi Arabia’s output from 1970-1974, PIRA said in a release on Tuesday…

Last month, China surpassed the United States as the largest importer of crude, according to the U.S. government, as the rise of domestic output cuts the U.S. dependence on overseas oil.

“(The U.S.) growth rate is greater than the sum of the growth of the next nine fastest growing countries combined and has covered most of the world’s net demand growth over the past two years,” PIRA Energy Group wrote.

Three quick thoughts:

1. People don’t often think of United States as having lots of oil though the natural resources within the US have been important throughout its history. With this new information, does this change how US residents and others around the world view the US? Does it then change how the US views the Middle East and other nations with lots of oil?

2. The article notes that this was the fastest production increase in over four years. The average person may not be terribly aware of this but those opposed to fracking should be able to use this info: this is quite a rapid change.

3. When will peak oil really arrive? One article recently suggested this oil boom is not the last; there is more untapped oil in the oceans. As the article suggests, this supply may make it even more difficult to talk about the impact of oil on the environment.

Internet commenters can’t handle science because they argue by anecdote, think studies apply to 100% of cases

Popular Science announced this week they are not allowing comments on their stories because “comments can be bad for science”:

But even a fractious minority wields enough power to skew a reader’s perception of a story, recent research suggests. In one study led by University of Wisconsin-Madison professor Dominique Brossard, 1,183 Americans read a fake blog post on nanotechnology and revealed in survey questions how they felt about the subject (are they wary of the benefits or supportive?). Then, through a randomly assigned condition, they read either epithet- and insult-laden comments (“If you don’t see the benefits of using nanotechnology in these kinds of products, you’re an idiot” ) or civil comments. The results, as Brossard and coauthor Dietram A. Scheufele wrote in a New York Times op-ed:

Uncivil comments not only polarized readers, but they often changed a participant’s interpretation of the news story itself.
In the civil group, those who initially did or did not support the technology — whom we identified with preliminary survey questions — continued to feel the same way after reading the comments. Those exposed to rude comments, however, ended up with a much more polarized understanding of the risks connected with the technology.
Simply including an ad hominem attack in a reader comment was enough to make study participants think the downside of the reported technology was greater than they’d previously thought.

Another, similarly designed study found that just firmly worded (but not uncivil) disagreements between commenters impacted readers’ perception of science…

A politically motivated, decades-long war on expertise has eroded the popular consensus on a wide variety of scientifically validated topics. Everything, from evolution to the origins of climate change, is mistakenly up for grabs again. Scientific certainty is just another thing for two people to “debate” on television. And because comments sections tend to be a grotesque reflection of the media culture surrounding them, the cynical work of undermining bedrock scientific doctrine is now being done beneath our own stories, within a website devoted to championing science.

In addition to rude comments and ad hominem attacks leading to changed perceptions about scientific findings, here are two common misunderstandings of how science works often found in online comments (these are also common misconceptions offline):

1. Internet conversations are ripe for argument by anecdote. This happens all the time: a study is described and then the comments are full of people saying that the study doesn’t apply to them or someone they know. Providing a single counterfactual usually says very little and scientific studies are often designed to be as generalizable as they can be. Think of jokes made about global warming: just because there is one blizzard or one cold season doesn’t necessarily invalidate a general trend upward for temperatures.

2. Argument by anecdote is related to a misconception about scientific studies: the findings do not often apply to 100% of cases. Scientific findings are probabilistic, meaning there is some room for error (this does not mean science doesn’t tell us anything – it means it is hard to measure and analyze the real world – and scientists try to limit error as much as possible). Thus, scientists tend to talk in terms of relationships being more or less likely. This tends to get lost in news stories that suggest 100% causal relationships.

In other words, in order to have online conversations about science, you have to have readers who know the basics of scientific studies. I’m not sure my two points above are necessarily taught before college but I know I cover these ideas in both Statistics and Research Methods courses.

“Why do we believe North America’s biggest cities are dangerous when they are, in fact, among the safest places in the world?”

Just how dangerous are North American cities?

Why do we believe North America’s biggest cities are dangerous when they are, in fact, among the safest places in the world? In large part, because it was once true: For most of the 20th century (and a good part of the 19th), our big cities really were dangerous. Murders, muggings, armed robberies and sexual assaults were big-city phenomena, and the way to escape physical danger was to move away. Today, the opposite is true.

If you really want to find murder city, you need to get out of North America. The most violent cities in the world are places that used to be small and peaceful, but have very recently become huge cities. And no wonder: The cities of the Southern and Eastern hemispheres are doing today what our cities did a century ago: Absorbing huge, formerly rural populations. In 50 years, Kinshasa has grown from 500,000 to 8 million people; Istanbul from 900,000 to 12 million…

“The twenty-first century,” it concludes, “is witness to a crisis of urban violence.” The two billion people becoming city-dwellers are facing the “urban dilemma” – they realize that moving to the city is an improvement in their lives by most known measures, but it does expose them to greater risk and danger. So while urbanization has cut world poverty in half and lifted billions out of starvation, the hives of crime and danger in the city are preventing the next step into prosperity: “This dark side of urbanization threatens to erase its potential to stimulate growth, productivity and economic dividends.”

By no means is this inevitable. Cities are not naturally more violent: Yes, Caracas and Cape Town have horrendous murder rates. On the other hand, very densely-populated cities such as Dhaka and Mumbai have rates below their national averages – they are actually safer places to live than the villages migrants are leaving behind. In poor countries, and here in the West, the really huge cities are often much safer than the small and medium-sized ones, where the real corruption and danger lie. In India, which has been galvanized by a rape crisis in the fast-urbanizing north, new research shows that rates of sexual assault and rape remain higher in rural areas. And we have learned from Brazil and South Africa that big, bold interventions can make dangerous cities safer.

It is helpful to keep a global perspective on this issue. What counts as violent is relative: Americans tend to compare their cities to other American big cities, perhaps within regions or to the biggest cities in the country.

Another reason our big cities are seen as violent: urban violence is a consistent media story, even as violent crime rates have dropped in many cities.

Unrest in Paris suburbs highlights changes in suburbs around the world

Tensions ran high last week in a Paris suburb as immigrants reacted to their economic and social conditions:

Weekend violence outside Paris triggered by France’s controversial veil ban has highlighted how tensions with the Muslim community are adding to an already-volatile mix of poverty and alienation in the country’s blighted suburbs.

The unrest in the Paris suburb of Trappes erupted after a man was arrested for allegedly attacking a police officer who stopped his wife over wearing a full-face veil in public.

Feelings of anti-Muslim discrimination, coupled with unemployment and tensions with police are creating an “explosive” mix in the suburbs, said Veronique Le Goaziou, a sociologist and expert on urban violence in France…

A few kilometres (miles) from the Chateau de Versailles, Trappes is a poor city of 30,000 surrounded by wealthy neighbours. In 2010, half the households lived on less than 13,400 euros ($17,600) a year and unemployment was at 15 percent.

“This is a terrifyingly common situation,” said sociologist Michel Kokoreff. “We are in an area that has problem after problem, where people have a profound feeling of abandonment.”

This is a reminder that the monoculture view of suburbs, that they primarily consist of the middle- to upper-classes around the world living in isolated communities, is simply not the case in many places.  American suburbs are increasingly diverse (recent posts: more poor residents, more aging residents, more immigrants looking for opportunities) and suburbs outside of many European cities have been poorer from the beginning (though the increasing religious diversity is of more recent decades). All together, there are plenty of suburban problems for American and French suburbs to address. The actions taken (or not taken) have the potential to set the course individual suburbs but also suburbs as a whole for decades to come.

New TV shows with young adults feature unrealistically large city apartments

The dwellings of many young adults on television are quite large:

Plenty of things are unrealistic about television: No iconic moment in my life has ever been accompanied by Ellie Goulding’s “Anything Can Happen,” despite how much I wish it were. But the perpetual tiny-but-annoying quirk that most shows are guilty of is the unemployed twentysomething with a fabulous apartment. I’m onto you, Girls: No matter how much junk you throw around in Marnie and Hannah’s onetime-shared living space, it doesn’t hide the fact that they’ve got a ton of room. I live in New York City; I know you’re lying to me.

This isn’t anything new, of course. The go-to example is usually Carrie Bradshaw and her ridiculous Manhattan apartment with its gorgeous walk-in closet full on Manolos when her only source of explained income was a weekly newspaper column. But while everyone loves some good 1998 nostalgia (the Friends’ West Village apartments are another egregious example), the trend of the unbelievably large home isn’t fading away.

I’m not simply talking about gorgeous, jealousy-inspiring apartments; I totally get and buy into the fact that say, Dr. Lahiri from The Mindy Project would have an awe-worthy living space to bring all of her meet-cute boyfriends. What I can’t get behind is recent shows like dearly-departed Happy Endings (perpetually unemployed Max’s “gross loft” in Chicago is gorgeous) or 2 Broke Girls‘ Williamsburg, Brooklyn, apartment (They’re supposed to actually be broke, not heiresses!) where the characters ostensibly “have no money,” yet are somehow chilling around complaining about said fact in an abode that would retail for hundreds of thousands of dollars.

Sure, this is a minor issue. None of this is getting in the way of my enjoyment of all of these shows. But there is some point during each of these programs’ respective runs — often more than once — where I’ll laugh out loud at the sheer ridiculousness of it. It’s all I can do; I can’t change the channel: basically all shows with twenty-something characters are guilty of this. Weirdly enough, the most realistic living set-up on television right now might be the Big Brother house, with all 16 of its residents fighting in a Hunger Games of sorts for limited bed space.

Several quick thoughts:

1. If Big Brother is perceived to be more realistic, these other shows may have some problems.

2. Of the examples cited above, most of the urban apartments are in New York City with one in Chicago (Happy Endings). Manhattan and some of the surroundings areas are some of the most expensive areas in the country so the housing situation as portrayed on TV is really unrealistic. At the same time, TV shows with young adults in places like Atlanta or Houston or Dallas or some other cheaper markets could feature bigger apartments without losing all realism.

3. This is not a new phenomenon on TV. In The Overspent American, sociologist Juliet Schor talks about the expanding middle-class lifestyle on television in the later decades of the 20th century. As the years went by, middle-class people on TV had more and more material goods, larger houses, and had fewer concerns about work and money. Schor then argues that TV contributed to changing perceptions among Americans in what they needed to own to have “the good life.”

4. Shows on channels like HGTV don’t help. It seems like every show features a person looking for the most-updated features. Granted, their price range varies quite a bit but the homes tend to be on the larger side. This is simply unrealistic for many emerging adults.

5. There is potential here for some TV shows to work with more size-appropriate dwellings. How about a show about young people revolving around micro-apartments? How about bringing back the starter home on TV?

Mapping the boundaries of the Midwest

Mapping the Midwest has now become a crowdsourced project through asking “What’s the Midwest to you?”

That’s the question design and planning firm Sasaki Associates is asking visitors to its new exhibit, “Reinvention in the Urban Midwest,” which opens at the Boston Society of Architects (BSA) Space this week. The project includes an interactive survey that contains a timeless challenge: Draw the geographic boundaries of what counts as the U.S. Midwest…

Judging by the maps drawn by others and myself, it appears Montana, Wyoming, Colorado, Kentucky, Arkansas, and Oklahoma are the states of most contention. I personally felt I had no choice but to cut some of them in half. Perhaps the correct answer is still the textbook answer: the states of most intensified yellow (at least as identified by those who’ve lived in the Midwest the longest) make up the U.S. Census Bureau’s definition of the Midwest: Illinois, Indiana, Michigan, Ohio and Wisconsin to the east, plus Iowa, Kansas, Minnesota, Missouri, Nebraska, North Dakota and South Dakota to the west.  (As a commenter pointed out, cartographer and historian Bill Rankin has also done a Midwest mapping project, in which he overlaid 100 different maps of the Midwest and made the confounding observation that  “no area that was included on every single map”.)

So the geographic boundaries of what most Americans consider the Midwest aren’t exactly clear, but Sasaki has also included another set of maps that reveal a much less murky truth: the Midwest has urbanized in a vastly different way from the rest of the United States. The graphic below maps out the population densities found in urban areas from four U.S. regions in 2010 (a darker shade signifies a larger, denser population)…

The Midwest is characterized by small but strong urban centers that transition sharply to rural surroundings. This pattern has of course grown from the region’s historical focus on agricultural land use. Sasaki’s recent work in Iowa suggests a continued population decline in rural areas but growing population density in more urban areas. However, the growth of urban areas in the Midwest is not uniform. The firm has further identified that agricultural cities in the plains sub-region, such as Des Moines, Iowa, or Lincoln, Nebraska, are indeed growing due to factors like de-ruralization and in-migration to city centers, while traditionally heavy-industry cities in the forest sub-region, such as Milwaukee or St. Louis, are still losing population.

One takeaway: the Midwest is a fuzzily-defined entity that perhaps has more to do with perceptions and culture than it does with exact geography. This would be aided by then asking people who drew the maps to then type in words they associate with the Midwest. I like the contrasting maps between those who have spent more of their lives in the Midwest versus those who have not: there are some clear differences.

The connection between farmland and cities is a good catch. Big cities like Chicago or Omaha were (and still are) intimately connected to agricultural commodities that needed to be distributed and sold through the big cities. For example, if you look at the early railroad construction in the Chicago region, much of it was linked to shipping products from the plains, everything from wheat (southwestern Wisconsin) to lead (Galena) to then distribute further east. Or look at the trading of commodities in places like Chicago and the creation of new kinds of markets. Even though there are big gaps between the Chicago area and the rest of Illinois – they operate as very different worlds – both would strongly consider themselves part of the same region, even if they can’t speak to the deeper ties that connect them.

Social inertia in time use between the 1960s and today

A sociologist who has examined recent time use surveys suggests not much has changed since the 1960s:

John Robinson, a sociology professor from the University of Maryland whose research has focused heavily on Americans’ time use, said the most striking aspect of the latest American Time Use Survey is how closely it resembles similar information from before the 2008 recession — and from as early as the 1960s when time-use surveys first came into being.

The annual Bureau of Labor Statistics publication documents how Americans spend their time. In 2012, employed people worked for about 7.7 hours each day, spent two hours on household chores and took between five and six hours on leisure activities, with close to three of those hours spent plopped in front of the television…

Although today’s Americans spend their time similarly to their counterparts in the decade of discontent, Mr. Robinson noted some important changes in the by-the-minute breakdown. Men and women spend much more equal amounts of time at work, on housework and on leisure activities than they did in the 1960s.

Time spent watching TV has inched upward with every passing year, and although Mr. Robinson expected Internet use to slowly eat into TV time, the Web has yet to take up a large chunk of Americans’ time. The latest survey found men and women both spend less than 30 minutes of leisure time per workday on the computer.

Regardless, both Internet and TV use fall into the same category of activity: sedentary behavior.

This sounds like a good example of persistent social patterns. Without any official guidelines or norms about how people should spend their time, people are living fairly similarly to how they did in the 1960s. If daily life hasn’t changed much, perhaps it is more important to ask people’s perceptions about their time use. Do they feel better today about how they spend their days compared to fifty years ago? These perceptions are shaped by a number of factors, including generational changes where the younger adults of the 1960s are now the older adults of today.

The easier target for analysis: did people in the past expect that the people of the future would spend their time watching TV? I doubt it. At the same time, it suggests television has some staying power as a form of entertainment and information.

NHTSA asks states to wait on driverless cars

The federal government is asking states to wait on approving self-driving cars for general use until more research can be conducted:

The National Highway Traffic Safety Administration unveiled new recommendations to states for self-driving cars, urging them to be used only for testing and to require safeguards to ensure they can be taken over by a driver in the case of a malfunction.

NHTSA also said it was embarking on a four-year research effort on self-driving or autonomous vehicles as it considers requiring features like automatic braking, in which the car takes action to prevent crashes.

“We believe there are a number of technological issues as well as human performance issues that must be addressed before self-driving vehicles can be made widely available,” NHTSA said in its 14-page automated driving policy statement. “Self-driving vehicle technology is not yet at the stage of sophistication or demonstrated safety capability that it should be authorized for use by members of the public for general driving purposes. Should a state nevertheless decide to permit such non-testing operation of self-driving vehicles, at a minimum, the state should require that a properly licensed driver (i.e., one licensed to drive self-driving vehicles) be seated in the driver’s seat and be available at all times in order to operate the vehicle in situations in which the automated technology is not able to safely control the vehicle.”

NHTSA says as self-driving cars improve, they will reconsider. NHTSA says self-driving cars being tested in California, Florida and Nevada by Google and Audi of America should have the capability of detecting that their automated vehicle technologies have malfunctioned “and informing the driver in a way that enables the driver to regain proper control of the vehicle.” The Michigan Legislature is also considering allowing self-driving car testing…

Safety on the roads is an important concern but I’d be interested to see how much testing it might take for the government to approve self-driving cars. And, even if the safety appears to works out fairly quickly, will it take more time to reassure the public that such cars are safe?

It would also be interesting know how alert drivers are going to have to be while not driving. If the driver needs to be alerted to retake control, how relaxing is not driving going to be?

New public relations push for public housing

Here is a new public relations initiative for public housing:

A new public relations initiative called ReThink is trying to change those attitudes. Funded by Housing Authority Insurance, Inc., which provides insurance to public and affordable housing projects, ReThink aims to educate Americans about the benefits of public housing not only for the people who live in it, but for society as a whole.

Perceptions of public housing, according to research funded as part of the ReThink project, are a jumble of preconceptions and contradictory attitudes. Sixty-three percent of those surveyed say they would support public housing in their communities, but 53 percent don’t want to live close to it. Sixty-one percent believe that public housing has some positive impact on its residents, but nearly a third of respondents (31 percent) don’t think public housing residents are hard-working members of society…

Advocates, she says, need to educate “Joe Six-Pack” on how public housing should be one of those priorities for the nation’s cities, because it encourages stability and community among America’s neediest residents.

To that end, on ReThink’s website, you’ll find first-person stories from public-housing residents whose lives have been transformed by the availability of public housing. The highly produced two- to three-minute spots cut against the popular image of public housing residents as unemployed, directionless, and without ambition.

See the ReThink website with the videos here.

Sounds interesting but this is a tough sell for many Americans. It may be easier to convince people that public housing is needed for a small portion of American residents (currently less than 1% of the US population according to ReThink) but it becomes much harder to suggest some or more money should go toward it or that these public housing developments should be located anywhere near middle- and upper-class residents. The stigma is hard to overcome, even with positive stories today as well as positive stories of the past like featured in The Pruitt-Igoe Myth where past residents talked about what a beautiful place housing projects once were.

Also, ReThink doesn’t offer much on their website about what this public housing will look like. Are we talking mixed-income developments? Scattered-site housing? These details could go a long way toward the success or failure of a public relations push.

I am curious to see how people react to this…