Scientific misinformation flows through online echo chambers

New research examines how scientific misinformation is dispersed:

Research published this week in the journal Proceedings of the National Academy of Sciences maps out the factors that influence the spread of scientific misinformation and skepticism within online social networks — and the findings were disturbing.

“Our analysis shows that users mostly tend to select content according to a specific narrative and to ignore the rest,” Dr. Walter Quattrociocchi, a computer scientist at the IMT Institute for Advanced Studies in Italy and one of the study’s authors, told The Huffington Post in an email. Users are driven to content based on the brain’s natural confirmation bias — the tendency to seek information that reinforces pre-existing beliefs — which leads to the formation of “echo chambers,” he said…

For the study, the researchers conducted a quantitative analysis of articles shared on Facebook related to either conspiracy theories or fact-based science news. They found that users tended to cluster within homogenous, polarized groups, and within those groups, to share the same types of content, perpetuating the circulation of similar ideas.

Is the problem echo chambers or believing misinformation (when certain people want you to believe something else)? The way this article in the Huffington Post is written, it suggests that conservatives get stuck in these echo chambers – particularly for an issue like climate change – and don’t have a chance to engage with the real information. Something then needs to be done to break into or out of these echo chambers. Once people are exposed to ideas beyond the cluster of people like them, they will then find the truth. But, it may not work exactly this way:

  1. What if people actually are exposed to a range of information and still believe certain things? Exposure to a range of ideas is not necessarily a guarantee that people will believe the right things.
  2. How does the echo chamber participation on the conservative side compare with the echo chamber influence on the liberal side? The research study found echo chambers on both sides – the conspiracy and the science sides. Humans tend more toward people like them, a phenomenon called homophily, as found in numerous network studies. Are we worried generally that people might be too influenced by echo chambers (and not figuring out things for themselves) or are more worried that people have the correct ideas? Depending on one’s perspective on a particular issue, echo chambers could be positive or negative influences.

Social scientists critique Facebook’s study claiming the news feed algorithm doesn’t lead to a filter bubble

Several social scientists have some concerns about Facebook’s recent findings that its news feed algorithm is less important than the choices of individual users in limiting what they see to what they already agree with:

But even that’s [sample size] not the biggest problem, Jurgenson and others say. The biggest issue is that the Facebook study pretends that individuals choosing to limit their exposure to different topics is a completely separate thing from the Facebook algorithm doing so. The study makes it seem like the two are disconnected and can be compared to each other on some kind of equal basis. But in reality, says Jurgenson, the latter exaggerates the former, because personal choices are what the algorithmic filtering is ultimately based on:“Individual users choosing news they agree with and Facebook’s algorithm providing what those individuals already agree with is not either-or but additive. That people seek that which they agree with is a pretty well-established social-psychological trend… what’s important is the finding that [the newsfeed] algorithm exacerbates and furthers this filter bubble.”

Sociologist and social-media expert Zeynep Tufekci points out in a post on Medium that trying to separate and compare these two things represents the worst “apples to oranges comparison I’ve seen recently,” since the two things that Facebook is pretending are unrelated have significant cumulative effects, and in fact are tied directly to each other. In other words, Facebook’s algorithmic filter magnifies the already human tendency to avoid news or opinions that we don’t agree with…

Christian Sandvig, an associate professor at the University of Michigan, calls the Facebook research the “not our fault” study, since it is clearly designed to absolve the social network of blame for people not being exposed to contrary news and opinion. In addition to the framing of the research — which tries to claim that being exposed to differing opinions isn’t necessarily a positive thing for society — the conclusion that user choice is the big problem just doesn’t ring true, says Sandvig (who has written a paper about the biased nature of Facebook’s algorithm).

Research on political echo chambers has grown in recent years and has included examinations of blogs and TV news channels. Is Facebook “bad” if it follows the pattern of reinforcing boundaries? While it may not be surprising if it does, I’m reminded of what I’ve read about Mark Zuckerberg’s intentions for what Facebook would do: bring people together in ways that wouldn’t happen otherwise. So, if Facebook itself has the goal of crossing traditional boundaries, which are usually limited by homophily (people choosing to associate with people largely like themselves) and protecting the in-group against out-group interlopers, then does this mean the company is not meeting its intended goals? I just took a user survey from them recently that didn’t include much about crossing boundaries and instead asked about things like having fun, being satisfied with the Facebook experience, and whether I was satisfied with the number of my friends.