Researchers adjust as Americans say they are more religious when asked via phone versus responding online

Research findings suggest Americans answer questions about religiosity differently depending on the mode of the survey:

Photo by mentatdgt on Pexels.com

Researchers found the cause of the “noise” when they compared the cellphone results with the results of their online survey: social desirability bias. According to studies of polling methods, people answer questions differently when they’re speaking to another human. It turns out that sometimes people overstate their Bible reading if they suspect the people on the other end of the call will think more highly of them if they engaged the Scriptures more. Sometimes, they overstate it a lot…

Smith said that when Pew first launched the trend panel in 2014, there was no major difference between answers about religion online and over the telephone. But over time, he saw a growing split. Even when questions were worded exactly the same online and on the phone, Americans answered differently on the phone. When speaking to a human being, for example, they were much more likely to say they were religious. Online, more people were more comfortable saying they didn’t go to any kind of religious service or listing their religious affiliation as “none.”…

After re-weighting the online data set with better information about the American population from its National Public Opinion Reference Survey, Pew has decided to stop phone polling and rely completely on the online panels…

Pew’s analysis finds that, today, about 10 percent of Americans will say they go to church regularly if asked by a human but will say that they don’t if asked online. Social scientists and pollsters cannot say for sure whether that social desirability bias has increased, decreased, or stayed the same since Gallup first started asking religious questions 86 years ago.

This shift regarding studying religion highlights broader considerations about methodology that are always helpful to keep in mind:

  1. Both methods and people/social conditions change. More and more surveying (and other data collection) is done via the Internet and other technologies. This might change who responds, how people respond, and more. At the same time, actual religiosity changes and social scientists try to keep up. This is a dynamic process that should be expected to change over time to help researchers get better and better data.
  2. Social desirability bias is not the same as people lying to researchers or being dishonest with researchers. That implies an intentional false answer. This is more about context: the mode of the survey – phone or online – influences who the respondent is responding to. And with a human interaction, we might respond differently. In an interaction, we with impression management in mind where we want to be viewed in particular ways by the person with whom we are interacting.
  3. Studying any aspect of religiosity benefits from multiple methods and multiple approaches to the same phenomena under study. A single measure of church attendance can tell us something but getting multiple data points with multiple methods can help provide a more complete picture. Surveys have particular strengths but they are not great in other areas. Results from surveys should be put alongside other data drawn from interviews, ethnographies, focus groups, historical analysis, and more to see what consensus can be reached. All of this might be out of the reach of individual researchers or single research projects but the field as a whole can help find the broader patterns.

Claim of social desirability bias in immigration polls

Social desirability bias is the idea that people responding to surveys or other forms of data collection will say the socially correct answer rather than what they really think. A sociologist argues that this is the case for immigration polls:

A Gallup survey taken last year found 45 percent believe immigration should be decreased, compared to 17 percent saying it should be increased and 34 percent saying it should be kept at present levels. But should such figures be taken at face value? University of California, Berkeley, sociologist Alexander Janus argues not. Using a polling technique designed to uncover hidden bias, he concluded about 61 percent of Americans support a cutoff of immigration. Janus, who published his findings in the journal Social Science Quarterly, argues that “social desirability pressures” lead many on the left to lie about their true feelings on immigration — even when asked in an anonymous poll. In an interview, he discussed the survey he conducted in late 2005 and early 2006:

THE SURVEY: “The survey participants were first split into two similar groups. Individuals in one of the groups were presented with three concepts — ‘The federal government increasing assistance to the poor,’ ‘Professional athletes making millions of dollars per year,’ and ‘Large corporations polluting the environment’ — and asked how many of the three they opposed. Individuals in the second group were given the same three items as individuals in the first group, plus an immigration item: ‘Cutting off immigration to the United States.’ They were asked how many of the four they opposed. The difference in the average number of items named between the two groups can be attributed to opposition to the immigration item. The list experiment is superior to traditional questioning techniques in the sense that survey participants are never required to reveal to the interviewer their true attitudes or feelings.”…

I estimated that about 6 in 10 college graduates and more than 6 in 10 liberals hide their opposition to immigration when asked directly, using traditional survey measures.”

This sounds like an interesting technique because as he mentions, the respondents never have to say exactly which ideas they are opposed to.

In the long run for immigration policy, does it matter that much for liberals if people are secretly against immigration if they are willing to support it publicly? Of course, it could influence individual or small group interactions and how willing people are to participate in rallies and public events. But if people are still willing to vote in a socially desirable way, is this good enough?

I wonder if there are other numbers out there that are influenced by social desirability bias…