Pew Research: the response rate for a typical phone survey is now 9% and response rates are down across the board
Earlier this year, Pew Research described a growing problem for pollsters: over 90% of the public that doesn’t want to participate in telephone surveys.
It has become increasingly difficult to contact potential respondents and to persuade them to participate. The percentage of households in a sample that are successfully interviewed – the response rate – has fallen dramatically. At Pew Research, the response rate of a typical telephone survey was 36% in 1997 and is just 9% today.
The general decline in response rates is evident across nearly all types of surveys, in the United States and abroad. At the same time, greater effort and expense are required to achieve even the diminished response rates of today. These challenges have led many to question whether surveys are still providing accurate and unbiased information. Although response rates have decreased in landline surveys, the inclusion of cell phones – necessitated by the rapid rise of households with cell phones but no landline – has further contributed to the overall decline in response rates for telephone surveys.
A new study by the Pew Research Center for the People & the Press finds that, despite declining response rates, telephone surveys that include landlines and cell phones and are weighted to match the demographic composition of the population continue to provide accurate data on most political, social and economic measures. This comports with the consistent record of accuracy achieved by major polls when it comes to estimating election outcomes, among other things.
This is not to say that declining response rates are without consequence. One significant area of potential non-response bias identified in the study is that survey participants tend to be significantly more engaged in civic activity than those who do not participate, confirming what previous research has shown. People who volunteer are more likely to agree to take part in surveys than those who do not do these things. This has serious implications for a survey’s ability to accurately gauge behaviors related to volunteerism and civic activity. For example, telephone surveys may overestimate such behaviors as church attendance, contacting elected officials, or attending campaign events.
Read on for more comparisons between those who do tend to participate in telephone surveys and those who do not.
This has been a growing problem for years now: more people don’t want to be contacted and it is more difficult to contact cell phone users. One way this might be combated is to offer participants small incentives. This is already done with some online panels and it is more commonly used in mail surveys. These incentives wouldn’t be large enough to sway opinion or perhaps just get a sample of people who want the incentive but would be enough to raise response rates. It could be thought of as just enough to acknowledge and thank people for their time. I don’t know what the profit margins of firms like Gallup or Pew are but I imagine they could offer these small incentives quite easily.
This does suggest that the science of weighting is increasingly important. Having government benchmarks is really important, hence, the need for updated Census figures. However, it is not inconceivable that the Census could be scaled back: this is often a conservative proposal either based on the money spent on the Census Bureau or the “invasive” questions asked. And, it also may make the Census even more political as years of polling might be dependent on getting the figures “right,” depending on what side of the political aisle one is one.