As the years drifted by, it took more and more voters per cluster for us to get a single voter to agree to an interview. Between 1984 and 1989, when caller ID was rolled out, more voters began to ignore our calls. The advent of answering machines and then voicemail further reduced responses. Voters screen their calls more aggressively, so cooperation with pollsters has steadily declined year-by-year. Whereas once I could extract one complete interview from five voters, it can now take calls to as many as 100 voters to complete a single interview, even more in some segments of the electorate…
I offer my own experience from Florida in the 2020 election to illustrate the problem. I conducted tracking polls in the weeks leading up to the presidential election. To complete 1,510 interviews over several weeks, we had to call 136,688 voters. In hard-to-interview Florida, only 1 in 90-odd voters would speak with our interviewers. Most calls to voters went unanswered or rolled over to answering machines or voicemail, never to be interviewed despite multiple attempts.
The final wave of polling, conducted Oct. 25-27 to complete 500 interviews, was the worst for cooperation. We could finish interviews with only four-tenths of one percent from our pool of potential respondents. As a result, this supposed “random sample survey” seemingly yielded, as did most all Florida polls, lower support for President Trump than he earned on Election Day.
After the election, I noted wide variations in completion rates across different categories of voters, but nearly all were still too low for any actual randomness to be assumed or implied.
This is a basic Research Methods class issue: if you cannot collect a good sample, you are going to have a hard time reflecting reality for the population.
Here is the part I understand less. This is not a new issue. As noted above, response rates have been falling for decades. Part of it is new technology. Some of it involves new behavior, such as ignoring phone calls or distrust of political polling. The amount of polling and data collection that takes place now can lead to survey fatigue.
But, it is interesting that the techniques used to collect this data are roughly the same. Of course, it has moved from land lines to cell phones and perhaps even texting or recruited online pools of potential voters. The technology has changed some but the idea is similar in trying to reach out to a broad set of people and hope a representative enough sample responds.
Perhaps it is time for new techniques. The old ones have some advantages including the ability to relatively quickly reach a large number of people and researchers and consultants are used to these techniques. And I do not have the answers for what might work better. Researchers embedded in different communities who could collect data over time? Finding public spaces frequented by diverse populations and approaching people there? Working more closely with bellwhether or representative places or populations to track what is going on there?
Even with these low response rates, polling can still tell us something. It is not as bad as picking randomly or flipping a coin. Yet, it is not accurate enough in recent years. If researchers want to collect valid and reliable polling data in the future, new approaches may be in order.