Political pollsters sitting out the holidays in Georgia

The Senate run-offs in Georgia are attracting a lot of attention but pollsters are largely not participating:

Photo by Nate Hovee on Pexels.com

After a disastrous November election for the polling industry, when the polls again underestimated President Donald Trump (who lost regardless) as well as GOP candidates down the ballot, pollsters are mostly sidelined in the run-up to the Jan. 5 Georgia elections, which most observers regard as toss-ups.

The public polls that drove so much of the news coverage ahead of November — and generated tremendous distrust afterward — have all but disappeared in Georgia, and they are set to stay that way:Some of the most prolific, best-regarded media and academic pollsters told POLITICO they have no plans to conduct pre-election surveys in Georgia…

Part of the reason public pollsters are staying away from Georgia is the awkward timing of the races. With the elections being held on Jan. 5, the final two weeks of the race are coinciding with the Christmas and New Year’s holidays — typically a time when pollsters refrain from calling Americans on the phone. The voters who would answer a telephone poll or participate in an internet survey over the holidays might be meaningfully different from those who wouldn’t, which would skew the results.

Most major public pollsters are choosing not to field surveys over that time period, but the four campaigns don’t have a choice in the matter. The closing stretch of the races represents their final chances to shift resources or make changes to the television and digital advertising — decisions that will be made using multiple data streams, including polling.

Trying to reach members of the public via telephone or text or web is already hard enough. Response rates have been dropping for years. New devices have new norms. Figuring out who will actually vote is not easy.

Imagine trying to get a good sample during the holidays. On one hand, more people are likely not working and at home. On the other hand, this is time for family, getting away from the daily grind, relaxing. How many people will want to respond to talk about politics? Add in the post-national election letdown, COVID-19 worries, and this could be an extra challenging task during December 2020.

I know answering the door is not in vogue, even before COVID-19, but I wonder how well a door-to-door strategy for polling in Georgia might work. Such an approach would require more work but the races are limited to Georgia. Given that people are likely to be at home, this could reach some people.

97% response rate for American Community Survey

The Census Bureau regularly collects data through the American Community Survey and it has a high response rate:

“Since 2005, the American Community Survey has produced an annual overall survey response rate of around 97 percent,” says James Treat, chief of the American Community Survey Office. He compares filling out a survey to serving on a jury, paying taxes or getting a valid driver’s license.

The Census Bureau can do more than push patriotic buttons to persuade people. Under Title 13 of the U.S. Code, a person who willfully ignores the survey can be fined as much as $100. That fine could be as high as $500 if you lie — maybe claim to access the Internet through a “mobile broadband plan” because you don’t want to admit to having a “dial-up service.”

Treat says the Census Bureau has a thorough procedure to check for inconsistencies and inaccuracies and that people don’t need to worry about their private information being shared with immigration officials, cops, the IRS, employers or cable-service providers.

Given concerns today about survey fatigue, this response rate is astounding. It is a good thing since the data is used by all sorts of government agencies as well as researchers. Even though the ACS draws occasional attention from lawmakers who want to cut budgets, it also doesn’t rise the same kind of ire compared to the dicennial census and its massive undertaking.

Plans for an Internet-driven Census in 2020

The next dicennial census may just be largely conducted via Internet:

People may be asked to fill out their census forms on the Internet instead of sending them through the mail. Census takers may use smartphones instead of paper to complete their counts…

Despite outreach and advertising campaigns, the share of occupied homes that returned a form was 74 percent in 2010, unchanged from 2000 and 1990. The majority of the money the bureau spends during a census goes to getting everyone else to fill out their forms, Census Director John H. Thompson said…

Americans are ready for an Internet-driven census, officials said. During 2014 tests in in Washington, D.C., and nearby Montgomery County, Maryland, 55 percent of the families who were asked to fill out their census tests on the Internet responded without major prodding, an “exceptional response,” Thompson said. Census workers used iPhones to collect information in follow-up visits…

For government officials, going digital means they can do real-time analysis on areas to figure out which households have not responded, and be able to use their workers on the ground more efficiently, he said.

Three things I’d love to know:

1. Officials cite a high response rate but how accurate are the responses? In other words, who is likely to fill out the Census online? Internet users as a whole tend to skew toward younger and wealthier users (the digital divide) so this might skew the Internet data.

2. How exactly are households matched to email addresses? Or do people go to a website and input their own address which is then matched with a government database?

3. Given the threats to digital security, is the Census Bureau prepared to defend the data (particularly not allowing information to be matched to particular addresses?

US unemployment figures distorted by lack of response, repeated takings of the survey

Two new studies suggest unemployment figures are pushed downward by the data collection process:

The first report, published by the National Bureau of Economic Research, found that the unemployment number released by the government suffers from a problem faced by other pollsters: Lack of response. This problem dates back to a 1994 redesign of the survey when it went from paper-based to computer-based, although neither the researchers nor anyone else has been able to offer a reason for why the redesign has affected the numbers.

What the researchers found was that, for whatever reason, unemployed workers, who are surveyed multiple times are most likely to respond to the survey when they are first given it and ignore the survey later on.

The report notes, “It is possible that unemployed respondents who have already been interviewed are more likely to change their responses to the labor force question, for example, if they want to minimize the length of the interview (now that they know the interview questions) or because they don’t want to admit that they are still unemployed.”

This ends up inaccurately weighting the later responses and skewing the unemployment rate downward. It also seems to have increased the number of people who once would have been designated as officially unemployed but today are labeled as out of the labor force, which means they are neither working nor looking for work.

And the second study suggests some of this data could be collected via Twitter by looking for key phrases.

This generally highlights the issue of survey fatigue where respondents are less likely to respond and completely fill out a survey. This hampers important data collection efforts across a wide range of fields. Given the enormity of the unemployment figures for American politics and economic life, this is a data problem worth solving.

A side thought: instead of searching Twitter for key words, why not deliver survey instruments like this through Twitter or smartphones? The surveys would have to be relatively short but they could have the advantage of seeming less time-consuming and could get better data.

Pew Research: the response rate for a typical phone survey is now 9% and response rates are down across the board

Earlier this year, Pew Research described a growing problem for pollsters: over 90% of the  public that doesn’t want to participate in telephone surveys.

It has become increasingly difficult to contact potential respondents and to persuade them to participate. The percentage of households in a sample that are successfully interviewed – the response rate – has fallen dramatically. At Pew Research, the response rate of a typical telephone survey was 36% in 1997 and is just 9% today.

The general decline in response rates is evident across nearly all types of surveys, in the United States and abroad. At the same time, greater effort and expense are required to achieve even the diminished response rates of today. These challenges have led many to question whether surveys are still providing accurate and unbiased information. Although response rates have decreased in landline surveys, the inclusion of cell phones – necessitated by the rapid rise of households with cell phones but no landline – has further contributed to the overall decline in response rates for telephone surveys.

A new study by the Pew Research Center for the People & the Press finds that, despite declining response rates, telephone surveys that include landlines and cell phones and are weighted to match the demographic composition of the population continue to provide accurate data on most political, social and economic measures. This comports with the consistent record of accuracy achieved by major polls when it comes to estimating election outcomes, among other things.

This is not to say that declining response rates are without consequence. One significant area of potential non-response bias identified in the study is that survey participants tend to be significantly more engaged in civic activity than those who do not participate, confirming what previous research has shown. People who volunteer are more likely to agree to take part in surveys than those who do not do these things. This has serious implications for a survey’s ability to accurately gauge behaviors related to volunteerism and civic activity. For example, telephone surveys may overestimate such behaviors as church attendance, contacting elected officials, or attending campaign events.

Read on for more comparisons between those who do tend to participate in telephone surveys and those who do not.

This has been a growing problem for years now: more people don’t want to be contacted and it is more difficult to contact cell phone users. One way this might be combated is to offer participants small incentives. This is already done with some online panels and it is more commonly used in mail surveys. These incentives wouldn’t be large enough to sway opinion or perhaps just get a sample of people who want the incentive but would be enough to raise response rates. It could be thought of as just enough to acknowledge and thank people for their time. I don’t know what the profit margins of firms like Gallup or Pew are but I imagine they could offer these small incentives quite easily.

This does suggest that the science of weighting is increasingly important. Having government benchmarks is really important, hence, the need for updated Census figures. However, it is not inconceivable that the Census could be scaled back: this is often a conservative proposal either based on the money spent on the Census Bureau or the “invasive” questions asked. And, it also may make the Census even more political as years of polling might be dependent on getting the figures “right,” depending on what side of the political aisle one is one.