97% response rate for American Community Survey

The Census Bureau regularly collects data through the American Community Survey and it has a high response rate:

“Since 2005, the American Community Survey has produced an annual overall survey response rate of around 97 percent,” says James Treat, chief of the American Community Survey Office. He compares filling out a survey to serving on a jury, paying taxes or getting a valid driver’s license.

The Census Bureau can do more than push patriotic buttons to persuade people. Under Title 13 of the U.S. Code, a person who willfully ignores the survey can be fined as much as $100. That fine could be as high as $500 if you lie — maybe claim to access the Internet through a “mobile broadband plan” because you don’t want to admit to having a “dial-up service.”

Treat says the Census Bureau has a thorough procedure to check for inconsistencies and inaccuracies and that people don’t need to worry about their private information being shared with immigration officials, cops, the IRS, employers or cable-service providers.

Given concerns today about survey fatigue, this response rate is astounding. It is a good thing since the data is used by all sorts of government agencies as well as researchers. Even though the ACS draws occasional attention from lawmakers who want to cut budgets, it also doesn’t rise the same kind of ire compared to the dicennial census and its massive undertaking.

US unemployment figures distorted by lack of response, repeated takings of the survey

Two new studies suggest unemployment figures are pushed downward by the data collection process:

The first report, published by the National Bureau of Economic Research, found that the unemployment number released by the government suffers from a problem faced by other pollsters: Lack of response. This problem dates back to a 1994 redesign of the survey when it went from paper-based to computer-based, although neither the researchers nor anyone else has been able to offer a reason for why the redesign has affected the numbers.

What the researchers found was that, for whatever reason, unemployed workers, who are surveyed multiple times are most likely to respond to the survey when they are first given it and ignore the survey later on.

The report notes, “It is possible that unemployed respondents who have already been interviewed are more likely to change their responses to the labor force question, for example, if they want to minimize the length of the interview (now that they know the interview questions) or because they don’t want to admit that they are still unemployed.”

This ends up inaccurately weighting the later responses and skewing the unemployment rate downward. It also seems to have increased the number of people who once would have been designated as officially unemployed but today are labeled as out of the labor force, which means they are neither working nor looking for work.

And the second study suggests some of this data could be collected via Twitter by looking for key phrases.

This generally highlights the issue of survey fatigue where respondents are less likely to respond and completely fill out a survey. This hampers important data collection efforts across a wide range of fields. Given the enormity of the unemployment figures for American politics and economic life, this is a data problem worth solving.

A side thought: instead of searching Twitter for key words, why not deliver survey instruments like this through Twitter or smartphones? The surveys would have to be relatively short but they could have the advantage of seeming less time-consuming and could get better data.