More evidence that Americans don’t like answering survey questions about income

While looking at data about the wage gap between men and women, two researchers discovered that respondents to the American Community Survey may not been completely correct in stating their incomes or the incomes of others in their households:

The authors, whose study will be published in the journal Social Science Research, identified these biases by examining data from the American Community Survey, which is also conducted by the Census Bureau. Respondents are interviewed multiple times, one year apart. When the researchers looked at how responses to these questions changed across the subsequent interviews (controlling for other factors), they found that people answered more generously for themselves than other people had for them.About half of the data on this income question in the American Community Survey have long come from “proxy reporters” — people answering on behalf of others in their household. In the early ’80s, a majority of these proxy reporters were women. “They were simply around to answer the phone call,” Reynolds said, noting that women had not entered the work force full time back then to the extent that they have today.

On the whole, these female survey respondents likely under-reported the income of their husbands, and over-reported their own — creating the skewed impression that the gender gap in America was much smaller in the early ’80s than it really was…

Once Reynolds and Wenger had calculated the extent of these biases, they went back to the data we’ve long used to measure the wage gap and readjusted it. Over time, as more women have entered the labor force, men have also become more likely to answer these surveys for themselves. And that impacts the data, too. The existing analysis — based on what the authors call the “naïve approach” to this data — suggested that the wage gap in America between 1979 and 2009 closed by about 16 percent (or $1.19 per hour). Wenger and Reynolds put that number instead at 22 percent (or $1.76). And so we have been 50 percent off in this basic calculation.

Interesting finding. As I tell my students, how you collect the data matters a lot for your conclusions. How much will other researchers be willing to change their data and conclusions based on this “quirk” in the data? No other researcher had ever thought about this before or have others considered the issue and moved forward anyway?

Researchers need to be particularly careful in dealing with questions about income. The researcher will have to find some sort of compromise where you can get the most fine-grained data while making sure that people are still willing and able to answer the question. If you ask about specific incomes, you are likely to get a lot of missing data as people are not comfortable answering. If you ask too broadly (say by having really large categories), you may not be able to do much with the data.

Does this suggest that other surveys that ask a single person to report on their whole household may also be skewed?

Leave a comment