Statistics can be used well and they can be used not so well. Here is an example where the headline statistic suggests something different from the rest of the story:
Of an already small pool of millionaires and billionaires, 1,470 didn’t pay any federal income taxes in 2009, according to the Internal Revenue Service.
Just over 0.1% of taxpayers — or 8,274 out of 140 million total — made more than $10 million in 2009, according to the agency. More than 235,000 taxpayers earned $1 million or more, according to a recent report from the agency.
But of the high earners who avoided paying income taxes, many did so due to heavy charity donations or foreign investments.
About 46% of all American households won’t pay federal income tax in 2011, many due to low income, tax credits for child care and exemptions, according to the nonpartisan Tax Policy Center.
The headline makes it sound like there are a lot of millionaires who are avoiding paying taxes. The actual percentage hinted at it in the story suggests something else: less than 0.63% of all millionaires (1,470/235,000 – less than 1 in a 100)) paid no taxes. In the midst of a political debate about whether to raise taxes for the wealthy in America, each side could grab on to factual yet different figures: the 1,500 figure sounds high like the country is missing out on a lot money while the 0.63% figure suggests almost all pay some taxes. It wouldn’t take much to include both figures, the actual number and the percentage in the story.
Examples like this help contribute to the reaction some people have when they see statistics in the media: how can I trust any of them if they will just use the figures that suit them? All statistics become suspect and it is then hard to get a handle on what is going on in the world.