About six years ago I wrote, “In 2000, the Justice Policy Institute (JPI) found evidence that more black men are in prison than in college,” in my first “Breaking Barriers” (pdf) report. At the time, I did not question the veracity of this statement. The statement fit well among other stats that I used to establish the need for more solution-focused research on black male achievement…
Today there are approximately 600,000 more black men in college than in jail, and the best research evidence suggests that the line was never true to begin with. In this two-part entry in Show Me the Numbers, the Journal of Negro Education’s monthly series for The Root, I examine the dubious origins, widespread use and harmful effects of what is arguably the most frequently quoted statistic about black men in the United States…
In September 2012, in response to the Congressional Black Caucus Foundation’s screening of the film Hoodwinked, directed by Janks Morton, JPI issued a press release titled, “JPI Stands by Data in 2002 on Education and Incarceration.” However, if one examines the IPEDS data from 2001 to 2011, it is clear that many colleges and universities were not reporting JPI’s data 10 years ago.
In 2011, 4,503 colleges and universities across the United States reported having at least one black male student. In 2001, only 2,734 colleges and universities reported having at least one black male student, with more than 1,000 not reporting any data at all. When perusing the IPEDS list of colleges with significant black male populations today but none reported in 2001, I noticed several historically black colleges and universities, including Bowie State University, and my own alma mater, Temple University. Ironically, I was enrolled at Temple as a doctoral candidate in 2001.
When I first saw this, I first thought it might be an example of what sociologist Joel Best calls a “mutant statistic.” This is a statistic that might originally be based in fact but at some point undergoes a transformation and keeps getting repeated until it seems unchallengeable.
There might be some mutant statistic going here but it also appears to be an issue of methodology. As Toldson points out, it looks like this was a missing data issue: the 2001 survey did not include data from over 1,000 colleges. When more colleges were counted in 2011, the findings changed. If it is a methodological issue, then this issue should have been caught at the beginning.
As Best notes, it can take some time for bad statistics to be reversed. It will be interesting to see how long this particular “fact” continues to be repeated.