Credibility, statistics, and the legal profession

Elie Mystal at Above the Law has this take on a recent story involving credibility, statistics, and the legal profession:

This week, the law schools at Columbia, NYU, and Fordham have come under fire for their allegedly inflated employment statistics. A story in the New York Post specifically called out the top New York-area law schools for shady reporting of graduate outcomes when it comes to graduates employed by the schools….

I want to take a step back and look at what we’re really fighting about here: some of the best law schools in New York City have put out a statistic about how many graduates get jobs, and the New York Post and a bunch of other people immediately called “bulls**t.” Think about that. Even if the law schools can somehow convince people that, technically, their published information isn’t riddled with lies, we’re living in a world where such data can be assumed to be false absent a long and detailed explanation and discussion from the law schools. When somebody notices a discrepancy between a school’s numbers and what’s in the newspaper, we assume the school was full of crap, not that the newspaper got it wrong.

I suppose this isn’t very flattering to either newspapers or law schools.  Perhaps Americans now trust journalists more than lawyers (or at least legal educators)?

Commenting a few months ago on a scandal within academic sociology, Brian suggested several approaches to dealing with uncertain statistics:

This reminds me of Joel Best’s recommendations regarding dealing with statistics. One common option is to simply trust all statistics. Numbers look authoritative, often come from experts, and they can be overwhelming. Just accepting them can be easy. At the other pole is the common option of saying that all statistics are simply interpretation and are manipulated so we can’t trust any of them. No numbers are trustworthy. Neither approaches are good options but they are relatively easy options. The better route to go when dealing with scientific studies is to have the basic skills necessary to understand whether they are good studies or not and how the process of science works [emphasis added].

Brian’s point is a good one.  Unfortunately, it’s not possible to implement his “third way” here because the root problem is the lack of raw information rather than the inability to duplicate experimental/study results.  The question is not, in the theoretical abstract:  how many law students will get jobs when the economy is in condition X?  The question is rather:  as a matter of historical fact, how many 2010 law school graduates (or 2011, or 2009, or whatever) actually had jobs by date Y?

Elie faults the American Bar Association, which oversees and accredits law schools, for the current disaster of unreliable data:

The ABA is supposed to represent lawyers and law schools to the public. It’s supposed to relegate them so that the public can trust that moral and ethical standards are being upheld and enforced. And on that scale, the ABA has been an unmitigated failure. It’s done a disservice to all law schools. Nobody can trust any law school because the ABA has failed to impose effective oversight over all of them.

That’s tragic. A society is supposed to be proud of its institutions of higher learning, but the ABA has robbed us of that pride in our nation’s law schools. We no longer get to feel like our justice system is populated by people trained to the highest ethical standards, because we can’t even trust our law schools to tell us the truth about how many people got hired.

If the numbers published by law schools under the oversight of the ABA are unreliable, it goes without saying that it’s very difficult to derive these numbers through other means, especially in a form that allows for legitimate comparisons between schools and over time.  There are workarounds, of course, like journalistic attempts to compile and/or verify employment statistics independently of the law schools.  But those are obviously imperfect solutions, as The Economist recently noted in its analysis of the (surprisingly analogous) problem of Argentinan inflation statistics:

Since 2007 Argentina’s government has published inflation figures that almost nobody believes. These show prices as having risen by between 5% and 11% a year. Independent economists, provincial statistical offices and surveys of inflation expectations have all put the rate at more than double the official number (see article). The government has often granted unions pay rises of that order….

We [The Economist] hope that we can soon revert to an official consumer-price index for Argentina. That would require INDEC to be run by independent statisticians working unhindered. Until then, readers are better served by a credible unofficial figure than a bogus official one.

Unfortunately, for the foreseeable future, I think we’re going to need to start seeing a lot more credible unofficial figures out there, both for Argentinian inflation and for law school placement statistics.