Skip to main content

Progress Towards Law School Transparency

By Kyle McEntee and Patrick J. Lynch
February 29, 2012

We have a few updates on our progress towards greater law school transparency. The first is a rundown of voluntary website improvements made by law schools in advance of the ABA standards reform. The second is to announce that LST has obtained its 40th NALP report from an ABA-approved law school.

Transparency Index: Class of 2010

Back in January, we released the Transparency Index, an index of every ABA-approved law school website. It measures how transparent law schools are on their websites in detailing post-graduation outcomes for the class of 2010 through the lens of 19 criteria.

We said:

Taken together, these and other findings illustrate how law schools have been slow to react to calls for disclosure, with some schools conjuring ways to repackage employment data to maintain their images. Our findings play into a larger dialogue about law schools and their continued secrecy against a backdrop of stories about admissions data fraud, class action lawsuits, and ever-rising education costs. These findings raise a red flag as to whether schools are capable of making needed changes to the current, unsustainable law school model without being compelled to through government oversight or other external forces.

There is good, bad, and disappointing news. The bad news is that secrecy is still the norm, with schools still opting to selectively withhold class of 2010 information to suit their own interests (and to the possible detriment of prospective students). With this year's admissions cycle well underway, and admitted students closing in on their final decisions, people could really use the critical information schools have at their fingertips. While we can place some responsibility at the feet of those who will knowingly choose to attend schools that are withholding critical information, their poor choices still stem from law schools' bad acts, especially when it is clear that many prospective students still do not understand the extent of the gaps in information, which law schools have become so adept at hiding.

The good news is that we've seen been big improvements to a number of law school websites following the publication of our Winter 2012 Transparency Index Report. Further, these improvements are likely an underestimation: we've only updated the Live Index as we've come across evidence of improvements or have been directed to updates (usually by the schools themselves). As more and more schools respond positively to criticism, it is also getting easier to identify who the bad actors are.

  • 22% of schools do not provide evaluable class of 2010 information, up from 27%.
  • 64% of schools indicate how many graduates actually responded to their survey, up from 49%. Response rates provide applicants with a way to gauge the usefulness of survey results, a sort of back-of-the-envelope margin of error. Without the rate, schools can advertise employment rates north of 95% without explaining that the true employment rate is unknown, and likely lower.
  • 39% of law schools indicate how many graduates worked in legal jobs, up from 26%. 20% provide the full-time legal rate, up from 11%. We are aware of no additional schools providing the full-time, long-term employment rate. (It is still just two schools, or 1%.)
  • 28% of schools indicate how many graduates were employed in full-time vs. part-time jobs, up from 17%.
  • 16% indicate how many were employed in long-term vs. short-term jobs, up from 10%.
  • 16% of schools report how many graduates were employed in school-funded jobs, up from 10%.
  • 59% of schools provide at least some salary information, up from 49%. But the majority of the 59% of schools (63%) provide the information in ways that mislead the reader, down from 78%.

These are substantial changes. The schools who have made them (list available here) deserve some praise. However, it needs to be restated that every school could meet all 19 of the criteria we used in the Transparency Index, so our praise comes with that caveat.

The disappointing news is that one of the most transparent schools from our original report has decided to be less transparent. The University of Houston Law Center previously met thirteen criteria; it now meets only seven. (Original Disclosure; New Disclosure.)

In particular, Houston no longer shares the percentage of graduates in full-time and part-time jobs, in school-funded jobs, and in full-time legal jobs. It also no longer indicates when the graduates obtained the job (timing), nor how the graduate obtained the job (source). The school now also provides misleading salary information because the school no longer indicates the response rate for each salary figure provided.

When asked why they took such a huge step backwards, the dean of career services cited that Houston was now just doing what other schools were doing. She also claimed it was an improvement overall because it also included 2009 and 2008 employment data, although it is barely more than what's already available in our data clearinghouse (2008 & 2009).

For the unacquainted, Houston copied the University of Chicago's presentation standard, and in doing so actually decreased its level of disclosure. We criticized Chicago's standard back in January for this particular reason:

Last month, the University of Chicago Law School received widespread acclaim for its decision to provide improved consumer information about the class of 2010. We believe Chicago received acclaim because the job outcomes for Chicago's 2010 graduates appear to be strong relative to other law schools' outcomes. The positive responses confused the unexpected quality of outcomes, which for Chicago graduates remained strong despite the retraction in attorney hiring, with actual quality of disclosure. Chicago coupled tabular data with language about the need for transparency, leading people to claim that Chicago has set the market. But if every law school disclosed employment data according to Chicago's incomplete methodology, most would still continue to mislead prospective law students.

The Chicago methodology does not distinguish between full- and part-time jobs, between long- and short-term jobs, or whether positions are funded by the law school. Nor does it indicate the numerator for the salary figures. For Chicago, this is a relatively minor oversight because it collected salary data from 94% of employed graduates. But the further the response rate moves away from 100%, the more important it is that the rate be disclosed for every category that a school provides salary information for. Few, if any, other schools have a response rate above 90%.

Predictably, Chicago's puffery about its dedication to transparency has done more harm than good.

LST Obtains Its 40th NALP Report

There remains reason to be optimistic, however. LST has obtained NALP reports from six more law schools since our original announcement.

This brings the total to 40 law schools. While such incremental improvements to transparency suggest a long road ahead, we do consider 40 a significant threshold. LST will continue advocating for law schools to share their class of 2010 reports and, when the time comes, officially request the class of 2011 NALP reports too. Accepted students who don't want to wait until then can start contacting the schools to request the 2011 data, now that we have passed the February 15 reporting deadline. If you are successful in leveraging your acceptances to procure the data please consider sharing it with us and directly with other applicants on popular discussion board sites like TLS.