LST Requests Class of 2011 NALP Reports

Following our success in collecting NALP reports from schools last year, we are asking schools to now make their reports for the class of 2011 available.

For the class of 2010, we managed to collect 50 NALP reports. These reports helped us expand our data clearinghouse so that we could become the place to go for the most thorough and easy-to-compare employment information. This year, our goal is to double the number of reports we collect.

Even with the improvements to law school transparency, thanks to immense pressure on the ABA, these reports contain helpful data that schools are not required to be make public.

  • Salary Data (aggregated in categories)
  • Job Source (e.g., OCI, networking, direct mailings)
  • Job Offer Timing (before graduation, before bar results, after bar results)
  • Job Status (employed graduates who are still seeking or not seeking)
  • Job Region and Job States
  • Job Type Breakdowns by Employer Type (e.g., government JD Advantage)

Check out Seattle University School of Law’s Class of 2011 NALP report, which the school sent to us unprompted, to see what schools have to offer this year.

We hope that schools share our sense of urgency and help us put comparable employment information into the hands of consumers. Check out the full letter after the jump.

Continue reading LST Requests Class of 2011 NALP Reports

Data Clearinghouse Updates

LST has received a number of inquiries from schools since updating our employment data clearinghouse. In most instances the schools did not understand the data they were publishing, either on their websites or through U.S. News.

Of the inquiries we received, two complaints, which came from administrators at Santa Clara and Toledo, warranted updates/corrections. These schools informed us that we were using wrong or incomplete data. They were right in this regard, though the problems stemmed exclusively from what schools supplied to U.S. News.

In Toledo’s case, a law school representative misinterpreted questions on the U.S. News survey and therefore supplied the magazine with incorrect data. In Santa Clara’s case, the school had policies and procedures in place that led to under-reporting what the school actually knew about its graduates. In both cases we were able to work with the schools to identify the source of the problems and have corrected the errors with data supplied by the schools.

(We’d also like to mention that prior to releasing the class of 2010 data clearinghouse, we
contacted ten law schools that made the same mistake as Toledo with their U.S. News-supplied data. Four of these schools confirmed the discrepancies and have provided us with the correct data.)

We’ve added the following note to Santa Clara’s class of 2010 profile:

After joint review with Santa Clara, we have restored the school’s profile using data provided by Santa Clara following its internal review of each 2010 graduate’s student file. Rather than relying on student-supplied data, which is what the school reported to U.S. News and reported on its website (the original data in the school’s profile), Santa Clara added data the administration culled from conversations and basic investigation.

Note: One major change is with the 28 jobs Santa Clara originally reported as non-professional. Santa Clara tells us “[t]his was done in error.” While these graduates were still employed, Santa Clara does not know what sort of credentials (e.g. bar passage required) those graduates’ jobs required. However, Santa Clara does know 12 of these 28 graduates’ employer types (e.g. law firm) and expected working hours (i.e. FT or PT).

We are happy to report that Toledo chose to resolve questions about graduate employment outcomes by disclosing the 2010 NALP report, joining 47 other ABA-approved law schools who have done so. Toledo’s class of 2010 NALP report now appears in our report database and can be viewed here.

Finally, we welcome law schools to continue contacting us with their concerns. These conversations are always valuable and almost always lead to improving law school transparency.

Progress Towards Law School Transparency

We have a few updates on our progress towards greater law school transparency. The first is a rundown of voluntary website improvements made by law schools in advance of the ABA standards reform. The second is to announce that LST has obtained its 40th NALP report from an ABA-approved law school.

Transparency Index: Class of 2010

Back in January, we released the Transparency Index, an index of every ABA-approved law school website. It measures how transparent law schools are on their websites in detailing post-graduation outcomes for the class of 2010 through the lens of 19 criteria.

We said:

Taken together, these and other findings illustrate how law schools have been slow to react to calls for disclosure, with some schools conjuring ways to repackage employment data to maintain their images. Our findings play into a larger dialogue about law schools and their continued secrecy against a backdrop of stories about admissions data fraud, class action lawsuits, and ever-rising education costs. These findings raise a red flag as to whether schools are capable of making needed changes to the current, unsustainable law school model without being compelled to through government oversight or other external forces.

There is good, bad, and disappointing news. The bad news is that secrecy is still the norm, with schools still opting to selectively withhold class of 2010 information to suit their own interests (and to the possible detriment of prospective students). With this year’s admissions cycle well underway, and admitted students closing in on their final decisions, people could really use the critical information schools have at their fingertips. While we can place some responsibility at the feet of those who will knowingly choose to attend schools that are withholding critical information, their poor choices still stem from law schools’ bad acts, especially when it is clear that many prospective students still do not understand the extent of the gaps in information, which law schools have become so adept at hiding.

The good news is that we’ve seen been big improvements to a number of law school websites following the publication of our Winter 2012 Transparency Index Report. Further, these improvements are likely an underestimation: we’ve only updated the Live Index as we’ve come across evidence of improvements or have been directed to updates (usually by the schools themselves). As more and more schools respond positively to criticism, it is also getting easier to identify who the bad actors are.

  • 22% of schools do not provide evaluable class of 2010 information, up from 27%.
  • 64% of schools indicate how many graduates actually responded to their survey, up from 49%. Response rates provide applicants with a way to gauge the usefulness of survey results, a sort of back-of-the-envelope margin of error. Without the rate, schools can advertise employment rates north of 95% without explaining that the true employment rate is unknown, and likely lower.
  • 39% of law schools indicate how many graduates worked in legal jobs, up from 26%. 20% provide the full-time legal rate, up from 11%. We are aware of no additional schools providing the full-time, long-term employment rate. (It is still just two schools, or 1%.)
  • 28% of schools indicate how many graduates were employed in full-time vs. part-time jobs, up from 17%.
  • 16% indicate how many were employed in long-term vs. short-term jobs, up from 10%.
  • 16% of schools report how many graduates were employed in school-funded jobs, up from 10%.
  • 59% of schools provide at least some salary information, up from 49%. But the majority of the 59% of schools (63%) provide the information in ways that mislead the reader, down from 78%.

These are substantial changes. The schools who have made them (list available here) deserve some praise. However, it needs to be restated that every school could meet all 19 of the criteria we used in the Transparency Index, so our praise comes with that caveat.

The disappointing news is that one of the most transparent schools from our original report has decided to be less transparent. The University of Houston Law Center previously met thirteen criteria; it now meets only seven. (Original Disclosure; New Disclosure.)

In particular, Houston no longer shares the percentage of graduates in full-time and part-time jobs, in school-funded jobs, and in full-time legal jobs. It also no longer indicates when the graduates obtained the job (timing), nor how the graduate obtained the job (source). The school now also provides misleading salary information because the school no longer indicates the response rate for each salary figure provided.

When asked why they took such a huge step backwards, the dean of career services cited that Houston was now just doing what other schools were doing. She also claimed it was an improvement overall because it also included 2009 and 2008 employment data, although it is barely more than what’s already available in our data clearinghouse (2008 & 2009).

For the unacquainted, Houston copied the University of Chicago’s presentation standard, and in doing so actually decreased its level of disclosure. We criticized Chicago’s standard back in January for this particular reason:

Last month, the University of Chicago Law School received widespread acclaim for its decision to provide improved consumer information about the class of 2010. We believe Chicago received acclaim because the job outcomes for Chicago’s 2010 graduates appear to be strong relative to other law schools’ outcomes. The positive responses confused the unexpected quality of outcomes, which for Chicago graduates remained strong despite the retraction in attorney hiring, with actual quality of disclosure. Chicago coupled tabular data with language about the need for transparency, leading people to claim that Chicago has set the market. But if every law school disclosed employment data according to Chicago’s incomplete methodology, most would still continue to mislead prospective law students.

The Chicago methodology does not distinguish between full- and part-time jobs, between long- and short-term jobs, or whether positions are funded by the law school. Nor does it indicate the numerator for the salary figures. For Chicago, this is a relatively minor oversight because it collected salary data from 94% of employed graduates. But the further the response rate moves away from 100%, the more important it is that the rate be disclosed for every category that a school provides salary information for. Few, if any, other schools have a response rate above 90%.

Predictably, Chicago’s puffery about its dedication to transparency has done more harm than good.

LST Obtains Its 40th NALP Report

There remains reason to be optimistic, however. LST has obtained NALP reports from six more law schools since our original announcement.

This brings the total to 40 law schools. While such incremental improvements to transparency suggest a long road ahead, we do consider 40 a significant threshold. LST will continue advocating for law schools to share their class of 2010 reports and, when the time comes, officially request the class of 2011 NALP reports too. Accepted students who don’t want to wait until then can start contacting the schools to request the 2011 data, now that we have passed the February 15 reporting deadline. If you are successful in leveraging your acceptances to procure the data please consider sharing it with us and directly with other applicants on popular discussion board sites like TLS.