NYU Plays ‘Hide the Ball’ With Employment Data

New York University School of Law has decided to continue withholding valuable employment data from prospective students. After sending NYU a third request to publish its NALP report, we finally received a reply from Dean Ricky Revesz:

Thank you for your note. I expect you are aware that, since the end of last year, we have added a substantial amount of employment data to the NYU Law website. If there is more information that would be suitable and helpful for us to provide, we are happy to consider doing that. For example, we are now looking into posting data of the type found in Table 12 of the NALP form (Source of Job by Employer Type), since that would likely be of interest to prospective and current students. Please let me know if there is additional information that you think would be helpful for us to publish.

Our letter was quite clear about what would be helpful: NYU needs to publish its NALP report. It was a short letter and referenced the NALP report five times. (Read it here.) There is no other way to read any of our three requests and it’s insulting that Dean Revesz would feign ignorance while referencing the very report we requested.

We responded to Dean Revesz, repeating that we believe the entire NALP report should be published and noting what information NYU has not released (along with the sections of the NALP report that contain this information):

  • Full-time and part-time employment by required/preferred credentials (“FT/PT Jobs”)
  • Classification of business jobs (“Business Jobs”)
  • Position held in a law firm (“Type of Law Firm Job”)
  • When the job was obtained (“Timing of Job Offer”)
  • Salaries by job type (“Employment Status Known” and “Employment Categories”)
  • Salaries by location (“Jobs Taken by Region” and “Location of Jobs”)
  • Salaries by firm size (“Size of Firm”)

Dean Revesz’s response to our last communication was disappointing:

As I indicated in my last note to you, we will continue to add data to the area of our website that contains employment statistics.

We must read this to say that NYU has no plans to publish the additional information we have requested, even though it is of value to both the legal profession and law school applicants. The data NYU indicated it was “looking into” publishing 14 months after collection–the source of job by employer type–has not been added to NYU’s website despite more than another month passing. The immense intellectual, technological, and financial resources of NYU have proven unequal to the task of posting an 8×7 table.

(To see the data NYU knowingly withholds, check out our reconstruction of its NALP report in our data clearinghouse. Fields shown in dark gray represent information NYU possess but has not released.)

NYU to Professor Campos: Come at me, bro!

“Having trouble knowing what to believe? We have a proposal for Paul Campos: come audit our numbers. We’ll show you a list of all NLJ 250 firms to which we sent associates in 2010 and 2011. Pick a reasonably sized sample from that group, and compare them to firm-verifiable data. Then let us, and the world, know what you find.”NYU’s Rebuttal

Professor Paul Campos called in to question NYU’s biglaw placement rate, citing a discrepancy between the numbers reported by NYU and by the National Law Journal. NYU’s response was clear:

Come at me, bro!

Something about NYU’s response (I’m a 2008 grad) just didn’t sit right with me. It wasn’t the acerbic tone the rebuttal took. It was that Law School Transparency had already requested NYU be more open about its job placement rates. LST actually made that request twice, and we have renewed that request again today.

Like all-but-six law schools, NYU has in its possession, right now, a NALP report with detailed job placement statistics for the class of 2010. This report contains a wealth of information, including the size of firms students went to work at; salary information for a multitude of categories; if the jobs are full time or part time, permanent or temporary; what states they are in; if the job at a law firm is as a lawyer, a clerk, or a paralegal; if students found the jobs through OCI, a job posting, or went back to work at a pre-law school employer. It paints a very detailed picture of what happened to the class of 2010, but it’s a picture that NYU has decided to not let anyone else see.

That’s what makes NYU’s response to Professor Campos so strange.

NYU professes its openness, its honesty, its transparency. With one hand it extends an offer to verify that things are as it says they are, but with the other hand it folds up the numbers and locks them away. “Come audit our books…. But not those books!”

Having trouble knowing what to believe? We have a proposal for New York University: disclose your numbers. You’ll publish the employment data collected by NALP for the class of 2010 (and for 2011 when you receive that report this summer). Then let us, and the world, know what you already know.

Read the letter after the jump

National Jurist grades transparency; another school provides LST with its NALP report

In this month’s National Jurist, the magazine’s editors used the data from our Live Transparency Index to grade how transparent each law school website is. While we do not encourage or approve of ranking or grading schools based on the number of criteria met, because some criteria are more important than others, we did provide National Jurist with a spreadsheet of the data to ease their workload. These are the same data that were available on the Live Index as of the end of February 2012. As such, some schools have updated their websites since that time.

In other news, another school provided LST with its class of 2010 NALP report: St. Mary’s University. Over the next few weeks, we will be encouraging a letter writing campaign to convince other schools to provide us with their class of 2010 NALP reports. In the meantime, we are busy inputing the data from all NALP

Progress Towards Law School Transparency

We have a few updates on our progress towards greater law school transparency. The first is a rundown of voluntary website improvements made by law schools in advance of the ABA standards reform. The second is to announce that LST has obtained its 40th NALP report from an ABA-approved law school.

Transparency Index: Class of 2010

Back in January, we released the Transparency Index, an index of every ABA-approved law school website. It measures how transparent law schools are on their websites in detailing post-graduation outcomes for the class of 2010 through the lens of 19 criteria.

We said:

Taken together, these and other findings illustrate how law schools have been slow to react to calls for disclosure, with some schools conjuring ways to repackage employment data to maintain their images. Our findings play into a larger dialogue about law schools and their continued secrecy against a backdrop of stories about admissions data fraud, class action lawsuits, and ever-rising education costs. These findings raise a red flag as to whether schools are capable of making needed changes to the current, unsustainable law school model without being compelled to through government oversight or other external forces.

There is good, bad, and disappointing news. The bad news is that secrecy is still the norm, with schools still opting to selectively withhold class of 2010 information to suit their own interests (and to the possible detriment of prospective students). With this year’s admissions cycle well underway, and admitted students closing in on their final decisions, people could really use the critical information schools have at their fingertips. While we can place some responsibility at the feet of those who will knowingly choose to attend schools that are withholding critical information, their poor choices still stem from law schools’ bad acts, especially when it is clear that many prospective students still do not understand the extent of the gaps in information, which law schools have become so adept at hiding.

The good news is that we’ve seen been big improvements to a number of law school websites following the publication of our Winter 2012 Transparency Index Report. Further, these improvements are likely an underestimation: we’ve only updated the Live Index as we’ve come across evidence of improvements or have been directed to updates (usually by the schools themselves). As more and more schools respond positively to criticism, it is also getting easier to identify who the bad actors are.

  • 22% of schools do not provide evaluable class of 2010 information, up from 27%.
  • 64% of schools indicate how many graduates actually responded to their survey, up from 49%. Response rates provide applicants with a way to gauge the usefulness of survey results, a sort of back-of-the-envelope margin of error. Without the rate, schools can advertise employment rates north of 95% without explaining that the true employment rate is unknown, and likely lower.
  • 39% of law schools indicate how many graduates worked in legal jobs, up from 26%. 20% provide the full-time legal rate, up from 11%. We are aware of no additional schools providing the full-time, long-term employment rate. (It is still just two schools, or 1%.)
  • 28% of schools indicate how many graduates were employed in full-time vs. part-time jobs, up from 17%.
  • 16% indicate how many were employed in long-term vs. short-term jobs, up from 10%.
  • 16% of schools report how many graduates were employed in school-funded jobs, up from 10%.
  • 59% of schools provide at least some salary information, up from 49%. But the majority of the 59% of schools (63%) provide the information in ways that mislead the reader, down from 78%.

These are substantial changes. The schools who have made them (list available here) deserve some praise. However, it needs to be restated that every school could meet all 19 of the criteria we used in the Transparency Index, so our praise comes with that caveat.

The disappointing news is that one of the most transparent schools from our original report has decided to be less transparent. The University of Houston Law Center previously met thirteen criteria; it now meets only seven. (Original Disclosure; New Disclosure.)

In particular, Houston no longer shares the percentage of graduates in full-time and part-time jobs, in school-funded jobs, and in full-time legal jobs. It also no longer indicates when the graduates obtained the job (timing), nor how the graduate obtained the job (source). The school now also provides misleading salary information because the school no longer indicates the response rate for each salary figure provided.

When asked why they took such a huge step backwards, the dean of career services cited that Houston was now just doing what other schools were doing. She also claimed it was an improvement overall because it also included 2009 and 2008 employment data, although it is barely more than what’s already available in our data clearinghouse (2008 & 2009).

For the unacquainted, Houston copied the University of Chicago’s presentation standard, and in doing so actually decreased its level of disclosure. We criticized Chicago’s standard back in January for this particular reason:

Last month, the University of Chicago Law School received widespread acclaim for its decision to provide improved consumer information about the class of 2010. We believe Chicago received acclaim because the job outcomes for Chicago’s 2010 graduates appear to be strong relative to other law schools’ outcomes. The positive responses confused the unexpected quality of outcomes, which for Chicago graduates remained strong despite the retraction in attorney hiring, with actual quality of disclosure. Chicago coupled tabular data with language about the need for transparency, leading people to claim that Chicago has set the market. But if every law school disclosed employment data according to Chicago’s incomplete methodology, most would still continue to mislead prospective law students.

The Chicago methodology does not distinguish between full- and part-time jobs, between long- and short-term jobs, or whether positions are funded by the law school. Nor does it indicate the numerator for the salary figures. For Chicago, this is a relatively minor oversight because it collected salary data from 94% of employed graduates. But the further the response rate moves away from 100%, the more important it is that the rate be disclosed for every category that a school provides salary information for. Few, if any, other schools have a response rate above 90%.

Predictably, Chicago’s puffery about its dedication to transparency has done more harm than good.

LST Obtains Its 40th NALP Report

There remains reason to be optimistic, however. LST has obtained NALP reports from six more law schools since our original announcement.

This brings the total to 40 law schools. While such incremental improvements to transparency suggest a long road ahead, we do consider 40 a significant threshold. LST will continue advocating for law schools to share their class of 2010 reports and, when the time comes, officially request the class of 2011 NALP reports too. Accepted students who don’t want to wait until then can start contacting the schools to request the 2011 data, now that we have passed the February 15 reporting deadline. If you are successful in leveraging your acceptances to procure the data please consider sharing it with us and directly with other applicants on popular discussion board sites like TLS.

LST Obtains 34 Class of 2010 NALP Reports

Update: We were alerted that Indiana–Bloomington had made its NALP report available online prior to this story’s publication. We’ve made this correction throughout this story.

Update 2: We apologize to the University of Utah S.J. Quinney College of Law. On February 10th, the school provided LST its NALP report, but we did not include it in this story.

Update 3: To see more schools that have since complied, see here.

On December 14, 2011, we wrote all ABA-approved law school deans to request the class of 2010 NALP report that each school received in June 2011. We are pleased to announce that we have obtained 32 34 of these reports (17.3% of ABA-approved law schools and 17.8% of schools with NALP reports). 29 of the 32 34 law schools sent a report to LST. The other three four were pulled from school websites.

We asked for these reports to help prospective law students find the law schools that best meet their career objectives. Together, these reports provide prospectives access to timely, thorough, and comparable employment information. They make LST’s website an even more valuable source of information for prospective law students. We hope to soon add the employment data contained in the NALP reports to our data clearinghouse. In the meantime, these reports are available here. In addition, the list below links to each school’s NALP report.

Schools Providing the Class of 2010 NALP Report to LST (30):

Non-Responding Schools with accessible NALP Reports (4):

Akron

Northern Illinois University

Thomas Jefferson

Indiana — Bloomington

We want to make a special note that six law schools do not submit employment data to NALP and therefore do not have these forms: Notre Dame, Pepperdine, St. Louis, and the three law schools in Puerto Rico. This does not mean that these schools do not have ample employment data, however. Rather, these are the only schools who do not have a prepared form in front of them that can quite easily be disclosed to prospective law students.

Interestingly, the vast majority of schools providing LST with the 2010 NALP reports were public institutions. This may be because these schools interpreted our request as an open records request. Two schools (Wisconsin and UNC) explicitly treated our request in this way.

We received a handful of responses expressly declining to provide LST with the NALP report. Consistent with past communications with law schools, a number of schools indicated that they had meaningful employment information already available on their websites. In almost every case, this was (and remains) false.

In addition, there were two “no” responses that stood out. The full text of the responses is below.

First, Ave Maria argued that its NALP report does not provide meaningful information. Dean Milhizer claimed, “Our school is small with a unique mission, and our employment outcomes are reflective of this.” He suggested that additional information would be needed to assist prospective students. He did not, however, suggest what kind of information would be useful. The school’s website contains little information on what graduates found for work, with much of that information serving to mislead applicants.

Second, Chapman University argued that its report was confidential. Dean Campbell’s response is very misleading. It is true that NALP promises that it will not share the graduate-level data or the reports generated from those data with anyone except the law school. But schools are under no such obligation, either contractually or (for the NALP reports) legally. If it was required to keep the data confidential, Chapman could not provide these employment statistics on its website. Instead, the NALP reports only remain confidential because law schools decline to share them with those who would find the information most useful. Fortunately for prospective students, at least 32 33 schools disagree that the NALP reports are confidential.

Overall, as the Live Transparency Index shows, schools are increasingly sharing more employment information. But the vast majority of law schools still leave critical gaps in their presentation of employment information – gaps which the NALP reports would fill. These 32 34 law schools have demonstrated leadership that is sorely lacking at other law schools. While these schools still need to choose how to share employment information on their websites, they understand the importance of providing free access to comparable information now. Our hope is that these schools pave the way for changes at other schools, many of whom are still acting as if their applicants do not deserve access to comparable consumer information.

Ave Maria School of Law:

Dear Mr. McEntee and Mr. Lynch:

I applaud your efforts to assist prospective law students in obtaining reliable information about the law schools of interest to them. However, Ave Maria School of Law does not believe that its Class of 2010 Summary Report from NALP will provide meaningful information about our school to prospective students. Our school is small with a unique mission, and our employment outcomes are reflective of this. In our judgment, the Summary Report does not provide sufficient information about the types of positions obtained by our 2010 graduates, and so to release the report in a vacuum without additional information would not be of assistance to prospective students.

This year, as in years past, AMSL will comply with NALP’s guidelines on reporting employment outcomes for the Class of 2011, and we will be participating in the ABA’s new annual survey of these outcomes.

Sincerely,

Eugene R. Milhizer
President and Dean

Chapman University:

Dear Law School Transparency,

You have requested that our School of Law send you the otherwise confidential report that we supply to NALP. We have years of experience with NALP, and on that basis, we know that the information we send will be appropriately treated, consistent with good ethics and all applicable federal and state laws. Regrettably, we do not have that kind of basis with your organization. Accordingly, we think it is most appropriate to continue to keep our submission to NALP confidential.

Nevertheless, our School of Law maintains a very informative website, and we post a great deal of data on our graduates’ employment there. That information is available to the public, and we invite you to consult this source if you would like.

Sincerely,

Tom Campbell
Dean

Breaking: 12 more law schools facing class actions

The Law Offices of David Anziska, together with Strauss Law PLLC and six other law firms, publicly announced moments ago that they have filed complaints against 12 more law schools. To date, 15 of the country’s 197 ABA-approved law schools are facing class action suits. (Thomas Jefferson, New York Law School, and Thomas Cooley have already been sued, with the first lawsuit already in discovery.)

These lawsuits should be of grave concern to the ABA, both as the only federally-recognized accrediting body and as the legal profession’s largest and most powerful trade organization. Nearly 8% of its member schools have been formally accused of fraud by 74 former students. While positive results for the plaintiffs would further confirm what LST has drawn attention to over the past two years, the underlying problem of poor ABA governance will remain unchanged by the results. Recent efforts to reform the accreditation standards are a start, but the ABA has yet to show that they will take any significant corrective action against schools. While these lawsuits will attempt to hold schools accountable for past misleading actions, it will be up to the ABA to ensure its member schools do not continue the fraud that is widespread throughout American legal education.

The new batch includes 11 schools from Anziska and Strauss’s October 2011 announcement. The twelfth is Golden Gate University School of Law, as Above the Law announced late last year.

All 12 Schools:

  • Albany Law School
  • Brooklyn Law School
  • California Western School of Law
  • Chicago-Kent College of Law
  • DePaul University College of Law
  • Florida Coastal School of Law
  • Golden Gate University School of Law
  • Hofstra Law School
  • John Marshall School of Law (Chicago)
  • Southwestern Law School
  • University of San Francisco School of Law
  • Widener University School of Law

As momentum for holding law schools accountable grows and people start to realize the courts are their only remedy, LST expects more class actions will be filed this year. These allegations concern a long history of consumer-disoriented behavior, which unfortunately continues today at a great number of schools. LST’s Winter 2012 Transparency Index shows just how poor the newly-sued schools are doing when it comes to being honest about what their graduates found for work. Just one of the twelve schools currently discloses the number of graduates who found full-time, permanent jobs for which bar passage was required.

Transparency Index Performance of Newly-Sued Schools

School State Transparency Index Performance
Albany Law School NY Does not indicate # in FT/PT jobs or LT/ST jobs. Provides Legal Employment Rate.
Brooklyn Law School NY Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. Provides misleading salary figures.
California Western School of Law CA Struggled with its graduate survey response rate more than most schools. Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. Provides misleading salary figures.
Chicago-Kent College of Law IL Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. Provides misleading salary figures.
DePaul University College of Law IL Does not indicate graduate survey response rate. Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. Provides misleading salary figures.
Florida Coastal School of Law FL Struggled with its graduate survey response rate more than most schools. Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. However, it does provide the Legal Employment Rate. Provides misleading salary figures.
Golden Gate University School of Law CA Struggled with its graduate survey response rate more than most schools. Does not indicate # in school-funded jobs or LT/ST jobs. However, it does provide the FT Legal Employment Rate.
Hofstra Law School NY Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. Provides misleading salary figures and employer list.
John Marshall School of Law (Chicago) IL Does not indicate # in school-funded jobs or LT/ST jobs. Provides the FT Legal Employment Rate. Provides many misleading salary figures.
Southwestern Law School CA One of the best performing schools with 12 met criteria. One of two schools that currently provide the Full-time, Long-term Legal Employment Rate. Does not indicate # in school-funded jobs.
University of San Francisco School of Law CA Does not provide employment statistics on its website.
Widener University School of Law DE/PA Struggled with its graduate survey response rate more than most schools. Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. However, it does provide the FT Legal Employment Rate.

View the press release after the jump »»

Winter 2012 Transparency Index Report

Today, we’re releasing a new feature on our website. The Transparency Index is an index of every ABA-approved law school website. It measures how transparent law schools are on their websites about their post-graduation outcomes for the class of 2010. From January 1, 2012 to January 3, 2012, the LST team analyzed and documented every site using 19 criteria chosen after contemplating what matters to a prospective law student looking to invest three years and a lot of money in a professional degree. The results from this period are LST’s Winter 2012 Transparency Index.

The Transparency Index is not a ranking system. It would not be very meaningful to rank a school by the number of criteria met because different criteria vary in importance. In other words, just because one school meets more criteria than another school does not mean that the first school is more transparent than the second.

It is also important to note that law school websites are fluid and that schools may respond to external stimuli, including LST’s official request for school NALP reports, by improving their web disclosure policies. In fact, some schools may have improved public employment information shortly after our data collection dates.

Over the next few weeks, we will make the Transparency Index more user friendly and update school information when we learn of the updates. Meanwhile, we encourage law schools to learn from the index, to update their websites with the TIQ Criteria in mind, and to alert us when they do so.

Full report is available here.
Winter 2012 Data is available here.
Live Transparency Index is here.

Executive Summary

As a new year unfolds and the debate about legal education reform continues, efforts in furtherance of law school transparency remain critical. While transparency of law schools’ post-graduation employment data will not solve all of legal education’s problems, it can put pressure on the current law school model and thereby act as a catalyst for broader legal education reform. This is true whether it occurs through the process of seeking transparency or because of the information that such disclosure ultimately reveals.

Having had their long-standing practice of withholding basic consumer information called into question, law schools have responded with new attempts at disclosure in advance of the ABA’s new requirements. Adequate disclosure should be easy to achieve; law schools have possessed ample information, in an easy publishable format, for many months. But as the findings of this report show, the vast majority of U.S. law schools are still hiding critical information from their applicants.

This report reflects LST’s analysis of the class of 2010 employment information available on ABA-approved law school websites in early January 2012. The Winter 2012 Index reveals a continued pattern of consumer-disoriented activity. Our chief findings are as follows:

  • 27% (54/197) do not provide any evaluable information on their websites for class of 2010 employment outcomes. Of those 54 schools, 22 do not provide any employment information on their website whatsoever. The other 32 schools demonstrate a pattern of consumer-disoriented behavior.
  • 51% of schools fail to indicate how many graduates actually responded to their survey. Response rates provide applicants with a way to gauge the usefulness of survey results, a sort of back-of-the-envelope margin of error. Without the rate, schools can advertise employment rates north of 95% without explaining that the true employment rate is unknown, and likely lower.
  • Only 26% of law schools indicate how many graduates worked in legal jobs. 11% indicate how many were in full-time legal jobs. Just 1% indicate how many were in full-time, long-term legal jobs.
  • 17% of schools indicate how many graduates were employed in full-time vs. part-time jobs. 10% indicate how many were employed in long-term vs. short-term jobs. 10% of schools report how many graduates were employed in school-funded jobs.
  • 49% of schools provide at least some salary information, but the vast majority of those schools (78%) provide the information in ways that mislead the reader.

Taken together, these and other findings illustrate how law schools have been slow to react to calls for disclosure, with some schools conjuring ways to repackage employment data to maintain their images. Our findings play into a larger dialogue about law schools and their continued secrecy against a backdrop of stories about admissions data fraud, class action lawsuits, and ever-rising education costs. These findings raise a red flag as to whether schools are capable of making needed changes to the current, unsustainable law school model without being compelled to through government oversight or other external forces.