The ABA Council on Legal Education and Admissions to the Bar completed an enormous step this morning towards helping prospective law students make informed decisions. The Council, which is the sole accrediting body for U.S. law schools, unanimously approved the Questionnaire Committee’s recommended procedures for the improved collection and sharing of employment data. The recommendation is based on last December’s Questionnaire Committee hearing, at which interested parties, including LST’s executive director, presented on the issue of consumer information transparency.
You can review the now-approved recommendation here (pages 22–28), although the vote added two caveats to the recommendation. Under the new policy, for at least one year, the ABA will work with NALP to leverage NALP’s present collection, cleansing, and distribution practices, subject to the ABA and NALP reaching a contractual agreement concerning confidentiality. Working with NALP this year will allow the ABA to avoid unnecessary costs, divergent information, and confusion.
Several Council members were concerned, however, about using a third party—NALP—as part of the Council’s regulatory oversight, thus desire to reach an appropriate contractual arrangement with NALP. Bryan O’Keefe, the Law Student Divison’s student representative to the Council, clarified that these logistical concerns will not hold up publishing the improved data in the upcoming year.
The Council agreed that these issues will not prevent improving data disclosure in the next questionnaire. In the long-term, the Council wishes to exert direct control over the process, either through hiring its own staff or selecting a third-party vendor through a RFP (request for proposal).
The New Disclosure Policy
Caveat One: The Council’s Executive Committee will work with the Questionnaire Committee to reach an appropriate contractual agreement with NALP.
Caveat Two: The Council will begin work as soon as possible to directly collect the relevant data, either through increased staff or an outside vendor.
Schools are required to report:
Employment Status (100% of the class will be accounted for with these categories)
Job Credentials: employed in a job requiring bar passage, in a job for which a JD is preferred, in another professional job, in a non-professional job, or in a job of unknown type.
Non-Employed Status: pursuing a graduate degree; unemployed – not seeking or unemployed – seeking; and status unknown.
Employer Type (100% of the class will be accounted for with these categories)
Law Firms: various sizes based on total attorneys at the firm globally (8 total + an unknown category).
Other Employers: business and industry; government; public interest; judicial clerkships; academia; and employer type unknown.
United States: the three states where the most graduates are employed and number employed in each.
International: the number of graduates employed internationally.
Schools will also report data, where applicable, about whether the jobs are full-time/part-time and long-term/short-term, as well as indicate the number of jobs that are funded by the law school or university.
In addition to the placement data, the ABA/LSAC Official Guide will publish state-specific salary information based on graduates reporting from all law schools. According to Questionnaire Committee, only providing school-specific salary data provides “limited and perhaps confusing information” to applicants.
As the Committee also correctly points out, school-specific data is less representative than the state-specific datasets. Moreover, when only ~50% of graduates report a salary, granular categories like “Employed in a Law Firm of 2-10 Attorneys” are unlikely to have sufficient school-specific data to warrant sharing. (This is a flaw we identified with the 509 Subcommittee’s proposal.)
According to Mr. O’Keefe:
By using aggregate data, students will be able to have a more accurate picture of possible salaries.
When you combine the job data and the salary data, applicants will be able to discern the exact job prospects of an individual school, and using the aggregate salary data, develop a solid idea of what that job prospect makes in any given state. The employment location variable will allow students to assess where those graduates end up working— in essence, you will be able to tell where you are most likely geographically to end up getting that job and making that salary.
We agree that this solution will provide prospectives a more thorough and accurate window into the placement opportunities at various law schools. However, we do not think this solution goes far enough with consideration to our goal of helping prospective law students find the schools that best meet their individual career objectives. The adopted solution does not provide enough graduate-level detail for those seeking to make a meaningful connection between the post-graduate outcomes for a given school’s graduates and the job characteristics, including salary, required credentials, and location. It is still too difficult to connect these dots. Nevertheless, it is a monumental improvement that should be celebrated.
The Questionnaire Committee also indicates that it will develop ideas for how to improve the accuracy of the data entered by law school graduates and staff.
[A]s an example of a possible response, the ABA might require that annually or at the time of the sabbatical site visit there be random audits of placement data submitted in the annual surveys. If performed at the time of the site visit, these audits might be “informal” and performed by a member of the site team. We have serious concerns, however, with the ability of a site team member to perform this function in the context of a site visit.
We share these concerns. Site team members do not typically have auditing expertise and usually only collect facts from and about the law school. Mechanically, the site team likely would only examine the data collected by the law school to compare to what the school has reported. This creates an echo chamber, duplicates NALP’s data cleansing (which aims to catch mistakes), and would therefore provide only a marginal return on investment. Villanova-style lying would still be too easy to achieve, although even a single set of third-party eyes might deter schools.
An alternative might be to require schools, on a random basis, to provide a more “formal” audit performed by a CPA firm of their placement survey responses. These random audits could be performed annually or at the time of the site visit. Obviously, this would involve greater expense and we would have to look at ways that the expense could be minimized and distributed among law schools, whether they are audited or not. Finally, there may be other methods of performing such an audit, or an alternative to it; we will consider them also.
As is clear from these suggestions, the idea that all law schools should be subject to yearly auditing is not on the table. It seems unnecessary to us because we do not believe the problem lies with falsified data, but misleading information. An effective alternative, which we’ve already shared with the ABA committees, would be to use the LST Proposal. Through sufficient disaggregation, individual graduates could verify how the school reported their outcomes from behind a veil of anonymity provided by the proposal’s structure. This provides a nearly zero-cost alternative that would deter law schools from fabricating outcomes. Upon an accusation of foul play, the school would tender its source (the survey) and the graduate its evidence of misrepresentation.
The Questionnaire Committee operated with a clear mission:
Our objective in selecting, obtaining, and providing [employment] data are several-fold: (1) to provide correct and complete data (a) to law school applicants to assist them in making decisions on whether to go to law school and, if so, which law school to attend, and (b) to current law students and recent graduates to assist them in making job decisions; and (2) to obtain and provide this information in a way that will require the least amount of additional, unnecessary effort by law schools, particularly in their career services offices.
This echoes LST’s mission and objectives. It’s important that the ABA continues to prioritize law school transparency as legal education continues to change over the next decade.
We look forward to the Questionnaire Committee and Council optimizing the annual questionnaire over the next few years with these objectives in mind because there is still a need for more improvement. Elsewhere within the Section of Legal Education, the Standards Review Committee, operating with similar objectives, continues its work on Standard 509. Together, these approved and prospective changes are a great start.