ABA Reforms Employment Outcome Disclosure

The ABA Council on Legal Education and Admissions to the Bar completed an enormous step this morning towards helping prospective law students make informed decisions. The Council, which is the sole accrediting body for U.S. law schools, unanimously approved the Questionnaire Committee’s recommended procedures for the improved collection and sharing of employment data. The recommendation is based on last December’s Questionnaire Committee hearing, at which interested parties, including LST’s executive director, presented on the issue of consumer information transparency.

You can review the now-approved recommendation here (pages 22–28), although the vote added two caveats to the recommendation. Under the new policy, for at least one year, the ABA will work with NALP to leverage NALP’s present collection, cleansing, and distribution practices, subject to the ABA and NALP reaching a contractual agreement concerning confidentiality. Working with NALP this year will allow the ABA to avoid unnecessary costs, divergent information, and confusion.

Several Council members were concerned, however, about using a third party—NALP—as part of the Council’s regulatory oversight, thus desire to reach an appropriate contractual arrangement with NALP. Bryan O’Keefe, the Law Student Divison’s student representative to the Council, clarified that these logistical concerns will not hold up publishing the improved data in the upcoming year.

The Council agreed that these issues will not prevent improving data disclosure in the next questionnaire. In the long-term, the Council wishes to exert direct control over the process, either through hiring its own staff or selecting a third-party vendor through a RFP (request for proposal).

The New Disclosure Policy

Caveat One: The Council’s Executive Committee will work with the Questionnaire Committee to reach an appropriate contractual agreement with NALP.
Caveat Two: The Council will begin work as soon as possible to directly collect the relevant data, either through increased staff or an outside vendor.

Job Data

Schools are required to report:

Employment Status (100% of the class will be accounted for with these categories)
Job Credentials: employed in a job requiring bar passage, in a job for which a JD is preferred, in another professional job, in a non-professional job, or in a job of unknown type.
Non-Employed Status: pursuing a graduate degree; unemployed – not seeking or unemployed – seeking; and status unknown.

Employer Type (100% of the class will be accounted for with these categories)
Law Firms: various sizes based on total attorneys at the firm globally (8 total + an unknown category).
Other Employers: business and industry; government; public interest; judicial clerkships; academia; and employer type unknown.

Employment location:
United States: the three states where the most graduates are employed and number employed in each.
International: the number of graduates employed internationally.

Schools will also report data, where applicable, about whether the jobs are full-time/part-time and long-term/short-term, as well as indicate the number of jobs that are funded by the law school or university.

Salary Data

In addition to the placement data, the ABA/LSAC Official Guide will publish state-specific salary information based on graduates reporting from all law schools. According to Questionnaire Committee, only providing school-specific salary data provides “limited and perhaps confusing information” to applicants.

As the Committee also correctly points out, school-specific data is less representative than the state-specific datasets. Moreover, when only ~50% of graduates report a salary, granular categories like “Employed in a Law Firm of 2-10 Attorneys” are unlikely to have sufficient school-specific data to warrant sharing. (This is a flaw we identified with the 509 Subcommittee’s proposal.)

According to Mr. O’Keefe:

By using aggregate data, students will be able to have a more accurate picture of possible salaries.

When you combine the job data and the salary data, applicants will be able to discern the exact job prospects of an individual school, and using the aggregate salary data, develop a solid idea of what that job prospect makes in any given state. The employment location variable will allow students to assess where those graduates end up working— in essence, you will be able to tell where you are most likely geographically to end up getting that job and making that salary.

We agree that this solution will provide prospectives a more thorough and accurate window into the placement opportunities at various law schools. However, we do not think this solution goes far enough with consideration to our goal of helping prospective law students find the schools that best meet their individual career objectives. The adopted solution does not provide enough graduate-level detail for those seeking to make a meaningful connection between the post-graduate outcomes for a given school’s graduates and the job characteristics, including salary, required credentials, and location. It is still too difficult to connect these dots. Nevertheless, it is a monumental improvement that should be celebrated.

Auditing

The Questionnaire Committee also indicates that it will develop ideas for how to improve the accuracy of the data entered by law school graduates and staff.

[A]s an example of a possible response, the ABA might require that annually or at the time of the sabbatical site visit there be random audits of placement data submitted in the annual surveys. If performed at the time of the site visit, these audits might be “informal” and performed by a member of the site team. We have serious concerns, however, with the ability of a site team member to perform this function in the context of a site visit.

We share these concerns. Site team members do not typically have auditing expertise and usually only collect facts from and about the law school. Mechanically, the site team likely would only examine the data collected by the law school to compare to what the school has reported. This creates an echo chamber, duplicates NALP’s data cleansing (which aims to catch mistakes), and would therefore provide only a marginal return on investment. Villanova-style lying would still be too easy to achieve, although even a single set of third-party eyes might deter schools.

An alternative might be to require schools, on a random basis, to provide a more “formal” audit performed by a CPA firm of their placement survey responses. These random audits could be performed annually or at the time of the site visit. Obviously, this would involve greater expense and we would have to look at ways that the expense could be minimized and distributed among law schools, whether they are audited or not. Finally, there may be other methods of performing such an audit, or an alternative to it; we will consider them also.

As is clear from these suggestions, the idea that all law schools should be subject to yearly auditing is not on the table. It seems unnecessary to us because we do not believe the problem lies with falsified data, but misleading information. An effective alternative, which we’ve already shared with the ABA committees, would be to use the LST Proposal. Through sufficient disaggregation, individual graduates could verify how the school reported their outcomes from behind a veil of anonymity provided by the proposal’s structure. This provides a nearly zero-cost alternative that would deter law schools from fabricating outcomes. Upon an accusation of foul play, the school would tender its source (the survey) and the graduate its evidence of misrepresentation.

Going Forward

The Questionnaire Committee operated with a clear mission:

Our objective in selecting, obtaining, and providing [employment] data are several-fold: (1) to provide correct and complete data (a) to law school applicants to assist them in making decisions on whether to go to law school and, if so, which law school to attend, and (b) to current law students and recent graduates to assist them in making job decisions; and (2) to obtain and provide this information in a way that will require the least amount of additional, unnecessary effort by law schools, particularly in their career services offices.

This echoes LST’s mission and objectives. It’s important that the ABA continues to prioritize law school transparency as legal education continues to change over the next decade.

We look forward to the Questionnaire Committee and Council optimizing the annual questionnaire over the next few years with these objectives in mind because there is still a need for more improvement. Elsewhere within the Section of Legal Education, the Standards Review Committee, operating with similar objectives, continues its work on Standard 509. Together, these approved and prospective changes are a great start.

18 thoughts on “ABA Reforms Employment Outcome Disclosure”

  1. Open Letter to the entire Legal Community

    Re: Law School Transparency and Employment Statistics

    No one argues with the theory of law school transparency, the issues lie in the means to achieve that goal. I believe that many of those calling for the increased amount and detail of the reported information do not have a true understanding of the challenges faced in collecting this information. While demanding more, they do not suggest which tools may be used to gather more thorough information, nor do they offer assistance in its collection. Law School Career Services Offices (CSO’s) cannot mandate, coerce, bribe, or otherwise direct their new graduates to answer the annual questionnaire. It is a completely voluntary exercise which only goodwill and nagging manages to elicit responses.

    It is for this reason that small or solo CSO’s suffer most noticeably; they simply do not have manpower to make these repetitive outreach efforts. To be sure, many students respond to the CSO’s request in the first or second contact. However, despite phone calls and emails encouraging responses, the graduates, for any number of reasons, do not supply the information. Further, a statistically significant number of our graduates simply “drop off the radar screen” as they change all of their contact information from what we have on file. The helplessness and frustration is palpable in our CSO in January.

    Compounding this stress now is US News & World Report’s new method of calculating the overall employment rate used in their ranking system. Instead of using the number of graduates whose status is known as the denominator, they are using the number of total graduates. What this does in practical, mathematical terms is calculate all unknown graduates as unemployed. Regardless of the reason that the graduates failed to respond, to presume that they are unemployed is inaccurate, misleading and extremely detrimental to all the constituencies that rely on this information. It serves the exact opposite function of transparency. Could it be that these graduates are indeed employed but merely too busy working in fulfilling and demanding legal jobs to respond? I sure hope so, but the new method of calculation crushes that possibility without statistical justification. Calculating the employment rate based upon the known statuses of graduates who have responded supplies us with a representative sample – a well-known and widely accepted statistical practice. To deviate from this simply counters good reporting practices.

    Without compulsion from an entity that still retains influence over these graduates, the task becomes one of sheer man-hours that can be dedicated to it. The pressure to provide responses for the entire class was already keenly felt, but now will have serious negative impacts on the classes of current students for whom the CSO is responsible. Time spent by the CSO staff in exhausting every avenue of collecting the graduate data takes away the availability of our counselors to meet with current students and address their more immediate needs. Again, this is an acute issue for small and solo CSO’s. How can we justify hours spent on the phone with perhaps no fruitful resolution when there are students in our offices who need our attention? If we better serve the current students, we are cultivating the relationships that may result in employment for them and increase the probability of future responsiveness. We are always striving to do better each year.

    A secondary problem arises when the CSO has only some information that was gathered more informally and/or from third parties. With the pressure to report on all of the graduates, what level of certainty must be attained before the information can be reported? If a professor recalls a conversation with a graduate at a Bar reception, can that be entered as employment data? What about information posted by friends on Facebook? The verification process may take up more valuable time, rather than taking the conservative route of not reporting that graduate’s status as truly known.

    A slippery slope, a downward spiral, these are the characteristics of the new chase for information to satisfy the ever-hungry statistics beast. The statistical beast does not serve any master, either; merely its own greediness. Prospective law students will not get any more accurate information from this process, in fact it may be worse than before. Striving to keep the law schools’ data as transparent as it can be is a laudable goal, but it has been severely derailed in this instance. The new ranking system races full steam ahead without notice to the law schools and without a plan of guidance or tools to help them reach the same point. I would ask for partnerships among the law schools, the various state bars, and the American Bar Association to determine how we can all reach out to the new graduates to obtain full and reliable information. Perhaps as part of the first-year CLE requirement? Perhaps as a condition of joining the ABA as an incentivized newly licensed attorney? And certainly with a revision of the calculation method used by the US News & World Report.

    Respectfully submitted,

    Linda A. Spagnola – Wendling

    Assistant Dean for Career Services
    Turner Law Building
    North Carolina Central University School of Law
    Durham, NC 27707

  2. Linda,

    Here are two solutions I came up with in the manner of seconds after reading your comment. 1) Distribute and collect the questionnaire at the same time as Cap and Gown pickup. 2) Issue and collect the questionnaire to students while they are captive in the classroom (in the same manner as the class/professor reviews). Sure, you cannot mandate that students fill out the questionnaire but you can easily make the process painless.

    The point is, collecting the data from students isn’t that problematic, just use a little common sense.

    — Current student

  3. Current student,

    Most law schools that I know of already do what you suggest or some variation of it. The problem is that the employment data is meant to be a snapshot of the graduating class at nine months post-graduation, and for schools that have a large number of graduates going to employers such as D.A.’s offices or smaller firms that do not hire until after bar results, the information gathered at graduation may not be accurate nine months down the road. I agree that better employment data is desirable, but schools have no leverage to compel graduates to supply this information at the time that NALP requests it.

  4. If students don’t graduate unemployed, horribly in debt, and extremely bitter against their alma matters for luring them in with employment statistics that in no way reflect reality, the information shouldn’t be that hard to collect 9 months down the road… I loved my experience in college, and if my undergrad tried to solicit this information, I would send it to them in a second (hell, I responded to their stuff even when I was working 80+ hour weeks before starting law school). No system is perfect, but the recent changes are a huge improvement.

  5. Linda,

    Can we please speak to every member of your graduating class, to see whether (a) they feel like you have added any value to their job search process and (b) to see if they agree with your arguments? I know the answer to (a) for my school (Loyola LA) would be a no, and if Loyola LA’s career services office made such arguments I would consider them laughable and not worth a serious debate.

  6. Linda, based on my own experience, and the experience of almost every single person who I have ever talked to on the subject, either at my law school or other law schools, career services office are not particularly productive. They can’t force outside organizations to hire, all they can do is post the jobs they receive, which does not really take that much time. They tend to give platitudes and general advice that everyone has already heard (network, volunteer, etc.), and they also tend to only go out of their way to help people who don’t need it (i.e. who are already being heavily recruited). Unfortunately, CSOs have also been implicated in the systematic misrepresentation of post-law school employment statistics. Collecting and accurately reporting those statistics is, frankly, a better use of their time in my opinion.

  7. Linda,

    While students don’t have to report, I believe many would if you, at enrollment, charged a $100 fee that would be returned w/interest at the end of law school after reporting employment. You could also look at linkedin-you know, the site you recommend thinking it might actually get law graduates a job, HA!

    Thank you for your input but I bet your staff has the attention to detail and perseverance to endure conducting a 150 person survey.

  8. Schools like North Carolina Central University School of Law are the reason reform is needed. TTTT schools that promise students 89% employment but I doubt 20% are employed by graduation. You might as well spend your time gathering this data since no amount phone calls will convince top (or in your school’s case mid-size) firms to even allow your students to interview for custodial positions unless they’re top %10 and on law review.

    Hopefully this is just the first step.

  9. “Regardless of the reason that the graduates failed to respond, to presume that they are unemployed is inaccurate, misleading and extremely detrimental to all the constituencies that rely on this information. It serves the exact opposite function of transparency. Could it be that these graduates are indeed employed but merely too busy working in fulfilling and demanding legal jobs to respond? I sure hope so, but the new method of calculation crushes that possibility without statistical justification.”

    LOL
    —————–

    Linda,

    People like you are the reason that the ABA needs to get it’s act together and insist on common sense reforms. Do you really think that it is more accurate, honest, and beneficial for prospective students to be led to believe that non-responders are “are indeed employed but merely too busy working in fulfilling and demanding legal jobs to respond?” Come on now get real.

    My hunch: You have sold yourself this baloney so that you can sleep at night. Must be nice to know that your life in “Southern Wake county on a peaceful wooded acre with her husband Raymond, her two beautiful little girls Emmelia and Katerina, and Winston the yellow Labrador Retriever” is built primarily on the shattered dreams of students who were duped by faulty information into crushing debt.

    ref: http://web.nccu.edu/law/faculty/administration/spagnola.html

  10. As a current law school student, I can honestly say that despite reaching out to them, the career services people at my school didn’t in any way help me find legal employment. I’m not saying that this is necessarily a bad thing. Frankly, I think that it’s really the student’s job to find work. But along with the above posters, I would MUCH rather have career services employees spend 100% of them time figuring out employment stats than “helping me find a job.”

  11. At the law school I went to,(Wake Forest) thirty years ago, they freely admitted that if you hadn’t starved to death in 60 days after graduation, they counted you as employed. I remember, and it is the reason that I will never give money to the school, getting the alumni newsletter with every single job opening listed having passed the applications deadline. I sat down and cried. But, if I walked across the street from my apartment to the placement office, it would be locked. Dreadful, dreadful place.

  12. Did you see the NALP’s response? It is an amazing piece of work. Let me provide some quotes.

    http://www.nalp.org/uploads/documents/doc1.pdf

    “beginning in February 2012, the ABA Annual Law School Questionnaire process will require that all law schools report individual student record level employment data to the ABA. This will, in effect, duplicate the research effort that NALP has successfully undertaken for the last 37 years.”

    Successfully? If the NALP did its job successfully there wouldn’t be the uproar.

    “This decision, made, as we understand it, by the Council’s Executive Committee, without a vote by the entire Council, and without comment or input from the public,”

    Are these people crazy?! No comment or input from the public? The public including e.g. LST are the sole motivators of this change.

    “One of the chief harms caused by this action is that it will require a dual reporting burden by the law schools, who now will be asked to report individual student record level employment data to both the ABA and NALP.”

    No, the NALP is completely useless and dispensible.

    “Worst, we fear, is that if schools are required to
    separately report employment outcomes to the ABA, there is a great risk that many of them will no longer report their data to NALP. This will inevitably lead to the reduction in the amount of information we have about the entry-level legal employment process, and will have the long-term effect of producing less transparency about the legal job market and not more.”

    No, see last comment.

    “For those schools that are able to meet the dual burden, it will drain resources that might otherwise be used to counsel and support students in their job search process.”

    First of all it’s very easy to collect this information. What’s hard is presenting it in a fraudulent and deceptive manner. Furthermore, I doubt students will worry too much about losing the “counsel and support” of their CSO office.

    “Finally, we object in the strongest terms possible to the Section’s unlicensed use of NALP’s research terms and definitions in its plan to collect student record level data directly from the schools. The ABA may well decide that it should survey schools directly about what happens to their students when they complete their legal education, but in order to do so the ABA must develop its own survey instrument and research terms. The actions of the Council’s Executive Committee this week have effectively taken the intellectual property that NALP has developed over the last 37 years.”

    WOOWOWWOWOWOOWOWOW! The NALP is claiming that they invented surveys, job categories, salary information and employment status!!!!!! WHAT A BUNCH OF INTELLECTUALLY DISHONEST *******!!!!!!

Comments are closed.