Recent incoming law school classes earned progressively lowers LSAT scores and GPAs compared to prior classes. Journalists report in legal and national press on this "brain drain." Academics and bloggers discuss it online and at conferences. But does the so-called brain drain really matter? Does a several point drop in a school's median LSAT score—a flawed window into an individual's ability to help future clients—mean we should worry? Through this investigation, we've set out to advance the conversation.
We started with a basic observation. As long as the bar exam guards entrance to the legal profession, law schools should be held accountable for enrolling students who face significant risk of not passing that exam. Failing to earn a license does not eliminate all of the value law schools provide, but that failure significantly decreases the value of a law degree for a typical graduate. Fewer students would undertake three years of law school and significant debt without the prospect of practicing law.
According to the Law School Admissions Council's (LSAC) National Longitudinal Bar Passage Study, the LSAT is the best predictor before law school as to whether a student will pass or fail the bar exam. Further, the ABA accreditation standards reinforce the notion that law schools should be held accountable for graduate bar exam outcomes. The standards also reinforce the relevance of LSAT scores in examining whether law schools make responsible enrollment choices.
This report digs into law school data to analyze the choices law schools are making in the face of significant financial pressure caused by falling demand and unhealthy revenue dependencies. This report also prescribes actions for various legal education stakeholders to take to improve the state of legal education.
We are not aware of a time when so many law schools had something like an open enrollment policy. Law school entrance has long been very competitive, but since 2003, it's become easier to get into law school every year. Last year, 4 in 5 people who applied to law school were admitted to at least one school. To a real extent, we're in uncharted territory. For those reasons and more, this report is not the final word. But we do intend it to enhance a difficult and contentious conversation about the choices law schools are making in the face of significant financial pressure.
|Minimal Risk||156-180||≥ 67.4|
|Low Risk||153-155||55.6 - 63.9|
|Modest Risk||150-152||44.3 - 52.5|
|High Risk||147-149||33 - 40.3|
|Very High Risk||145-146||26.1 - 29.5|
|Extreme Risk||120-144||≤ 22.9|
The table to the left reflects an analytical framework for assessing how much risk of failing the bar exam a student faces based on their LSAT score. The framework was developed by David Frakt, a lieutenant colonel in the Air Force Reserve, defense attorney, and former law professor. We applied his framework to entering classes at ABA-approved law schools. For example, a school is "high risk" if its 25th percentile LSAT score for its 2014 1L class is 147, 148, or 149. Unless otherwise stated, risk levels reflect the 25th percentile LSAT score. Schools that are high, very high, or extreme risk are deemed to be "serious risk."
If a school's 25th percentile student is "high risk" then it means that 25% of the class or more faces high risk of not completing school and of failing the bar. Similarly, if a school's 75th percentile student is "high risk" then at least 75% of the class faces high risk of not completing school and of failing the bar.
The table to the right may help to understand how these quartile marks are calculated. Imagine that School X and Y both enroll 16 new students. To calculate the LSAT interquartile range and median, the schools line all students up in ascending order by LSAT score. The 4th student is the 25th percentile (16*.25); the 8th student is the 50th percentile or median (16*.50); and the 12th student is the 75th percentile (16*.75). Looking at School X and School Y's LSAT breakdowns (144/150/152) shows how class composition can vary despite having the exact same interquartile range and median.
As such, we don't know more about the upper or lower quartiles than what the person at each quartile scored and what we can infer by comparing quartiles. As an example, School Y's bottom three students performed substantially better on the LSAT than the bottom three students at School X's. For both schools, we knew from the bottom quartile that the bottom three students scored 144 or less, but nothing more.
This affects how we understand risk at each quartile. Using the risk chart from the left, both School X and School Y have at least 25% of the class in the extreme risk category. It turns out that exactly 25% of School Y's class faces extreme risk, with the rest of the class facing modest or low risk. For School X, 31.25% of its class faces extreme risk, and 41.75% faces high, very high, or extreme risk. In other words, a school can hide lower scores easily when the only public data are the interquartile range and median—three single data points. For schools where the 75th percentile student is high risk, the entire class could be high risk, but without more granular data, we're stuck saying "at least 75%."
You can read more about the risk framework in our full analysis.
All school-level data, unless otherwise noted, come from the American Bar Association Section of Legal Education & Admissions to the Bar, which collects data from law schools each year. Student borrowing data come from U.S. News & World Report, which collects data from law schools each year, and sometimes directly from schools. National enrollment, applicant, and LSAT data come from the Law School Admissions Council.
We analyze every ABA-approved law school, except the three in Puerto Rico, Lincoln Memorial University, Belmont University, and University of Massachusetts. We excluded the Puerto Rican schools because the LSAT is in English and the bar exam in Puerto Rico can be taken in Spanish. We excluded UMass because it would not provide us 2010 admissions data. We excluded Lincoln Memorial and Belmont because they did not enroll students in 2010.