Class of 2012 NLJ 250 Statistics

The National Law Journal (NLJ) released its annual report this weekend on the law schools that send the most graduates to the 250 largest American law firms (NLJ 250). In this post we’ll answer a few basic questions about this important employment outcome measure. This is the first published Class of 2012 employment information.

What is the NLJ 250?

The NLJ 250 includes the 250 largest law firms headquartered in the United States. This is measured by the firm-reported annual average number of full-time and full-time equivalent attorneys working at the firm, in any office, in 2012. This does not include temporary or contract attorneys, though it does include non-partner track attorneys.

Where do the data come from?

Methodology via the NLJ:

Methodology: Data for this Go-To Law Schools special report were provided by the law firms surveyed for the NLJ 250, The National Law Journal’s annual survey of the nation’s 250 largest law firms by headcount. We received data from 190 firms. For firms that did not submit new associate numbers, we relied on data from ALM Media LLC’s RivalEdge database and independent reporting. We determined rankings by the percentage of 2012 juris doctor graduates who took associate jobs at NLJ 250 firms. The rankings do not reflect law graduates who took jobs as clerks following graduation. Our data do not include new associates at Paul, Weiss, Rifkind, Wharton & Garrison or King & Spalding.

What do these numbers tell us?

Large firm placement percentage is an important, albeit imperfect, proxy for the number of graduates with access to the most competitive and highest paying jobs. The percentage, accordingly, tell us which schools most successfully place students in these highly sought-after jobs. Successful large firm placement is best analyzed by looking at multiple years worth of data. (View the NLJ 250 from the class of 2010 here and from the class of 2011 here.)

What do these numbers not tell us?

First, self-selection controls all post-graduation outcomes. Nobody is coerced into a job they are offered (unless you consider debt pressure or other strong personal influences coercive), so these numbers do not provide more than a proxy for opportunities. Opportunities, after all, are prospective students’ real concern when analyzing employment information, and these rankings do not necessarily reflect a school’s ability to place students into NLJ 250 firms.

Many graduates, particularly at the top schools, choose to clerk after graduation instead of working for these law firms. While not all of these graduates would have secured employment at the NLJ 250 firms, some could have. For this reason, one popular technique used to understand a school’s placement ability is adding the percentage of graduates at NLJ 250 firms to the percentage of graduates clerking for Article III judges. This method is not perfect; read our white paper for a more detailed explanation of the strengths and weaknesses of this technique.

Second, NLJ 250 firm jobs are not the only competitive, high-paying firm jobs. Boutique law firms are also very competitive, with some paying New York City market rates and above. Additionally, the NLJ 250 does not include large, prestigious internationally-based law firms with American offices.

Third, not all NLJ 250 firm jobs are equally competitive. Law firms from different regions and of differing caliber have varying preferences for the students from different law schools, including how far into the class they are willing to reach. That is, two schools that place an equal percentage of graduates in NLJ 250 firms may do so for reasons other than similar preferences among equally competitive NLJ 250 firms.

Fourth, the rankings include data only about the law schools that placed at least 8.22% of its entire class in the NLJ 250 firms. All other American law schools placed a lower, unknown percentage at NLJ 250 firms. The remaining schools range from 0% to 8.22%, and probably do not fall into a normal distribution.

If you have more questions, please feel free to email or reply this post. We will update this as needed.

2012 placement into NLJ 250 firms by law school

Rank School NLJ 250
Grads
Total
J.D.’s
% of
Class
1 University of Pennsylvania Law School 163 270 60.37%
2 University of Chicago Law School 119 216 55.09%
3 Columbia Law School 245 460 53.26%
4 New York University School of Law 253 478 52.93%
5 Northwestern University School of Law 144 280 51.43%
6 Harvard Law School 297 590 50.34%
7 Duke Law School 107 221 48.42%
8 Stanford Law School 86 182 47.25%
9 University of California, Berkeley School of Law 139 307 45.28%
10 Cornell Law School 85 192 44.27%
11 University of Virginia School of Law 151 357 42.30%
12 University of Michigan Law School 149 388 38.40%
13 Georgetown University Law Center 193 616 31.33%
14 Yale Law School 68 222 30.63%
15 University of California at Los Angeles School of Law 97 333 29.13%
16 University of Southern California Gould School of Law 63 220 28.64%
17 Vanderbilt University Law School 51 194 26.29%
18 University of Texas School of Law 96 372 25.81%
19 Fordham University School of Law 114 487 23.41%
20 University of California, Irvine School of Law 13 56 23.21%
21 George Washington University Law School 122 542 22.51%
22 Boston University School of Law 58 273 21.25%
23 Boston College Law School 54 256 21.09%
24 University of Illinois College of Law 40 213 18.78%
25 Washington University in St. Louis School of Law 49 300 16.33%
26 University of Notre Dame Law School 32 196 16.33%
27 Southern Methodist University Dedman School of Law 44 280 15.71%
28 Emory University School of Law 39 253 15.42%
29 University of Houston Law Center 35 262 13.36%
30 College of William and Mary Marshall-Wythe School of Law 27 206 13.11%
31 Howard University School of Law 19 148 12.84%
32 University of North Carolina School of Law 32 260 12.31%
33 University of Arizona James E. Rogers College of Law 17 141 12.06%
34 Washington and Lee University School of Law 15 129 11.63%
35 University of Washington School of Law 20 181 11.05%
36 University of Minnesota Law School 25 230 10.87%
37 Seton Hall University School of Law 32 310 10.32%
38 University of Kentucky College of Law 15 148 10.14%
39 Loyola Law School, Los Angeles 41 414 9.90%
40 University of California Hastings College of the Law 43 443 9.71%
41 Wake Forest University School of Law 15 155 9.68%
42 Villanova University School of Law 24 255 9.41%
43 University of Georgia School of Law 21 225 9.33%
44 Indiana University Maurer School of Law–Bloomington 19 208 9.13%
45 University of California, Davis School of Law 17 198 8.59%
46 Santa Clara University School of Law 26 306 8.50%
47 University of Wisconsin Law School 24 284 8.45%
48 Rutgers School of Law–Camden 23 274 8.39%
49 Loyola University Chicago School of Law 23 274 8.39%
50 University of Tennessee College of Law 12 146 8.22%

Class of 2011 NLJ 250 Statistics

The National Law Journal (NLJ) released its annual report this weekend on the law schools that send the most graduates to the 250 largest American law firms (NLJ 250). In this post we’ll answer a few basic questions about this important employment outcome measure. This is the first Class of 2011 employment information publicly provided.

What is the NLJ 250?

The NLJ 250 includes the 250 largest law firms headquartered in the United States. This is measured by the firm-reported annual average number of full-time and full-time equivalent attorneys working at the firm, in any office, in 2011. This does not include temporary or contract attorneys.

Where do the data come from?

First, the NLJ collects survey data from the law firms themselves, not the law schools. A significant percentage of all NLJ 250 firms responded to the survey about first-year hiring. (The NLJ would not comment on the exact percentage.) The NLJ then contacts the law schools to fill in the gaps — but never relies directly on their word. The final figures reached are minimums, representing only the people the NLJ verified to their liking; at no point does the NLJ extrapolate from a smaller number to a larger number.

What do these numbers tell us?

Large firm placement percentage is an important, albeit imperfect, proxy for the number of graduates with access to the most competitive and highest paying jobs. The percentage, accordingly, tell us which schools most successfully place students in these highly sought-after jobs. Successful large firm placement is best analyzed by looking at multiple years worth of data. (View the NLJ 250 from 2010 here.)

What do these numbers not tell us?

First, self-selection controls all post-graduation outcomes. Nobody is coerced into a job they are offered (unless you consider debt pressure or other strong personal influences coercive), so these numbers do not provide more than a proxy for opportunities. Opportunities, after all, are prospective students’ real concern when analyzing employment information, and these rankings do not necessarily reflect a school’s ability to place students into NLJ 250 firms.

Many graduates, particularly at the top schools, choose to clerk after graduation instead of working for these law firms. While not all of these graduates would have secured employment at the NLJ 250 firms, many could have. For this reason, one popular technique used to understand a school’s placement ability is adding the percentage of graduates at NLJ 250 firms to the percentage of graduates clerking for Article III judges. This method is not perfect; read our white paper for a more detailed explanation of the strengths and weaknesses of this technique.

Second, NLJ 250 firm jobs are not the only competitive, high-paying firm jobs. Boutique law firms are also very competitive, with some paying New York City market rates and above. Additionally, the NLJ 250 does not include large, prestigious internationally-based law firms with American offices.

Third, not all NLJ 250 firm jobs are equally competitive. Law firms from different regions and of differing caliber have varying preferences for the students from different law schools, including how far into the class they are willing to reach. That is, two schools that place an equal percentage of graduates in NLJ 250 firms may do so for reasons other than similar preferences among equally competitive NLJ 250 firms.

Fourth, the rankings include data only about the law schools that placed at least 6.49% of its entire class in the NLJ 250 firms. All other American law schools placed a lower, unknown percentage at NLJ 250 firms. The remaining schools range from 0% to 6.49%, and probably do not fall into a normal distribution.

If you have more questions, please feel free to email or reply this post. We will update this as needed.

2011 placement into NLJ 250 firms by law school

Rank School NLJ 250 Grads Total Grads % of Class
1 University of Pennsylvania Law School 156 274 56.93%
2 Northwestern University School of Law 149 286 52.1%
3 Columbia Law School 235 455 51.65%
4 Harvard Law School 285 583 48.89%
5 Stanford Law School 87 181* 48.07%
6 University of California, Berkeley School of Law (Boalt Hall) 140 305 45.9%
7 University of Chicago Law School 92 203 45.32%
8 Duke Law School 89 219* 40.64%
9 New York University School of Law 187 466 40.13%
10 University of Virginia School of Law 150 377 39.79%
11 Cornell Law School 72 188* 38.3%
12 University of Southern California Gould School of Law 68 207 32.85%
13 University of Michigan Law School 119 378 31.48%
14 Georgetown University Law Center 198 637 31.08%
15 Yale Law School 61 205 29.76%
16 University of California at Los Angeles School of Law 78 344 22.67%
17 Vanderbilt University Law School 43 195 22.05%
18 Boston College Law School 62 285 21.75%
19 University of Texas School of Law 82 382 21.47%
20 Fordham University School of Law 84 429 19.58%
21 Boston University School of Law 48 269* 17.84%
22 George Washington University Law School 92 518 17.76%
23 University of Notre Dame Law School 26 190 13.68%
24 Washington University School of Law (St. Louis) 42 315 13.33%
25 Washington and Lee University School of Law 16 126 12.7%
26 Emory University School of Law 28 225 12.44%
27 Yeshiva University Benjamin N. Cardozo School of Law 45 380 11.84%
28 University of Washington School of Law 21 182 11.54%
29 University of Minnesota Law School 29 261 11.11%
29 University of Illinois College of Law 21 189 11.11%
31 Southern Methodist University Dedman School of Law 28 272 10.29%
32 University of Houston Law Center 27 281 9.61%
33 West Virginia University College of Law 12 126* 9.52%
34 Wake Forest University School of Law 15 158 9.49%
35 University of California, Davis School of Law 17 195 8.72%
36 University of North Carolina School of Law 21 246 8.54%
37 University of California Hastings College of the Law 35 412 8.5%
38 University of Missouri-Columbia School of Law 12 142* 8.45%
39 Seton Hall University School of Law 24 293 8.19%
40 Rutgers School of Law-Newark 19 248 7.66%
41 Howard University School of Law 12 157* 7.64%
42 Villanova University School of Law 19 252 7.54%
43 University of Maryland School of Law 20 281 7.12%
44 University of Wisconsin Law School 18 254 7.09%
45 Samford University Cumberland School of Law 11 157 7.01%
46 Temple University James E. Beasley School of Law 22 319 6.9%
46 University of Alabama School of Law 12 174* 6.9%
48 Brigham Young University J. Reuben Clark Law School 10 148 6.76%
49 Brooklyn Law School 30 455 6.59%
50 University of Miami School of Law 25 385 6.49%

*Graduate class size based on latest data from the ABA/LSAC Official Guide to Law Schools.

Breaking: Class Action Suit Filed Against Thomas Jefferson School of Law

Update: Follow the latest on Alaburda v. TJSL here.

At least one graduate has chosen to seek judicial relief from her alma mater in a class action that could include over 2,300 graduates of Thomas Jefferson School of Law in San Diego, California. Sara Randazzo broke the news (subscription required) at midnight PDT in the Daily Journal. The story will be available in print Friday morning.

The complaint (see the case summary below) alleges that Thomas Jefferson School of Law (TJSL) has engaged in “fraudulent and deceptive business practices,” including “a practice of misrepresenting its post-graduation employment statistics,” and that “the disservice TJSL is doing to its students and society generally is readily apparent.” The complaint cites a number of news articles over the last few years, and quotes from law school faculty and administrators to demonstrate the widespread consensus that schools are engaged in unfair and misleading practices. You can check out the complaint for yourself here. The complaint was filed by lead plaintiff Anna Alaburda, a 2008 honors graduate of TJSL. Additional court documents are attached to this post.

This lawsuit is of historical significance. It is the latest example of the breaking trust relationship between law schools and their students, their graduates, and the profession. Law schools have a duty to be honest and ethical in their reporting and presentation of employment data. This lawsuit shows that at least some members of the profession believe these duties are legal requirements, in addition to being merely professional or educational in nature. Perhaps importantly for some critics, Ms. Alaburda decided to attend law school before the legal market collapsed and before stories of misleading information were widespread.

Current Employment Information

As of today, TJSL is still providing misleading employment information (the “TJSL Report”) on its website for the Class of 2009. Compounding the problem, TJSL has thus far declined to report any Class of 2010 information on its website, despite already collecting sufficient employment data about the class when they reported to NALP back in March of this year. Almost every law school could do a much better job educating prospective students about the nature of the jobs obtained by their graduates; TJSL is no different. The most serious fault we find with the TJSL Report is how the school misrepresents starting salaries.

The underlying data match for the TJSL Report and U.S. News-provided information

The TJSL Report claims that the school collected at least some data from 86% of graduates (respectable, though still putting them in the bottom 5% of all law schools), and that of those graduates 84.7% were employed. This means that 72.8% of Class of 2009 graduates were known to be employed, which is the same as what the career services office reported to U.S. News. Likewise, both sources indicate that 80% of the graduates known to be employed were employed in the private sector, i.e. working for law firms or in business & industry in some (any) capacity. This data match makes it possible for us to examine TJSL’s advertised placement success with the more detailed reporting rates submitted to U.S. News.

TJSL Salaries

Based on our calculations from the data submitted to U.S. News, only 17% of those working full time in the private sector reported a salary. This means that at most 22 graduates reported salary data for full-time, private-sector jobs to TJSL. (This puts TJSL in the bottom 10% of law schools by percentage reporting.)

We say “at most” because the U.S. News salary figures only include full-time jobs. Only about half of TJSL graduates had full-time jobs for the Class of 2009. Some of these were likely with law firms and in business, but probably not all of them. The only thing we gain from the information provided on the TJSL Report is that at least five salaries underly the average salary figures for law firm practice ($62,443) and for Business jobs ($90,267). Based on the other data, the average figures probably each only use data for a few more graduates than the minimum five. As such, the $90,267 and $62,443 average salaries are each based on data for between 2-8% of the entire class (for a total not to exceed 10%).

The substance of these salary averages is not apparent from TJSL’s Report or website. In fact, the picture which the published averages present is of a magnitude far more appealing than reality. The business salary average is significantly higher than the California mean salary, $83,977, for the business category according to NALP.

For law firm jobs, the problem is a little different. While the national mean salary for law firms is $115,254, that average is misleading on its face because 40% of the salaries used to calculate the average were $160,000 and 5% were $145,000. If we factor these salaries – the salaries most likely to be reported – out of the average, the average reduces to $80,007.

Although this average still likely skews high, the effect of large firm salaries on the adapted average is apparent. Those with higher salaries are far more likely to report. These salaries are also usually publicly known, thus the graduates do not need to report their salary to be included in these averages since schools can report any salaries they have reason to believe are accurate. This adjustment is not only common at law schools, but encouraged by NALP. As the TJSL Report states, “Our annual employment statistics are compiled in accordance with the [sic] NALP’s Employment Report and Salary Survey.”

The main point here is that the average salary reported in the TJSL Report skews high without context: no salary ranges, percentiles, or observational data besides the five-graduate floor has been provided. TJSL could, if it wanted, provide the following chart as specific context. This information, specific to graduates from all NALP-reporting graduates working in California, comes from NALP’s Class of 2009 Jobs & JDs. TJSL receives a copy of this report, since it is an active participant in NALP’s research. Our example uses all California salary information because 83% of TJSL’s graduates known to be employed were employed in California.

TJSL Data California Salary Data (All Grads)
Firm Type # Grads 25th Median 75th Middle 90% Avg.
2-10 Attys. 36 $52,000 $62,400 $72,000 $36,000 – $100,000 $63,526
11-25 Attys. 2 $60,000 $70,000 $80,000 $45,000 – $135,000 $77,096
26-50 Attys. 3 $70,000 $78,000 $95,000 $50,000 – $130,000 $83,152
51-100 Attys. 4 $79,000 $90,000 $135,000 $62,500 – $160,000 $105,449
101-250 Attys. 2 $100,000 $145,000 $160,000 $85,000 – $160,000 $135,171
251+ Attys. 7 $160,000 $160,000 $160,000 $140,000 – $160,000 $156,904

The total number of TJSL graduates in each category indicates that the salaries TJSL used to calulate its published average firm salary skews even higher than normal. If between 5 and 17 graduates reported a law firm salary, at least some were from jobs paying six figures. But it’s difficult to know how many of those were six-figure jobs because the employer category includes non-attorneys making significantly less than attorneys with the same employer. Of course, prospective law students could know all of this if the school had decided to tell them.

Overall, it is easy to see why a prospective TJSL student today would be misled into thinking that a $200,000 investment in the TJSL degree is worth it. It remains to be seen whether our analysis holds for previous years, as well as whether what we consider misleading is sufficiently fraudulent, misrepresentative, or unfair according to a Cali state court.

TJSL is not alone

Countless other law schools across the country engage in similarly misleading practices, making them equally at risk of facing a class action. Every law school has the opportunity to provide better information and better context for that information. Some schools are proactively reforming how they present employment data, but many more have not yet felt compelled to change their behavior. Lawsuits like this will make law schools quickly rethink how they promote their programs.
See a summary of the complaint after the jump »»

NYLS’s Deceptive Practices

We recently learned of an email sent to accepted students by William D. Perez, Assistant Dean of Admissions and Financial Aid for New York Law School. The email is a response to what Dean Perez considers to be misinformation about law schools in the media. In an effort to convince accepted students to reconsider NYLS, his email tries to balance out the discussion by sharing some positive facts about NYLS. You can view the entire email here (will appear in a lightbox).

Our issue is not that Dean Perez wants to allay fears about law school in general and NYLS in particular. Any school, especially one where the average debt for 2009 graduates borrowing was $119,437, should believe that its opportunities justify the cost of attendance and should share information that materially affects a prospective’s cost-benefit analysis. Our issue is that NYLS has not provided nearly enough information, either in Dean Perez’s email or in its publications, to support some of the claims made in this effort to recruit next year’s class. Next week, we will submit our concerns to Dean Perez and Dean Richard Matasar in the hopes they will act responsibly to resolve what is possibly a violation of accreditation Standard 509.

Dean Perez claims that “our graduates are getting jobs, earning money and able to repay their loans.” But available information demonstrates otherwise. At worst, Dean Perez has overstated this claim in a deceptive and irresponsible manner. At best, NYLS has failed to meaningfully portray the data he believes supports these propositions. We’ll begin by addressing the employment and salary information that NYLS provides to prospective law students, and then move on to discuss the (un)importance of loan default statistics.

Getting Jobs. Earning Money.

NYLS’s employment statistics webpage (“Statistics Page”) (source) is designed for prospective law students trying to answer questions about job opportunities at NYLS. But it takes specialized knowledge about the reporting process and access to third-party information to recognize that these numbers are misleading.

For starters, NYLS provides its nine-month employment rate (89.7%), the breakdown of its employed graduates (first table below), and some of their salaries (first and second tables below).

Salaries
Employer Type Percentage Range (Min-Max) Average
Private Practice 45.6% $28,000 – $160,000 $120,197
Coporate/Business 23.7% $50,000 – $96,000 $75,167
Government 8.2% $41,000 – $72,000 $56,054
Public Interest 16% n/a n/a
Judicial Clerkship 3.4% $42,000 – $58,200 $45,887
Academic 3.1% $40,000 – $45,000 $42,500

You might expect that this table reflects a breakdown of the 89.7% of its 440 graduates because this is the “employment rate for the Class of 2009″ as of February 15, 2010. This rate, although not unusual, is not what it seems. It’s actually an adjusted rate, which, until this year, U.S. News used for its rankings:

Employment Rate =
graduates known to be employed OR enrolled in FT degree program
+
25% of graduates whose employment status is unknown
total graduates – graduates who are unemployed and not seeking work

Based on Class of 2009 employment data submitted to U.S. News, we come to a rate of 89.6%. The result is off by .1% due to rounding error, but nevertheless confirms NYLS’s rate calculation. As such, the employer type breakdown reflects only 82.7% of the class, because those are the only graduates reporting an employer type.

Next we look at the salary information. Understanding the salary figures on this table requires understanding the size of the dataset. This is difficult based on what NYLS says about its size: “Approximately 20% of our 2009 graduates reported salary information.” There is no clarity about what the denominator is. It could be the entire graduating class (440), the number of graduates counted as employed using the adjusted rate (395, from the 89.7% rate), or the actual number of graduates with any job (364, from the 82.7% rate). We will assume that it uses 364 graduates as the denominator on the grounds that these are the only graduates who could be expected to report a salary.

From this, we know that the first table includes salaries for roughly 16% of the entire class (73 graduates). But we have no indication from the Statistics Page as to the distribution of graduates throughout employer types, other than knowing that zero graduates working public interest jobs reported a salary.

NYLS also breaks down the private practice employer type by the salaries attained. The below table breaks down the “Private Practice” row in the first table. Accordingly, this table reflects the job outcomes for 37.7% of the class, or 45.6% of the 82.7% of graduates who were employed.

Salaries
Law Firm Size Percentage Range (Min-Max) Average
501+ 20% $145,000 – $160,000 $159,500
251 – 500 6% $120,000 – $160,000 $155,000
101 – 250 4% $90,000 – $160,000 $136,667
51 – 100 4% $62,000 – $90,000 $81,750
26 – 50 3% $55,000 – $55,000 $55,000
11 – 25 11% $47,000 – $65,000 $57,000
2 – 10 51% $28,000 – $80,000 $54,583
Unknown 1% n/a n/a

The salaries are equally as problematic for this second table. Just as we cannot tell how many of those 73 graduates reporting a salary were in a particular employer type category, we cannot tell how many are working for law firms and represented in this table. Based on the distribution of salaries in the first table, at least 11 graduates were in categories other than private practice. This means that these salary figures by firm size represent at most 14% of the class when you combine all of the rows, though the number is assuredly smaller.

What does all of this mean? Although the Statistics Page includes a cautionary statement that only about 20% of graduates reported salaries, the information provided is still deceptive. It took numerous calculations and data from a third party to figure out how few graduates actually underlie these figures. Yet, when you read these tables, an unknowing prospective who is contacted by Dean Perez and told that “[NYLS] graduates are getting jobs, earning money and able to repay their loans” will see large salaries that can reasonably be taken as evidence of this advertised, short-term solvency. The method NYLS employs to present salary statistics can be persuasive to the unknowing applicant, but it clearly does not reflect reality when, for example, the advertised $159,500 salary average for graduates employed at 501+ attorney law firms reflects the average for, at most, 7.5% of the class.

Dean Perez claims that graduates are earning money, even though the school only reports what one-fifth of the Class of 2009 was earning. If his office has information on the other four-fifths, it would be a good idea to share it when making such claims, rather than lead prospectives to think that the salary information provided is reflective of actual practice. And if NYLS does not possess salary data for the other 80% of the class, then the administration needs to review its recruiting policies and determine whether these statements are designed to mislead and/or have the effect of misleading the consumer. We think they do. When only 62% of the entire class is working in a bar-required position, there’s ample room to be skeptical of the claims made by Dean Perez.

Dean Perez also claims that New York Law School had more favorable or comparable employment statistics than [Hofstra, Buffalo, Touro, Albany, CUNY, Pace, Syracuse, Fordham, Cardozo, Saint John’s, and Brooklyn]. These are important claims that require adequate evidence, regardless of the economic climate and media attention. In the context of the email, this claim is especially troublesome because it seeks to sway applicants by stating that, despite all of the criticism, this particular law school really is a worthwhile investment. That may be true (and we will not make that call), but the school cannot simply prove its value by comparing itself to the other New York schools. No school can prove its value this way without first having sufficient transparency about the post-graduation employment outcomes of its own graduates.

[See our data clearinghouse to see if you agree that NYLS has “more favorable or comparable statistics” compared to these other New York schools: NYLS, Hofstra, Buffalo, Touro, Albany, CUNY, Pace, Syracuse, Fordham, Cardozo, Saint John’s, Brooklyn.]

Paying Back Loans

Loan default rates, contrary to Dean Perez’s assumption, do not indicate the value of a program. With the federal Income Based Repayment and Income Contingent Repayment plans, no individual with federal student loans should default. Defaults merely suggest poor advice by financial aid offices and/or poor self-discipline. A graduate can make minimum wage and have significantly-reduced monthly loan payments, thanks to these programs. Both programs have their downsides: interest accumulates and can cause debts to balloon over the life of the payment plan, and in certain scenarios the debtor will be taxed on the forgiven debt at the end of the repayment period. But they are programs designed to make sure that people don’t default. If the default rates are low, the school should be applauded for providing sound financial advice, but it is hardly evidence that NYLS graduates are by-and-large doing well, particularly when we only know the salaries for 20% of the class.

Misleading The Consumer

Selective presentation is deceptive. The manner in which NYLS portrays salaries and job outcomes, while not outright lying, deceives the reader into thinking she is more informed about the employment opportunities at NYLS than she really is. Despite NYLS possessing better information (and even reporting some of that information to U.S. News), the school has declined to share information on the Statistics Page that it knows would be valuable, such as the fact that 58.4% of all 2009 NYLS graduates were employed full-time, while 45.2% were working full-time, bar-required jobs. Omission of such important, value-adding information is so obvious that it suggests NYLS actually intends to deceive. Such a perception has enormous ramifications for how people view legal education in this country. This behavior is precisely why we are prompting reform.

Law schools are sophisticated suppliers of a service; they understand what consumers want to believe as truth, particularly consumers facing full tuition costs and six figures of debt. With no incentive to do otherwise, schools hide or otherwise misrepresent the data that might scare applicants away. And when the applicants get wind of it through exposure in the media, we see responses like that of Dean Perez. Absent tougher regulations that require improved disclosure while prohibiting claims like these from being made without factual support, some law schools in the United States will continue undermining the educational purpose they are supposed to serve.

Have a Complaint about Your School? How to File with the ABA

We have heard from many law school alumni and current students about problems they encountered regarding how their school reports post-graduation outcomes. Many have alleged intentional acts of deception on the part of their law schools, whether it’s regarding the reporting of their own employment information or the employment of their friends. At the same time, some commentators have accused the ABA Section of Legal Education of lax enforcement concerning violations of the accreditation standards. One way to encourage better enforcement (and better compliance) is to file an official complaint with the Section of Legal Education.

NOTE: We have requested more information from the Consultant on Legal Education, the Accreditation Committee Chair, and representatives of the Section of Legal Ed in Chicago. This post will be updated when we receive a response.

How to file a complaint

For starters, complaints are governed by Rules of Procedure. The complaint form (.doc) is available on this page, which also explains the complaint requirements and process.

A complaint should include a clear and concise description of the allegation and any evidence upon which the allegation is based (with any relevant supporting documentation). [Rule 24(d)3(i).] You must allege a violation of one or more of the accreditation standards, which you can read through here. The complaint must state the timeframe of the alleged lack of compliance (limited to one year from filing), a description of any steps taken to exhaust the law school’s grievance process, and any actions taken by the law school in response to the complaint. [Rule 24(d)3(ii) and (iii).] Any other channels being pursued by the complainant should be disclosed, including legal action. The complainant must also provide a release authorizing the Consultant’s Office to send a copy of the complaint to the dean of the law school.

Any person may bring a complaint alleging noncompliance with the standards; no other harm or damages need to be alleged. The filing of a complaint can lead to an investigation by the Consultant on Legal Education and sanctions by the Accreditation Committee or Council of Legal Education. Per Rule 16 of the Rules of Procedure, sanctions can include monetary penalties, refunds for part or all of the tuition and/or fees paid by students, censure (both private and public), publication of a corrective statement, remedial action, and probation. A school on probation is at risk of being removed from the list of approved law schools.

Allegations that a school has violated or is currently violating one or more standards are serious. Separate from official sanctions of the law school, culpable individuals may be asked to resign or terminated for cause by their school. A school’s reputation may be damaged even if sanctions don’t ultimately rise to the most serious levels. For these reasons we ask that you consider whether the evidence you have is strong enough to warrant an investigation by the ABA. A suspicion that your employment status was misreported, for example, may not be enough without supporting documentation.

You should first contact the school to request that they cease violating the standard prior to filing a complaint with the ABA. An exception to pursuing this route is if you wish to file anonymously, in which case see the discussion below about site evaluation comments.

What actions might qualify as non-compliance?

Of the 52 accreditation standards that currently regulate law school behavior, only one (Standard 509, found in Chapter 5) deals with employment reporting. The seven interpretations of Standard 509, as with all interpretations, carry the same force as the standard itself. This consumer protection standard requires schools to publish certain “basic consumer information” in a “fair and accurate manner reflective of actual practice.” While the accompanying Interpretations only list “employment rates and bar placement statistics” as basic, this list is not exhaustive. You can read more on the current employment reporting requirements here.

Complaints grouped under this standard might fall into two camps. The first are allegations that the school misreported the employment rates or bar placement statistics, focusing on the text of the Interpretation 509-1. Schools are required to report the employment status of each graduate as of February 15th for the second-most recent graduating class on the annual questionnaire.

If you have reason to believe that you or members of your class were miscounted as of that date, despite having reported accurate employment data, and if you can support that belief with documentation such as emails or surveys, then you should consider notifying the school and filing a complaint. Depending on the allegation, this could take sophisticated coordination. You likely need to document a sizable percentage of your classmates’ post-graduation outcomes to show that the reported percentages must have been wrong. For example, a sworn statement from 10% of your class stating they were unemployed as of February 15th would be good evidence that your school’s reported 95% employment is incorrect.

Many recent graduates have contacted us claiming that there was no way the school reported the results of their class accurately. However, it is important to first understand the reporting requirements to see whether the school was just following protocol, as the standards themselves make it very easy to legitimately hide individual outcomes. You may not think that a part-time job waiting tables should qualify you as employed, but it is appropriate under current reporting standards. A violation under Standard 509 would be if the school counted you as employed full-time, or in a JD-preferred or bar admission-required job to U.S. News.

The second camp of violations would be allegations that the basic consumer information provided on a law school’s website or in promotional brochures to law school applicants is misleading and therefore not presented in a “fair or accurate manner reflective of actual practice.” Supporting documentation would necessarily include the publications, and you should describe why they are not reflective of actual practice.

Not willing to file yourself?

One of the Section of Legal Education’s requirements is that complaints will be closed if they are made anonymously, unless the Consultant determines that there are extraordinary circumstances for keeping someone’s identity secret from the school. We understand that graduates may be reluctant to allege noncompliance on behalf of their schools, and that there may be other situations (for example, employees of the school) where someone might be discouraged from whistleblowing if their name will be dragged through the process.

[We have contacted the Consultant for more information on what has counted in the past as extraordinary circumstances, and will update this post when we hear back.]

Some of the complaint procedures may discourage you from filing. For one thing, a complainant has no right within the rules to appeal a decision to close the complaint by the Consultant’s office. A complainant also will not be informed about the proceedings or given access to view the school’s response if one is requested by the Consultant. If the complaint is eventually presented to the Accreditation Committee, there is no appeal process if the Committee sides with the school. And regardless of the outcome, a complainant will only be notified about the stage at which the matter was resolved. From what we can tell all proceedings are closed to the public.

If you have evidence that a school has been in noncompliance and you believe your situation is an extraordinary circumstance, you can contact LST. We will work with you to determine whether the complaint is actionable, and, if appropriate, file the complaint ourselves. NOTE: This does not guarantee that we will file a complaint; it only means that we will review the information to decide if we want to file the complaint on your behalf.

Complaint alternatives

As an alternative to filing a complaint, you can also file a comment as part of the accreditation process. Each ABA-approved law school is recertified once every five years through a process that is taken very seriously by the administration. To conduct the accreditation, the Section of Legal Education sends a delegation of volunteers, often professors, administrators, and judges, to the school as a member of a site evaluation team. The team visits the school to collect facts and gather opinions, including thoughts of employees and students, so that the Accreditation Committee and Council of the Section of Legal Education can evaluate whether the school is in compliance with accreditation standards.

Comments must be submitted at least eight weeks prior to the next site visit, which are conducted during the school year. You can find a draft schedule for all visits up through 2014 on the ABA Section of Legal Education’s website.

Written comments related to current compliance with the Standards for the Approval of Law Schools may be submitted to the Consultant’s Office. The comments should be sent no later than eight weeks prior to the site visit’s beginning date. Comments should be sent to the Deputy Consultant on Legal Education to the American Bar Association, 321 N. Clark Street, Chicago, Illinois, 60654.

Seven law schools will be audited next fall: Arizona, Baylor, Chicago-Kent, Idaho, Missouri-Columbia, Ohio, and Temple. Another twenty are scheduled for next spring. These visits are an excellent time to ask the site evaluation teams to fulfill their responsibilities by taking a hard look at how a particular law school is educating and potentially misleading applicants.

If you have any questions or comments, please feel free to contact us. We plan to begin filing complaints soon.

Class of 2009 U.S. News Data

Each year, U.S. News collects data from almost every ABA-approved law school through an annual survey. As with prior years (Class of 2007 and 2008), we have collected Class of 2009 employment data and cost of attendance data in a spreadsheet for easy use and comparison. While there is a significant time lag—the data came out a few weeks ago for the graduates from nearly two years ago—the spreadsheet serves as one of just a few sources of employment information that prospectives can use to easily compare schools in a standardized fashion.

Also in accord with prior years, we added a few of our own metrics to help readers understand and compare data. These include (1) the percent of the graduating class represented by the private sector salary information and (2) the percent of the graduating class with a known salary.

These metrics show how much (or little) we know about salary outcomes from the salary information provided by using the U.S. News reporting standard. As it turns out, the touted salary medians are poor pictures of the graduating classes for the vast majority of schools. On average, the “median salary” represents what only 29% of private sector graduates made, or just 17% of the entire graduating class. The ABA will hopefully soon begin regulating these practices, ensuring that schools cannot advertise salary information without important qualifiers such as the percentage of all graduates included in salary figures.

This year, partially in an effort to better qualify the salary figures, U.S. News made important changes to the way it reports employment data collected from law schools. We reported back in December that changes were coming as a result of our discussions with Bob Morse, Director of Data Research at U.S. News, who implemented our ideas as we proposed. These changes will provide prospective students a more thorough picture of post-graduation outcomes from many ABA-approved law schools. Additionally, these changes will improve our data clearinghouse, since they allow us to eliminate most of our assumptions as we turn the data into more meaningful information.

Information Quality

For the most part, schools do a good job reporting data in the manner requested by U.S. News. That is, there are few discrepancies that cannot be explained by small rounding errors (where the percentages do not add up to 100 percent when they should) or schools misunderstanding the clerkship survey questions. But people should not assume that we are out of the woods. Reporting data according to the U.S. News standard is a separate achievement from presenting true, meaningful information that’s useful for prospectives trying to make informed decisions. The job characteristic data and LST’s new metrics demonstrate just how different the picture can be between how some schools present their data and the reality.

In the coming weeks, we will start a discussion about practices at some schools that may deserve the ABA’s attention. If you know of individual schools where Class of 2009 data evidences schools presenting information in a misleading manner, please do not hesitate to let us know.

Otherwise, a preliminary data review did find a few errors in the data reported for 2009 grads. We’ve already alerted Bob Morse about the errors and we have been told that corrections are on the way. Our spreadsheet fixes these errors and highlights other areas for concern.

LST’s Data Clearinghouse

We will soon release the newest installment of our data clearinghouse, once we are satisfied with the underlying data. The clearinghouse helps applicants visualize the data in a way that isn’t intuitively obvious. Many applicants have been using it to better understand school-specific employment information and to make better estimates about future job prospects. We expect it will be even more useful this year because it requires fewer assumptions and prospectives can trace how schools fared over the prior three years.

As always, the data clearinghouse will reflect only cleansed data. We have not and cannot audit the data for accuracy. In the meantime, if you spot any errors or have any comments, please do not hesitate to leave a comment here or email us at .

LST’s Proposal: The Job Outcome List and a National Salary Database

The 509 Subcommittee’s first draft proposal for a revised Standard 509 is a good start. But as we described in our analysis, the proposed revisions are only the first step towards greater transparency. The proposal does not go far enough to disaggregate the current employment information, resulting in a reporting standard that will still struggle to help match prospectives to the law schools that best meet their career objectives.

We have been working on our own proposal, separate from the LST Standard, for a few months now. We have discussed it with key people in the Section of Legal Education, law school administrators, and briefly with NALP’s Executive Director, Jim Leipold. It was born out of discussion at December’s Questionnaire Committee hearing. These conversations have helped shape The LST Proposal into a solution that meets the needs of all interested parties.

The LST Proposal

Our proposal can and should co-exist with the chart proposed by the 509 Subcommittee. Together, the proposals provide prospective students a quick overview of the employment opportunities at various schools while also allowing a more detailed, holistic view for those students who wish to delve deeper. We are hopeful that implementing the two proposals would result in more informed decisions and a more efficient allocation of students to the schools that best meet their career and educational objectives.

The LST Proposal has two core elements. First, each school would report graduate-level data about post-graduation employment outcomes on a “Job Outcome List.” For each graduate, schools would report, as applicable:

  • Employment status
  • Employer type
  • Full-time or part-time
  • Required credentials
  • Location
  • Whether the graduate received special funding
  • Job Source

[View the detailed categories on this chart]

These data are already reported to NALP by all but six ABA-approved law schools (St. Louis University, University of Kentucky, Columbia University, and the three law schools in Puerto Rico). The Job Outcome List would be publicly available.

Second, schools would report known salary data for each graduate. Schools also already report these data to NALP. However, unlike the data on the Job Outcome List, the salary data would not be publicly available. Instead, the Section of Legal Education would create a national database of salary data just like the database NALP already has and reports about in Jobs & J.D.s. The database would include all employment data contributed by law schools each year.

The result would be a public, national database of job outcomes and salaries that respects individual and employer privacy desires. Prospective students would use this database for a general idea of lawyer pay in certain locations for certain jobs, as well as an indicator of the short-term economic value recent graduates are attaining with each school’s J.D.

Mechanics of the National Database

Pairing a national salary database with school-by-school, disaggregated employment information would allow prospectives to understand entry-level salaries without identifying the compensation of any individual graduate. To do this, the database would provide salaries for small, though statistically significant, cross-sections of law school graduates. The cross-sections would be created by using the factors that many prospectives consider to be part of their career objectives: employer type, location, and key job characteristics.

For example, for the Class of 2009 graduates, the average starting salary of full-time bar-required jobs in Los Angeles at law firms with 51-100 attorneys was $97,287. The 10th, 25th, 50th, 75th, 90th salary percentiles are, respectively, $75,000, $80,000, $90,000, $95,000, and $145,000. In Atlanta, the average starting salary for the same category is $107,619, and the salaries percentiles are, respectively, $80,000, $90,000, $90,000, $130,000, and $145,000.

Under The LST Proposal, prospectives would be able to match these salaries to a school’s actual placement track record in different places in different jobs. Under the 509 Subcommittee’s current draft, if a school collects fewer than five salary data points for a particular category, schools report no salary information at all. Prospectives remain unaware of how graduates fared because the only information available is that Y graduates obtained jobs with 51-100 attorney law firms, with no indication of location or required job credentials.

In order to understand what these salary percentiles mean to a prospective student considering X school, each school must provide enough disaggregated information to allow prospectives to match outcomes to the national salary database. This connectivity is crucial to an operational national salary database. This is one function that the Job Outcome List would serve.

There are a few ways to design the database, and we are hopeful that the ABA, NALP, LST, and other interested parties can have open discussions about how to best execute this vision. Initially, it is our view that between one and five years of salary data, back-provided by NALP, can be aggregated to create a richer salary dataset. The number of years used would depend on the type of job and location, as salaries have shifted more or less for different cross-categories of employment outcomes. (E.g. New York City 501+ attorney firm salaries have remained relatively stable within at least the last three years.)

Additionally, it is our view that the narrowest salary picture should be provided whenever possible. If enough data exist for 51-100 attorney law firms in Atlanta, city-level figures should be available. If not, the database would provide the next narrowest regional dataset. These higher-level datasets might be Fulton County, Metro Atlanta, Georgia, the South Atlantic (DE, DC, FL, GA, MD, NC, SC, VA, WV), and the United States. The categories could also carve certain locations out of a larger geographical area. For example, one category might be 2-10 attorney law firms in Georgia minus Metro Atlanta. The possibilities hinge only on having large enough datasets. Regardless of whether the narrowest set is available, each higher-level dataset should be associable with each listed job outcome.

Other Advantages of the Employment Lists

The benefits of this proposal do not end with the addition of elaborate, privacy-respecting salary information to the marketplace. After all, the jobs graduates take are often based on more than salaries, so a proposal that aims to help match prospectives to their best fit cannot end with only salary information. To this end, the Job Outcome List will help prospectives understand the various kinds of jobs graduates take at particular law schools. Its components offer various insights into the entry-level market and how each school fits into that market.

Long-term Help

Focusing on a single year of data is dangerous, but an improved standard must start somewhere. The concern is certain to be more pronounced when there is more disaggregated information available for public consumption. The fear that prospectives will pay too much attention to the first year of new data, while grounded in reality, is but a consequence of improved transparency at law schools. The LST Proposal will be best after three or five years. At that point, prospectives would be more able to discern which schools can best meet their individual objectives. And that should be everybody’s goal.

The 509 Subcommittee’s Draft Proposal: An Explanation and Evaluation

This is our third post in a series of posts (see the first and the second) where we contemplate the 509 Subcommittee’s draft proposal and the facts needed to understand how it would advance transparency at ABA-approved law schools. This post will explain the new proposal and evaluate it using the three criteria we set out in the second post.

The Subcommittee’s Proposal

On March 14th, the Subcommittee released its first draft proposal for a revised standard for the reporting of employment data. David Yellen, dean of Loyola University Chicago School of Law and chair of the Standard 509 Subcommittee, will present this proposal to the Standards Review Committee on Saturday, April 2, 2011 in Chicago. We will provide updates on any changes that come out of the meeting.

The draft proposal has three parts: a memorandum explaining the subcommittee’s operating assumptions and goals, a new Standard 509(b), and a chart that each law school would be required to fill out and post on its website each year.

The Memorandum

In the memorandum, the Subcommittee states that the goal is to “provide more meaningful and consistent employment information to prospective law students . . . [that will] greatly assist prospective students in making informed decisions about whether to go to law school or which law school to attend.” Right away the Subcommittee recognizes that schools already gather a great deal of data, and that it follows that sharing more information with prospective students will require only a small (and, implicitly, justified) burden.

The Subcommittee describes the consumer protection standard, Standard 509, as “a vague standard” that enables schools to provide limited and hard-to-compare information. The fact that reporting practices vary so widely among schools makes it very difficult for prospectives to understand the employment outcomes of a particular set of graduates. What’s more, the Subcommittee continues to recognize that the presentation of information is occasionally misleading. This reflects previous comments made by Dean Yellen.

The memorandum then cabins the problems with the current information into two categories: employment rates and salary information. The Subcommittee establishes two principles regarding the the first category. First, “the percentages disclosed should be based on the entire graduating class, with only those known to be employed being counted as such.” The second principle regards the variety of jobs graduates take, and the problem of providing misleading impressions about the true successes of a school’s graduates. “[T]he best approach is to require schools to disclose more disaggregated data about . . . categories of jobs.” These categories include nonprofessional jobs, part-times jobs, temporary jobs, and jobs funded in part by the school.

Regarding the second category, the Subcommittee recognizes the limited utility of salary medians and the likelihood that readers will misunderstand what the medians refer to and how they are calculated. The Subcommittee proposes that “all salary information clearly indicate the number of respondents and percentage of all graduates included.” This is an important revision that will change the manner in which many schools currently portray salary statistics. For examples of how problematic this can be, check out LST’s data clearinghouse. (The linked example shows a school that reported a median salary of $160,000, despite it being the median for only about 16% of the entire class.)

Proposed Standard 509(b)

The first proposal made by the Subcommittee is as follows:

Standard 509. BASIC CONSUMER INFORMATION
(b) A law school must publicly disclose the employment outcomes of its graduates by preparing and posting on its website the attached chart.
(1) The employment information must be accurate as of February 15th for persons who graduated with a JD or equivalent degree between September 1 two calendar years prior and August 31 one calendar year prior.
(2) The information must be posted on the school’s website by March 31 each year.
(3) The information posted must remain on the school’s website for at least three years, so that at any time, at least three graduating classes’ data is posted.
(4) The information must be gathered and disclosed in accordance with the instructions and definitions issued by the Section’s Questionnaire Committee.
(5) Any additional employment information the law school discloses must be fair, accurate and not misleading.
(A) Any publicly disclosed statistics regarding graduates’ salaries must clearly identify the number of salaries and the percentage of graduating students included.

The proposed Standard 509(b) requires that schools publicly disclose the employment outcomes of the most recent graduating class as true on the first February 15th following graduation. Schools must disclose these outcomes, at minimum, on the “attached chart” by the first March 31st following graduation. It also requires schools to keep the chart on their websites for at least three years. Finally, it adds a catch-all in 509(b)(5) to protect against predatory, opportunistic practices. This specifically includes a solution to misleading median salary practices that some law schools currently use.

The Chart

[View the chart]

The proposed Standard 509(b) “attached chart” aims to exhibit the outcomes of the entire graduating class as of the first February 15th following graduation. The chart disaggregates the current information into smaller categories to illuminate the outcomes graduates achieve at a particular school. The chart is also the first official recognition by an arm of the Section of Legal Education that salary information is in fact “basic consumer information.”

There are two classes of categories on this chart: employment status and employment type. For each category and subcategory, schools must report the percentage of all graduates, rather than of only employed graduates, as well as the raw number of graduates included in the calculation. This decision aims to limit the impact of creative accounting and less than forthright attempts at collecting employment data from graduates.

The employment status class places all graduates into four exhaustive categories: employed, pursuing a graduate degree full-time, unemployed, and employment status unknown.

The chart breaks “employed graduates” into two subcategories. First, this category breaks all employed graduates into four exhaustive kinds of employment: full-time long-term, full-time short-term, part-time long-term, and part-time short-term.

Second, it breaks all employed graduates into exhaustive categories based on the credentials required (or preferred) to do the job: bar passage required, J.D. preferred, other professional, or non-professional. It then further breaks each of those categories into (the same) four exhaustive kinds of employment: full-time long-term, full-time short-term, part-time long-term, and part-time short-term.

The employment type class breaks all employed graduates into six exhaustive categories based on the type of employer: law firms, business & industry, government, public interest, judicial clerkships, and academic. Of those categories, the law firm and judicial clerkships categories are further broken down by type. The law firms are disaggregated by size and the clerkships are disaggregated by level of government (state or federal).

Finally, full time salaries will accompany each category (except solo practitioners) of full-time, employed graduates whenever there are at least five salaries reported in a given category. These salaries will be reported with a 25th, 50th, and 75th percentile, as well as the number of salaries used to create these salary quartiles. There is also a space for schools to report the total number of jobs they funded.

A Good Start, But More To Be Done

The 509 Subcommittee is off to a really strong start in reforming how schools report employment information. It was made clear to us that this is only a preliminary draft, and that the Subcommittee expects more changes will be made. We hope this is the case.

The principles guiding the Subcommittee are sound. It is true that the information must be meaningful, consistent, and help prospectives make informed decisions about whether to, and where to, attend law school. But the execution of these principles still leaves something to be desired. If approved as a new accreditation standard in its current form, the proposal would certainly help prospective students and drastically cut down on misleading statistics. At the same time, it runs the risk of only providing superficial comfort, because it would not help match students to the schools that best meet their career objectives as efficiently as legal education needs.

As we previously outlined, we will use three criteria to assess the draft proposal.

(1) Does it disaggregate the current information?
(2) Does it demonstrate the economic value of a school’s J.D.?
(3) Does disclosure operate on an accelerated schedule?

Does it disaggregate the current information?

This proposal does disaggregate the current information. It helps show the nature of the jobs graduates obtained and with whom the graduates were employed. But as evidenced by comparing this draft proposal to the LST Standard, the vague “employed at 9 months” standard, where “a job is a job,” can be disaggregated to varying degrees. We’ve concluded that this draft does not disaggregate the current information to an adequate degree.

The more disaggregated employment information is, and the more data provided at that degree, the more likely it is that there will be privacy norm concerns. With these norms in mind, there is a legitimate interest in not disclosing all of the employment data that law schools already collect. On the other hand, law schools already collect all of the data needed to help prospectives make informed decisions, so cost concerns are greatly overblown (as the Subcommittee recognizes). As such, the appropriate level of disaggregation must balance privacy norms against the usefulness of additional disaggregation to anybody trying to understand the entry-level market for a school’s graduates.

It is the job of the Section of Legal Education to use its regulatory power to enforce the right balance. The Section must force schools to share the appropriate level of disaggregated information and must not opt to require less useful information because law schools have competitive concerns. The important question thus becomes how much weight the Section of Legal Education should give to schools that believe that more disaggregated information could (i) hurt their recruiting efforts, (ii) cause prospectives to focus too much on the first job in making their law school decision (as opposed to something else the schools think prospectives should focus on), and (iii) cause confusion through information overload.

Among the opportunities for improvement is how well the proposal connects job outcome features together. It does not disaggregate the locations of these jobs and does not show how the job, employer, and location connect for individual graduates. For example, we might be able to tell that 60% of a school’s graduates are working at jobs that require bar passage, but we do not know what percentage of those are working in business & industry. Likewise, we might know that 15% of a school’s graduates work in 2-10 attorney law firms, but we cannot tell what percentage of those graduates are working there as attorneys. This is not merely a theoretical concern– a sizeable percentage of law school graduates work in non-attorney positions in law firms. The decision to disaggregate further directly contravenes the Subcommittee’s principle against providing misleading impressions about the true successes of a school’s graduates.

Part of the reason additional disaggregation is so important is that it would minimize the effect of national rankings on student decision-making by offering a window directly into what graduates shortly after graduating. With this proposal, a prospective’s choice might still hinge on what a school ranks each year in U.S. News rather than on how well a school can help a student achieve her goals. Prospectives need clarity about how a school fits into the legal hiring market.

After all, the Subcommittee’s stated goal is to help prospectives make “informed decisions about whether to go to law school or which law school to attend.” The proposed solution is only satisfactory insofar that the goal is to differentiate between schools using percentage differences in broad, albeit more disaggregated, categories. It will still be too difficult to know the challenges graduates face for achieving their career objectives, which usually include a combination of location, employer type, and required credentials. Without sufficient granularity, neither will prospectives as easily understand a school’s placement niches. All together, prospectives will still struggle to understand schools’ unique placement abilities.

Another issue with the Subcommittee’s method of disaggregation is that it actually creates new gaps in the information (though not to a debilitating extent) and thus an incentive for creative accounting. One of the purposes of disaggregating the nine-month employment rate is to limit how much schools hide employment outcomes. Unnecessary gaps undermine this purpose.

The total number of graduates in each subcategory, taken together, should equal the total number in the parent category. For example, the total number of graduates who are employed, unemployed, pursuing a graduate degree, or whose employment statues are unknown should equal the total number of graduates in the graduating class because the categories are exhaustive.

The unknown status category is very important for identifying gaps in the employment status data. However, an unknown category is missing from all other exhaustive groups except the group for type of law firms. The employment type category, required credentials subcategory, judicial clerkships subcategory, and the full time and part time (and corresponding long and short term) subcategories all need an unknown field so that the numbers in the subcategories all equal the parent category’s total number.

Helping prospectives understand where data gaps exist encourages them to ask the right questions and serves to limit false impressions due to extrapolating outcomes from unrepresentative segments of the graduating class. Unfortunately, allowing schools to report graduates as “unknown” in any category incentivizes schools to avoid learning or researching employment outcomes. However, it is more important that the gaps created by non-reporting graduates are readily identifiable. As such, all exhaustive categories and subcategories need to account for each graduate.

Does it demonstrate the economic value of a school’s J.D.?

It is a huge step forward for the Subcommittee to recognize salary information as “basic consumer information.” As of right now, the only standardized, school-specific salary information is courtesy of U.S. News. Until this year, even U.S. News salary information was too opaque.

The Subcommittee’s proposal does a decent job with highlighting what new graduates make and, accordingly, demonstrates some of the economic value of each school’s J.D. This new salary information would allow prospective students to roughly understand how well graduates can service their debts immediately after law school. For the Class of 2009, the average graduate had $98,055 of law school debt, which translates to about a $1200/month loan payment.

While the Subcommittee’s approach is useful and likely the best way for schools to report school-specific salary outcomes without using job-specific salary data, it is not the approach we think the Subcommittee should take. A better way would be to leverage the reported salary data of all law schools together the way NALP does in its annual Job’s and J.D.’s. Certainly, if prospectives knew about this publication, which costs non-members $90, they could use it to have a better understanding of entry-level salaries for law school graduates. But there is currently no way to bridge the gap between this salary information and an individual school’s graduates, and the Subcommittee’s proposal does not help on that front, so it is limitedly useful for those trying to decide which law school to attend.

The aforementioned lack of connectivity between employers, job credentials, and job location makes understanding how the new salary information impacts them – particularly for loan payments – very difficult. For example, a $160,000 starting salary for a new associate grows differently in New York City compared to Houston due to salary compression in years two through seven. Additionally, $70,000 in New York City does not go as far as $70,000 in Philadelphia, Raleigh, or Nashville. The geographic impact on the ease of loan repayment cannot be understated. Even if a prospective has the Job’s and J.D.’s book, that information can only take them so far because its salary breakdowns are very specific (e.g., attorneys in 2-10 person law firms in X city). Nothing in the new standard or chart helps answer these important questions.

There is a separate concern about whether each category would have meaningful salary information associated with it. For example, 10 may work at small firms, with only four reporting. In this case, the four salaries do not get reported and thus do not serve any use. They are simply swept away. However, if these four salaries were added to a national salary database, those four become 40 or even 400, and the result is meaningful salary information about jobs that wouldn’t otherwise have salary information. Unfortunately, this resource cannot be utilized on a school-by-school basis without more disaggregation. In our next post we will explain our proposal for doing this in depth.

Does disclosure operate on an accelerated schedule?

Yes. In striking this balance between cost concerns and the need for timely information about the most recent graduating class, the Subcommittee has paved the way for significant improvements beginning as early as next year. At the Questionnaire Committee hearing in December, law school administrators expressed concern that requiring schools to report information too soon would be too high of a burden given cost constraints. But by limiting the Standard 509 requirements to only data that schools submit to NALP in February/March, the Subcommittee erases these concerns. Even small career services staff will be able to comply with the standard provided they already report to NALP, which nearly every ABA-approved law school does. Given that collection methods are now mostly electronic (through Symplicity or other user-entry databases), assembling and posting the data according to the proposed Standard 509(b) would take very few work hours and limited financial resources beyond what schools already allocate voluntarily.

Concluding Thoughts

The goal of a revised Standard 509(b) must be to help students make informed decisions about which (if any) school best meets their career objectives. While a good start, we think that, as currently conceived, the Subcommittee’s proposal will fail to adequately achieve this basic goal.

We ask that each member of the Committee imagine herself as a prospective student trying to choose a school to invest thousands of hours and dollars into. Each member must then think about how soundly she can act after analyzing employment information reported according to the new standard, and consider how well she actually understands the school’s ability to help her achieve her career objectives. We suspect that this thought experiment would leave each member uncomfortably uncertain. This uncertainty, at a minimum, should be addressed through a non-theoretical exploration of the standard’s implications. Before accepting a new standard, the Standards Review Committee should compare a few schools using real employment information presented as it would be under the proposed revisions.

An improved Standard 509 has the ability to wage an important battle against the influence of U.S. News on the decision-making of prospective law students. But without sufficient disaggregation of the current employment information, the effects can only be minimal. Under the current proposal, it is still too easy to imagine a prospective student choosing the #55 ranked school located on the east coast over the #81 ranked school on the west coast because she does not know, for example, what to make of the schools’ minute differences in percentage employed in mid-sized firms as it pertains to her goals of working out west in a mid-sized firm. Without adequate information to dissuade her, she might come to the head-scratching conclusion that #55 must be better because it is ranked higher. This is bound to worsen now that there are 45 more schools ranked on a national scale.

Each year, the Section of Legal Education makes an effort to minimize the effect of national rankings. We are sure that almost every law school administrator would agree with the Section’s sentiments, and revising Standard 509 is the chance to show that these are not empty words. We look forward to working with the Subcommittee to improve this first draft.

Three Critical Features for the ABA’s Collective Solution to Employment Reporting

This is our second post in a series of posts (see the first) where we contemplate the 509 Subcommittee’s proposal and the facts needed to understand how it would advance transparency at ABA-approved law schools. This post provides three criteria for us to use to judge the ABA’s actions. In our next post, we will evaluate the new proposal using these criteria.

Whatever standard the Standards Review Committee and Questionnaire Committee together adopt, it must:

(1) Disaggregate the current information
(2) Demonstrate the economic value of a school’s J.D.
(3) Operate on an accelerated schedule

(1) Disaggregate the current information

 
The most serious handicap of the current reporting standards is that the standards allow outcomes to be hidden in aggregate form. For prospectives seeking to make an informed decision, and law schools seeking to fulfill their educational responsibilities, the new standard must provide an accurate picture of the entry-level job market for each school. To do this, any new standard must characterize the jobs graduates obtain beyond “a job is a job.” This includes the nature of the jobs graduates obtain, with whom the graduates are employed, and the locations of these jobs. Gaps in the information also must be clearly visible to limit prospectives from extrapolating from unrepresentative segments of the graduating class.
 
The best way to achieve this is by requiring graduate-level detail, just like NALP has been collecting for years. This allows prospective students to know the challenges they face for achieving their educational and career objectives, which will help them maximize the value of their time spent in law school. The granularity also respects school regionality and encourages schools to develop their placement niches. Whether this niche is in a particular region or city, a field of law, or a sector, this feature publicizes each school’s unique placement ability. Displaying where all graduates go post-graduation can help match students to the right programs, minimizing the effect of national rankings on student decision-making. The choice then becomes less about what a school ranks each year in U.S. News and more about how each school can help a student achieve her goals. If it is clear where a school fits into the legal hiring market, schools will be encouraged to adapt and innovate, and may even be able to reduce costs.

This does not mean that law schools must share how much each individual graduate makes at her first job, as we have done with the LST Standard. Rather, law schools just need to provide enough graduate-level detail to enable prospectives to make a meaningful connection between the post-graduate outcomes for a given school’s graduates and the regional market rates for those jobs.
 

(2) Demonstrate the economic value of a school’s J.D.

 
While disaggregating the current information into graduate-level detail allows for rough estimates of economic value, the ABA does not currently consider salaries to be basic consumer information. It is time for the ABA to recognize the importance of starting salary as basic consumer information. Some prospectives come to law school straight from undergrad with low opportunity costs, and others change careers or work first, but almost all will eventually pay an enormous amount of money for the privilege to earn a J.D. It is difficult to separate the question of “how much will I make?” from “how much will my monthly loan payments be right after I graduate?” Likewise, it is difficult to think about the salary a graduate earns separate from where that graduate lives and works. New salary information must be presented in a way that allows prospective students to understand how graduates begin earning the income they need to juggle loan payments, living expenses, and everything else a new member of the legal profession must pay for.
 
It is clear that a graduate’s starting salary is only a part of the economic value a graduate can derive from the degree, and that many graduates (notably solo practitioners) may see a sharp upward trend in their earning salary over the first five to ten years. However, entry-level salaries are a good place to start, and the least costly time to assemble a comprehensive picture of a graduating class. The Bureau of Labor Statistics provides salary information for lawyers, but lawyers represent only a portion, even if a large one, of law school graduates. The important question is the value of the law degree itself. Between 60% and 70% of all 2009 law school graduates had jobs, as of February 15, 2010, that required a J.D. Of those that did not, some will eventually find work as an attorney. Likewise, some of the graduates who work as lawyers after law school will soon leave the profession. None of this warrants hiding information about post-graduate outcomes. Career trajectories are hard to predict, but they all necessarily include the first job.
 

(3) Operate on an accelerated schedule

 
The data and information reported on the annual questionnaire and on law school websites must be published in a timely manner. The 2009–2010 questionnaire was due October 31, 2010. This included employment information about the class of 2009, which was finalized on February 15, 2010, and will not be published anywhere until after the admissions cycle for the Prospective Class of 2014 has just about concluded. The Class of 2009 information will not appear in the Official Guide until after the Class of 2010 data has been assembled and reported to NALP. The ABA must publish this information sooner, along with other consumer information as it becomes available.

There are no reasons why law schools cannot either submit employment data to the ABA or provide employment information on their websites by the end of March, each year, for the most recent graduating class. According to Jim Leipold, Executive Director of NALP, data straggles into NALP from February 15th through March 15th, but by the end of that period almost every school has reported all of their employment data to NALP. This data is fresh in everyone’s minds and can be readily provided to the consumer at low costs to career services staff.

In the old days, there would be good reason why prospectives needed to wait to see this data, because the submissions to NALP by law schools would be by individual paper forms. Simply put, times have changed. 90% of law schools submit the data in an electronic format, downloaded from whatever system the school uses to survey graduates. And although schools already have the Class of 2010 data accessible, to our knowledge no school has posted any 2010 employment information on their website.

While the problems with the current employment information (see our white paper for more detail) are separate from the terrible job market, the present job market makes the current reporting schedules unquestionably unacceptable. Regardless of whether job placement for a given year was good or bad, prospectives should still be able to see the full picture. But when the available information is so outdated that it differs greatly from current placement trends (as evidenced by the new NLJ 250 statistics), not providing up-to-date information to consumers grossly undermines the obligations law schools have to their students and to the legal profession. This is particularly true when the information has already been collected and can be disclosed to the consumer with relative ease.

These three evaluation criteria were originally communicated to the Questionnaire Committee and Dean David Yellen (Chair of the 509 Subcommittee) at the December 2010 Questionnaire Committee hearing by LST’s Executive Director.

The Current Employment Information Reported to the ABA

This is the first of a series of posts where we contemplate the 509 Subcommittee’s proposal and the facts needed to understand how it would advance transparency at ABA-approved law schools. This post begins this process by describing the current employment information that schools report to the ABA according to Standard 509 and the annual questionnaire. Later we provide three criteria to judge the ABA’s actions and then will evaluate the new proposal with those criteria in mind.
  
Law schools must report “basic consumer information” about their programs to the ABA, including information about the employment outcomes of their graduates. Currently, the ABA requires that schools report employment rates nine months after graduation, as well as basic bar passage statistics. The annual questionnaire requires that schools report these placement rates for the second-most-recent class, roughly 16 months after most of the graduates earned their degree. It takes about 2 years from graduation for the ABA to publish the information for public consumption.
 
These employment rates include the employment status of all graduates, as well as the type of employer, type of job, and geographic location of all employed graduates. For all of these categories, “a job is a job.”

The employment status includes five exhaustive categories: employed, unemployed—seeking, unemployed—not seeking, pursuing an advanced degree, and unknown. Although exhaustive, the total number of graduates in each category inexplicably does not always add up to the total number of graduates. As one of many examples in the most recent Official Guide, New York Law School does not account for eight graduates while reporting according to these exhaustive categories. The ABA disclaims any warranty as to the accuracy of the information submitted by law schools, so it is unlikely that anybody will correct even basic errors.
 
The employer type rate only considers what business the employer engages in, rather than the type of job the graduate works for that employer. Accordingly, the percentage of graduates “employed in law firms” includes lawyers, paralegals, and administrative assistants. Likewise, “employed in business and industry” includes everyone from an in-house lawyer to a short-order cook. The job-type rate aims to shed some light on these logical disconnects.

NALP’s annual reports on the entry-level hiring market indicate that the disconnect is not merely theoretical, as a sizeable percentage of graduates take these non-law jobs at law firms and in business each year. That graduates take these jobs is not necessarily a problem. The problems are that it is unclear to readers that there exists a disconnect and that, once realized, readers cannot determine what types of non-law jobs these graduates take. Perhaps, originally, all that mattered was the bar-passage-required rate versus the not-required rate. But when a school advertises the versatility of a J.D., unassuming consumers are likely to think many of these graduates are doing something with their degree other than becoming a paralegal or short-order cook. The reality is that just about every graduate needs to find some way to earn money because most of them used student loans to pay for their education.

The current ABA employment reporting standard is seriously limited by its form and substance. This standard aggregates employment outcomes and makes it difficult for prospectives to understand the various employment opportunities for new J.D.’s. Quite differently from problems with the standards, schools’ individualized reporting policies often package information in ways that are not only difficult to compare, but oftentimes misleading. While arguably violative of Standard 509, the “fair and accurate manner reflective of actual practice” portion of the standard has yet to be enforced.
 
What follows is that prospective law students rarely make informed decisions about whether, and where, to attend law school. The ability to make an informed decision directly relates to prospective law students’ ability to access quality information, and the available resources are inadequate for prospectives who strive to take a detailed, holistic look at the diverse employment opportunities at different law schools.

Because prospectives usually do not have enough information about employment outcomes to make an informed decision, they often look to other resources to facilitate comparisons among schools. Most famously, U.S. News provides a yearly law school ranking that prospectives often use as a proxy for schools’ job placement opportunities. While the U.S. News ranking drives down transaction costs for prospectives seeking to acquire and explain information, it also causes prospectives to make decisions based on minute, arguably arbitrary rankings disparities. U.S. News’s decision to rank the former-third tier will only exacerbate this problem.
 
These problems have existed for quite some time, and are divorced from schools’ current struggle to help their graduates find gainful employment. That said, the economic climate is creating ever-larger implications for the legal profession. Law school in the U.S. is now an extremely costly proposition in terms of both positive attendance costs and opportunity costs. Tuition continues to rise, debt is not dischargeable in bankruptcy, and the expected value of all outcomes is less than it was just a few years ago. The result is more graduates for whom uninformed decisions will adversely affect their well-being. Caveat emptor may be an attractive quip when consumers choose to buy inherently dangerous goods, but it is not applicable when even the most informed prospectives really have no idea what kind of return follows from investing in a particular J.D.

This post is derived from our white paper. Many of these comments were also presented by LST’s Executive Director at the ABA Questionnaire Committee hearing in December, 2010.