Breaking: Class Action Suit Filed Against Thomas Jefferson School of Law

Update: Follow the latest on Alaburda v. TJSL here.

At least one graduate has chosen to seek judicial relief from her alma mater in a class action that could include over 2,300 graduates of Thomas Jefferson School of Law in San Diego, California. Sara Randazzo broke the news (subscription required) at midnight PDT in the Daily Journal. The story will be available in print Friday morning.

The complaint (see the case summary below) alleges that Thomas Jefferson School of Law (TJSL) has engaged in “fraudulent and deceptive business practices,” including “a practice of misrepresenting its post-graduation employment statistics,” and that “the disservice TJSL is doing to its students and society generally is readily apparent.” The complaint cites a number of news articles over the last few years, and quotes from law school faculty and administrators to demonstrate the widespread consensus that schools are engaged in unfair and misleading practices. You can check out the complaint for yourself here. The complaint was filed by lead plaintiff Anna Alaburda, a 2008 honors graduate of TJSL. Additional court documents are attached to this post.

This lawsuit is of historical significance. It is the latest example of the breaking trust relationship between law schools and their students, their graduates, and the profession. Law schools have a duty to be honest and ethical in their reporting and presentation of employment data. This lawsuit shows that at least some members of the profession believe these duties are legal requirements, in addition to being merely professional or educational in nature. Perhaps importantly for some critics, Ms. Alaburda decided to attend law school before the legal market collapsed and before stories of misleading information were widespread.

Current Employment Information

As of today, TJSL is still providing misleading employment information (the “TJSL Report”) on its website for the Class of 2009. Compounding the problem, TJSL has thus far declined to report any Class of 2010 information on its website, despite already collecting sufficient employment data about the class when they reported to NALP back in March of this year. Almost every law school could do a much better job educating prospective students about the nature of the jobs obtained by their graduates; TJSL is no different. The most serious fault we find with the TJSL Report is how the school misrepresents starting salaries.

The underlying data match for the TJSL Report and U.S. News-provided information

The TJSL Report claims that the school collected at least some data from 86% of graduates (respectable, though still putting them in the bottom 5% of all law schools), and that of those graduates 84.7% were employed. This means that 72.8% of Class of 2009 graduates were known to be employed, which is the same as what the career services office reported to U.S. News. Likewise, both sources indicate that 80% of the graduates known to be employed were employed in the private sector, i.e. working for law firms or in business & industry in some (any) capacity. This data match makes it possible for us to examine TJSL’s advertised placement success with the more detailed reporting rates submitted to U.S. News.

TJSL Salaries

Based on our calculations from the data submitted to U.S. News, only 17% of those working full time in the private sector reported a salary. This means that at most 22 graduates reported salary data for full-time, private-sector jobs to TJSL. (This puts TJSL in the bottom 10% of law schools by percentage reporting.)

We say “at most” because the U.S. News salary figures only include full-time jobs. Only about half of TJSL graduates had full-time jobs for the Class of 2009. Some of these were likely with law firms and in business, but probably not all of them. The only thing we gain from the information provided on the TJSL Report is that at least five salaries underly the average salary figures for law firm practice ($62,443) and for Business jobs ($90,267). Based on the other data, the average figures probably each only use data for a few more graduates than the minimum five. As such, the $90,267 and $62,443 average salaries are each based on data for between 2-8% of the entire class (for a total not to exceed 10%).

The substance of these salary averages is not apparent from TJSL’s Report or website. In fact, the picture which the published averages present is of a magnitude far more appealing than reality. The business salary average is significantly higher than the California mean salary, $83,977, for the business category according to NALP.

For law firm jobs, the problem is a little different. While the national mean salary for law firms is $115,254, that average is misleading on its face because 40% of the salaries used to calculate the average were $160,000 and 5% were $145,000. If we factor these salaries – the salaries most likely to be reported – out of the average, the average reduces to $80,007.

Although this average still likely skews high, the effect of large firm salaries on the adapted average is apparent. Those with higher salaries are far more likely to report. These salaries are also usually publicly known, thus the graduates do not need to report their salary to be included in these averages since schools can report any salaries they have reason to believe are accurate. This adjustment is not only common at law schools, but encouraged by NALP. As the TJSL Report states, “Our annual employment statistics are compiled in accordance with the [sic] NALP’s Employment Report and Salary Survey.”

The main point here is that the average salary reported in the TJSL Report skews high without context: no salary ranges, percentiles, or observational data besides the five-graduate floor has been provided. TJSL could, if it wanted, provide the following chart as specific context. This information, specific to graduates from all NALP-reporting graduates working in California, comes from NALP’s Class of 2009 Jobs & JDs. TJSL receives a copy of this report, since it is an active participant in NALP’s research. Our example uses all California salary information because 83% of TJSL’s graduates known to be employed were employed in California.

TJSL Data California Salary Data (All Grads)
Firm Type # Grads 25th Median 75th Middle 90% Avg.
2-10 Attys. 36 $52,000 $62,400 $72,000 $36,000 – $100,000 $63,526
11-25 Attys. 2 $60,000 $70,000 $80,000 $45,000 – $135,000 $77,096
26-50 Attys. 3 $70,000 $78,000 $95,000 $50,000 – $130,000 $83,152
51-100 Attys. 4 $79,000 $90,000 $135,000 $62,500 – $160,000 $105,449
101-250 Attys. 2 $100,000 $145,000 $160,000 $85,000 – $160,000 $135,171
251+ Attys. 7 $160,000 $160,000 $160,000 $140,000 – $160,000 $156,904

The total number of TJSL graduates in each category indicates that the salaries TJSL used to calulate its published average firm salary skews even higher than normal. If between 5 and 17 graduates reported a law firm salary, at least some were from jobs paying six figures. But it’s difficult to know how many of those were six-figure jobs because the employer category includes non-attorneys making significantly less than attorneys with the same employer. Of course, prospective law students could know all of this if the school had decided to tell them.

Overall, it is easy to see why a prospective TJSL student today would be misled into thinking that a $200,000 investment in the TJSL degree is worth it. It remains to be seen whether our analysis holds for previous years, as well as whether what we consider misleading is sufficiently fraudulent, misrepresentative, or unfair according to a Cali state court.

TJSL is not alone

Countless other law schools across the country engage in similarly misleading practices, making them equally at risk of facing a class action. Every law school has the opportunity to provide better information and better context for that information. Some schools are proactively reforming how they present employment data, but many more have not yet felt compelled to change their behavior. Lawsuits like this will make law schools quickly rethink how they promote their programs.
See a summary of the complaint after the jump »»

NYLS’s Deceptive Practices

We recently learned of an email sent to accepted students by William D. Perez, Assistant Dean of Admissions and Financial Aid for New York Law School. The email is a response to what Dean Perez considers to be misinformation about law schools in the media. In an effort to convince accepted students to reconsider NYLS, his email tries to balance out the discussion by sharing some positive facts about NYLS. You can view the entire email here (will appear in a lightbox).

Our issue is not that Dean Perez wants to allay fears about law school in general and NYLS in particular. Any school, especially one where the average debt for 2009 graduates borrowing was $119,437, should believe that its opportunities justify the cost of attendance and should share information that materially affects a prospective’s cost-benefit analysis. Our issue is that NYLS has not provided nearly enough information, either in Dean Perez’s email or in its publications, to support some of the claims made in this effort to recruit next year’s class. Next week, we will submit our concerns to Dean Perez and Dean Richard Matasar in the hopes they will act responsibly to resolve what is possibly a violation of accreditation Standard 509.

Dean Perez claims that “our graduates are getting jobs, earning money and able to repay their loans.” But available information demonstrates otherwise. At worst, Dean Perez has overstated this claim in a deceptive and irresponsible manner. At best, NYLS has failed to meaningfully portray the data he believes supports these propositions. We’ll begin by addressing the employment and salary information that NYLS provides to prospective law students, and then move on to discuss the (un)importance of loan default statistics.

Getting Jobs. Earning Money.

NYLS’s employment statistics webpage (“Statistics Page”) (source) is designed for prospective law students trying to answer questions about job opportunities at NYLS. But it takes specialized knowledge about the reporting process and access to third-party information to recognize that these numbers are misleading.

For starters, NYLS provides its nine-month employment rate (89.7%), the breakdown of its employed graduates (first table below), and some of their salaries (first and second tables below).

Salaries
Employer Type Percentage Range (Min-Max) Average
Private Practice 45.6% $28,000 – $160,000 $120,197
Coporate/Business 23.7% $50,000 – $96,000 $75,167
Government 8.2% $41,000 – $72,000 $56,054
Public Interest 16% n/a n/a
Judicial Clerkship 3.4% $42,000 – $58,200 $45,887
Academic 3.1% $40,000 – $45,000 $42,500

You might expect that this table reflects a breakdown of the 89.7% of its 440 graduates because this is the “employment rate for the Class of 2009″ as of February 15, 2010. This rate, although not unusual, is not what it seems. It’s actually an adjusted rate, which, until this year, U.S. News used for its rankings:

Employment Rate =
graduates known to be employed OR enrolled in FT degree program
+
25% of graduates whose employment status is unknown
total graduates – graduates who are unemployed and not seeking work

Based on Class of 2009 employment data submitted to U.S. News, we come to a rate of 89.6%. The result is off by .1% due to rounding error, but nevertheless confirms NYLS’s rate calculation. As such, the employer type breakdown reflects only 82.7% of the class, because those are the only graduates reporting an employer type.

Next we look at the salary information. Understanding the salary figures on this table requires understanding the size of the dataset. This is difficult based on what NYLS says about its size: “Approximately 20% of our 2009 graduates reported salary information.” There is no clarity about what the denominator is. It could be the entire graduating class (440), the number of graduates counted as employed using the adjusted rate (395, from the 89.7% rate), or the actual number of graduates with any job (364, from the 82.7% rate). We will assume that it uses 364 graduates as the denominator on the grounds that these are the only graduates who could be expected to report a salary.

From this, we know that the first table includes salaries for roughly 16% of the entire class (73 graduates). But we have no indication from the Statistics Page as to the distribution of graduates throughout employer types, other than knowing that zero graduates working public interest jobs reported a salary.

NYLS also breaks down the private practice employer type by the salaries attained. The below table breaks down the “Private Practice” row in the first table. Accordingly, this table reflects the job outcomes for 37.7% of the class, or 45.6% of the 82.7% of graduates who were employed.

Salaries
Law Firm Size Percentage Range (Min-Max) Average
501+ 20% $145,000 – $160,000 $159,500
251 – 500 6% $120,000 – $160,000 $155,000
101 – 250 4% $90,000 – $160,000 $136,667
51 – 100 4% $62,000 – $90,000 $81,750
26 – 50 3% $55,000 – $55,000 $55,000
11 – 25 11% $47,000 – $65,000 $57,000
2 – 10 51% $28,000 – $80,000 $54,583
Unknown 1% n/a n/a

The salaries are equally as problematic for this second table. Just as we cannot tell how many of those 73 graduates reporting a salary were in a particular employer type category, we cannot tell how many are working for law firms and represented in this table. Based on the distribution of salaries in the first table, at least 11 graduates were in categories other than private practice. This means that these salary figures by firm size represent at most 14% of the class when you combine all of the rows, though the number is assuredly smaller.

What does all of this mean? Although the Statistics Page includes a cautionary statement that only about 20% of graduates reported salaries, the information provided is still deceptive. It took numerous calculations and data from a third party to figure out how few graduates actually underlie these figures. Yet, when you read these tables, an unknowing prospective who is contacted by Dean Perez and told that “[NYLS] graduates are getting jobs, earning money and able to repay their loans” will see large salaries that can reasonably be taken as evidence of this advertised, short-term solvency. The method NYLS employs to present salary statistics can be persuasive to the unknowing applicant, but it clearly does not reflect reality when, for example, the advertised $159,500 salary average for graduates employed at 501+ attorney law firms reflects the average for, at most, 7.5% of the class.

Dean Perez claims that graduates are earning money, even though the school only reports what one-fifth of the Class of 2009 was earning. If his office has information on the other four-fifths, it would be a good idea to share it when making such claims, rather than lead prospectives to think that the salary information provided is reflective of actual practice. And if NYLS does not possess salary data for the other 80% of the class, then the administration needs to review its recruiting policies and determine whether these statements are designed to mislead and/or have the effect of misleading the consumer. We think they do. When only 62% of the entire class is working in a bar-required position, there’s ample room to be skeptical of the claims made by Dean Perez.

Dean Perez also claims that New York Law School had more favorable or comparable employment statistics than [Hofstra, Buffalo, Touro, Albany, CUNY, Pace, Syracuse, Fordham, Cardozo, Saint John’s, and Brooklyn]. These are important claims that require adequate evidence, regardless of the economic climate and media attention. In the context of the email, this claim is especially troublesome because it seeks to sway applicants by stating that, despite all of the criticism, this particular law school really is a worthwhile investment. That may be true (and we will not make that call), but the school cannot simply prove its value by comparing itself to the other New York schools. No school can prove its value this way without first having sufficient transparency about the post-graduation employment outcomes of its own graduates.

[See our data clearinghouse to see if you agree that NYLS has “more favorable or comparable statistics” compared to these other New York schools: NYLS, Hofstra, Buffalo, Touro, Albany, CUNY, Pace, Syracuse, Fordham, Cardozo, Saint John’s, Brooklyn.]

Paying Back Loans

Loan default rates, contrary to Dean Perez’s assumption, do not indicate the value of a program. With the federal Income Based Repayment and Income Contingent Repayment plans, no individual with federal student loans should default. Defaults merely suggest poor advice by financial aid offices and/or poor self-discipline. A graduate can make minimum wage and have significantly-reduced monthly loan payments, thanks to these programs. Both programs have their downsides: interest accumulates and can cause debts to balloon over the life of the payment plan, and in certain scenarios the debtor will be taxed on the forgiven debt at the end of the repayment period. But they are programs designed to make sure that people don’t default. If the default rates are low, the school should be applauded for providing sound financial advice, but it is hardly evidence that NYLS graduates are by-and-large doing well, particularly when we only know the salaries for 20% of the class.

Misleading The Consumer

Selective presentation is deceptive. The manner in which NYLS portrays salaries and job outcomes, while not outright lying, deceives the reader into thinking she is more informed about the employment opportunities at NYLS than she really is. Despite NYLS possessing better information (and even reporting some of that information to U.S. News), the school has declined to share information on the Statistics Page that it knows would be valuable, such as the fact that 58.4% of all 2009 NYLS graduates were employed full-time, while 45.2% were working full-time, bar-required jobs. Omission of such important, value-adding information is so obvious that it suggests NYLS actually intends to deceive. Such a perception has enormous ramifications for how people view legal education in this country. This behavior is precisely why we are prompting reform.

Law schools are sophisticated suppliers of a service; they understand what consumers want to believe as truth, particularly consumers facing full tuition costs and six figures of debt. With no incentive to do otherwise, schools hide or otherwise misrepresent the data that might scare applicants away. And when the applicants get wind of it through exposure in the media, we see responses like that of Dean Perez. Absent tougher regulations that require improved disclosure while prohibiting claims like these from being made without factual support, some law schools in the United States will continue undermining the educational purpose they are supposed to serve.

Class of 2009 U.S. News Data

Each year, U.S. News collects data from almost every ABA-approved law school through an annual survey. As with prior years (Class of 2007 and 2008), we have collected Class of 2009 employment data and cost of attendance data in a spreadsheet for easy use and comparison. While there is a significant time lag—the data came out a few weeks ago for the graduates from nearly two years ago—the spreadsheet serves as one of just a few sources of employment information that prospectives can use to easily compare schools in a standardized fashion.

Also in accord with prior years, we added a few of our own metrics to help readers understand and compare data. These include (1) the percent of the graduating class represented by the private sector salary information and (2) the percent of the graduating class with a known salary.

These metrics show how much (or little) we know about salary outcomes from the salary information provided by using the U.S. News reporting standard. As it turns out, the touted salary medians are poor pictures of the graduating classes for the vast majority of schools. On average, the “median salary” represents what only 29% of private sector graduates made, or just 17% of the entire graduating class. The ABA will hopefully soon begin regulating these practices, ensuring that schools cannot advertise salary information without important qualifiers such as the percentage of all graduates included in salary figures.

This year, partially in an effort to better qualify the salary figures, U.S. News made important changes to the way it reports employment data collected from law schools. We reported back in December that changes were coming as a result of our discussions with Bob Morse, Director of Data Research at U.S. News, who implemented our ideas as we proposed. These changes will provide prospective students a more thorough picture of post-graduation outcomes from many ABA-approved law schools. Additionally, these changes will improve our data clearinghouse, since they allow us to eliminate most of our assumptions as we turn the data into more meaningful information.

Information Quality

For the most part, schools do a good job reporting data in the manner requested by U.S. News. That is, there are few discrepancies that cannot be explained by small rounding errors (where the percentages do not add up to 100 percent when they should) or schools misunderstanding the clerkship survey questions. But people should not assume that we are out of the woods. Reporting data according to the U.S. News standard is a separate achievement from presenting true, meaningful information that’s useful for prospectives trying to make informed decisions. The job characteristic data and LST’s new metrics demonstrate just how different the picture can be between how some schools present their data and the reality.

In the coming weeks, we will start a discussion about practices at some schools that may deserve the ABA’s attention. If you know of individual schools where Class of 2009 data evidences schools presenting information in a misleading manner, please do not hesitate to let us know.

Otherwise, a preliminary data review did find a few errors in the data reported for 2009 grads. We’ve already alerted Bob Morse about the errors and we have been told that corrections are on the way. Our spreadsheet fixes these errors and highlights other areas for concern.

LST’s Data Clearinghouse

We will soon release the newest installment of our data clearinghouse, once we are satisfied with the underlying data. The clearinghouse helps applicants visualize the data in a way that isn’t intuitively obvious. Many applicants have been using it to better understand school-specific employment information and to make better estimates about future job prospects. We expect it will be even more useful this year because it requires fewer assumptions and prospectives can trace how schools fared over the prior three years.

As always, the data clearinghouse will reflect only cleansed data. We have not and cannot audit the data for accuracy. In the meantime, if you spot any errors or have any comments, please do not hesitate to leave a comment here or email us at .

The Current Employment Information Reported to the ABA

This is the first of a series of posts where we contemplate the 509 Subcommittee’s proposal and the facts needed to understand how it would advance transparency at ABA-approved law schools. This post begins this process by describing the current employment information that schools report to the ABA according to Standard 509 and the annual questionnaire. Later we provide three criteria to judge the ABA’s actions and then will evaluate the new proposal with those criteria in mind.
  
Law schools must report “basic consumer information” about their programs to the ABA, including information about the employment outcomes of their graduates. Currently, the ABA requires that schools report employment rates nine months after graduation, as well as basic bar passage statistics. The annual questionnaire requires that schools report these placement rates for the second-most-recent class, roughly 16 months after most of the graduates earned their degree. It takes about 2 years from graduation for the ABA to publish the information for public consumption.
 
These employment rates include the employment status of all graduates, as well as the type of employer, type of job, and geographic location of all employed graduates. For all of these categories, “a job is a job.”

The employment status includes five exhaustive categories: employed, unemployed—seeking, unemployed—not seeking, pursuing an advanced degree, and unknown. Although exhaustive, the total number of graduates in each category inexplicably does not always add up to the total number of graduates. As one of many examples in the most recent Official Guide, New York Law School does not account for eight graduates while reporting according to these exhaustive categories. The ABA disclaims any warranty as to the accuracy of the information submitted by law schools, so it is unlikely that anybody will correct even basic errors.
 
The employer type rate only considers what business the employer engages in, rather than the type of job the graduate works for that employer. Accordingly, the percentage of graduates “employed in law firms” includes lawyers, paralegals, and administrative assistants. Likewise, “employed in business and industry” includes everyone from an in-house lawyer to a short-order cook. The job-type rate aims to shed some light on these logical disconnects.

NALP’s annual reports on the entry-level hiring market indicate that the disconnect is not merely theoretical, as a sizeable percentage of graduates take these non-law jobs at law firms and in business each year. That graduates take these jobs is not necessarily a problem. The problems are that it is unclear to readers that there exists a disconnect and that, once realized, readers cannot determine what types of non-law jobs these graduates take. Perhaps, originally, all that mattered was the bar-passage-required rate versus the not-required rate. But when a school advertises the versatility of a J.D., unassuming consumers are likely to think many of these graduates are doing something with their degree other than becoming a paralegal or short-order cook. The reality is that just about every graduate needs to find some way to earn money because most of them used student loans to pay for their education.

The current ABA employment reporting standard is seriously limited by its form and substance. This standard aggregates employment outcomes and makes it difficult for prospectives to understand the various employment opportunities for new J.D.’s. Quite differently from problems with the standards, schools’ individualized reporting policies often package information in ways that are not only difficult to compare, but oftentimes misleading. While arguably violative of Standard 509, the “fair and accurate manner reflective of actual practice” portion of the standard has yet to be enforced.
 
What follows is that prospective law students rarely make informed decisions about whether, and where, to attend law school. The ability to make an informed decision directly relates to prospective law students’ ability to access quality information, and the available resources are inadequate for prospectives who strive to take a detailed, holistic look at the diverse employment opportunities at different law schools.

Because prospectives usually do not have enough information about employment outcomes to make an informed decision, they often look to other resources to facilitate comparisons among schools. Most famously, U.S. News provides a yearly law school ranking that prospectives often use as a proxy for schools’ job placement opportunities. While the U.S. News ranking drives down transaction costs for prospectives seeking to acquire and explain information, it also causes prospectives to make decisions based on minute, arguably arbitrary rankings disparities. U.S. News’s decision to rank the former-third tier will only exacerbate this problem.
 
These problems have existed for quite some time, and are divorced from schools’ current struggle to help their graduates find gainful employment. That said, the economic climate is creating ever-larger implications for the legal profession. Law school in the U.S. is now an extremely costly proposition in terms of both positive attendance costs and opportunity costs. Tuition continues to rise, debt is not dischargeable in bankruptcy, and the expected value of all outcomes is less than it was just a few years ago. The result is more graduates for whom uninformed decisions will adversely affect their well-being. Caveat emptor may be an attractive quip when consumers choose to buy inherently dangerous goods, but it is not applicable when even the most informed prospectives really have no idea what kind of return follows from investing in a particular J.D.

This post is derived from our white paper. Many of these comments were also presented by LST’s Executive Director at the ABA Questionnaire Committee hearing in December, 2010.

Class of 2010 NLJ 250 Statistics

The National Law Journal (NLJ) released its annual report last month on the law schools that send the most graduates to the 250 largest American law firms (NLJ 250). In this post we’ll answer a few basic questions about this important employment data. To our knowledge, this is the first Class of 2010 employment information publicly provided.

While this topic has received pretty extensive coverage, explaining the basic information available about post-graduation outcomes is necessary to understanding why the ABA must regulate law schools more strictly and extensively.

A significant segment of our readership includes prospective students seeking to understand a vast amount of hard-to-understand information that shortchanges those seeking to understand the various opportunities (and information about those opportunities) at different law schools across the United States. This is what the data clearinghouse does, and what we will keep doing for all employment information about the entry-level market.

What is the NLJ 250?

The NLJ 250 includes the 250 largest law firms headquartered in the United States. This is measured by the firm-reported annual average number of full-time and full-time equivalent attorneys working at the firm, in any office, in 2010. This does not include temporary or contract attorneys.

Where do the data come from?

The NLJ collects survey data from the law firms themselves, not the law schools. A significant percentage of all NLJ 250 firms responded to the survey about first-year hiring. (We have inquired with the NLJ as to the exact percentage and will update this post when the NLJ gets back to us.)

What do these numbers tell us?

Large firm placement percentage is an important, albeit imperfect, proxy for the number of graduates with access to the most competitive and highest paying jobs. The percentage, accordingly, tell us which schools most successfully place students in these highly sought-after jobs. Successful large firm placement is best analyzed by looking at multiple years worth of data.

What do these numbers not tell us?

First, self-selection controls all post-graduation outcomes. Nobody is coerced into a job they are offered (unless you consider debt pressure or other strong personal influences coercive), so these numbers do not provide more than a proxy for opportunities. Opportunities, after all, are prospective students’ real concern when analyzing employment information, and these rankings do not necessarily reflect a school’s ability to place students into NLJ 250 firms.

Many graduates, particularly at the top schools, choose to clerk after graduation instead of working for these law firms. While not all of these graduates would have secured employment at the NLJ 250 firms, many could have. For this reason, one popular technique used to understand a school’s placement ability is adding the percentage of graduates at NLJ 250 firms to the percentage of graduates clerking for Article III judges. This method is not perfect; read our white paper (beginning on page 28) for a more detailed explanation of the strengths and weaknesses of this technique.

Second, NLJ 250 firm jobs are not the only competitive, high-paying firm jobs. Boutique law firms are also very competitive, with some paying New York City market rates and above. Additionally, the NLJ 250 does not include large, prestigious internationally-based law firms with American offices.

Third, not all NLJ 250 firm jobs are equally competitive. Law firms from different regions and of differing caliber have varying preferences for the students from different law schools, including how far into the class they are willing to reach. That is, two schools that place an equal percentage of graduates in NLJ 250 firms may do so for reasons other than similar preferences among equally competitive NLJ 250 firms.

Fourth, the rankings include data only about the law schools that placed at least 10.57% of its entire class in the NLJ 250 firms. All other American law schools placed a lower, unknown percentage at NLJ 250 firms. The remaining schools likely range from 0% to 10.57%, and probably do not fall into a normal distribution.

If you have more questions, please feel free to email or reply this post. We will update this as needed.

2010 placement into NLJ 250 firms by law school

Rank School NLJ 250 Grads Total Grads % of Class
1 University of Chicago Law School 115 195 58.97%
2 Cornell Law School 112 192 58.33%
3 Columbia Law School 239 433 55.2%%
4 University of Pennsylvania Law School 145 272 53.31%
5 Harvard Law School* 287 577 49.74%
6 University of Virginia School of Law 175 374 46.79%
7 University of California, Berkeley School of Law 135 296 45.61%
8 Northwestern University School of Law 126 284 44.37%
9 New York University School of Law* 209 483 43.27%
10 University of Michigan Law School 158 372 42.47%
11 Stanford Law School 72 173 41.62%
12 Duke Law School 81 213 38.03%
13 Georgetown University Law Center 242 644 37.58%
14 University of California at Los Angeles School of Law 123 350 35.14%
15 Yale Law School* 67 198 33.84%
16 Boston College Law School 89 265 33.58%
17 Boston University School of Law 81 270 30%
18 Vanderbilt University Law School 62 208 29.81%
19 University of Southern California Gould School of Law 56 195 28.72%
20 University of Texas School of Law 101 379 26.65%
21 Fordham University School of Law* 123 479 25.68%
22 George Washington University Law School 127 513 24.76%
23 University of Notre Dame Law School 41 172 23.84%
24 Emory University School of Law 54 255 21.18%
25 Washington University in St. Louis School of Law 51 269 18.96%
26 University of Illinois College of Law 35 195 17.95%
27 Southern Methodist University Dedman School of Law 42 259 16.22%
28 College of William and Mary Marshall-Wythe School of Law 33 214 15.42%
29 University of California, Davis School of Law 30 195 15.38%
30 Wake Forest University School of Law 25 166 15.06%
31 Howard University School of Law 20 133 15.04%
32 Georgia State University College of Law 22 162 13.58%
33 Seton Hall University School of Law 41 320 12.81%
34 Yeshiva University Benjamin N. Cardozo School of Law 48 381 12.6%
35 University of California Hastings College of the Law 52 419 12.41%
36 University of Wisconsin Law School 31 252 12.3%
37 University of Iowa College of Law 24 197 12.18%
38 University of Maryland College of Law** 29 242 11.98%
39 University of Minnesota Law School 34 284 11.97%
40 Villanova University School of Law** 28 235 11.91%
41 University of North Carolina School of Law 28 237 11.81%
42 Ohio State University Michael E. Moritz College of Law 23 198 11.62%
42 University of Houston Law Center 33 284 11.62%
44 Tulane University Law School 29 252 11.51%
45 University of Georgia School of Law 25 218 11.47%
46 Temple University James E. Beasley School of Law 33 293 11.26%
47 Brigham Young University J. Reuben Clark Law School 16 145 11.03%
48 Loyola University Chicago School of Law 29 266 10.9%
49 Rutgers School of Law-Newark 28 258 10.85%
50 Washington and Lee University School of Law 13 123 10.57%

* Graduate class size based on average of last three years.
** Graduate class size based on latest data in ABA/LSAC Official Guide to Law Schools.

Source: National Law Journal

Vanderbilt Class of 2009 Uncertified List

This uncertified list includes – as of graduation – the city, state, and employer name for each employed 2009 Vanderbilt graduate, as well as a mention for each graduate pursuing another graduate degree. The original list appeared in Vanderbilt’s 2009 Career Services brochure. We transcribed the whole list into the attached spreadsheet.

If any readers are aware of lists that similarly disaggregate employment summaries that account for all or nearly all graduates, we are happy to publish them when you provide them. As always, we strive to make this important employment data available to everybody, even when it does not meet our standard. Per our privacy policy we will not reveal your identity.

A Way Forward

Update: We have submitted our article to a number of law reviews and posted a working draft to SSRN. The article will hopefully spark some discussion in academia as we move forward to implement our new standard. Please send us your comments.

A Way Forward: Improving Transparency in Employment Reporting at American Law Schools
The decision to attend law school in the 21st century requires an increasingly significant financial investment, yet very little information about the value of a legal education is available for prospective law students. Prospectives use various tools provided by schools and third parties while seeking to make an informed decision about which law school to attend. This Article surveys the available information with respect to one important segment of the value analysis: post-graduation employment outcomes.

One of the most pressing issues with current access to information is the ability to hide outcomes in aggregate statistical forms. Just about every tool enables this behavior, which, while misleading, often complies with the current ABA and U.S. News reporting standards. In this Article, we propose a new standard for employment reporting grounded in compromise. Our hope is that this standard enables prospectives to take a detailed, holistic look at the diverse employment options from different law schools. In time, improved transparency at American law institutions can produce generations of lawyers who were better informed about the range of jobs obtainable with a law degree.

Chicago Class of 2009 Uncertified List

Thanks to our tipster for sending this uncertified list to us. It includes the city, state, and employer name for each employed graduate. The original includes each graduate’s name, but we opted to redact each name for the sake of individual privacy.

4/28/2010 Update: The University of Chicago Law School published this list in their Fall 2009 Alumni magazine. This was not the first time Chicago released this data to certain stakeholders. According to Marsha Ferziger Nagorsky, Assistant Dean for Communications and Lecturer in Law at The University of Chicago Law School, “for at least forty years [Chicago has] provided an annual list of our graduates and their employers – first in our student facebook (lists in the facebook dating back at least to 1970-71 are available in our library), and more recently in our alumni magazine.” Thank you Dean Nagorsky for contacting us.

5/3/2010 Update: We are not sure which frozen date this list reflects. Because this list was published in the Fall 2009 Alumni Magazine, it represents some point in time between graduation (May 2009) and the ABA, U.S. News, and NALP reporting date (February 15th, 2010).

Duke Class of 2007 Uncertified List

On March 20th, 2008, Duke released an employment list for their Class of 2007 graduates at the admitted student Open House. Shortly after, an admitted student created a spreadsheet like the one created by the Vanderbilt admitted student. Although we have obtained the spreadsheet, we have thus far failed to track down the original document given to the admitted students at the Open House. The original spreadsheet is attached to this post. We also plan to improve the document in the near future.

Update: On July 28th, 2009, LST emailed Associate Dean and Director of Duke’s Career & Professional Development Center, Bruce Elvin, to obtain a copy of the original document. We will report back when he responds.

Vanderbilt Class of 2007 Uncertified List

On March 14th, 2008, at Admitted Students Day, Vanderbilt’s admissions office released an uncertified list of where, and with whom, 196 of the 223 graduates were employed. Shortly after, one of the admitted students compiled this information in a spreadsheet. Both of these documents are attached to this post.

To our knowledge, this is the most comprehensive employment list publicly released. It’s LST’s hope that this was the first step in a new direction.

Update 5/3/2010: We have learned that other schools have released employment lists like this in the past. According to David Lat from Above the Law, “at Yale Law School — back in 1996, so almost 15 years ago — we were given detailed lists showing where the past few classes ended up working. The graduates were listed in alphabetical order, and below each person’s name was the name and address of their employer.” According to Marsha Ferziger Nagorsky, Assistant Dean for Communications and Lecturer in Law at The University of Chicago Law School, “for at least forty years [Chicago has] provided an annual list of our graduates and their employers – first in our student facebook (lists in the facebook dating back at least to 1970-71 are available in our library), and more recently in our alumni magazine.”

We do wish to emphasize that while releasing employment information to current students and alumni is admirable, it does not constitute “publishing” in a manner consistent with our standard. Our stated goal is to make this employment information available to the public (and particularly to prospective law students). As it turns out, it is still the case that the Vanderbilt uncertified list, “to our knowledge . . . is the most comprehensive employment list publicly released” because it was available to prospective law students. If you know otherwise, please contact us.