LST Revamps Law School Decision Tool

With law school application deadlines looming, a tough legal job market, and high education costs, prospective students need useful information to make smart choices about whether or where to attend law school.

Today, prospective law students receive essential help with making informed decisions with the launch of Law School Transparency‘s fully customizable Score Reports (http://www.LSTScoreReports.com).

Attending law school is a life-changing decision that deserves fair and clear presentation of relevant information. The Score Reports organize admissions, employment, and cost data to show the big picture and the fine detail. The Score Reports help prospective law students find the schools that can meet their career goals and evaluate the projected time and financial commitment.

We have added the following features:

  • Financial Planning Worksheets. These help students plan their budgets and see how much debt will be owed when the first loan payment is due, as well as how much that payment will be.
  • Custom Scores and Reports. Students can choose what matters most to them. From large firm and public service placement to LSAT and GPA statistics, students can create reports and change them to see how schools stack up. To compare apples to apples, students can also create custom scores based on the types of jobs they value.
  • Compare up to 4 law schools at once on our head-to-head page.
  • Enhanced school reports that provide mountains of admissions, employment, and cost data. Elements include salary information, job trends, enrollment trends, and a projected debt table.
  • Scholarship negotiation help. Because almost all law schools use a high tuition, high discount model, students must negotiate scholarship amounts and scholarship terms (GPA/class rank requirements, escalation with tuition increases).

Note that the LST Score Reports are not rankings. Indeed, we believe national rankings and law schools do not make any sense. Law schools function in local markets; few schools have a national reach. 2 in 3 employed graduates work in the state where their school is located.

Law School Transparency is a Georgia nonprofit legal education policy and watchdog organization. Our mission is to make entry to the legal profession more transparent, affordable, and fair. We’re best known for advocating for increased employment data transparency. The LST Score Reports are the product of our advocacy successes.

Class of 2012 NLJ 250 Statistics

The National Law Journal (NLJ) released its annual report this weekend on the law schools that send the most graduates to the 250 largest American law firms (NLJ 250). In this post we’ll answer a few basic questions about this important employment outcome measure. This is the first published Class of 2012 employment information.

What is the NLJ 250?

The NLJ 250 includes the 250 largest law firms headquartered in the United States. This is measured by the firm-reported annual average number of full-time and full-time equivalent attorneys working at the firm, in any office, in 2012. This does not include temporary or contract attorneys, though it does include non-partner track attorneys.

Where do the data come from?

Methodology via the NLJ:

Methodology: Data for this Go-To Law Schools special report were provided by the law firms surveyed for the NLJ 250, The National Law Journal’s annual survey of the nation’s 250 largest law firms by headcount. We received data from 190 firms. For firms that did not submit new associate numbers, we relied on data from ALM Media LLC’s RivalEdge database and independent reporting. We determined rankings by the percentage of 2012 juris doctor graduates who took associate jobs at NLJ 250 firms. The rankings do not reflect law graduates who took jobs as clerks following graduation. Our data do not include new associates at Paul, Weiss, Rifkind, Wharton & Garrison or King & Spalding.

What do these numbers tell us?

Large firm placement percentage is an important, albeit imperfect, proxy for the number of graduates with access to the most competitive and highest paying jobs. The percentage, accordingly, tell us which schools most successfully place students in these highly sought-after jobs. Successful large firm placement is best analyzed by looking at multiple years worth of data. (View the NLJ 250 from the class of 2010 here and from the class of 2011 here.)

What do these numbers not tell us?

First, self-selection controls all post-graduation outcomes. Nobody is coerced into a job they are offered (unless you consider debt pressure or other strong personal influences coercive), so these numbers do not provide more than a proxy for opportunities. Opportunities, after all, are prospective students’ real concern when analyzing employment information, and these rankings do not necessarily reflect a school’s ability to place students into NLJ 250 firms.

Many graduates, particularly at the top schools, choose to clerk after graduation instead of working for these law firms. While not all of these graduates would have secured employment at the NLJ 250 firms, some could have. For this reason, one popular technique used to understand a school’s placement ability is adding the percentage of graduates at NLJ 250 firms to the percentage of graduates clerking for Article III judges. This method is not perfect; read our white paper for a more detailed explanation of the strengths and weaknesses of this technique.

Second, NLJ 250 firm jobs are not the only competitive, high-paying firm jobs. Boutique law firms are also very competitive, with some paying New York City market rates and above. Additionally, the NLJ 250 does not include large, prestigious internationally-based law firms with American offices.

Third, not all NLJ 250 firm jobs are equally competitive. Law firms from different regions and of differing caliber have varying preferences for the students from different law schools, including how far into the class they are willing to reach. That is, two schools that place an equal percentage of graduates in NLJ 250 firms may do so for reasons other than similar preferences among equally competitive NLJ 250 firms.

Fourth, the rankings include data only about the law schools that placed at least 8.22% of its entire class in the NLJ 250 firms. All other American law schools placed a lower, unknown percentage at NLJ 250 firms. The remaining schools range from 0% to 8.22%, and probably do not fall into a normal distribution.

If you have more questions, please feel free to email or reply this post. We will update this as needed.

2012 placement into NLJ 250 firms by law school

Rank School NLJ 250
Grads
Total
J.D.’s
% of
Class
1 University of Pennsylvania Law School 163 270 60.37%
2 University of Chicago Law School 119 216 55.09%
3 Columbia Law School 245 460 53.26%
4 New York University School of Law 253 478 52.93%
5 Northwestern University School of Law 144 280 51.43%
6 Harvard Law School 297 590 50.34%
7 Duke Law School 107 221 48.42%
8 Stanford Law School 86 182 47.25%
9 University of California, Berkeley School of Law 139 307 45.28%
10 Cornell Law School 85 192 44.27%
11 University of Virginia School of Law 151 357 42.30%
12 University of Michigan Law School 149 388 38.40%
13 Georgetown University Law Center 193 616 31.33%
14 Yale Law School 68 222 30.63%
15 University of California at Los Angeles School of Law 97 333 29.13%
16 University of Southern California Gould School of Law 63 220 28.64%
17 Vanderbilt University Law School 51 194 26.29%
18 University of Texas School of Law 96 372 25.81%
19 Fordham University School of Law 114 487 23.41%
20 University of California, Irvine School of Law 13 56 23.21%
21 George Washington University Law School 122 542 22.51%
22 Boston University School of Law 58 273 21.25%
23 Boston College Law School 54 256 21.09%
24 University of Illinois College of Law 40 213 18.78%
25 Washington University in St. Louis School of Law 49 300 16.33%
26 University of Notre Dame Law School 32 196 16.33%
27 Southern Methodist University Dedman School of Law 44 280 15.71%
28 Emory University School of Law 39 253 15.42%
29 University of Houston Law Center 35 262 13.36%
30 College of William and Mary Marshall-Wythe School of Law 27 206 13.11%
31 Howard University School of Law 19 148 12.84%
32 University of North Carolina School of Law 32 260 12.31%
33 University of Arizona James E. Rogers College of Law 17 141 12.06%
34 Washington and Lee University School of Law 15 129 11.63%
35 University of Washington School of Law 20 181 11.05%
36 University of Minnesota Law School 25 230 10.87%
37 Seton Hall University School of Law 32 310 10.32%
38 University of Kentucky College of Law 15 148 10.14%
39 Loyola Law School, Los Angeles 41 414 9.90%
40 University of California Hastings College of the Law 43 443 9.71%
41 Wake Forest University School of Law 15 155 9.68%
42 Villanova University School of Law 24 255 9.41%
43 University of Georgia School of Law 21 225 9.33%
44 Indiana University Maurer School of Law–Bloomington 19 208 9.13%
45 University of California, Davis School of Law 17 198 8.59%
46 Santa Clara University School of Law 26 306 8.50%
47 University of Wisconsin Law School 24 284 8.45%
48 Rutgers School of Law–Camden 23 274 8.39%
49 Loyola University Chicago School of Law 23 274 8.39%
50 University of Tennessee College of Law 12 146 8.22%

Rutgers – Camden School of Law’s Dean Stands by Marketing Campaign

This weekend we wrote about a recruitment letter sent by Rutgers – Camden School of Law’s admissions dean, Camille Andrews. We alleged that the letter contained incomplete, deceptive, and false information, and that as a result Dean Andrews should resign from her post and the ABA should conduct an investigation and bring appropriate sanctions against the law school.

In an article published in Inside Higher Ed, Camden’s Dean Rayman Solomon responded. Neither Dean Solomon nor Dean Andrews responded to us directly, and we have only the portions of Dean Solomon’s statements published by Inside Higher Ed:

Dean Rayman Solomon is standing by Andrews. Solomon said the recruitment material was accurate but that he’s “open to discussion” about the best way to reach prospective students going forward. The promotion in question targeted potential applicants who took the GMAT, not the LSAT, the typical law school admission test. The goal, Solomon said, was to reach a new audience and introduce the Rutgers-Camden program. Students could then go online to get more information.

“This was one letter saying are you interested, have you thought about it?” Solomon said. “This is not our entire marketing campaign. This is telling people that we have a program.”

But were the numbers misleading?

“I don’t know how to respond,” Solomon said. “If you have a hundred people, would four of them be misled? Would one be misled? Would 98 be misled? [It was] a piece that was designed to get people to think about something they hadn’t thought about. This wasn’t the only information they could get about it.”

We appear to agree with Dean Solomon on the purpose. The May 2012 letter was designed to get students to think about law school or a legal career who were not known to be interested in attending law school starting in August 2012. We bet we also agree on the following three points:

  • Camden waived the application fee to reduce the application barrier
  • Camden discussed employment outcomes to show its placement successes in a bad economy
  • Camden discussed salary outcomes and salary potential to inform the cost-benefit analysis of the campaign targets

However, we clearly disagree about whether Camden’s employment outcome claims adequately reflect reality and whether targeting people who had not yet expressed interest in law school was appropriate given the very short decision window and lack of knowledge about their professional goals.

Nevertheless, neither LST nor Camden knows the actual effect of the campaign on the letter recipients. Frankly it doesn’t matter whether many people or zero people enroll. We care about how Camden conducts itself in the law school marketplace; Camden unfairly used employment statistics to augment its argument that the law school is a safe haven from a bad economy. In this regard Camden crossed the ethical (and likely legal) line from mere puffery to deceptive advertising. These facts are troubling irrespective of whether prospective students are sophisticated, unsophisticated, or indifferent.

The brunt of Dean Solomon’s response is that this is but a single letter that isn’t a big deal and shouldn’t affect decision making. To that we ask, what could the employment statistics have been meant to do other than affect application and enrollment decisions? The letter was part of a recruitment campaign, not a teaser for a movie due out next summer. Camden should strive to have all of its communications with students be accurate and honest. Dean Solomon further states that the misinformation is okay because other information is out there. It would appear that he is saying “you should know not to take our statements at face value.” That’d be a pitiful position for a law school dean to take.

It’s not acceptable to provide prospective students with false and misleading information just because the truth is available somewhere else. Interpretation 509-4 to ABA Standard 509 clearly states that reporting consumer information accurately somewhere does not absolve a school’s responsibility to present such information in a fair and accurate manner elsewhere.

Interpretation 509-4
Standard 509 requires a law school fairly and accurately to report basic consumer information whenever and wherever that information is reported or published. A law school’s participation in the Council-designated publication referred to in Interpretation 509-2 and its provision of fair and accurate information for that book does not excuse a school from the obligation to report fairly and accurately all basic consumer information published in other places or for other purposes.

It’s worthwhile to emphasize that Dean Solomon disputed our analysis and not our numbers. He also said he is open to discussion. So are we, and we’ve sent him the following email:

We would like to know what specifically in our analysis you believe is incorrect.

1. Does the category “JD Advantage” include only jobs in the legal field?
2. If #1 is no, did any Camden graduates have a “JD Advantage” job not in the legal field? If so, how many?
3. Do you think the advertised private practice starting salary of $74,000 represents the average of all 2011 graduates employed in private practice?
4. How many graduates reported earning salaries of at least $130,000?
5. Do you believe the answer to #4 can fairly be described as “many”?
6. Are statements about employed graduates meaningful without disclosing how many non-employed graduates there are?

Please respond via email. If you do not have adequate information to answer any of these questions, please say so. In addition to the email, we would be happy to schedule a time to talk about the data, our analysis, Camden’s forthcoming remedial measures, and the internal policies Camden plans to adopt to prevent repeat violations of ABA Standard 509.

We reemphasize that the letter must stand on its own merits. This letter was intended to create a first impression with prospective students and paint in their minds a picture of financial security if they attend law school at Rutgers – Camden School of Law. Later discovering that the letter was deceptive does not erase the deception.

We will post a new story if/when Dean Solomon responds.

LST Calls for Dean’s Resignation and ABA Investigation

Last week we became aware of an ongoing recruiting campaign by Rutgers – Camden School of Law that targets students who were not considering law school. As a part of this campaign, Camille Andrews, Associate Dean of Enrollment, sent students an email with bold statements about the employment outcomes achieved by the class of 2011. When compared to the school’s self-published employment data, we see Dean Andrew’s statements range from misleading to plainly false. Because the statements made in this email are demonstrably deceptive and are in clear violation of ABA Standard 509, Dean Andrews should resign immediately from her administrative appointment.

There are two important layers to this story. First, Dean Andrews made unfair statements about the employment outcomes of Camden graduates. These statements exaggerate the successful outcomes of Camden graduates and attempt to influence student behavior. The realities of Camden’s placement are far different from what Dean Andrews discloses. (More on this below.)

Second, Camden has extended a special offer for people who haven’t followed the normal application process and haven’t expressed an interest in law school or legal practice. (The email recipients had taken the GMAT, not the LSAT.) The Camden Special allows the students to avoid delay and enroll this August. By portraying Camden as some down-economy safe haven that leads to status and riches, Dean Andrews is attempting to enroll the exact students who ought not to attend law school: people who have not had time to carefully weigh the pros and cons of this significant investment.

In addition to ensuring that Dean Andrews resigns, Camden must also take swift, corrective action in all cases where prospective students received emails containing these or similar false, misleading, or incomplete statements. We also call on the American Bar Association to conduct a full investigation and bring appropriate sanctions against the school for violations of the ABA Standards, especially Standard 509(a) and Interpretation 509-4. Not only is Camden an institute of higher learning, but it also serves as a gateway to the legal profession. The degree of recklessness displayed by Dean Andrews, and the Camden administration for permitting a representative to deceive potential students, cannot be tolerated. It’s the latest example of a law school having no accountability for its recruiting practices. These practices must stop.

What follows is an analysis of each unfair statement made by Dean Andrews. We can do this analysis because Camden has made the relevant employment data publicly available, though their accessibility does not excuse false, misleading, and incomplete statements that the administration should know leave readers with incorrect impressions. Each statement is itself a black eye for Rutgers — Camden School of Law, but it’s the cumulative effect of all of the statements and all of law school bad behavior that makes resignation, corrective action, and sanctions imperative.

Analysis of Statements by Dean Andrews for Rutgers – Camden School of Law

Camden Data
Click image to enlarge. Created from the data Camden provided on its website.

“[O]f those employed nine months after graduation, 90% were employed in the legal field”

This is problematic on two levels. First, it excludes non-employed graduates from the calculation to provide a false sense of success. There were 242 graduates in Camden’s 2011 graduating class. Of these, 199 were employed. Camden uses 199 as the denominator with no indication that it has excluded 17.8% of the class from the calculation. While the statement does disclose that it is “of those employed,” the number of unemployed graduates is so large that the statement requires context to avoid misrepresenting what it means. The advertised “90% of employed” actually only represents 74% of the whole class.

Second, “in the legal field” implies “as a lawyer,” yet Camden groups non-lawyers with lawyers to create the “in the legal field” category. Specifically, Camden has combined two distinct categories: jobs that require bar admission (154 grads) and jobs where the J.D. was an advantage (25 grads). The advertised “90% of employed” actually works out to 63.6% of the class in lawyer jobs, with another 10.6% in jobs where the J.D. was an advantage.

The “J.D. Advantage” category that Camden uses to boost its “in the legal field” rate includes jobs as paralegals, law school admissions officers, and a host of jobs not credibly considered “in the legal field.” A graduate falls into this category when the employer sought an individual with a J.D. (and perhaps even required a J.D.), or for which the J.D. provided a demonstrable advantage in obtaining or performing the job, but the job itself does not require bar passage, an active law license, or involve practicing law.

“[O]f those employed nine months after graduation . . . 90% were in full time positions.”

This likewise excludes non-employed graduates without indicating that 17.8% of the class has been excluded. Once again, 90% of employed actually means only 74% of the whole class.

“Our average starting salary for a 2011 graduate who enters private practice is in excess of $74,000, with many top students accepting positions with firms paying in excess of $130,000.”

There are a number of distinct problems with this statement. First, Camden does not accurately state what the average reflects. The average is “for a 2011 graduate who enters private practice and reported a salary” not “for a 2011 graduate who enters private practice.” This is not a trivial distinction. Only 46.6% of graduates in private practice reported a salary. Of those that did so, the numbers were slanted towards higher salaries at large firms. 83.3% of graduates at firms with 101 or more attorneys reported their salaries, while only 37.0% of those at smaller firms reported a salary. The low overall response rate and the bias towards higher salaries being reported mean that the average of responses is not the average “for a 2011 graduate who enters private practice.”

Second, Camden does not disclose the salary response rate. The private practice salary response rate (46.6%) indicates that private practice salaries don’t tell the whole story. The letter also does not state that only 24% of the class was in private practice. This means the “average starting salary” actually reflects the average salary for just 11.2% of the class. None of this was communicated to the recipients of Dean Andrews’ email.

Third, Camden uses the average salary figure without any statistical context. NALP, LST, and many other academics have belabored for many years about how average salaries tend to mislead more than inform. This is because reported salaries fall into a bimodal distribution. For the class of 2010 (across all law schools), there is one peak from $40,000 to $65,000, accounting for nearly half of reported salaries, and another distinct peak at $160,000. This bimodal distribution means that very few graduates make the mean salary of $84,111.

Based on the salary data Camden produces on its website, we see a similar distribution to the national picture across private practice salaries. There were 27 salaries provided; between 8 and 12 were above the $74,000 average by at least 30%; the rest were below the average, with 14 or more at least 20% below the average.

Fourth, Camden claims that many of its top students have accepted positions with firms paying “in excess of $130,000.” To be sure, “many” is ambiguous. It might reasonably mean 40% of the class, or even perhaps 20%. With the “top” qualifier, it might not even strain credibility to claim that 10% of the class constitutes “many” top students. Based on the published data, Camden knows that at most five graduates reported a salary of $130,000+, or 2.1% of the entire class. After analyzing the salary data in detail, we think just one graduate did. Whether it is one or five, “many” is far from accurate.

That said, we do know that eight graduates (or 3.3%) made at least $100,000. We also know that Camden grossly exaggerated the salary outcomes of its graduates right after exalting placement success and right before pointing out how its alumni are among the very richest of all lawyers. Of course, this is the same school that reported to U.S. News that its 2011 graduates had an average of only $27,423 in debt, even though the estimated total debt was well into the six figures for a New Jersey resident graduating in 2011 receiving no tuition discount. Fewer than a third (31.7%) of students received tuition discounts, with just 4.3% of students received more than a 50% discount on tuition.

“Rutgers is also ranked high in the nation at placing its students in prestigious federal and state clerkships.”

Like Camden, we only have the class of 2010 data to use to compare clerkship placement rankings. With federal clerkships, Camden does okay, tied for 33rd. In terms of percentage of students placed in federal clerkships, it’s as close to 16th place as it is to last (188th). Suffice it to say that this exaggeration caps off a legion of false, misleading, and incomplete information used to induce applicants who didn’t even take the LSAT.

Other Coverage

Class of 2011 NLJ 250 Statistics

The National Law Journal (NLJ) released its annual report this weekend on the law schools that send the most graduates to the 250 largest American law firms (NLJ 250). In this post we’ll answer a few basic questions about this important employment outcome measure. This is the first Class of 2011 employment information publicly provided.

What is the NLJ 250?

The NLJ 250 includes the 250 largest law firms headquartered in the United States. This is measured by the firm-reported annual average number of full-time and full-time equivalent attorneys working at the firm, in any office, in 2011. This does not include temporary or contract attorneys.

Where do the data come from?

First, the NLJ collects survey data from the law firms themselves, not the law schools. A significant percentage of all NLJ 250 firms responded to the survey about first-year hiring. (The NLJ would not comment on the exact percentage.) The NLJ then contacts the law schools to fill in the gaps — but never relies directly on their word. The final figures reached are minimums, representing only the people the NLJ verified to their liking; at no point does the NLJ extrapolate from a smaller number to a larger number.

What do these numbers tell us?

Large firm placement percentage is an important, albeit imperfect, proxy for the number of graduates with access to the most competitive and highest paying jobs. The percentage, accordingly, tell us which schools most successfully place students in these highly sought-after jobs. Successful large firm placement is best analyzed by looking at multiple years worth of data. (View the NLJ 250 from 2010 here.)

What do these numbers not tell us?

First, self-selection controls all post-graduation outcomes. Nobody is coerced into a job they are offered (unless you consider debt pressure or other strong personal influences coercive), so these numbers do not provide more than a proxy for opportunities. Opportunities, after all, are prospective students’ real concern when analyzing employment information, and these rankings do not necessarily reflect a school’s ability to place students into NLJ 250 firms.

Many graduates, particularly at the top schools, choose to clerk after graduation instead of working for these law firms. While not all of these graduates would have secured employment at the NLJ 250 firms, many could have. For this reason, one popular technique used to understand a school’s placement ability is adding the percentage of graduates at NLJ 250 firms to the percentage of graduates clerking for Article III judges. This method is not perfect; read our white paper for a more detailed explanation of the strengths and weaknesses of this technique.

Second, NLJ 250 firm jobs are not the only competitive, high-paying firm jobs. Boutique law firms are also very competitive, with some paying New York City market rates and above. Additionally, the NLJ 250 does not include large, prestigious internationally-based law firms with American offices.

Third, not all NLJ 250 firm jobs are equally competitive. Law firms from different regions and of differing caliber have varying preferences for the students from different law schools, including how far into the class they are willing to reach. That is, two schools that place an equal percentage of graduates in NLJ 250 firms may do so for reasons other than similar preferences among equally competitive NLJ 250 firms.

Fourth, the rankings include data only about the law schools that placed at least 6.49% of its entire class in the NLJ 250 firms. All other American law schools placed a lower, unknown percentage at NLJ 250 firms. The remaining schools range from 0% to 6.49%, and probably do not fall into a normal distribution.

If you have more questions, please feel free to email or reply this post. We will update this as needed.

2011 placement into NLJ 250 firms by law school

Rank School NLJ 250 Grads Total Grads % of Class
1 University of Pennsylvania Law School 156 274 56.93%
2 Northwestern University School of Law 149 286 52.1%
3 Columbia Law School 235 455 51.65%
4 Harvard Law School 285 583 48.89%
5 Stanford Law School 87 181* 48.07%
6 University of California, Berkeley School of Law (Boalt Hall) 140 305 45.9%
7 University of Chicago Law School 92 203 45.32%
8 Duke Law School 89 219* 40.64%
9 New York University School of Law 187 466 40.13%
10 University of Virginia School of Law 150 377 39.79%
11 Cornell Law School 72 188* 38.3%
12 University of Southern California Gould School of Law 68 207 32.85%
13 University of Michigan Law School 119 378 31.48%
14 Georgetown University Law Center 198 637 31.08%
15 Yale Law School 61 205 29.76%
16 University of California at Los Angeles School of Law 78 344 22.67%
17 Vanderbilt University Law School 43 195 22.05%
18 Boston College Law School 62 285 21.75%
19 University of Texas School of Law 82 382 21.47%
20 Fordham University School of Law 84 429 19.58%
21 Boston University School of Law 48 269* 17.84%
22 George Washington University Law School 92 518 17.76%
23 University of Notre Dame Law School 26 190 13.68%
24 Washington University School of Law (St. Louis) 42 315 13.33%
25 Washington and Lee University School of Law 16 126 12.7%
26 Emory University School of Law 28 225 12.44%
27 Yeshiva University Benjamin N. Cardozo School of Law 45 380 11.84%
28 University of Washington School of Law 21 182 11.54%
29 University of Minnesota Law School 29 261 11.11%
29 University of Illinois College of Law 21 189 11.11%
31 Southern Methodist University Dedman School of Law 28 272 10.29%
32 University of Houston Law Center 27 281 9.61%
33 West Virginia University College of Law 12 126* 9.52%
34 Wake Forest University School of Law 15 158 9.49%
35 University of California, Davis School of Law 17 195 8.72%
36 University of North Carolina School of Law 21 246 8.54%
37 University of California Hastings College of the Law 35 412 8.5%
38 University of Missouri-Columbia School of Law 12 142* 8.45%
39 Seton Hall University School of Law 24 293 8.19%
40 Rutgers School of Law-Newark 19 248 7.66%
41 Howard University School of Law 12 157* 7.64%
42 Villanova University School of Law 19 252 7.54%
43 University of Maryland School of Law 20 281 7.12%
44 University of Wisconsin Law School 18 254 7.09%
45 Samford University Cumberland School of Law 11 157 7.01%
46 Temple University James E. Beasley School of Law 22 319 6.9%
46 University of Alabama School of Law 12 174* 6.9%
48 Brigham Young University J. Reuben Clark Law School 10 148 6.76%
49 Brooklyn Law School 30 455 6.59%
50 University of Miami School of Law 25 385 6.49%

*Graduate class size based on latest data from the ABA/LSAC Official Guide to Law Schools.

Winter 2012 Transparency Index Report

Today, we’re releasing a new feature on our website. The Transparency Index is an index of every ABA-approved law school website. It measures how transparent law schools are on their websites about their post-graduation outcomes for the class of 2010. From January 1, 2012 to January 3, 2012, the LST team analyzed and documented every site using 19 criteria chosen after contemplating what matters to a prospective law student looking to invest three years and a lot of money in a professional degree. The results from this period are LST’s Winter 2012 Transparency Index.

The Transparency Index is not a ranking system. It would not be very meaningful to rank a school by the number of criteria met because different criteria vary in importance. In other words, just because one school meets more criteria than another school does not mean that the first school is more transparent than the second.

It is also important to note that law school websites are fluid and that schools may respond to external stimuli, including LST’s official request for school NALP reports, by improving their web disclosure policies. In fact, some schools may have improved public employment information shortly after our data collection dates.

Over the next few weeks, we will make the Transparency Index more user friendly and update school information when we learn of the updates. Meanwhile, we encourage law schools to learn from the index, to update their websites with the TIQ Criteria in mind, and to alert us when they do so.

Full report is available here.
Winter 2012 Data is available here.
Live Transparency Index is here.

Executive Summary

As a new year unfolds and the debate about legal education reform continues, efforts in furtherance of law school transparency remain critical. While transparency of law schools’ post-graduation employment data will not solve all of legal education’s problems, it can put pressure on the current law school model and thereby act as a catalyst for broader legal education reform. This is true whether it occurs through the process of seeking transparency or because of the information that such disclosure ultimately reveals.

Having had their long-standing practice of withholding basic consumer information called into question, law schools have responded with new attempts at disclosure in advance of the ABA’s new requirements. Adequate disclosure should be easy to achieve; law schools have possessed ample information, in an easy publishable format, for many months. But as the findings of this report show, the vast majority of U.S. law schools are still hiding critical information from their applicants.

This report reflects LST’s analysis of the class of 2010 employment information available on ABA-approved law school websites in early January 2012. The Winter 2012 Index reveals a continued pattern of consumer-disoriented activity. Our chief findings are as follows:

  • 27% (54/197) do not provide any evaluable information on their websites for class of 2010 employment outcomes. Of those 54 schools, 22 do not provide any employment information on their website whatsoever. The other 32 schools demonstrate a pattern of consumer-disoriented behavior.
  • 51% of schools fail to indicate how many graduates actually responded to their survey. Response rates provide applicants with a way to gauge the usefulness of survey results, a sort of back-of-the-envelope margin of error. Without the rate, schools can advertise employment rates north of 95% without explaining that the true employment rate is unknown, and likely lower.
  • Only 26% of law schools indicate how many graduates worked in legal jobs. 11% indicate how many were in full-time legal jobs. Just 1% indicate how many were in full-time, long-term legal jobs.
  • 17% of schools indicate how many graduates were employed in full-time vs. part-time jobs. 10% indicate how many were employed in long-term vs. short-term jobs. 10% of schools report how many graduates were employed in school-funded jobs.
  • 49% of schools provide at least some salary information, but the vast majority of those schools (78%) provide the information in ways that mislead the reader.

Taken together, these and other findings illustrate how law schools have been slow to react to calls for disclosure, with some schools conjuring ways to repackage employment data to maintain their images. Our findings play into a larger dialogue about law schools and their continued secrecy against a backdrop of stories about admissions data fraud, class action lawsuits, and ever-rising education costs. These findings raise a red flag as to whether schools are capable of making needed changes to the current, unsustainable law school model without being compelled to through government oversight or other external forces.

LST Requests Class of 2010 Employment Information From Law Schools

This morning, we sent a letter to all ABA-approved law schools asking that they provide us class of 2010 employment information so that we may expand our data clearinghouse. Our goal is to provide thorough and comparable employment information to prospective law students. While some law schools are improving the quality of information they share, it is critical that people be able to compare law schools through a standardized presentation standard.

It is true that the ABA Section of Legal Education and Admissions to the Bar has taken important first steps towards reducing the provision of misleading information by law schools. However, these steps have failed to include critical information about the class of 2010, including the rate of graduates employed in legal jobs and the rate of graduates employed in full-time jobs. The section underestimated how prospective law students, pundits, and elected officials would react, despite mounting evidence of widespread consumer-disoriented behavior at law schools and within the section.

We hope that schools share our sense of urgency and help us put comparable employment information into the hands of consumers.

Continue reading LST Requests Class of 2010 Employment Information From Law Schools

Op/Ed on The Careerist: The Cooley Strategy Exposed

This op/ed is available on The Careerist.

The Cooley Strategy

Last week, Nelson Miller, associate dean of Thomas M. Cooley Law School’s Grand Rapids campus, wrote an editorial, “Lawyer Employment Remains Strong,” that appeared in The Careerist. Using employment data from the Bureau of Labor Statistics, he argues that lawyer job prospects are strong, that the legal profession has less risk than others, and that any noise questioning the value of obtaining a J.D. is as erroneous as it is inflammatory.

We will not spend much time discrediting Dean Miller’s “data-based” arguments, including Cooley’s Cooley’s Report One, which is the basis of this latest editorial. (That report has been thoroughly and thoughtfully discredited in an article by Matt Leichter.) To make a long story short, the underlying data upon which Report One depends excludes at least one broad segment of law school graduates: People who never became lawyers in the first place because they couldn’t find legal jobs.

So what is Miller’s editorial really about? Is it just an honest attempt by a law school administrator to educate students and allay unfounded fears propagated through the media? We don’t think so.

Law schools like Cooley are facing significant hardship because prospective students are increasingly more informed about the risk of taking on six-figure debt for the chance of entering the legal profession. In addition to numerous anecdotes, we are seeing this play out through fewer LSAT-takers and law school applicants. Unfortunately for these schools, this will translate into fewer people willing to pay $30,000, $40,000, or even $50,000 per year in tuition.

Miller has every incentive to distract consumers and conceal what Cooley graduates face after graduation. The 2009 Cooley graduates had an average law school debt in excess of $106,000, but only 42.2 percent obtained full-time legal work by February 2010. This statistic does not even account for Cooley’s unparalleled attrition rate, and we do not know how 2010 and 2011 graduates fared on these postgraduation metrics because Cooley does not share this information with its applicants.

The truth is that unless Miller and the rest of the Cooley administration can convince almost 2,000 people next year that a Cooley investment is worthwhile, they will be forced to make a series of hard business decisions in the coming years. This includes whether to keep the Michigan-based school’s new Florida campus and other satellite campuses open.

Commissioning reports,in-house rankings, and aggressive public relations are all part of a very smart strategy. The Cooley administration understands how these efforts affect prospective students. If Cooley can confirm to prospective students that law school is a magic ticket to financial security, it can continue to operate without introspection about what’s really wrong with legal education today.

As prospective students become more informed and the ABA exerts greater oversight to protect consumers of legal education, some enterprising deans will find ways to reduce tuition and class sizes, adapting their schools’ models to stay in business. Others will close up shop for lack of demand. And in the interim period, representatives like Miller will attempt to convince anyone who will listen that there is nothing wrong with taking on $106,000 in nondischargeable debt for their Cooley law degree. This continued, shameless promotion is part of the reason his law school has been hauled into court by former graduates amidst allegations of fraud and misrepresentation.

Miller’s advocacy for his law school at others’ expense belies his ethical responsibilities as both a lawyer and an educator. This country needs law school administrators who are capable of ethically recruiting and training the next generation of lawyers, judges, advocates, and educators. We do not need people running law schools who engage in Miller’s level of deception.

Case Update: Amended Alaburda Complaint Includes New Allegation

With the recent joint announcement by Law Offices of Dave Anziska and Strauss Law PLLC that the firms have drafted complaints against 15 ABA-approved law schools and intend to file them as class actions, we thought it would be a good idea to revisit the first class action against a law school for misleading employment information. We reached out to the lead attorney handling Alaburda v. TJSL, Brian Procel of Miller Barondess, LLP, for an update on where things stand.

The most recent Amended Complaint (available below), filed September 15th, 2011, contains a new allegation (our emphasis):

5. Furthermore, TJSL also misleads students by concealing the fact that these post-graduate employment figures are based on a small sample of graduating students rather than the entire class of graduates. Specifically, TJSL conceals the fact that its statistics are based on surveys and questionnaires that are sent to only a fraction of its graduates. Not all graduates receive surveys or questionnaires.

If discovery reveals the bolded to be true, the school may have more to worry about than the Alaburda complaint.

Risk of ABA Sanctions?

Many schools have defended the gaps in their employment information by stating that graduates simply don’t respond to their requests, and that nothing the school does can get graduates to voluntarily report more and better data. This conclusion is suspect, given that graduates are less likely to report when they feel let down by the school. A high non-response rate should raise eyebrows about the quality of a school’s services. But purposely not contacting certain graduates, if substantiated, may constitute a violation of the ABA’s Accreditation Standards. This would make TJSL subject to probation or even a loss of accreditation.

Such sanctioning could happen irrespective of whether Alaburda’s attorneys are successful in recovering under one or more claims. As weak as the ABA’s current accreditation standards are, law schools must publish “basic consumer information . . . in a fair and accurate manner reflective of actual practice.” What constitutes “basic consumer information” has in the past been restricted only, in practice, to the overall employment rate and bar passage data. (This means that schools could technically present any other employment information, e.g. salary statistics, in an inaccurate manner without risking its accreditation.)

But a pattern of failing to survey some graduates looks like it would constitute a violation of the standards, particularly if the behavior was motivated by a belief that the unsurveyed graduates are likely to report undesirable outcomes. Schools are all over the ethical map in terms of how to creatively count or massage the data graduates report to them, but an outright failure to even contact some graduates should not be ignored by the ABA.

Current Students Suing?

Otherwise, Alaburda’s lead attorney is “optimistic the class will be certified” given that “the alleged misrepresentations are uniform.” Keep in mind, the class includes not only recent graduates but also current law students. Much of the attention in the media has focused on how graduates are bringing claims against their alma maters, but both the TJSL complaint and the complaints against Cooley and New York Law School contemplate including current students. At least one of the draft complaints to be filed against the 15 additional law schools also lists current law students as eventual members of the class. This could make for an interesting development if any of the classes are certified. Current students would continue to pay tuition while simultaneously waiting to see if they can recover for the initial fraudulent acts that got them into the school.

Note: as with the two other firms handling claims against law schools, Mr. Procel reports that they “have received dozens of inquiries from graduates of other law schools who are interested in filing suit.”