Rutgers – Camden School of Law’s Dean Stands by Marketing Campaign

This weekend we wrote about a recruitment letter sent by Rutgers – Camden School of Law’s admissions dean, Camille Andrews. We alleged that the letter contained incomplete, deceptive, and false information, and that as a result Dean Andrews should resign from her post and the ABA should conduct an investigation and bring appropriate sanctions against the law school.

In an article published in Inside Higher Ed, Camden’s Dean Rayman Solomon responded. Neither Dean Solomon nor Dean Andrews responded to us directly, and we have only the portions of Dean Solomon’s statements published by Inside Higher Ed:

Dean Rayman Solomon is standing by Andrews. Solomon said the recruitment material was accurate but that he’s “open to discussion” about the best way to reach prospective students going forward. The promotion in question targeted potential applicants who took the GMAT, not the LSAT, the typical law school admission test. The goal, Solomon said, was to reach a new audience and introduce the Rutgers-Camden program. Students could then go online to get more information.

“This was one letter saying are you interested, have you thought about it?” Solomon said. “This is not our entire marketing campaign. This is telling people that we have a program.”

But were the numbers misleading?

“I don’t know how to respond,” Solomon said. “If you have a hundred people, would four of them be misled? Would one be misled? Would 98 be misled? [It was] a piece that was designed to get people to think about something they hadn’t thought about. This wasn’t the only information they could get about it.”

We appear to agree with Dean Solomon on the purpose. The May 2012 letter was designed to get students to think about law school or a legal career who were not known to be interested in attending law school starting in August 2012. We bet we also agree on the following three points:

  • Camden waived the application fee to reduce the application barrier
  • Camden discussed employment outcomes to show its placement successes in a bad economy
  • Camden discussed salary outcomes and salary potential to inform the cost-benefit analysis of the campaign targets

However, we clearly disagree about whether Camden’s employment outcome claims adequately reflect reality and whether targeting people who had not yet expressed interest in law school was appropriate given the very short decision window and lack of knowledge about their professional goals.

Nevertheless, neither LST nor Camden knows the actual effect of the campaign on the letter recipients. Frankly it doesn’t matter whether many people or zero people enroll. We care about how Camden conducts itself in the law school marketplace; Camden unfairly used employment statistics to augment its argument that the law school is a safe haven from a bad economy. In this regard Camden crossed the ethical (and likely legal) line from mere puffery to deceptive advertising. These facts are troubling irrespective of whether prospective students are sophisticated, unsophisticated, or indifferent.

The brunt of Dean Solomon’s response is that this is but a single letter that isn’t a big deal and shouldn’t affect decision making. To that we ask, what could the employment statistics have been meant to do other than affect application and enrollment decisions? The letter was part of a recruitment campaign, not a teaser for a movie due out next summer. Camden should strive to have all of its communications with students be accurate and honest. Dean Solomon further states that the misinformation is okay because other information is out there. It would appear that he is saying “you should know not to take our statements at face value.” That’d be a pitiful position for a law school dean to take.

It’s not acceptable to provide prospective students with false and misleading information just because the truth is available somewhere else. Interpretation 509-4 to ABA Standard 509 clearly states that reporting consumer information accurately somewhere does not absolve a school’s responsibility to present such information in a fair and accurate manner elsewhere.

Interpretation 509-4
Standard 509 requires a law school fairly and accurately to report basic consumer information whenever and wherever that information is reported or published. A law school’s participation in the Council-designated publication referred to in Interpretation 509-2 and its provision of fair and accurate information for that book does not excuse a school from the obligation to report fairly and accurately all basic consumer information published in other places or for other purposes.

It’s worthwhile to emphasize that Dean Solomon disputed our analysis and not our numbers. He also said he is open to discussion. So are we, and we’ve sent him the following email:

We would like to know what specifically in our analysis you believe is incorrect.

1. Does the category “JD Advantage” include only jobs in the legal field?
2. If #1 is no, did any Camden graduates have a “JD Advantage” job not in the legal field? If so, how many?
3. Do you think the advertised private practice starting salary of $74,000 represents the average of all 2011 graduates employed in private practice?
4. How many graduates reported earning salaries of at least $130,000?
5. Do you believe the answer to #4 can fairly be described as “many”?
6. Are statements about employed graduates meaningful without disclosing how many non-employed graduates there are?

Please respond via email. If you do not have adequate information to answer any of these questions, please say so. In addition to the email, we would be happy to schedule a time to talk about the data, our analysis, Camden’s forthcoming remedial measures, and the internal policies Camden plans to adopt to prevent repeat violations of ABA Standard 509.

We reemphasize that the letter must stand on its own merits. This letter was intended to create a first impression with prospective students and paint in their minds a picture of financial security if they attend law school at Rutgers – Camden School of Law. Later discovering that the letter was deceptive does not erase the deception.

We will post a new story if/when Dean Solomon responds.

LST Calls for Dean’s Resignation and ABA Investigation

Last week we became aware of an ongoing recruiting campaign by Rutgers – Camden School of Law that targets students who were not considering law school. As a part of this campaign, Camille Andrews, Associate Dean of Enrollment, sent students an email with bold statements about the employment outcomes achieved by the class of 2011. When compared to the school’s self-published employment data, we see Dean Andrew’s statements range from misleading to plainly false. Because the statements made in this email are demonstrably deceptive and are in clear violation of ABA Standard 509, Dean Andrews should resign immediately from her administrative appointment.

There are two important layers to this story. First, Dean Andrews made unfair statements about the employment outcomes of Camden graduates. These statements exaggerate the successful outcomes of Camden graduates and attempt to influence student behavior. The realities of Camden’s placement are far different from what Dean Andrews discloses. (More on this below.)

Second, Camden has extended a special offer for people who haven’t followed the normal application process and haven’t expressed an interest in law school or legal practice. (The email recipients had taken the GMAT, not the LSAT.) The Camden Special allows the students to avoid delay and enroll this August. By portraying Camden as some down-economy safe haven that leads to status and riches, Dean Andrews is attempting to enroll the exact students who ought not to attend law school: people who have not had time to carefully weigh the pros and cons of this significant investment.

In addition to ensuring that Dean Andrews resigns, Camden must also take swift, corrective action in all cases where prospective students received emails containing these or similar false, misleading, or incomplete statements. We also call on the American Bar Association to conduct a full investigation and bring appropriate sanctions against the school for violations of the ABA Standards, especially Standard 509(a) and Interpretation 509-4. Not only is Camden an institute of higher learning, but it also serves as a gateway to the legal profession. The degree of recklessness displayed by Dean Andrews, and the Camden administration for permitting a representative to deceive potential students, cannot be tolerated. It’s the latest example of a law school having no accountability for its recruiting practices. These practices must stop.

What follows is an analysis of each unfair statement made by Dean Andrews. We can do this analysis because Camden has made the relevant employment data publicly available, though their accessibility does not excuse false, misleading, and incomplete statements that the administration should know leave readers with incorrect impressions. Each statement is itself a black eye for Rutgers — Camden School of Law, but it’s the cumulative effect of all of the statements and all of law school bad behavior that makes resignation, corrective action, and sanctions imperative.

Analysis of Statements by Dean Andrews for Rutgers – Camden School of Law

Camden Data
Click image to enlarge. Created from the data Camden provided on its website.

“[O]f those employed nine months after graduation, 90% were employed in the legal field”

This is problematic on two levels. First, it excludes non-employed graduates from the calculation to provide a false sense of success. There were 242 graduates in Camden’s 2011 graduating class. Of these, 199 were employed. Camden uses 199 as the denominator with no indication that it has excluded 17.8% of the class from the calculation. While the statement does disclose that it is “of those employed,” the number of unemployed graduates is so large that the statement requires context to avoid misrepresenting what it means. The advertised “90% of employed” actually only represents 74% of the whole class.

Second, “in the legal field” implies “as a lawyer,” yet Camden groups non-lawyers with lawyers to create the “in the legal field” category. Specifically, Camden has combined two distinct categories: jobs that require bar admission (154 grads) and jobs where the J.D. was an advantage (25 grads). The advertised “90% of employed” actually works out to 63.6% of the class in lawyer jobs, with another 10.6% in jobs where the J.D. was an advantage.

The “J.D. Advantage” category that Camden uses to boost its “in the legal field” rate includes jobs as paralegals, law school admissions officers, and a host of jobs not credibly considered “in the legal field.” A graduate falls into this category when the employer sought an individual with a J.D. (and perhaps even required a J.D.), or for which the J.D. provided a demonstrable advantage in obtaining or performing the job, but the job itself does not require bar passage, an active law license, or involve practicing law.

“[O]f those employed nine months after graduation . . . 90% were in full time positions.”

This likewise excludes non-employed graduates without indicating that 17.8% of the class has been excluded. Once again, 90% of employed actually means only 74% of the whole class.

“Our average starting salary for a 2011 graduate who enters private practice is in excess of $74,000, with many top students accepting positions with firms paying in excess of $130,000.”

There are a number of distinct problems with this statement. First, Camden does not accurately state what the average reflects. The average is “for a 2011 graduate who enters private practice and reported a salary” not “for a 2011 graduate who enters private practice.” This is not a trivial distinction. Only 46.6% of graduates in private practice reported a salary. Of those that did so, the numbers were slanted towards higher salaries at large firms. 83.3% of graduates at firms with 101 or more attorneys reported their salaries, while only 37.0% of those at smaller firms reported a salary. The low overall response rate and the bias towards higher salaries being reported mean that the average of responses is not the average “for a 2011 graduate who enters private practice.”

Second, Camden does not disclose the salary response rate. The private practice salary response rate (46.6%) indicates that private practice salaries don’t tell the whole story. The letter also does not state that only 24% of the class was in private practice. This means the “average starting salary” actually reflects the average salary for just 11.2% of the class. None of this was communicated to the recipients of Dean Andrews’ email.

Third, Camden uses the average salary figure without any statistical context. NALP, LST, and many other academics have belabored for many years about how average salaries tend to mislead more than inform. This is because reported salaries fall into a bimodal distribution. For the class of 2010 (across all law schools), there is one peak from $40,000 to $65,000, accounting for nearly half of reported salaries, and another distinct peak at $160,000. This bimodal distribution means that very few graduates make the mean salary of $84,111.

Based on the salary data Camden produces on its website, we see a similar distribution to the national picture across private practice salaries. There were 27 salaries provided; between 8 and 12 were above the $74,000 average by at least 30%; the rest were below the average, with 14 or more at least 20% below the average.

Fourth, Camden claims that many of its top students have accepted positions with firms paying “in excess of $130,000.” To be sure, “many” is ambiguous. It might reasonably mean 40% of the class, or even perhaps 20%. With the “top” qualifier, it might not even strain credibility to claim that 10% of the class constitutes “many” top students. Based on the published data, Camden knows that at most five graduates reported a salary of $130,000+, or 2.1% of the entire class. After analyzing the salary data in detail, we think just one graduate did. Whether it is one or five, “many” is far from accurate.

That said, we do know that eight graduates (or 3.3%) made at least $100,000. We also know that Camden grossly exaggerated the salary outcomes of its graduates right after exalting placement success and right before pointing out how its alumni are among the very richest of all lawyers. Of course, this is the same school that reported to U.S. News that its 2011 graduates had an average of only $27,423 in debt, even though the estimated total debt was well into the six figures for a New Jersey resident graduating in 2011 receiving no tuition discount. Fewer than a third (31.7%) of students received tuition discounts, with just 4.3% of students received more than a 50% discount on tuition.

“Rutgers is also ranked high in the nation at placing its students in prestigious federal and state clerkships.”

Like Camden, we only have the class of 2010 data to use to compare clerkship placement rankings. With federal clerkships, Camden does okay, tied for 33rd. In terms of percentage of students placed in federal clerkships, it’s as close to 16th place as it is to last (188th). Suffice it to say that this exaggeration caps off a legion of false, misleading, and incomplete information used to induce applicants who didn’t even take the LSAT.

Other Coverage

Data Clearinghouse Updates

LST has received a number of inquiries from schools since updating our employment data clearinghouse. In most instances the schools did not understand the data they were publishing, either on their websites or through U.S. News.

Of the inquiries we received, two complaints, which came from administrators at Santa Clara and Toledo, warranted updates/corrections. These schools informed us that we were using wrong or incomplete data. They were right in this regard, though the problems stemmed exclusively from what schools supplied to U.S. News.

In Toledo’s case, a law school representative misinterpreted questions on the U.S. News survey and therefore supplied the magazine with incorrect data. In Santa Clara’s case, the school had policies and procedures in place that led to under-reporting what the school actually knew about its graduates. In both cases we were able to work with the schools to identify the source of the problems and have corrected the errors with data supplied by the schools.

(We’d also like to mention that prior to releasing the class of 2010 data clearinghouse, we
contacted ten law schools that made the same mistake as Toledo with their U.S. News-supplied data. Four of these schools confirmed the discrepancies and have provided us with the correct data.)

We’ve added the following note to Santa Clara’s class of 2010 profile:

After joint review with Santa Clara, we have restored the school’s profile using data provided by Santa Clara following its internal review of each 2010 graduate’s student file. Rather than relying on student-supplied data, which is what the school reported to U.S. News and reported on its website (the original data in the school’s profile), Santa Clara added data the administration culled from conversations and basic investigation.

Note: One major change is with the 28 jobs Santa Clara originally reported as non-professional. Santa Clara tells us “[t]his was done in error.” While these graduates were still employed, Santa Clara does not know what sort of credentials (e.g. bar passage required) those graduates’ jobs required. However, Santa Clara does know 12 of these 28 graduates’ employer types (e.g. law firm) and expected working hours (i.e. FT or PT).

We are happy to report that Toledo chose to resolve questions about graduate employment outcomes by disclosing the 2010 NALP report, joining 47 other ABA-approved law schools who have done so. Toledo’s class of 2010 NALP report now appears in our report database and can be viewed here.

Finally, we welcome law schools to continue contacting us with their concerns. These conversations are always valuable and almost always lead to improving law school transparency.

Progress Towards Law School Transparency

We have a few updates on our progress towards greater law school transparency. The first is a rundown of voluntary website improvements made by law schools in advance of the ABA standards reform. The second is to announce that LST has obtained its 40th NALP report from an ABA-approved law school.

Transparency Index: Class of 2010

Back in January, we released the Transparency Index, an index of every ABA-approved law school website. It measures how transparent law schools are on their websites in detailing post-graduation outcomes for the class of 2010 through the lens of 19 criteria.

We said:

Taken together, these and other findings illustrate how law schools have been slow to react to calls for disclosure, with some schools conjuring ways to repackage employment data to maintain their images. Our findings play into a larger dialogue about law schools and their continued secrecy against a backdrop of stories about admissions data fraud, class action lawsuits, and ever-rising education costs. These findings raise a red flag as to whether schools are capable of making needed changes to the current, unsustainable law school model without being compelled to through government oversight or other external forces.

There is good, bad, and disappointing news. The bad news is that secrecy is still the norm, with schools still opting to selectively withhold class of 2010 information to suit their own interests (and to the possible detriment of prospective students). With this year’s admissions cycle well underway, and admitted students closing in on their final decisions, people could really use the critical information schools have at their fingertips. While we can place some responsibility at the feet of those who will knowingly choose to attend schools that are withholding critical information, their poor choices still stem from law schools’ bad acts, especially when it is clear that many prospective students still do not understand the extent of the gaps in information, which law schools have become so adept at hiding.

The good news is that we’ve seen been big improvements to a number of law school websites following the publication of our Winter 2012 Transparency Index Report. Further, these improvements are likely an underestimation: we’ve only updated the Live Index as we’ve come across evidence of improvements or have been directed to updates (usually by the schools themselves). As more and more schools respond positively to criticism, it is also getting easier to identify who the bad actors are.

  • 22% of schools do not provide evaluable class of 2010 information, up from 27%.
  • 64% of schools indicate how many graduates actually responded to their survey, up from 49%. Response rates provide applicants with a way to gauge the usefulness of survey results, a sort of back-of-the-envelope margin of error. Without the rate, schools can advertise employment rates north of 95% without explaining that the true employment rate is unknown, and likely lower.
  • 39% of law schools indicate how many graduates worked in legal jobs, up from 26%. 20% provide the full-time legal rate, up from 11%. We are aware of no additional schools providing the full-time, long-term employment rate. (It is still just two schools, or 1%.)
  • 28% of schools indicate how many graduates were employed in full-time vs. part-time jobs, up from 17%.
  • 16% indicate how many were employed in long-term vs. short-term jobs, up from 10%.
  • 16% of schools report how many graduates were employed in school-funded jobs, up from 10%.
  • 59% of schools provide at least some salary information, up from 49%. But the majority of the 59% of schools (63%) provide the information in ways that mislead the reader, down from 78%.

These are substantial changes. The schools who have made them (list available here) deserve some praise. However, it needs to be restated that every school could meet all 19 of the criteria we used in the Transparency Index, so our praise comes with that caveat.

The disappointing news is that one of the most transparent schools from our original report has decided to be less transparent. The University of Houston Law Center previously met thirteen criteria; it now meets only seven. (Original Disclosure; New Disclosure.)

In particular, Houston no longer shares the percentage of graduates in full-time and part-time jobs, in school-funded jobs, and in full-time legal jobs. It also no longer indicates when the graduates obtained the job (timing), nor how the graduate obtained the job (source). The school now also provides misleading salary information because the school no longer indicates the response rate for each salary figure provided.

When asked why they took such a huge step backwards, the dean of career services cited that Houston was now just doing what other schools were doing. She also claimed it was an improvement overall because it also included 2009 and 2008 employment data, although it is barely more than what’s already available in our data clearinghouse (2008 & 2009).

For the unacquainted, Houston copied the University of Chicago’s presentation standard, and in doing so actually decreased its level of disclosure. We criticized Chicago’s standard back in January for this particular reason:

Last month, the University of Chicago Law School received widespread acclaim for its decision to provide improved consumer information about the class of 2010. We believe Chicago received acclaim because the job outcomes for Chicago’s 2010 graduates appear to be strong relative to other law schools’ outcomes. The positive responses confused the unexpected quality of outcomes, which for Chicago graduates remained strong despite the retraction in attorney hiring, with actual quality of disclosure. Chicago coupled tabular data with language about the need for transparency, leading people to claim that Chicago has set the market. But if every law school disclosed employment data according to Chicago’s incomplete methodology, most would still continue to mislead prospective law students.

The Chicago methodology does not distinguish between full- and part-time jobs, between long- and short-term jobs, or whether positions are funded by the law school. Nor does it indicate the numerator for the salary figures. For Chicago, this is a relatively minor oversight because it collected salary data from 94% of employed graduates. But the further the response rate moves away from 100%, the more important it is that the rate be disclosed for every category that a school provides salary information for. Few, if any, other schools have a response rate above 90%.

Predictably, Chicago’s puffery about its dedication to transparency has done more harm than good.

LST Obtains Its 40th NALP Report

There remains reason to be optimistic, however. LST has obtained NALP reports from six more law schools since our original announcement.

This brings the total to 40 law schools. While such incremental improvements to transparency suggest a long road ahead, we do consider 40 a significant threshold. LST will continue advocating for law schools to share their class of 2010 reports and, when the time comes, officially request the class of 2011 NALP reports too. Accepted students who don’t want to wait until then can start contacting the schools to request the 2011 data, now that we have passed the February 15 reporting deadline. If you are successful in leveraging your acceptances to procure the data please consider sharing it with us and directly with other applicants on popular discussion board sites like TLS.

Class of 2011 NLJ 250 Statistics

The National Law Journal (NLJ) released its annual report this weekend on the law schools that send the most graduates to the 250 largest American law firms (NLJ 250). In this post we’ll answer a few basic questions about this important employment outcome measure. This is the first Class of 2011 employment information publicly provided.

What is the NLJ 250?

The NLJ 250 includes the 250 largest law firms headquartered in the United States. This is measured by the firm-reported annual average number of full-time and full-time equivalent attorneys working at the firm, in any office, in 2011. This does not include temporary or contract attorneys.

Where do the data come from?

First, the NLJ collects survey data from the law firms themselves, not the law schools. A significant percentage of all NLJ 250 firms responded to the survey about first-year hiring. (The NLJ would not comment on the exact percentage.) The NLJ then contacts the law schools to fill in the gaps — but never relies directly on their word. The final figures reached are minimums, representing only the people the NLJ verified to their liking; at no point does the NLJ extrapolate from a smaller number to a larger number.

What do these numbers tell us?

Large firm placement percentage is an important, albeit imperfect, proxy for the number of graduates with access to the most competitive and highest paying jobs. The percentage, accordingly, tell us which schools most successfully place students in these highly sought-after jobs. Successful large firm placement is best analyzed by looking at multiple years worth of data. (View the NLJ 250 from 2010 here.)

What do these numbers not tell us?

First, self-selection controls all post-graduation outcomes. Nobody is coerced into a job they are offered (unless you consider debt pressure or other strong personal influences coercive), so these numbers do not provide more than a proxy for opportunities. Opportunities, after all, are prospective students’ real concern when analyzing employment information, and these rankings do not necessarily reflect a school’s ability to place students into NLJ 250 firms.

Many graduates, particularly at the top schools, choose to clerk after graduation instead of working for these law firms. While not all of these graduates would have secured employment at the NLJ 250 firms, many could have. For this reason, one popular technique used to understand a school’s placement ability is adding the percentage of graduates at NLJ 250 firms to the percentage of graduates clerking for Article III judges. This method is not perfect; read our white paper for a more detailed explanation of the strengths and weaknesses of this technique.

Second, NLJ 250 firm jobs are not the only competitive, high-paying firm jobs. Boutique law firms are also very competitive, with some paying New York City market rates and above. Additionally, the NLJ 250 does not include large, prestigious internationally-based law firms with American offices.

Third, not all NLJ 250 firm jobs are equally competitive. Law firms from different regions and of differing caliber have varying preferences for the students from different law schools, including how far into the class they are willing to reach. That is, two schools that place an equal percentage of graduates in NLJ 250 firms may do so for reasons other than similar preferences among equally competitive NLJ 250 firms.

Fourth, the rankings include data only about the law schools that placed at least 6.49% of its entire class in the NLJ 250 firms. All other American law schools placed a lower, unknown percentage at NLJ 250 firms. The remaining schools range from 0% to 6.49%, and probably do not fall into a normal distribution.

If you have more questions, please feel free to email or reply this post. We will update this as needed.

2011 placement into NLJ 250 firms by law school

Rank School NLJ 250 Grads Total Grads % of Class
1 University of Pennsylvania Law School 156 274 56.93%
2 Northwestern University School of Law 149 286 52.1%
3 Columbia Law School 235 455 51.65%
4 Harvard Law School 285 583 48.89%
5 Stanford Law School 87 181* 48.07%
6 University of California, Berkeley School of Law (Boalt Hall) 140 305 45.9%
7 University of Chicago Law School 92 203 45.32%
8 Duke Law School 89 219* 40.64%
9 New York University School of Law 187 466 40.13%
10 University of Virginia School of Law 150 377 39.79%
11 Cornell Law School 72 188* 38.3%
12 University of Southern California Gould School of Law 68 207 32.85%
13 University of Michigan Law School 119 378 31.48%
14 Georgetown University Law Center 198 637 31.08%
15 Yale Law School 61 205 29.76%
16 University of California at Los Angeles School of Law 78 344 22.67%
17 Vanderbilt University Law School 43 195 22.05%
18 Boston College Law School 62 285 21.75%
19 University of Texas School of Law 82 382 21.47%
20 Fordham University School of Law 84 429 19.58%
21 Boston University School of Law 48 269* 17.84%
22 George Washington University Law School 92 518 17.76%
23 University of Notre Dame Law School 26 190 13.68%
24 Washington University School of Law (St. Louis) 42 315 13.33%
25 Washington and Lee University School of Law 16 126 12.7%
26 Emory University School of Law 28 225 12.44%
27 Yeshiva University Benjamin N. Cardozo School of Law 45 380 11.84%
28 University of Washington School of Law 21 182 11.54%
29 University of Minnesota Law School 29 261 11.11%
29 University of Illinois College of Law 21 189 11.11%
31 Southern Methodist University Dedman School of Law 28 272 10.29%
32 University of Houston Law Center 27 281 9.61%
33 West Virginia University College of Law 12 126* 9.52%
34 Wake Forest University School of Law 15 158 9.49%
35 University of California, Davis School of Law 17 195 8.72%
36 University of North Carolina School of Law 21 246 8.54%
37 University of California Hastings College of the Law 35 412 8.5%
38 University of Missouri-Columbia School of Law 12 142* 8.45%
39 Seton Hall University School of Law 24 293 8.19%
40 Rutgers School of Law-Newark 19 248 7.66%
41 Howard University School of Law 12 157* 7.64%
42 Villanova University School of Law 19 252 7.54%
43 University of Maryland School of Law 20 281 7.12%
44 University of Wisconsin Law School 18 254 7.09%
45 Samford University Cumberland School of Law 11 157 7.01%
46 Temple University James E. Beasley School of Law 22 319 6.9%
46 University of Alabama School of Law 12 174* 6.9%
48 Brigham Young University J. Reuben Clark Law School 10 148 6.76%
49 Brooklyn Law School 30 455 6.59%
50 University of Miami School of Law 25 385 6.49%

*Graduate class size based on latest data from the ABA/LSAC Official Guide to Law Schools.

LST Obtains 34 Class of 2010 NALP Reports

Update: We were alerted that Indiana–Bloomington had made its NALP report available online prior to this story’s publication. We’ve made this correction throughout this story.

Update 2: We apologize to the University of Utah S.J. Quinney College of Law. On February 10th, the school provided LST its NALP report, but we did not include it in this story.

Update 3: To see more schools that have since complied, see here.

On December 14, 2011, we wrote all ABA-approved law school deans to request the class of 2010 NALP report that each school received in June 2011. We are pleased to announce that we have obtained 32 34 of these reports (17.3% of ABA-approved law schools and 17.8% of schools with NALP reports). 29 of the 32 34 law schools sent a report to LST. The other three four were pulled from school websites.

We asked for these reports to help prospective law students find the law schools that best meet their career objectives. Together, these reports provide prospectives access to timely, thorough, and comparable employment information. They make LST’s website an even more valuable source of information for prospective law students. We hope to soon add the employment data contained in the NALP reports to our data clearinghouse. In the meantime, these reports are available here. In addition, the list below links to each school’s NALP report.

Schools Providing the Class of 2010 NALP Report to LST (30):

Non-Responding Schools with accessible NALP Reports (4):

Akron

Northern Illinois University

Thomas Jefferson

Indiana — Bloomington

We want to make a special note that six law schools do not submit employment data to NALP and therefore do not have these forms: Notre Dame, Pepperdine, St. Louis, and the three law schools in Puerto Rico. This does not mean that these schools do not have ample employment data, however. Rather, these are the only schools who do not have a prepared form in front of them that can quite easily be disclosed to prospective law students.

Interestingly, the vast majority of schools providing LST with the 2010 NALP reports were public institutions. This may be because these schools interpreted our request as an open records request. Two schools (Wisconsin and UNC) explicitly treated our request in this way.

We received a handful of responses expressly declining to provide LST with the NALP report. Consistent with past communications with law schools, a number of schools indicated that they had meaningful employment information already available on their websites. In almost every case, this was (and remains) false.

In addition, there were two “no” responses that stood out. The full text of the responses is below.

First, Ave Maria argued that its NALP report does not provide meaningful information. Dean Milhizer claimed, “Our school is small with a unique mission, and our employment outcomes are reflective of this.” He suggested that additional information would be needed to assist prospective students. He did not, however, suggest what kind of information would be useful. The school’s website contains little information on what graduates found for work, with much of that information serving to mislead applicants.

Second, Chapman University argued that its report was confidential. Dean Campbell’s response is very misleading. It is true that NALP promises that it will not share the graduate-level data or the reports generated from those data with anyone except the law school. But schools are under no such obligation, either contractually or (for the NALP reports) legally. If it was required to keep the data confidential, Chapman could not provide these employment statistics on its website. Instead, the NALP reports only remain confidential because law schools decline to share them with those who would find the information most useful. Fortunately for prospective students, at least 32 33 schools disagree that the NALP reports are confidential.

Overall, as the Live Transparency Index shows, schools are increasingly sharing more employment information. But the vast majority of law schools still leave critical gaps in their presentation of employment information – gaps which the NALP reports would fill. These 32 34 law schools have demonstrated leadership that is sorely lacking at other law schools. While these schools still need to choose how to share employment information on their websites, they understand the importance of providing free access to comparable information now. Our hope is that these schools pave the way for changes at other schools, many of whom are still acting as if their applicants do not deserve access to comparable consumer information.

Ave Maria School of Law:

Dear Mr. McEntee and Mr. Lynch:

I applaud your efforts to assist prospective law students in obtaining reliable information about the law schools of interest to them. However, Ave Maria School of Law does not believe that its Class of 2010 Summary Report from NALP will provide meaningful information about our school to prospective students. Our school is small with a unique mission, and our employment outcomes are reflective of this. In our judgment, the Summary Report does not provide sufficient information about the types of positions obtained by our 2010 graduates, and so to release the report in a vacuum without additional information would not be of assistance to prospective students.

This year, as in years past, AMSL will comply with NALP’s guidelines on reporting employment outcomes for the Class of 2011, and we will be participating in the ABA’s new annual survey of these outcomes.

Sincerely,

Eugene R. Milhizer
President and Dean

Chapman University:

Dear Law School Transparency,

You have requested that our School of Law send you the otherwise confidential report that we supply to NALP. We have years of experience with NALP, and on that basis, we know that the information we send will be appropriately treated, consistent with good ethics and all applicable federal and state laws. Regrettably, we do not have that kind of basis with your organization. Accordingly, we think it is most appropriate to continue to keep our submission to NALP confidential.

Nevertheless, our School of Law maintains a very informative website, and we post a great deal of data on our graduates’ employment there. That information is available to the public, and we invite you to consult this source if you would like.

Sincerely,

Tom Campbell
Dean

Breaking: 12 more law schools facing class actions

The Law Offices of David Anziska, together with Strauss Law PLLC and six other law firms, publicly announced moments ago that they have filed complaints against 12 more law schools. To date, 15 of the country’s 197 ABA-approved law schools are facing class action suits. (Thomas Jefferson, New York Law School, and Thomas Cooley have already been sued, with the first lawsuit already in discovery.)

These lawsuits should be of grave concern to the ABA, both as the only federally-recognized accrediting body and as the legal profession’s largest and most powerful trade organization. Nearly 8% of its member schools have been formally accused of fraud by 74 former students. While positive results for the plaintiffs would further confirm what LST has drawn attention to over the past two years, the underlying problem of poor ABA governance will remain unchanged by the results. Recent efforts to reform the accreditation standards are a start, but the ABA has yet to show that they will take any significant corrective action against schools. While these lawsuits will attempt to hold schools accountable for past misleading actions, it will be up to the ABA to ensure its member schools do not continue the fraud that is widespread throughout American legal education.

The new batch includes 11 schools from Anziska and Strauss’s October 2011 announcement. The twelfth is Golden Gate University School of Law, as Above the Law announced late last year.

All 12 Schools:

  • Albany Law School
  • Brooklyn Law School
  • California Western School of Law
  • Chicago-Kent College of Law
  • DePaul University College of Law
  • Florida Coastal School of Law
  • Golden Gate University School of Law
  • Hofstra Law School
  • John Marshall School of Law (Chicago)
  • Southwestern Law School
  • University of San Francisco School of Law
  • Widener University School of Law

As momentum for holding law schools accountable grows and people start to realize the courts are their only remedy, LST expects more class actions will be filed this year. These allegations concern a long history of consumer-disoriented behavior, which unfortunately continues today at a great number of schools. LST’s Winter 2012 Transparency Index shows just how poor the newly-sued schools are doing when it comes to being honest about what their graduates found for work. Just one of the twelve schools currently discloses the number of graduates who found full-time, permanent jobs for which bar passage was required.

Transparency Index Performance of Newly-Sued Schools

School State Transparency Index Performance
Albany Law School NY Does not indicate # in FT/PT jobs or LT/ST jobs. Provides Legal Employment Rate.
Brooklyn Law School NY Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. Provides misleading salary figures.
California Western School of Law CA Struggled with its graduate survey response rate more than most schools. Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. Provides misleading salary figures.
Chicago-Kent College of Law IL Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. Provides misleading salary figures.
DePaul University College of Law IL Does not indicate graduate survey response rate. Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. Provides misleading salary figures.
Florida Coastal School of Law FL Struggled with its graduate survey response rate more than most schools. Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. However, it does provide the Legal Employment Rate. Provides misleading salary figures.
Golden Gate University School of Law CA Struggled with its graduate survey response rate more than most schools. Does not indicate # in school-funded jobs or LT/ST jobs. However, it does provide the FT Legal Employment Rate.
Hofstra Law School NY Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. Provides misleading salary figures and employer list.
John Marshall School of Law (Chicago) IL Does not indicate # in school-funded jobs or LT/ST jobs. Provides the FT Legal Employment Rate. Provides many misleading salary figures.
Southwestern Law School CA One of the best performing schools with 12 met criteria. One of two schools that currently provide the Full-time, Long-term Legal Employment Rate. Does not indicate # in school-funded jobs.
University of San Francisco School of Law CA Does not provide employment statistics on its website.
Widener University School of Law DE/PA Struggled with its graduate survey response rate more than most schools. Does not indicate # in school-funded jobs, FT/PT jobs, or LT/ST jobs. However, it does provide the FT Legal Employment Rate.

View the press release after the jump »»

Winter 2012 Transparency Index Report

Today, we’re releasing a new feature on our website. The Transparency Index is an index of every ABA-approved law school website. It measures how transparent law schools are on their websites about their post-graduation outcomes for the class of 2010. From January 1, 2012 to January 3, 2012, the LST team analyzed and documented every site using 19 criteria chosen after contemplating what matters to a prospective law student looking to invest three years and a lot of money in a professional degree. The results from this period are LST’s Winter 2012 Transparency Index.

The Transparency Index is not a ranking system. It would not be very meaningful to rank a school by the number of criteria met because different criteria vary in importance. In other words, just because one school meets more criteria than another school does not mean that the first school is more transparent than the second.

It is also important to note that law school websites are fluid and that schools may respond to external stimuli, including LST’s official request for school NALP reports, by improving their web disclosure policies. In fact, some schools may have improved public employment information shortly after our data collection dates.

Over the next few weeks, we will make the Transparency Index more user friendly and update school information when we learn of the updates. Meanwhile, we encourage law schools to learn from the index, to update their websites with the TIQ Criteria in mind, and to alert us when they do so.

Full report is available here.
Winter 2012 Data is available here.
Live Transparency Index is here.

Executive Summary

As a new year unfolds and the debate about legal education reform continues, efforts in furtherance of law school transparency remain critical. While transparency of law schools’ post-graduation employment data will not solve all of legal education’s problems, it can put pressure on the current law school model and thereby act as a catalyst for broader legal education reform. This is true whether it occurs through the process of seeking transparency or because of the information that such disclosure ultimately reveals.

Having had their long-standing practice of withholding basic consumer information called into question, law schools have responded with new attempts at disclosure in advance of the ABA’s new requirements. Adequate disclosure should be easy to achieve; law schools have possessed ample information, in an easy publishable format, for many months. But as the findings of this report show, the vast majority of U.S. law schools are still hiding critical information from their applicants.

This report reflects LST’s analysis of the class of 2010 employment information available on ABA-approved law school websites in early January 2012. The Winter 2012 Index reveals a continued pattern of consumer-disoriented activity. Our chief findings are as follows:

  • 27% (54/197) do not provide any evaluable information on their websites for class of 2010 employment outcomes. Of those 54 schools, 22 do not provide any employment information on their website whatsoever. The other 32 schools demonstrate a pattern of consumer-disoriented behavior.
  • 51% of schools fail to indicate how many graduates actually responded to their survey. Response rates provide applicants with a way to gauge the usefulness of survey results, a sort of back-of-the-envelope margin of error. Without the rate, schools can advertise employment rates north of 95% without explaining that the true employment rate is unknown, and likely lower.
  • Only 26% of law schools indicate how many graduates worked in legal jobs. 11% indicate how many were in full-time legal jobs. Just 1% indicate how many were in full-time, long-term legal jobs.
  • 17% of schools indicate how many graduates were employed in full-time vs. part-time jobs. 10% indicate how many were employed in long-term vs. short-term jobs. 10% of schools report how many graduates were employed in school-funded jobs.
  • 49% of schools provide at least some salary information, but the vast majority of those schools (78%) provide the information in ways that mislead the reader.

Taken together, these and other findings illustrate how law schools have been slow to react to calls for disclosure, with some schools conjuring ways to repackage employment data to maintain their images. Our findings play into a larger dialogue about law schools and their continued secrecy against a backdrop of stories about admissions data fraud, class action lawsuits, and ever-rising education costs. These findings raise a red flag as to whether schools are capable of making needed changes to the current, unsustainable law school model without being compelled to through government oversight or other external forces.

Class Action Updates: Plaintiffs’ Reply to Cooley’s Motion to Dismiss

In response to Thomas M. Cooley Law School’s motion to dismiss, the plaintiffs, represented by David Anziska and Jesse Strauss, have filed a reply brief.

The plaintiffs are asking the court to allow their case to proceed. They allege that Cooley “has been systematically defrauding thousands of prospective and enrolled students by reporting deceptive and misleading job placement data and salary information in a misguided attempt to inflate the value of a Cooley degree and thereby draw millions of undeserved tuition dollars.”

Cooley previously raised a number of defenses as to why it should not be subject to consumer protection claims. The reply addresses each defense in turn. Of note is the response to Cooley’s unreasonable reliance claim:

Cooley next alleges that to the extent that Plaintiffs relied upon the deceptive and misleading employment data, that reliance was unreasonable because Plaintiffs should have known that far fewer than the reported amount of Cooley graduates actually obtained full-time, permanent employment that required a Cooley degree. Def.’s Memorandum of Law, p. 39. In other words, Cooley has the audacity to argue that its own graduates unreasonably relied on Cooley’s marketing materials because they should have realized that Cooley’s reported employment statistics were inaccurate and that most Cooley graduates do not obtain full-time, permanent employment for which a JD degree is required or preferred. Aside from making a cynical and unprincipled argument, Cooley misstates the law.

The reply is attached.