New Law School Jobs Data Indicate Flat Entry-Level Legal Market

FOR IMMEDIATE RELEASE – LST has made class of 2013 job outcome data available for 200 ABA-approved law schools on The LST Score Reports help students make smarter application and enrollment choices using admissions, employment, and cost information.

LST’s analysis of class of 2013 data collected by the American Bar Association sheds considerable light on how difficult the job market remains for law school graduates. These graduates fared 0.8% better than last year’s graduates on the key lawyer jobs statistic: 57.0% of 2013 graduates were employed in full-time, long-term legal jobs. Exclude jobs funded by the law schools from this figure and it is 55.3%—just a 0.2% improvement from the class of 2012.

A devastating 26.8% were either underemployed (short-term or part-time or non-professional jobs) or not employed (unemployed or pursuing an additional degree). Unemployment is up to 13.7% from 13.2%, while underemployment is down to 11.4% from 12.5%.

Full-time, Long-Term Legal Jobs:

The national full-time, long-term legal rate is 57.0%.

  • By definition these jobs:
    • require bar passage or are judicial clerkships; and
    • require 35+ hours per week and have an expected duration of at least one year.
  • At 64 law schools (31.8%), 50% of graduates or less had these legal jobs.
    • 33 schools (16.4%) had 40% or less;
    • 13 schools (6.5%) had 33% or less.
  • 103 schools (51.2%) exceeded the national rate of 57.0%.
    • 51 schools (25.4%) had 66% or more;
    • 21 schools (10.4%) had 75% or more;
    • 5 schools (2.5%) had 90% or more.

The national full-time, long-term legal rate, excluding jobs funded by law schools, is 55.3%.

  • The richest schools were able to hire their struggling graduates full time and long term; only 18 schools (9.0%) paid 5.0% or more of their graduates for long-term, full-time jobs that required bar passage.
    • 50% of these schools (9) were in the top 20 on the full-time, long-term rate without the benefit of the school-funded jobs; including school-funded jobs in the rate puts 67% of those schools (12) in the top 20.
    • Excluding school-funded jobs from the full-time, long-term legal rate caused all 5 schools over 90% to drop below that threshold.
  • Although the absolute number of full-time, long-term legal jobs funded by schools was relatively small (775, 2.0% of all employed graduates), there were 50% more of these jobs this year compared to last year.

Underemployed or Not Employed:

  • The national rate is 26.8%.
  • A graduate counts as underemployed when he or she in a non-professional job or employed in a short-term or part-time job.
  • A graduate counts as not employed when he or she is unemployed or pursuing an additional advanced degree.
  • 192 schools (95.5%) reported a rate of 10% or more.
    • 163 schools (81.1%) had 20% or more;
    • 129 schools (64.2%) had 25% or more;
    • 74 schools (36.8%) had 33% or more;
    • 36 schools (17.9%) had 40% or more;
    • 14 schools (7.0%) had 50% or more.
  • 30 schools (14.9%) had more underemployed and non-employed graduates than graduates employed in long-term, full-time legal jobs.
    • Last year, 24 schools qualified.
    • If we compare all long-term, full-time professional jobs (legal or not), 16 schools (8.0%) qualify.

Large Firms (at least 101 attorneys):

  • 12.9% of graduates were employed at large firms in full-time, long-term positions
    • Graduates seek these jobs partly because they tend to pay the highest salaries.
    • Note that not all of these jobs are associate positions. An unknown number are paralegals, administrators, and staff attorneys.
    • This number is up 0.7%, from 12.2% last year. These jobs are particularly unevenly distributed across law schools. Graduates from 25 schools account for over 60% of these jobs; graduates from 10 schools account for 37% of these jobs.
  • At only 63 schools (31.3%) were 10% or more graduates in these jobs.
    • 28 schools (13.9%) had 20% or more;
    • 16 schools (8.0%) had 33% or more;
    • 9 schools (4.5%) had 50% or more;
    • 2 schools (1.0%) had 60% or more.

Although the class of 2013 is the largest ever at 46,776 graduates, it was only 0.8% larger than the class of 2012. The raw number of long-term, full-time legal jobs increased slightly by 574 jobs to 26,653. If we exclude positions funded by law schools, the raw number increased by just 319 jobs to 25,878. Overall, class of 2013 job statistics indicate a flat legal job market.

The future remains grim for prospective law students. Law school enrollment was nearly 40,000 in the most recent year. The current entry-level legal market cannot support such large classes.

In addition to recent job outcome data, the Bureau of Labor Statistics projects only 19,650 new law jobs per year between 2012 and 2022, a number that is 10% less than an estimate two years ago that projected 21,880 new jobs per year between 2010 and 2020. That ten-year prediction was 9% less than an estimate a few years prior that projected 24,040 new lawyer jobs per year between 2008 and 2018.

These new jobs include all legal jobs, whether full-time or part-time, permanent or temporary. The BLS labor economists base occupation predictions on econometric models, together with continuous monitoring of the workplace. The macroeconomic model predictions aim to reflect how many new entrants the economy will support in each occupation. These estimates account for economic growth, structural change, retirement, and a host of other variables.

Labor market weaknesses amplify the troubling cost of obtaining a legal education in the United States. Students entering this fall (who graduate in 2017) will likely fare better on the job market. But even if every law school graduate obtained a job, the sky-high cost of legal education means that expected salaries for law school graduates threaten economic hardship. For many, it will be impossible to fulfill their student loan obligations without relying on the generosity of federal hardship programs, which Congress may greatly scale back in the near future.

The message from Law School Transparency to prospective law students remains the same: if you choose to go to law school, carefully assess the costs and the benefits. Focus on where graduates work (geographically) because 2 in 3 employed graduates work in the state in which their law school is located. Use our resources to study law school job outcomes, engage in financial planning, and negotiate the best deal you can with the law schools that can meet your career goals.

For the vast majority of prospective law students who have not received a sizable scholarship, it makes sense to wait for prices to drop further. If you decide to attend, it is essential to negotiate scholarship terms, not just the scholarship amount. You should seek to reduce or eliminate GPA or class rank stipulations, as well as to ensure that your scholarship will grow in proportion to law school tuition increases.

+ School profiles:
+ Comparison charts:


* * *


Established in 2009, Law School Transparency is a nonprofit legal education policy and watchdog organization. Our mission is to make entry to the legal profession more transparent, affordable, and fair.

Law School Graduates Continue to Face Brutal Entry-Level Market

LST has made Class of 2012 job outcome data (as of 9 months after graduation) available about the 200 ABA-approved law schools. School-specific profiles and regional comparisons can be viewed on the LST Score Reports, The Score Reports are a popular law school decision tool based on employment data. They are not rankings and aid students to make decisions on a local and regional basis, rather than on a meaningless national scale.

The Class of 2012 outcome data shed considerable light on how difficult the job market remains for law school graduates. These graduates fared 1% better than last year’s graduates in lawyer jobs: 56.2% of 2012 graduates were be employed in full-time, long-term lawyer jobs. Exclude jobs funded by the law schools from this figure and it is 55.1%. A devastating 27.7% were either underemployed (short-term or part-time job, or non-professional) or not employed (unemployed or pursuing an additional degree). The national non-response rate was 2.6%.

Full-time, Long-Term Legal Jobs:

The national full-time, long-term legal rate is 56.2%.

  • These jobs
    • require bar passage or are judicial clerkships; and
    • require 35+ hours per week and have an expected duration of at least one year.
  • At 66 law schools (33.0%), less than 50% of graduates had these legal jobs.
    • 26 schools (13.0%) had less than 40%;
    • 11 schools (5.5%) had less than 33%.
  • 95 schools (47.5%) exceeded the national rate of 56.2%.
    • 50 schools (25.0%) had more than 66%;
    • 20 schools (10.0%) had more than 75%;
    • 6 schools (3.0%) had more than 90%.

Last month, the ABA agreed to further disaggregate school-funded jobs data. The new format sheds light on how many of the school-funded jobs were Bar Passaged Required, J.D. Advantage, Professional, and Non-Professional. Like last year, these categories are also broken down by whether they are full- or part-time and long- or short-term.

The national full-time, long-term legal rate, excluding school-funded jobs, is 55.1%.

  • The richest schools were able to hire their struggling graduates full time and long term; only 15 schools (7.5%) had 5% or more of their graduates in long-term, full-time, school-funded jobs that required bar passage.
    • Many of these schools were top performers on the full-time, long-term rate.
    • Excluding the school-funded jobs from this rate caused five of the six schools over 90% to drop below that threshold; two of those five dropped below 80%.

Underemployed or Not Employed:

  • The national rate is 27.7%.
  • A graduate counts as underemployed when he or she in a non-professional job or employed in a short-term or part-time job.
  • A graduate counts as not employed when he or she is unemployed or pursuing an additional advanced degree.
  • 187 schools (93.5%) reported a rate greater than 10%.
    • 153 schools (76.5%) had more than 20%;
    • 112 schools (56.0%) had more than 25%;
    • 58 schools (29.0%) had more than 33%;
    • 27 schools (13.5%) had more than 40%;
    • 8 schools (4.0%) had more than 50%.
  • 24 schools had more underemployed and non-employed graduates than graduates employed in long-term, full-time legal jobs.

Large Firms (at least 101 attorneys):

  • 12.2% of graduates were employed at large firms in full-time, long-term positions
    • Graduates seek these jobs partly because they tend to pay the highest salaries.
    • Note that not all of these jobs are associate positions. An unknown number are paralegals, administrators, and staff attorneys.
  • At only 51 schools (25.5%) were more than 10% in these jobs.
    • 27 schools (13.5%) had more than 20%;
    • 14 schools (7.0%) had more than 33%;
    • 8 schools (4.0%) had more than over 50%.

Despite 2012 graduates taking 2,000 more long-term, full-time legal jobs than 2011 graduates, the percentage improvement was just 1% because there were 5.4% more 2012 graduates than in 2011. The Class of 2012 does not represent the apex for new J.D.’s either; only after the class of 2013 will the number of graduates drop, though the total still looks to be in substantial disproportion to the number of jobs available in our quickly evolving profession.

The students entering this fall (who graduate in 2016) will likely fare better on the job market because fewer prospective students are deciding to take the LSAT, to apply to law school, and to attend now that post-graduation realities are transparent. But even if every law school graduate obtained a job, the sky-high cost of legal education means that expected salaries for law school graduates portend economic hardship. For many, it will be impossible to fulfill their student loan obligations without gambling on the continuation of federal hardship programs.

Law School Transparency’s executive director, Kyle McEntee, urged caution to students planning to enroll this fall. McEntee said, “Law school is too expensive relative to job outcomes. If you plan to debt-finance your education or use hard-earned savings, seriously think twice about attending a law school without a steep discount. For the vast majority of prospective law students who have not received a sizable scholarship, it makes sense to wait for prices to drop.”

+ School profiles:
+ Comparison charts:

Established in 2009, Law School Transparency is a nonprofit legal education policy organization. Our mission is to improve consumer information and reform the traditional law school model. We operate independently of any legal institutions, legal employers, or academic reports related to the legal market.

Law School Transparency releases annual index of law school disclosure

The Transparency Index reflects Law School Transparency’s review of law school websites, through which we analyze the employment information that law schools use to market their programs. We measure not only whether law schools meet voluntary transparency standards, but also whether they meet the requirements from ABA Standard 509.

Our project has three important parts:

During the initial review period, we found that 78.4% (156/199) of ABA-approved law schools were not meeting the expectations set forth by Standard 509. We contacted ever school’s dean, admissions office, and career services office with an offer to help them meet our criteria. 102 law schools took us up on our offer, and to date 84 of these schools have updated the consumer information on their websites.

See also:

New Alternative to U.S. News Law School Rankings

Today, Law School Transparency announces an alternative to the U.S. News law school rankings: The LST Score Reports.

LST has developed the Score Reports in an effort to produce a tool to help prospective students make application and enrollment decisions, keeping in mind that each person has a different risk tolerance, financial situation, and set of career aspirations.

The Score Reports are user-friendly tools for sorting law school employment outcomes, projected costs, and admissions stats. There is a Score Report for every state (includes only schools that place graduates there), every school (called a profile), and job types. They measure job outcomes, use a regional scope, and use real terms about the outcomes to allow prospective students to make an educated decision about not just which school to attend, but whether any school happens to meet their needs.

The Score Reports are not rankings, although they do serve as an alternative to conventional law school rankings. But unlike rankings, the Score Reports do not reduce complex data to a single metric. Instead, the Score Reports focus on observable relationships to specific legal markets and job types. Only a small handful of schools have a truly national reach in job placement. The rest have a regional, in-state, or even just local reach. A decision tool should not obfuscate this reality; it should embrace it.

You can view the Score Reports, and read more about them, by following these links:

The Score Reports:
Guide to Using the Score Reports:
The Value of the U.S. News rankings:
Methodology, Published in the Journal of Legal Metrics:

Founded in 2009, Law School Transparency is a nonprofit legal education policy organization dedicated to improving consumer information and to reforming the traditional law school model. LST and its administrators operate independently of any legal institutions, legal employers, or academic reports related to the legal market. The LST Score Reports are a new project from LST Reform Central.

Progress Towards Law School Transparency

We have a few updates on our progress towards greater law school transparency. The first is a rundown of voluntary website improvements made by law schools in advance of the ABA standards reform. The second is to announce that LST has obtained its 40th NALP report from an ABA-approved law school.

Transparency Index: Class of 2010

Back in January, we released the Transparency Index, an index of every ABA-approved law school website. It measures how transparent law schools are on their websites in detailing post-graduation outcomes for the class of 2010 through the lens of 19 criteria.

We said:

Taken together, these and other findings illustrate how law schools have been slow to react to calls for disclosure, with some schools conjuring ways to repackage employment data to maintain their images. Our findings play into a larger dialogue about law schools and their continued secrecy against a backdrop of stories about admissions data fraud, class action lawsuits, and ever-rising education costs. These findings raise a red flag as to whether schools are capable of making needed changes to the current, unsustainable law school model without being compelled to through government oversight or other external forces.

There is good, bad, and disappointing news. The bad news is that secrecy is still the norm, with schools still opting to selectively withhold class of 2010 information to suit their own interests (and to the possible detriment of prospective students). With this year’s admissions cycle well underway, and admitted students closing in on their final decisions, people could really use the critical information schools have at their fingertips. While we can place some responsibility at the feet of those who will knowingly choose to attend schools that are withholding critical information, their poor choices still stem from law schools’ bad acts, especially when it is clear that many prospective students still do not understand the extent of the gaps in information, which law schools have become so adept at hiding.

The good news is that we’ve seen been big improvements to a number of law school websites following the publication of our Winter 2012 Transparency Index Report. Further, these improvements are likely an underestimation: we’ve only updated the Live Index as we’ve come across evidence of improvements or have been directed to updates (usually by the schools themselves). As more and more schools respond positively to criticism, it is also getting easier to identify who the bad actors are.

  • 22% of schools do not provide evaluable class of 2010 information, up from 27%.
  • 64% of schools indicate how many graduates actually responded to their survey, up from 49%. Response rates provide applicants with a way to gauge the usefulness of survey results, a sort of back-of-the-envelope margin of error. Without the rate, schools can advertise employment rates north of 95% without explaining that the true employment rate is unknown, and likely lower.
  • 39% of law schools indicate how many graduates worked in legal jobs, up from 26%. 20% provide the full-time legal rate, up from 11%. We are aware of no additional schools providing the full-time, long-term employment rate. (It is still just two schools, or 1%.)
  • 28% of schools indicate how many graduates were employed in full-time vs. part-time jobs, up from 17%.
  • 16% indicate how many were employed in long-term vs. short-term jobs, up from 10%.
  • 16% of schools report how many graduates were employed in school-funded jobs, up from 10%.
  • 59% of schools provide at least some salary information, up from 49%. But the majority of the 59% of schools (63%) provide the information in ways that mislead the reader, down from 78%.

These are substantial changes. The schools who have made them (list available here) deserve some praise. However, it needs to be restated that every school could meet all 19 of the criteria we used in the Transparency Index, so our praise comes with that caveat.

The disappointing news is that one of the most transparent schools from our original report has decided to be less transparent. The University of Houston Law Center previously met thirteen criteria; it now meets only seven. (Original Disclosure; New Disclosure.)

In particular, Houston no longer shares the percentage of graduates in full-time and part-time jobs, in school-funded jobs, and in full-time legal jobs. It also no longer indicates when the graduates obtained the job (timing), nor how the graduate obtained the job (source). The school now also provides misleading salary information because the school no longer indicates the response rate for each salary figure provided.

When asked why they took such a huge step backwards, the dean of career services cited that Houston was now just doing what other schools were doing. She also claimed it was an improvement overall because it also included 2009 and 2008 employment data, although it is barely more than what’s already available in our data clearinghouse (2008 & 2009).

For the unacquainted, Houston copied the University of Chicago’s presentation standard, and in doing so actually decreased its level of disclosure. We criticized Chicago’s standard back in January for this particular reason:

Last month, the University of Chicago Law School received widespread acclaim for its decision to provide improved consumer information about the class of 2010. We believe Chicago received acclaim because the job outcomes for Chicago’s 2010 graduates appear to be strong relative to other law schools’ outcomes. The positive responses confused the unexpected quality of outcomes, which for Chicago graduates remained strong despite the retraction in attorney hiring, with actual quality of disclosure. Chicago coupled tabular data with language about the need for transparency, leading people to claim that Chicago has set the market. But if every law school disclosed employment data according to Chicago’s incomplete methodology, most would still continue to mislead prospective law students.

The Chicago methodology does not distinguish between full- and part-time jobs, between long- and short-term jobs, or whether positions are funded by the law school. Nor does it indicate the numerator for the salary figures. For Chicago, this is a relatively minor oversight because it collected salary data from 94% of employed graduates. But the further the response rate moves away from 100%, the more important it is that the rate be disclosed for every category that a school provides salary information for. Few, if any, other schools have a response rate above 90%.

Predictably, Chicago’s puffery about its dedication to transparency has done more harm than good.

LST Obtains Its 40th NALP Report

There remains reason to be optimistic, however. LST has obtained NALP reports from six more law schools since our original announcement.

This brings the total to 40 law schools. While such incremental improvements to transparency suggest a long road ahead, we do consider 40 a significant threshold. LST will continue advocating for law schools to share their class of 2010 reports and, when the time comes, officially request the class of 2011 NALP reports too. Accepted students who don’t want to wait until then can start contacting the schools to request the 2011 data, now that we have passed the February 15 reporting deadline. If you are successful in leveraging your acceptances to procure the data please consider sharing it with us and directly with other applicants on popular discussion board sites like TLS.

The 509 Subcommittee’s Draft Proposal: An Explanation and Evaluation

This is our third post in a series of posts (see the first and the second) where we contemplate the 509 Subcommittee’s draft proposal and the facts needed to understand how it would advance transparency at ABA-approved law schools. This post will explain the new proposal and evaluate it using the three criteria we set out in the second post.

The Subcommittee’s Proposal

On March 14th, the Subcommittee released its first draft proposal for a revised standard for the reporting of employment data. David Yellen, dean of Loyola University Chicago School of Law and chair of the Standard 509 Subcommittee, will present this proposal to the Standards Review Committee on Saturday, April 2, 2011 in Chicago. We will provide updates on any changes that come out of the meeting.

The draft proposal has three parts: a memorandum explaining the subcommittee’s operating assumptions and goals, a new Standard 509(b), and a chart that each law school would be required to fill out and post on its website each year.

The Memorandum

In the memorandum, the Subcommittee states that the goal is to “provide more meaningful and consistent employment information to prospective law students . . . [that will] greatly assist prospective students in making informed decisions about whether to go to law school or which law school to attend.” Right away the Subcommittee recognizes that schools already gather a great deal of data, and that it follows that sharing more information with prospective students will require only a small (and, implicitly, justified) burden.

The Subcommittee describes the consumer protection standard, Standard 509, as “a vague standard” that enables schools to provide limited and hard-to-compare information. The fact that reporting practices vary so widely among schools makes it very difficult for prospectives to understand the employment outcomes of a particular set of graduates. What’s more, the Subcommittee continues to recognize that the presentation of information is occasionally misleading. This reflects previous comments made by Dean Yellen.

The memorandum then cabins the problems with the current information into two categories: employment rates and salary information. The Subcommittee establishes two principles regarding the the first category. First, “the percentages disclosed should be based on the entire graduating class, with only those known to be employed being counted as such.” The second principle regards the variety of jobs graduates take, and the problem of providing misleading impressions about the true successes of a school’s graduates. “[T]he best approach is to require schools to disclose more disaggregated data about . . . categories of jobs.” These categories include nonprofessional jobs, part-times jobs, temporary jobs, and jobs funded in part by the school.

Regarding the second category, the Subcommittee recognizes the limited utility of salary medians and the likelihood that readers will misunderstand what the medians refer to and how they are calculated. The Subcommittee proposes that “all salary information clearly indicate the number of respondents and percentage of all graduates included.” This is an important revision that will change the manner in which many schools currently portray salary statistics. For examples of how problematic this can be, check out LST’s data clearinghouse. (The linked example shows a school that reported a median salary of $160,000, despite it being the median for only about 16% of the entire class.)

Proposed Standard 509(b)

The first proposal made by the Subcommittee is as follows:

(b) A law school must publicly disclose the employment outcomes of its graduates by preparing and posting on its website the attached chart.
(1) The employment information must be accurate as of February 15th for persons who graduated with a JD or equivalent degree between September 1 two calendar years prior and August 31 one calendar year prior.
(2) The information must be posted on the school’s website by March 31 each year.
(3) The information posted must remain on the school’s website for at least three years, so that at any time, at least three graduating classes’ data is posted.
(4) The information must be gathered and disclosed in accordance with the instructions and definitions issued by the Section’s Questionnaire Committee.
(5) Any additional employment information the law school discloses must be fair, accurate and not misleading.
(A) Any publicly disclosed statistics regarding graduates’ salaries must clearly identify the number of salaries and the percentage of graduating students included.

The proposed Standard 509(b) requires that schools publicly disclose the employment outcomes of the most recent graduating class as true on the first February 15th following graduation. Schools must disclose these outcomes, at minimum, on the “attached chart” by the first March 31st following graduation. It also requires schools to keep the chart on their websites for at least three years. Finally, it adds a catch-all in 509(b)(5) to protect against predatory, opportunistic practices. This specifically includes a solution to misleading median salary practices that some law schools currently use.

The Chart

[View the chart]

The proposed Standard 509(b) “attached chart” aims to exhibit the outcomes of the entire graduating class as of the first February 15th following graduation. The chart disaggregates the current information into smaller categories to illuminate the outcomes graduates achieve at a particular school. The chart is also the first official recognition by an arm of the Section of Legal Education that salary information is in fact “basic consumer information.”

There are two classes of categories on this chart: employment status and employment type. For each category and subcategory, schools must report the percentage of all graduates, rather than of only employed graduates, as well as the raw number of graduates included in the calculation. This decision aims to limit the impact of creative accounting and less than forthright attempts at collecting employment data from graduates.

The employment status class places all graduates into four exhaustive categories: employed, pursuing a graduate degree full-time, unemployed, and employment status unknown.

The chart breaks “employed graduates” into two subcategories. First, this category breaks all employed graduates into four exhaustive kinds of employment: full-time long-term, full-time short-term, part-time long-term, and part-time short-term.

Second, it breaks all employed graduates into exhaustive categories based on the credentials required (or preferred) to do the job: bar passage required, J.D. preferred, other professional, or non-professional. It then further breaks each of those categories into (the same) four exhaustive kinds of employment: full-time long-term, full-time short-term, part-time long-term, and part-time short-term.

The employment type class breaks all employed graduates into six exhaustive categories based on the type of employer: law firms, business & industry, government, public interest, judicial clerkships, and academic. Of those categories, the law firm and judicial clerkships categories are further broken down by type. The law firms are disaggregated by size and the clerkships are disaggregated by level of government (state or federal).

Finally, full time salaries will accompany each category (except solo practitioners) of full-time, employed graduates whenever there are at least five salaries reported in a given category. These salaries will be reported with a 25th, 50th, and 75th percentile, as well as the number of salaries used to create these salary quartiles. There is also a space for schools to report the total number of jobs they funded.

A Good Start, But More To Be Done

The 509 Subcommittee is off to a really strong start in reforming how schools report employment information. It was made clear to us that this is only a preliminary draft, and that the Subcommittee expects more changes will be made. We hope this is the case.

The principles guiding the Subcommittee are sound. It is true that the information must be meaningful, consistent, and help prospectives make informed decisions about whether to, and where to, attend law school. But the execution of these principles still leaves something to be desired. If approved as a new accreditation standard in its current form, the proposal would certainly help prospective students and drastically cut down on misleading statistics. At the same time, it runs the risk of only providing superficial comfort, because it would not help match students to the schools that best meet their career objectives as efficiently as legal education needs.

As we previously outlined, we will use three criteria to assess the draft proposal.

(1) Does it disaggregate the current information?
(2) Does it demonstrate the economic value of a school’s J.D.?
(3) Does disclosure operate on an accelerated schedule?

Does it disaggregate the current information?

This proposal does disaggregate the current information. It helps show the nature of the jobs graduates obtained and with whom the graduates were employed. But as evidenced by comparing this draft proposal to the LST Standard, the vague “employed at 9 months” standard, where “a job is a job,” can be disaggregated to varying degrees. We’ve concluded that this draft does not disaggregate the current information to an adequate degree.

The more disaggregated employment information is, and the more data provided at that degree, the more likely it is that there will be privacy norm concerns. With these norms in mind, there is a legitimate interest in not disclosing all of the employment data that law schools already collect. On the other hand, law schools already collect all of the data needed to help prospectives make informed decisions, so cost concerns are greatly overblown (as the Subcommittee recognizes). As such, the appropriate level of disaggregation must balance privacy norms against the usefulness of additional disaggregation to anybody trying to understand the entry-level market for a school’s graduates.

It is the job of the Section of Legal Education to use its regulatory power to enforce the right balance. The Section must force schools to share the appropriate level of disaggregated information and must not opt to require less useful information because law schools have competitive concerns. The important question thus becomes how much weight the Section of Legal Education should give to schools that believe that more disaggregated information could (i) hurt their recruiting efforts, (ii) cause prospectives to focus too much on the first job in making their law school decision (as opposed to something else the schools think prospectives should focus on), and (iii) cause confusion through information overload.

Among the opportunities for improvement is how well the proposal connects job outcome features together. It does not disaggregate the locations of these jobs and does not show how the job, employer, and location connect for individual graduates. For example, we might be able to tell that 60% of a school’s graduates are working at jobs that require bar passage, but we do not know what percentage of those are working in business & industry. Likewise, we might know that 15% of a school’s graduates work in 2-10 attorney law firms, but we cannot tell what percentage of those graduates are working there as attorneys. This is not merely a theoretical concern– a sizeable percentage of law school graduates work in non-attorney positions in law firms. The decision to disaggregate further directly contravenes the Subcommittee’s principle against providing misleading impressions about the true successes of a school’s graduates.

Part of the reason additional disaggregation is so important is that it would minimize the effect of national rankings on student decision-making by offering a window directly into what graduates shortly after graduating. With this proposal, a prospective’s choice might still hinge on what a school ranks each year in U.S. News rather than on how well a school can help a student achieve her goals. Prospectives need clarity about how a school fits into the legal hiring market.

After all, the Subcommittee’s stated goal is to help prospectives make “informed decisions about whether to go to law school or which law school to attend.” The proposed solution is only satisfactory insofar that the goal is to differentiate between schools using percentage differences in broad, albeit more disaggregated, categories. It will still be too difficult to know the challenges graduates face for achieving their career objectives, which usually include a combination of location, employer type, and required credentials. Without sufficient granularity, neither will prospectives as easily understand a school’s placement niches. All together, prospectives will still struggle to understand schools’ unique placement abilities.

Another issue with the Subcommittee’s method of disaggregation is that it actually creates new gaps in the information (though not to a debilitating extent) and thus an incentive for creative accounting. One of the purposes of disaggregating the nine-month employment rate is to limit how much schools hide employment outcomes. Unnecessary gaps undermine this purpose.

The total number of graduates in each subcategory, taken together, should equal the total number in the parent category. For example, the total number of graduates who are employed, unemployed, pursuing a graduate degree, or whose employment statues are unknown should equal the total number of graduates in the graduating class because the categories are exhaustive.

The unknown status category is very important for identifying gaps in the employment status data. However, an unknown category is missing from all other exhaustive groups except the group for type of law firms. The employment type category, required credentials subcategory, judicial clerkships subcategory, and the full time and part time (and corresponding long and short term) subcategories all need an unknown field so that the numbers in the subcategories all equal the parent category’s total number.

Helping prospectives understand where data gaps exist encourages them to ask the right questions and serves to limit false impressions due to extrapolating outcomes from unrepresentative segments of the graduating class. Unfortunately, allowing schools to report graduates as “unknown” in any category incentivizes schools to avoid learning or researching employment outcomes. However, it is more important that the gaps created by non-reporting graduates are readily identifiable. As such, all exhaustive categories and subcategories need to account for each graduate.

Does it demonstrate the economic value of a school’s J.D.?

It is a huge step forward for the Subcommittee to recognize salary information as “basic consumer information.” As of right now, the only standardized, school-specific salary information is courtesy of U.S. News. Until this year, even U.S. News salary information was too opaque.

The Subcommittee’s proposal does a decent job with highlighting what new graduates make and, accordingly, demonstrates some of the economic value of each school’s J.D. This new salary information would allow prospective students to roughly understand how well graduates can service their debts immediately after law school. For the Class of 2009, the average graduate had $98,055 of law school debt, which translates to about a $1200/month loan payment.

While the Subcommittee’s approach is useful and likely the best way for schools to report school-specific salary outcomes without using job-specific salary data, it is not the approach we think the Subcommittee should take. A better way would be to leverage the reported salary data of all law schools together the way NALP does in its annual Job’s and J.D.’s. Certainly, if prospectives knew about this publication, which costs non-members $90, they could use it to have a better understanding of entry-level salaries for law school graduates. But there is currently no way to bridge the gap between this salary information and an individual school’s graduates, and the Subcommittee’s proposal does not help on that front, so it is limitedly useful for those trying to decide which law school to attend.

The aforementioned lack of connectivity between employers, job credentials, and job location makes understanding how the new salary information impacts them – particularly for loan payments – very difficult. For example, a $160,000 starting salary for a new associate grows differently in New York City compared to Houston due to salary compression in years two through seven. Additionally, $70,000 in New York City does not go as far as $70,000 in Philadelphia, Raleigh, or Nashville. The geographic impact on the ease of loan repayment cannot be understated. Even if a prospective has the Job’s and J.D.’s book, that information can only take them so far because its salary breakdowns are very specific (e.g., attorneys in 2-10 person law firms in X city). Nothing in the new standard or chart helps answer these important questions.

There is a separate concern about whether each category would have meaningful salary information associated with it. For example, 10 may work at small firms, with only four reporting. In this case, the four salaries do not get reported and thus do not serve any use. They are simply swept away. However, if these four salaries were added to a national salary database, those four become 40 or even 400, and the result is meaningful salary information about jobs that wouldn’t otherwise have salary information. Unfortunately, this resource cannot be utilized on a school-by-school basis without more disaggregation. In our next post we will explain our proposal for doing this in depth.

Does disclosure operate on an accelerated schedule?

Yes. In striking this balance between cost concerns and the need for timely information about the most recent graduating class, the Subcommittee has paved the way for significant improvements beginning as early as next year. At the Questionnaire Committee hearing in December, law school administrators expressed concern that requiring schools to report information too soon would be too high of a burden given cost constraints. But by limiting the Standard 509 requirements to only data that schools submit to NALP in February/March, the Subcommittee erases these concerns. Even small career services staff will be able to comply with the standard provided they already report to NALP, which nearly every ABA-approved law school does. Given that collection methods are now mostly electronic (through Symplicity or other user-entry databases), assembling and posting the data according to the proposed Standard 509(b) would take very few work hours and limited financial resources beyond what schools already allocate voluntarily.

Concluding Thoughts

The goal of a revised Standard 509(b) must be to help students make informed decisions about which (if any) school best meets their career objectives. While a good start, we think that, as currently conceived, the Subcommittee’s proposal will fail to adequately achieve this basic goal.

We ask that each member of the Committee imagine herself as a prospective student trying to choose a school to invest thousands of hours and dollars into. Each member must then think about how soundly she can act after analyzing employment information reported according to the new standard, and consider how well she actually understands the school’s ability to help her achieve her career objectives. We suspect that this thought experiment would leave each member uncomfortably uncertain. This uncertainty, at a minimum, should be addressed through a non-theoretical exploration of the standard’s implications. Before accepting a new standard, the Standards Review Committee should compare a few schools using real employment information presented as it would be under the proposed revisions.

An improved Standard 509 has the ability to wage an important battle against the influence of U.S. News on the decision-making of prospective law students. But without sufficient disaggregation of the current employment information, the effects can only be minimal. Under the current proposal, it is still too easy to imagine a prospective student choosing the #55 ranked school located on the east coast over the #81 ranked school on the west coast because she does not know, for example, what to make of the schools’ minute differences in percentage employed in mid-sized firms as it pertains to her goals of working out west in a mid-sized firm. Without adequate information to dissuade her, she might come to the head-scratching conclusion that #55 must be better because it is ranked higher. This is bound to worsen now that there are 45 more schools ranked on a national scale.

Each year, the Section of Legal Education makes an effort to minimize the effect of national rankings. We are sure that almost every law school administrator would agree with the Section’s sentiments, and revising Standard 509 is the chance to show that these are not empty words. We look forward to working with the Subcommittee to improve this first draft.

Ave Maria’s Official Statement

We just received Ave Maria’s official statement via Karen Sloan, reporter for the National Law Journal. From Ave Maria Dean Eugene Milhizer:

Ave Maria School of Law is committed to providing outstanding value to our students. Initially, we were interested in participating in the Transparency Project because we felt that leadership was needed in disclosing to the public appropriate information related to the costs and benefits of attending law school.

Since our earlier indication that we would provide data, the ABA has undertaken concrete action to address this issue, and we are satisfied that meaningful steps are in motion. In light of these recent developments, coupled with our concerns about the confidentiality of personal information about our graduates (which is a special concern at a small law school such as ours), we have decided to give the ABA’s leadership a fair chance to address this issue before acting independently, and for now, to withdraw from the Transparency Project.

The ABA will have its fair chance, but it is not the school’s job to provide that chance. If the ABA fails with its fair chance, the schools will not pick up the slack where the ABA fails. Rather, the Department of Education will have to reconsider whether the ABA Section of Legal Education and its accreditation committee suitably regulate American law schools.

So while it is true that the ABA has a responsibility to provide leadership on this issue, and that the Section of Legal Education committees are progressing towards a solution, the ABA only has to act because the schools are not voluntarily releasing the information prospective students need to make an informed investment. Individual law school’s responsibilities to prospectives and the profession are not absolved or delayed because the ABA seeks to regulate law schools more carefully. Law schools have had their “fair chance” to comport with these responsibilities, and have failed to meet them.

Earlier: Apathy For Applicants Continues: Ave Maria Backs Out

Law School Transparency Reports: The Second Request

This past July, LST requested that all ABA-approved law schools provide employment data to LST for public consumption. At that time, the vast majority of law schools declined to respond or comment. In November, we finalized our official guidelines and made a second request from all ABA-approved law schools. While schools still have time to respond to our request and potentially commit to the LST Standard in time for the first wave of data, we thought it was time to issue a report on where things currently stand. Once again, the vast majority of schools have declined to respond or comment. The responses from the schools that did choose to respond were much the same.

Initial Request Responses

Law School Stance Primary reason for declining
American University Maybe Waiting for finalized Guidelines to decide
Ave Maria Yes
Creighton University No Compliance costs are too great
Northwestern University No LST is not well-established
Santa Clara University No Compliance costs are too great
University of Colorado No Compliance costs are too great
University of Florida No Prefer other means of improving information
University of Michigan Maybe Should make open records request instead
University of Tennessee No Violates privacy of students and employers
Vanderbilt University Maybe Waiting to examine impact on privacy
William Mitchell No No reason provided

Second Request Responses

Law School Stance Primary reason for declining
Loyola University Chicago No “most of the same [reasons] as the other deans who have declined”
University of Kansas No “not in a position to participate at this time”
Duquesne University No Violates privacy of students
Nova Southeastern University No Compliance costs are too great
University of Louisville Maybe see below

University of Louisville’s response to our second request was of particular note.

Via email:

Thank you for taking the time to speak with me today. I think we agree on the value of reporting employment data accurately. I think we also agree on some of the potential challenges with LST Standard compliance. At this point, the University of Louisville Law School can commit to reviewing our 2010 data with the LST Standard in mind. As we discussed, our challenges are going to be with salary determination, reporting, privacy and resources. Perhaps we can have another conversation after February 15. My goal, like yours, is to ensure that prospective consumers of law school education have complete and accurate information upon which to make a decision.

We will be following up with University of Louisville Law School next week. It’s good to see the school thinking critically about how they report information to prospective law students. Louisville is among the few schools in the recent months to act publicly on the notion that prospective students need more information to make informed decisions. We hope that Louisville and other schools promptly provide 2010 employment information after it is available.

In the meantime, if you are affiliated with a school that has yet to respond and you support the cause, we encourage you to contact the school’s administration. We’ve heard from a number of current and prospective students over the last month who are doing just that, so please check back as we report more on those initiatives.

Shocked about Kaplan’s Survey Results? New Information Comes to Light

Many articles have covered Kaplan Test Prep’s recent press release, in which Kaplan declared that “[i]n deciding where to apply, pre-law students consider a law school’s place in the rankings more important than affordability, geographic location, its academic program – and even more important than its job placement statistics.” The same release also reemphasized the results of an earlier survey, which showed more than half of all prospective law students are ‘very confident’ they will get a legal job after graduation, while only 16% were ‘very confident’ about their peers.

Based in part on these two statements, people are questioning whether prospective law students actually deserve better information about job prospects, and whether disclosing more information would even impact their decision-making process. In other words, if prospectives really don’t care about job prospects, why should we care about law school transparency?

The survey results seemed odd to us, so we decided to dig a little deeper by contacting Kaplan Test Prep and taking a closer look at the survey. Kaplan was kind enough to not only answer our questions promptly, but to also send the full survey and results.

After reading the survey, digesting the results, and learning more about Kaplan Test Prep, it turns out that the press release and ensuing coverage did not tell the whole story. Instead, the results reflect an application landscape where important information is scarce and application decisions are complex. LST has been arguing these two things for months now; many others have been doing so for years.

You can check out the survey’s full results and our comments after the jump.

Continue reading Shocked about Kaplan’s Survey Results? New Information Comes to Light

Law School Transparency Reports: The Initial Request

Law School Transparency is pleased to announce a report on our initial request to law schools for detailed employment data. The report documents the initial request, surveys the ensuing media coverage, and analyzes schools’ reasons for declining. We have attached an electronic version of the report to this post. Hard copies of this report will also be mailed to the prelaw advisors at 100 U.S. undergraduate institutions, who together assist roughly 40,000 law school applicants in choosing where to go each year.

Read the executive summary after the jump »»