New Alternative to U.S. News Law School Rankings

Today, Law School Transparency announces an alternative to the U.S. News law school rankings: The LST Score Reports.

LST has developed the Score Reports in an effort to produce a tool to help prospective students make application and enrollment decisions, keeping in mind that each person has a different risk tolerance, financial situation, and set of career aspirations.

The Score Reports are user-friendly tools for sorting law school employment outcomes, projected costs, and admissions stats. There is a Score Report for every state (includes only schools that place graduates there), every school (called a profile), and job types. They measure job outcomes, use a regional scope, and use real terms about the outcomes to allow prospective students to make an educated decision about not just which school to attend, but whether any school happens to meet their needs.

The Score Reports are not rankings, although they do serve as an alternative to conventional law school rankings. But unlike rankings, the Score Reports do not reduce complex data to a single metric. Instead, the Score Reports focus on observable relationships to specific legal markets and job types. Only a small handful of schools have a truly national reach in job placement. The rest have a regional, in-state, or even just local reach. A decision tool should not obfuscate this reality; it should embrace it.

You can view the Score Reports, and read more about them, by following these links:

The Score Reports: http://www.lstscorereports.com
Guide to Using the Score Reports: http://www.lstscorereports.com/?r=guides&show=12
The Value of the U.S. News rankings: http://www.lstscorereports.com/?r=guides&show=13
Methodology, Published in the Journal of Legal Metrics: http://ssrn.com/abstract=2106814

Founded in 2009, Law School Transparency is a nonprofit legal education policy organization dedicated to improving consumer information and to reforming the traditional law school model. LST and its administrators operate independently of any legal institutions, legal employers, or academic reports related to the legal market. The LST Score Reports are a new project from LST Reform Central.

Rankings Pressure as a Motivating Force for Fraud

The Chicago Tribune has picked up on part of the University of Illinois College of Law’s audit report that many readers lament as par for the course. While it’s unknown just how common intentional fraud is at U.S. law schools, the pressure is common among law schools to improve U.S. News rankings. It is easy to see why it sometimes translates to bad behavior.

College of Law admissions dean Paul Pless revealed another motive was at play [in starting its new admissions program]. By admitting high-achieving students in their junior years, without a law school entrance exam, the students’ high GPAs would be included in the class profile but no test scores could potentially drag down the class.

As the former dean’s own words explain:

“That way, I can trap about 20 of the little bastards with high GPA’s that count and no LSAT score to count against my median. It is quite ingenious,” Pless boasted in a 2008 e-mail exchange with an acquaintance about iLEAP, the early admissions program now in its fourth year.

Schools often admit “splitters”—high GPA/low LSAT or high LSAT/low GPA—to achieve higher and higher median scores. Admissions offices are restrained by the need to report the low LSAT or GPA along with the high LSAT or GPA because each low number drags down the median. Pless’s iLEAP program is a way to get credit for the best quality of the splitters (their high GPAs) without the LSAT median sinking downwards.

A review of the investigative file shows the intense culture in which Pless worked, one focused on improving the academic credentials of the incoming classes in part as a means to improving the already well-regarded school’s ranking.

The college’s strategic plans and annual reports focused on that ranking. Pless’ salary increases were tied to it. The law dean and other top officials exchanged emails about the benefits of different combinations of test scores and GPA medians to achieve it.

While most law school administrators wouldn’t so brazenly (and sloppily – professional email communications at a public school are usually subject to open records laws) brag about the novel ways they’ve boosted medians, they all have their bag of tricks. They all know that job security is often tied to maintaining or improving those numbers.

The same pressures exist for improving employment statistics too, though far less is known about intentional changes to underlying employment data. While it’s difficult to spin poor admissions numbers without resorting to fraud, it is far easier to use misleading tactics to dress up the unimpressive reality facing most law school graduates in today’s job market. This is particularly true when the entity charged with policing law schools, the Section of Legal Education, has failed to actively investigate what admissions and career services offices are doing.

It’s odd, to put it lightly, that these pressures exist in the first place. Serious thought needs to be given to the institutional incentives that law schools face, particularly when those incentives seem to run against the interest consumers have in receiving information that’s presented in a fair and accurate manner.

As of now, there are few if any incentives to blow the whistle on unethical admissions practices. It is likely that unethical practices have spread beyond Villanova and Illinois, but they are difficult to catch. These practices would not even be uncovered at the semi-decade inquiries from the ABA site visit teams—an important event for ABA accreditation.

It seems that we may have moved beyond the presumption that all law schools are operating ethically. It’s crucial, for the sake of the schools acting ethically as the gateway to the legal profession, that the bad apples be uncovered. We hope the schools with nothing to hide step up and ask LSAC to audit the past ten years of admissions data at all ABA-approved law schools. The costs of conducting such an audit will pay dividends of restored credibility.

Class of 2009 U.S. News Data

Each year, U.S. News collects data from almost every ABA-approved law school through an annual survey. As with prior years (Class of 2007 and 2008), we have collected Class of 2009 employment data and cost of attendance data in a spreadsheet for easy use and comparison. While there is a significant time lag—the data came out a few weeks ago for the graduates from nearly two years ago—the spreadsheet serves as one of just a few sources of employment information that prospectives can use to easily compare schools in a standardized fashion.

Also in accord with prior years, we added a few of our own metrics to help readers understand and compare data. These include (1) the percent of the graduating class represented by the private sector salary information and (2) the percent of the graduating class with a known salary.

These metrics show how much (or little) we know about salary outcomes from the salary information provided by using the U.S. News reporting standard. As it turns out, the touted salary medians are poor pictures of the graduating classes for the vast majority of schools. On average, the “median salary” represents what only 29% of private sector graduates made, or just 17% of the entire graduating class. The ABA will hopefully soon begin regulating these practices, ensuring that schools cannot advertise salary information without important qualifiers such as the percentage of all graduates included in salary figures.

This year, partially in an effort to better qualify the salary figures, U.S. News made important changes to the way it reports employment data collected from law schools. We reported back in December that changes were coming as a result of our discussions with Bob Morse, Director of Data Research at U.S. News, who implemented our ideas as we proposed. These changes will provide prospective students a more thorough picture of post-graduation outcomes from many ABA-approved law schools. Additionally, these changes will improve our data clearinghouse, since they allow us to eliminate most of our assumptions as we turn the data into more meaningful information.

Information Quality

For the most part, schools do a good job reporting data in the manner requested by U.S. News. That is, there are few discrepancies that cannot be explained by small rounding errors (where the percentages do not add up to 100 percent when they should) or schools misunderstanding the clerkship survey questions. But people should not assume that we are out of the woods. Reporting data according to the U.S. News standard is a separate achievement from presenting true, meaningful information that’s useful for prospectives trying to make informed decisions. The job characteristic data and LST’s new metrics demonstrate just how different the picture can be between how some schools present their data and the reality.

In the coming weeks, we will start a discussion about practices at some schools that may deserve the ABA’s attention. If you know of individual schools where Class of 2009 data evidences schools presenting information in a misleading manner, please do not hesitate to let us know.

Otherwise, a preliminary data review did find a few errors in the data reported for 2009 grads. We’ve already alerted Bob Morse about the errors and we have been told that corrections are on the way. Our spreadsheet fixes these errors and highlights other areas for concern.

LST’s Data Clearinghouse

We will soon release the newest installment of our data clearinghouse, once we are satisfied with the underlying data. The clearinghouse helps applicants visualize the data in a way that isn’t intuitively obvious. Many applicants have been using it to better understand school-specific employment information and to make better estimates about future job prospects. We expect it will be even more useful this year because it requires fewer assumptions and prospectives can trace how schools fared over the prior three years.

As always, the data clearinghouse will reflect only cleansed data. We have not and cannot audit the data for accuracy. In the meantime, if you spot any errors or have any comments, please do not hesitate to leave a comment here or email us at .

U.S. News Asks Law School Deans to go Beyond ABA Standards

Per Bob Morse’s blog, Morse Code:

U.S. News agrees with the efforts of Law School Transparency to improve employment information from law schools and make the data more widely available. We are also aware that the American Bar Association is studying changes to the standards that law schools must use when they report employment data for graduates. We agree that more still needs to be done by all parties. To that end, U.S. News Editor Brian Kelly reached out to law school deans in a letter mailed earlier this week.

The letter begins by focusing on how legal education is being perceived:

Dear Dean ___,

As you know, there have been some serious questions raised about the reliability of employment data reported by some schools of law to the American Bar Association and other sources. I write with some reluctance because it is not our role at U.S.News & World Report to be any sort of regulatory body over law schools or anyone else. We are a journalism company that gathers and analyzes information useful to our readers.

But I think we can all agree that it is not in anyone’s interest—especially that of prospective students—to have less than accurate data being put out by law schools. It’s creating a crisis of confidence in the law school sector that is unnecessary and we think could be easily fixed.

Specifically, employment after graduation is relevant data that prospective students and other consumers should be entitled to. Many graduate business schools are meticulous about collecting such data, even having it audited. The entire law school sector is perceived to be less than candid because it does not pursue a similar, disciplined approach to data collection and reporting.

We have encouraged Mr. Morse to provide more employment information to prospective law students. He has acknowledged our efforts in the past, and this week’s letter to the deans is a clear signal that he is making good on his word. U.S. News may not be the regulatory agency responsible for setting the disclosure standards, but its influence in altering law school behavior should not be understated. Regardless of whether or not law school deans will view U.S. News’s request as a form of antagonism (given their many criticisms of the U.S. News rankings, which are due out next week), there is no doubt that administrators are finding themselves in an increasingly difficult position.

Mr. Morse is correct to point out that the ABA is the entity in charge of setting the standards. The ABA Section of Legal Education’s Standards Review Committee, chaired by Dean Polden of Santa Clara Law School, is holding its next meeting on April 4 in Chicago. Our hopes are that the SRC is taking note of the recent comments by Mr. Morse and others, even as it prepares to announce proposed changes to the disclosure requirements.

Perhaps more importantly, Mr. Morse is also correct to call on law schools to go past the current industry standards and disclose more information for the benefit of their students. Schools that have withheld voluntary reporting under the guise of waiting around for ABA reform must revisit their policies and consider the ethical importance of moving beyond the status quo.

Read the full text of the letter after the jump:

U.S. News March 2011 Letter to Law School Deans »»

U.S. News Expresses Support for Greater Employment Transparency

We recently broke the news about U.S. News’ decision to reform its disclosure of surveyed employment information. This was important in three significant ways. First, while only a small victory for those of us calling for more transparency, it’s a meaningful step towards reducing the number of prospectives who make uninformed decisions. These additional data will shed light on the meaning of the employment figures reported on U.S. News’ widely-used website, even though the ABA already publishes answers to some (but not all) of the same questions.

Second, once U.S. News begins publishing all of the important employment data it collects from schools, it will show the tendency schools have to revise their employment outcomes over time due to what NALP has termed the “data drift.” Third, and perhaps most promising, Bob Morse (the U.S. News guru) went on the record in favor of the transparency efforts, making an official pledge on behalf of his organization to publish more employment data.

From Morse Code, Mr. Morse’s U.S. News blog:

Law school students need as much information as possible to help them realistically understand the employment prospects from their school and the economic value of their degree in terms of their ability to pay back loans and earning power. U.S. News believes the information we will be publishing will help current students in those efforts. However, disclosure of employment data by law schools is still woefully lacking given the cost of attendance and poor job market. U.S. News strongly backs all ongoing efforts to require law schools to report even more detailed data on the how recent grads have fared in the job market. We would collect and publish those statistics if they were available.

Improving Visibility and Revealing the Data Drift

U.S. News was very receptive to our suggested changes in all communications, and Mr. Morse’s public comments are a clear sign that the organization behind the notorious rankings is aware of the problems that prospective law students face. It is worth noting once again that U.S. News actually collects more information about employment outcomes (thanks to its market power) than the ABA does using its regulatory power.

Some commenters have rightfully pointed out that a lot of our suggestions to U.S. News do not include data that aren’t already available from the ABA’s Official Guide to Law Schools. We only asked U.S. News to disclose data it already has, rather than lobbying for U.S. News to expand or refine its survey. But as it turns out, schools are providing different answers to the same questions over time.

According to Jim Leipold, Executive Director of NALP, schools are actually revising the data they submit over time for the same graduating class. At last week’s Questionnaire Committee hearing, Mr. Leipold spoke of the “data drift” among school-reported employment information, which make datasets submitted to the ABA and to U.S. News incompatible. For a given year, schools report data to NALP in February, U.S. News in March, and the ABA in October. For some schools, these figures are all different, even though they all reflect the same graduates’ post-graduation outcomes as of February 15 in the relevant year.

This is not because schools should answer questions differently when reporting to these three organizations, as each organization uses the same definitions and denominators. Rather, schools refine their answers over time as more information comes available. For example, a career services dean may learn in June that one of the students she could not track down was actually (un)employed in February, thus report one more (un)employed graduate to the ABA as compared to NALP. This phenomenon means that using the ABA data to better understand the U.S. News information is unreliable because the data sets are incompatible. As such, the new data U.S. News has pledged to share only appears to be redundant.

What’s Next?

It is important to recognize that U.S. News plays a significant role in how prospectives choose where to go, and that they share responsibility with the law schools and the ABA for making sure prospectives have the information they need. The U.S. News rankings are particularly important in decisionmaking, for better or worse. Many have argued, including Law School Transparency, that prospectives put too much weight on a school’s rank. These arguments stem largely from the belief that the rankings suffer from a number of methodological flaws (law review citations at the end of this post), especially the attempt to rank local and regional schools on a national scale, the reliance on incongruent components, and the lack of a meaningful answer to “what do these rankings measure?”

We know that many prospectives use U.S. News rankings as a proxy for employment opportunities, given the lack of meaningful information from the schools or the ABA. But is that what these rankings actually show? If we look at placement in the largest national law firms, based on the 2005 NLJ 250 chart (2007, 2008), the rankings do a pretty good job at sorting the most national law schools. But even then, they only have meaning on a national scale that misses the importance that regionality plays in legal hiring. For the tens of thousands of prospectives who will end up attending a different law school that doesn’t place into the NLJ250 or who aren’t interested in working for large law firms, the rankings do not effectively distinguish schools by placement ability.

All of this is beside the point. Regardless of the many concerns about how the rankings distort the application process, there is no doubt that U.S. News’ data collection fills part of the gap that the ABA has historically left wide open. U.S. News’ commitment to disclosing more information suggests that the organization is not only paying attention to this debate but siding with those of us who are arguing for improved disclosure.

U.S. News is also showing a willingness to listen as it considers what steps to take next. Mr. Morse and his staff acknowledged that they could provide more assistance by actively seeking input on how they can help more effectively. In all of this their dedication to helping prospective students is especially important, particularly as the ABA considers making changes to what schools must report to maintain accreditation. One of the biggest differences between what the ABA requires and what U.S. News asks for (and usually gets) is salary information, which the ABA has not yet deemed to be basic consumer information. A call by U.S. News for the ABA to collect better consumer information has the potential to make a significant impact, and we will be keeping an eye on any changes over at the ABA. The Standards Review Committee is having its next meeting in a few weeks, when we look forward to continuing the discussion about the need for better regulation.

Some U.S. News rankings criticisms:
Brian Leiter, How to Rank Law Schools, 81 IND. L.J. 47–50 (2006); Andrew P. Morriss & William D. Henderson, Measuring Outcomes: Post-graduation Measures of Success in the U.S. News & World Report Law School Rankings, 83 IND. L.J. 791 (2008); Richard A. Posner, Law School Rankings, 81 IND. L.J. 13, 13 (2006); David A. Thomas, The Law School Rankings Are Harmful Deceptions: A Response to Those Who Praise the Rankings and Suggestions for a Better Approach to Evaluating Law Schools, 40 HOUS. L. REV. 419 (2003). But cf., e.g., Paul D. Carrington, On Ranking: A Response to Mitchell Berger, 53 J. LEGAL EDUC. 301 (2003); Russell Korobkin, In Praise of Law School Rankings: Solutions to Coordination and Collective Action Problems, 77 TEX. L. REV. 403 (1998).

U.S. News to Reform Its Disclosure of Surveyed Employment Information

Updated: 12/15/2010 4:15 pm CST
Thanks to around five months of correspondence between LST and U.S. News, we are happy to report that the organization behind the notorious law school rankings has agreed to make additional information available on its website. While the additional information will not change how U.S. News computes their rankings, it will help make things much clearer for prospectives trying to take apart the employment statistics that U.S. News currently collects. This information is far more detailed than the information required by the ABA, and making it more visible will help prospectives while we continue lobbying the ABA to reform their practices.

In August, LST contacted the U.S. News’ ranking guru, Bob Morse, with concerns about the display of employment information included for each law school on its website. Specifically, we were concerned that a number of answers that schools provide to U.S. News in its annual survey were not being disclosed. The effect of the missing answers, we believed, was to limit the usefulness of the employment figures beyond their documented substantive and formal flaws (more on those in our white paper).

After listening to our concerns, U.S. News has agreed to reform how it discloses the results of its annual survey. In doing so, U.S. News will adopt our suggestions and will soon make changes to its website for the Class of 2008 employment information. These changes will remain in place when the new rankings become available (around 3/15/2011), which will include employment information about the Class of 2009.

Thanks for following up. I will try to answer your question clearly. As I said earlier USNEWS is doing a major redesign on the education section of usnews.com website which means Best Grad Schools and Law Schools. Our current plan is that U.S. News will add all the new fields that you suggest to current data for 2008 graduates and those new fields will show up when the redesign goes live in late Winter 2011. Then those fields will remain as part of best law schools web site when [we] launch the next rankings with 2009 graduates data around 3/15/2011.   If you have other questions, let me know.
 
Bob Morse

Update. We received this email a few days ago with a few clarifications from Mr. Morse:

I read your blog post (http://www.lawschooltransparency.com/2010/12/u-s-news-to-reform-its-disclosure-of-surveyed-employment-data/ ) and I wanted to make a few key comments and clarifications from my earlier email.
1. U.S. News is in the midst of doing a major redesign of the education section of usnews.com as I have mentioned earlier.
2. Doing a such a redesign is a very large scale project and the exact timing of when parts will be done is hard to determine.
3. The current plan is that the redesign will be rolled out in phases in late winter 2011.
4. The current plan is that the new law data fields for 2008 graduates will be added as part of one of those phases, but not as part of the first phase.
5. U.S. News is committed to adding these new data fields from our statistical survey, if we are unable to add them as part of the roll out (described above) of the redesign, the current plan is that they will be added for 2009 graduates data when the next Best Graduate Schools law school rankings are launched.

If you have questions, let me know.

Bob Morse

We made a number of suggestions by doctoring U.S News’ current website. You can view the original here. Our mockup page can be viewed here. Although probably self-explanatory, “A157″ (etc.) represents the answer to question 157 on the 2010 survey (the relevant survey questions are included after the jump). We identified the fields we changed in yellow.

Our suggestions are designed to tie more closely to what the survey questions ask schools. These suggestions include language changes, field position changes, and indented fields whenever a group fits under the previous field. Taken together, we think these changes more aptly tell the story of a school’s post-graduation outcomes. 

Summary of Suggestions:

Better Language: Changed “Graduates employed at graduation” to “Employed at graduation rate
Better Language: Changed “Graduates known to be employed nine months after graduation” to “Employed at nine months rate

This reflects the fact that the percentage does not actually reflect the graduates employed at graduation, but a rate calculated using a proprietary formula. The language must indicate to readers that the percentages that follow below are school-reported rather than U.S. News-derived.

New Section: “Class of 2008 Graduates – Class Breakdown at Nine Months” is a new section, which adds new fields and includes the entire “Areas of Legal Practice (Class of 2008)” section

This clarifies that the percentages are for the nine-month measurement period. The goal with the new fields below is to show more realistic percentages, especially when the percentages of a category add up to 100% but do not reflect 100% of the class.

New Fields: This section includes 7 new fields:
     “Graduates whose employment status is unknown”
     “Graduates whose employment status is known”
     “Graduates known to be enrolled in a full-time degree program”
     “Graduates known to be unemployed and seeking work”
     “Graduates known to be unemployed and not seeking work”
     “Graduates known to be employed”
     “Percent employed in a judicial clerkship by an Article III federal judge”

The “Graduates whose employment status is unknown” and “Graduates whose employment status is known” fields, along with the “Graduates known to be employed” field, are the most important new additions. These figures are required for determining how much of the class is actually employed with certain types of employers, particularly those in private practice (law firms + business and industry). The private practice percentages are crucial for determining how many graduates schools used data for in calculating the salary information. These numbers would allow us to relax our assumptions in LST’s data clearinghouse.

The other fields round out the full picture of a school’s graduating class. While the underlying data are still missing, these nuances help draw attention to the complexity these apparently-simple percentages have.

Changing Location of the Salary Reporting Percentage:
     “Percent in the private sector who reported salary information”

One of the most glaring problems with how employment information is presented is the prevalence of the “median private sector salary” statistic, which is in reality only the median salary for the sometimes small percentage of graduates who actually reported salaries. This change puts the percent reporting salary up front, before people see the often exaggerated 25th, median and 75th salary statistics. While moving the location of the percent reporting is a small change, it could help combat the tendency of prospectives to look at the reported median salaries and assume they are actually the median, when in reality they are often significantly higher than the actual (but undisclosed) median salary.

Interactivity:
We also made a technical suggestion regarding using jQuery tooltips:

Finally, and this is for your site developers, I noticed that your website loads jQuery. With this in mind, consider jQuery tooltips, part of the UI library (which I did not check to see if you were loading), instead of html attribute titles where possible. It’ll be a little easier to explain how, for example, the employment rates at graduation and 9 months are calculated. If you use just the basic HTML, it will be harder to make it readable because not all browsers recognize line-breaks in the title attribute. Moreover, even if you decide limited cross-browser support is fine, the decision of where to place the line-break is kind of a nuisance.

For more explanations and the U.S. News survey and employment rate formula, read more after the jump.

Continue reading U.S. News to reform its disclosure of surveyed employment data »»

Class of 2008 U.S. News Employment Summary Data

We collected the employment summary data from the 2010 U.S. News rankings and put it in a spreadsheet for easy use and comparison. Additionally, we added a few of our own metrics that help compare data.

We will eventually make the data even more accessible, but we cannot promise a specific date because we are gearing up for the first official data request. Nevertheless, our goal is to have a comparative tool available within the next month.

See the categories after the jump »»