Skip to main content

Limits of Our Reports

Published on October 1, 2020

The LST Law School Reports suffer from a variety of limits and those using them need to be conscious of what the reports do and do not mean. Many of these problems are due to inadequacies with the underlying data. Despite these limits, we still publish the LST Reports because we believe they are superior to every other tool available to prospective law students. Please note that the following limits have only to do with the reports as reliable indicators for their purpose, rather than whether or not the reports solve all problems in want of a solution.

Limit: The Past Doesn't Guarantee the Future

The desire to use past employment data to predict the future is strong, and the ABA and schools recognize that this is why people look to employment information. While the past is not necessarily indicative of the future, examining the outcomes of recent graduating classes provides some idea of what to expect, barring major changes to the entry-level legal market. But even where major changes do call into question the reliability of information about past graduating classes, one can hypothesize about how particular schools will react within the new climate. That is, even as the market changes, there is value in older information in light of change—but it is up to readers to draw these conclusions for themselves.

Limit: Self-Selection

While prelaw students are concerned with the job options they will have upon graduation, schools do not collect those data. Instead we have information about actual employment outcomes, and must use those outcomes as a proxy for opportunities. Consider a school that sees 40% of its graduates work for large firms and 20% of its graduates work in public service. We do not know how many graduates working in public service could have worked for a large firm, nor how many graduates at large firms could have gone into public service. That information would be valuable—perhaps more valuable than outcomes—to a prelaw student. Accordingly, relying exclusively upon outcomes neglects very real differences in job prospects.

That said, be skeptical of schools that claim self-selection applies to an atypical proportion of their graduating classes. Ask why the class composition would cause outcomes to differ from the norm and what data support such a claim. Then ask for the data. For example, if a school claims its graduates desire non-legal jobs and that the outcomes support this result, inquire as to whether the historical outcomes support this claim, or if it is a new phenomenon. If the latter, wonder whether the shift in outcomes was due to an attitudinal shift in the graduating class, or perhaps because the jobs historically taken were unavailable.

There is the additional problem of geographic self-selection. By facilitating state-based sorting using a single year of geographical outcomes, we risk under and overestimating placement by location. For any number of reasons a school may have more or fewer graduates in a location in a given year. This is not ideal, though the problems will only be at the edges because schools only show up on geographical reports if the total number of graduates working in a location meets a minimum threshold.

Limit: Not All Law Jobs Created Equal

With the LST Employment Score, we treat all long-term, full-time legal jobs with employers the same. For example, a job with a large law firm counts the same as a job with a very small law firm, even though we have data for this distinction. We do not, however, have sufficient data for distinguishing between lawyer and non-lawyer jobs at large law firms. Wide variances by pay, prestige, practice settings, and practice specialties also exist. Neither do we have enough data that distinguish among placement in alternative internal staffing tracks, e.g. staff attorneys versus associates at law firms. Schools that share their NALP Reports with LST and the public, however, shed some light on these questions. View these data tables under the jobs tab on a school's profile.

Limit: Incomplete Picture of Outcomes

Because data are collected ten months after graduation and published a bit later, the data are perpetually outdated and may not accurately reflect the present employment outlook. It is important to pay attention to school track records on their school profiles, employment trends in national, regional, and local markets, and BLS labor projections. The LST Reports data provide only a snapshot, showing placement in first jobs without looking further into a graduate's career. Some graduates in temporary jobs will find permanent professional work, while some permanent jobs will unexpectedly come to an end.

Though the picture is incomplete, ten-month outcomes are nevertheless very important. In the legal profession, the first job matters. In this education climate, with costs as high as they are, much emphasis focuses on these outcomes because 3 in 4 graduates debt-financed at least some of their J.D. education. Initial loan payments are due shortly after graduation, whether or not the graduate's outcome reflect the successes he or she will find or lose throughout a career.

Limit: No Disaggregation of Job Characteristics by Location

This limit has much in common with the self-selection and incomplete picture problems. All scores and rates on this site reflect only nation-wide data. It is plausible, and quite likely, that a school will have differing levels of success in different states. This means that placement in other states may either inflate or deflate the scores/rates. Using New Jersey as an example, the Employment Scores for School X may not match the success rate for New Jersey. The scores could be higher or lower if we could instead focus only on those graduates obtaining work in Georgia.

Were the data available, the scores and rates would still suffer from the self-selection problem. For instance, graduates may be inclined to move across state lines only after having received a job offer. This would create a very high score/rate within that state, but would not take into account students who want to move but have not found jobs. Likewise, students may cross state lines because they have been unable to find work in their preferred state and believe another state presents better opportunities.

Upshot: What Should You Do?

Ultimately, our goal is to better equip people with the tools they need to understand the decision to attend law school. More public information exists now than ever before. One type of new information is information about what data are missing. Acknowledging these gaps allows one to recognize what assumptions are supported and what assumptions are mere conjecture. Questioning assumptions like "law school is a ticket to financial security or success" or "if I get into law school, I should go" is huge progress and will result in better informed decisions. Analyze the reports and then did deeper into school data. See what's missing. Unbundle the information. Ask questions.