Guides by LST
Helping you make more informed application and enrollment decisions
Last Updated: May 3, 2014
This guide explains five failures of the U.S. News rankings and the (lack of a) connection between a school's job placement and its U.S. News rank.

Every year, the law school world overreacts to slightly-shuffled U.S. News rankings. Proud alumni and worried students voice concerns. Provosts threaten jobs. Prospective students confuse the annual shuffle with genuine reputational change.

Law school administrators react predictably. They articulate methodological flaws and lament negative externalities, but nevertheless commit to the rankings game through their statements and actions. Assuring stakeholders bearing pitchforks has become part of the job description.

If the rankings measured something useful, the entire charade would be much easier to stomach. The unfortunate irony is that these rankings adversely affect the decision-making process for law school administrators and prospective law students alike. The stakes are high. Our profession and society need law schools that don't figure inefficient metrics into annual budgets. Dollars spent chasing U.S. News rankings diverts funds away from students' education. It also stands in the way of reducing tuition.

In this guide, we examine five U.S. News rankings failures. We consider the methodology and underlying rankings theory from the perspective of a student who features job prospects prominently in his application and enrollment decisions. Considering the near universal support for prioritizing job outcomes in the process, these failures demonstrate just some of the reasons annual consternation hardly seems worth it.

We would like to categorically describe the U.S. News rankings as "worse than useless," but that bypasses an important threshold question: What are you using the rankings for?

If you'd like a vague idea about the median LSAT and GPAs of the most recent class, U.S. News rankings provide it, though you could of course just look at the actual LSAT and GPA data. If you want to know what other professors think about a particular school, the rankings provide a clue about that as well, though much is likely due to an echo chamber effect. If you're a dean and want to justify soaring tuition, poor financial management, and a thirst for competition, then the U.S. News rankings are really quite great.

But if what you're after is the answer to the question "Which law school should I go to?" or "Is this law school right for me?" then yes, the U.S. News rankings are worse than useless.

Five Failures of the U.S. News Rankings

Word Cloud Image Word cloud of LSAT test-takers on their purpose of getting a J.D. Source: ATL.

As a tool for making application and enrollment decisions, the U.S. News rankings fail on a number of fronts. First, the rankings pay insufficient attention to what matters most to prospective students: job outcomes. In a survey of 600 students studying for the October 2012 LSAT, Breaking Media's research arm found that the two most popular words associated with the students' purpose of getting a J.D. were "career" and "work" (image to the right). These are not exactly shocking results. Despite the importance of job outcomes, they account for only 18% of the rank and credit schools for jobs few attend law school to pursue.

Second, the rankings use a national scope, which places schools on the same scale. Only a handful of schools have a truly national reach in job placement. The rest have a regional, in-state, or even just local reach. The relative positioning of California Western and West Virginia in the rankings is virtually meaningless. Graduates from these schools do not compete with one another.

State Outcomes

It turns out that 158 schools place at least half of their employed class of 2013 graduates in one state. The top state destination for each school accounts for 67% of employed graduates. A much smaller 8.2% of employed graduates go to a school's second most popular destination, with just 4.5% of employed graduates working in the third most popular destination. Only 20.4% of employed graduates (16.7% of the entire class) end up in a state other than the top three. Comparing schools across the country just doesn't make sense.

Third, U.S. News rankings follow an ordinal system that fails to show the degree of difference between schools. Are Columbia and NYU virtually tied? Or does the two-rank difference represent a wide gulf in quality? Is the so-called difference between Columbia and NYU the same as the difference between Cornell and Vanderbilt? Students weighing school prices need to know not just which school is better but how much better it is.

Fourth, performance changes over time but year-to-year comparisons are virtually impossible using the U.S. News rankings. U.S. News will tell you that Stanford knocked Harvard out of the #2 spot in 2012-13, but the swap in rankings does not provide any clues to your average reader as to why. Stanford may have improved while Harvard declined. Or, Stanford may have improved while Harvard's quality stayed the same. Or perhaps both schools saw a decline in quality but Harvard's decline was more severe. In fact, if every single school saw a marked decline in quality the U.S. News rankings would not indicate that this happened. Instead, students can know only relative performance.

Finally, U.S. News inexplicably places every ABA accredited law school on the list of "The Best." The best at what? U.S. News doesn't say. But it implies that every school on the list is good. The truth is that once costs and employment outcomes are considered in comparison to personal career goals, many schools are bad choices. The U.S. News rankings provide no help in drawing the line.

Parting Thoughts

U.S. News wants to be the duct tape of law school applications. No matter what your career goals, geographic preference, or financial situation are, it wants to be the answer. The truth is that the U.S. News rankings produce bad outcomes for applicants and law schools alike.

Not only do the U.S. News rankings measure the wrong things, but they put schools in the wrong order for job outcomes, make meaningless comparisons between schools which are not in competition, provide no way of telling what a difference in rankings mean, and provide strong encouragement for schools to push tuition levels to brave new heights. They provide for some great gossip and intrigue once a year, but when it comes to helping you make a $200,000 investment, they are worse than useless.

Rankings are not inherently bad. In fact, they are conceptually quite useful. They order comparable things to help people sort through more information than they know how to or can weigh. However, ranking credibility may be lost when methodologies are unsound, through irrational weighting or meaningless metrics, or when the scope is too broad. As previously described, the U.S. News rankings are victim to many of these flaws.

LST has developed the Score Reports in an effort to produce a tool to help prospective students make application and enrollment decisions, keeping in mind that each person has a different risk tolerance and financial situation. Though the Score Reports have a number of opportunities for improvement, little of what makes the U.S. News rankings flawed applies to the Score Reports. They unambiguously measure job outcomes, use a regional scope, and use real terms about the outcomes to allow prospective students to make an educated decision about not just which school to attend, but whether any school happens to meet their needs. Click here to read our guide to using the Score Reports.

Share with Friends