We would like to categorically describe the U.S. News rankings as "worse than useless," but that bypasses an important threshold question: What are you using the rankings for?
If you'd like a vague idea about the median LSAT and GPAs of the most recent class, U.S. News rankings provide it, though you could of course just look at the actual LSAT and GPA data. If you want to know what other professors think about a particular school, the rankings provide a clue about that as well, though much is likely due to an echo chamber effect. If you're a dean and want to justify soaring tuition, poor financial management, and a thirst for competition, then the U.S. News rankings are really quite great.
But if what you're after is the answer to the question "Which law school should I go to?" or "Is this law school right for me?" then yes, the U.S. News rankings are worse than useless.
As a tool for making application and enrollment decisions, the U.S. News rankings fail on a number of fronts. First, the rankings do not get at what matters most to a prospective student: job outcomes. In a survey of 600 students studying for the October 2012 LSAT, Breaking Media's research arm found that the two most popular words associated with the students' purpose of getting a J.D. were "career" and "work" (image is on the right). After all, a law school is a professional school.
But job outcomes account for only 20% of the U.S. News rank. It also treats every job as the same from barrister to barista and you can't disaggregate the employment rank from the rest of the ranking. (Editor's Note: The 2013-14 rankings now distinguish among job types.)
The second fundamental flaw in these rankings is that schools are analyzed in an irrelevant national scope. Only a small handful of schools have a truly national reach in job placement. The rest have a regional, in-state, or even just local reach. The relative positioning of California Western and West Virginia are as meaningless as the alignment of the stars on the day you take the LSAT.
Third, an ordinal ranking system fails to show the degree of difference between schools. Are Columbia and NYU virtually tied? Or does the two-rank difference represent a wide gulf in quality? Is the so-called difference between Columbia and NYU the same as the difference between Cornell and Vanderbilt? Once a student begins looking at the cost of enrollment and has scholarship offers from some schools it becomes extremely important to know not just which school is better but how much better it is.
Fourth, performance changes over time but year-to-year comparisons are virtually impossible using the U.S. News rankings. U.S. News will tell you that Stanford knocked Harvard out of the #2 spot in 2012-13, but the swap in rankings does not provide any clues as to why. Stanford may have improved while Harvard declined. Or, Stanford may have improved while Harvard's quality stayed the same. Or perhaps both schools saw a decline in quality but Harvard's decline was more severe. In fact, if every single school saw a marked decline in quality the U.S. News rankings would not indicate that this happened. Instead, consumers are left knowing relative performance.
Finally, U.S. News inexplicably places every ABA accredited law school on the list of "The Best." The best at what? U.S. News doesn't say. But it implies that every school on the list is good. The truth is that once costs and employment outcomes are considered, many schools are a bad choice for many people. The U.S. News rankings provide no help in drawing the line.
Prospective students often believe that while U.S. News ranks do not directly correlate to employment outcomes, that they are at least a very good proxy. For the very top schools, this more or less holds up. For the remaining 90%+ of law schools that the vast majority of students will be attending the correlation between U.S. News rank and job placement is miniscule.
The first chart below compares a law school's U.S. News rank to its LST Employment Score for the class of 2011. As you can see, outside of the very top schools you cannot use a U.S. News rank to predict employment outcomes.
The same holds true when U.S. News ranks are compared to the LST Under-Employment Score (second chart).
If the U.S. News rank is not a proxy for employment outcomes, what is it a proxy for? As discussed above, it tracks quite closely to the LSAT scores of the most recent class and somewhat closely to median GPAs. Are LSAT and GPA data relevant to the quality of a school? Cohort quality is certainly important, and you may decide the LSAT and GPA stats are a reliable proxy for cohort quality. If so, then just use the actual numbers, rather than the U.S. News rankings, which would merely be a proxy of a proxy.
Finally, we can see that the U.S. News ranking also tracks quite closely to a school's peer assessment score (what other professors think about the school). This isn't too surprising, given that the peer assessment score is the single largest component of a school's rank. It is also the least reliable. Unlike LSAT and GPA data, expenditures data, and library volume data, the peer assessment score asks professors about their feelings about schools with which they rarely if ever have regular interaction with.
U.S. News wants to be the duct tape of law school applications. No matter what your career goals, geographic preference, or financial situation are, it wants to be the answer. The truth is that the U.S. News rankings produce bad outcomes for applicants and law schools alike.
Not only do the U.S. News rankings measure the wrong things, but they put schools in the wrong order for job outcomes, make meaningless comparisons between schools which are not in competition, provide no way of telling what a difference in rankings mean, and provide strong encouragement for schools to push tuition levels to brave new heights. They provide for some great gossip and intrigue once a year, but when it comes to helping you make a $200,000 investment, they are worse than useless.
Rankings are not inherently bad. In fact, they are conceptually quite useful. They order comparable things to help people sort through more information than they know how to or can weigh. However, ranking credibility may be lost when methodologies are unsound, through irrational weighting or meaningless metrics, or when the scope is too broad. As previously described, the U.S. News rankings are victim to many of these flaws.
LST has developed the Score Reports in an effort to produce a tool to help prospective students make application and enrollment decisions, keeping in mind that each person has a different risk tolerance and financial situation. Though the Score Reports have a number of opportunities for improvement, little of what makes the U.S. News rankings flawed applies to the Score Reports. They unambiguously measure job outcomes, use a regional scope, and use real terms about the outcomes to allow prospective students to make an educated decision about not just which school to attend, but whether any school happens to meet their needs. Click here to read our guide to using the Score Reports.