Note: a version of this piece was published last year, but the U.S. News rankings remain as toxic of an influence as ever.

Tomorrow, the law school world will overreact to slightly-shuffled U.S. News rankings. Proud alumni and worried students will voice concerns. Provosts will threaten jobs. Prospective students will confuse the annual shuffle with genuine reputational change.

Law school administrators will react predictably. They'll articulate methodological flaws and lament negative externalities, but will nevertheless commit to the rankings game through their statements and actions. Assuring stakeholders bearing pitchforks has become part of the job description.

If the rankings measured something useful, the entire charade would be much easier to stomach. The unfortunate irony is that these rankings adversely affect the decision-making process for law school administrators and prospective law students alike. The stakes are high. Our profession and society need law schools that don't figure inefficient metrics into annual budgets. Dollars spent chasing U.S. News rankings diverts funds away from students' education. It also stands in the way of reducing tuition.

In this post, I examine five U.S. News rankings failures. I consider the methodology and underlying rankings theory from the perspective of a student who features job prospects prominently in his application and enrollment decisions. Considering the near universal support for prioritizing job outcomes in the process, these failures demonstrate just some of the reasons annual consternation hardly seems worth it.

Word Cloud Image Word cloud of LSAT test-takers surveyed about the reasons they plan to obtain a J.D. degree. Source: Above the Law.

First, the rankings pay insufficient attention to what matters most to prospective students: job outcomes. In a survey of 600 students studying for the October 2012 LSAT, Breaking Media's research arm found that the two most popular words associated with the students' purpose of getting a J.D. were “career” and “work” (image to the right). These are not exactly shocking results. Despite the importance of job outcomes, they account for only 18% of the rank and credit schools for jobs few attend law school to pursue.

Second, the rankings use a national scope, which places schools on the same scale. Only a handful of schools have a truly national reach in job placement. The rest have a regional, in-state, or even just local reach. The relative positioning of California Western and West Virginia in the rankings is virtually meaningless. Graduates from these schools do not compete with one another.

State Outcomes Percentages in above graphic are of the entire class.

It turns out that 158 schools place at least half of their employed class of 2013 graduates in one state. The top state destination for each school accounts for 67% of employed graduates. A much smaller 8.2% of employed graduates go to a school's second most popular destination, with just 4.5% of employed graduates working in the third most popular destination. Only 20.4% of employed graduates (16.7% of the entire class) end up in a state other than the top three. Comparing schools across the country just doesn't make sense.

Third, U.S. News rankings follow an ordinal system that fails to show the degree of difference between schools. Are Columbia and NYU virtually tied? Or does the two-rank difference represent a wide gulf in quality? Is the so-called difference between Columbia and NYU the same as the difference between Cornell and Vanderbilt? Students weighing school prices need to know not just which school is better but how much better it is.

Fourth, performance changes over time but year-to-year comparisons are virtually impossible using the U.S. News rankings. U.S. News will tell you that Stanford knocked Harvard out of the #2 spot in 2012-13, but the swap in rankings does not provide any clues to your average reader as to why. Stanford may have improved while Harvard declined. Or, Stanford may have improved while Harvard's quality stayed the same. Or perhaps both schools saw a decline in quality but Harvard's decline was more severe. In fact, if every single school saw a marked decline in quality the U.S. News rankings would not indicate that this happened. Instead, students can know only relative performance.

Finally, U.S. News inexplicably places every ABA accredited law school on the list of “The Best.” The best at what? U.S. News doesn't say. But it implies that every school on the list is good. The truth is that once costs and employment outcomes are considered in comparison to personal career goals, many schools are bad choices. The U.S. News rankings provide no help in drawing the line.

Rankings are not inherently bad. In fact, they are conceptually quite useful. They order comparable things to help people sort through more information than they know how to or can weigh. However, ranking credibility may be lost when methodologies are unsound, through irrational weighting or meaningless metrics, or when the scope is too broad. The legal profession is worse off for elevating the importance of a publication that falls victim to these flaws each and every year.