Increasingly sophisticated data analyses are seeping into the legal industry—smoothed graphics, text mining, regression, statistics, machine learning and more. As that happens, leaders of law firms and law departments will encounter and rely more on quantitative tools that can improve and quicken their decisions. One such tool (spoiler alert: ANOVA) can spot when averages tell little or tell a lot.

This tool has application everywhere in legal groups. Consider three uses. A law firm has four offices, and each lead office partner lobbies to be paid more because the office's average billable hours were high last year: How might the Comp Committee evaluate the data on which they justify their requests? Second, the average client-satisfaction rating for a law department from HR is 0.3 higher than from IT: Does that gap deserve noting? Third, a law firm's executive committee studies the average number of new matters opened by practice groups during the past several years and finds one laggard group: Should the committee take any action?

In situations like these where averages could shape people's thinking, operational decisions will benefit if there is an analysis that discloses whether any of the averages compared to the other averages actually matter. Can managers figure out whether some averages are sufficiently different to be relied on or, to the contrary, are they differences without a distinction?