Man vs. Machine: Maybe Computers Aren't the Best at Predicting Recidivism
A Dartmouth College study found that individuals without expertise in criminal justice may be as accurate as trusted court software in predicting recidivism.
January 18, 2018 at 09:43 AM
4 minute read
In technology, conventional wisdom says that machine learning can typically make better predictions than humans, weighing out biases and increasing accuracy by staggering amounts. A new study out of Dartmouth College, however, is challenging that assumption, particularly when it comes to the fate of those in the criminal justice system.
According to research from Dartmouth College, the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) risk management tool is no more accurate at predicting recidivism than individuals with “little or no criminal justice expertise.” COMPAS, widely used among U.S. courts to determine recidivism risk, has, according to the research, been used in assessing more than one million offenders since 1998.
Carried out by a student-faculty research team, the Dartmouth study gave a group of nonexperts—workers contracted through Amazon's Mechanical Turk online marketplace—short descriptions of pretrial defendants taken from a database in Broward County, Florida, from 2013-2014. The descriptions provided seven features of a pretrial defendant, including age, sex, crime they were charged with, whether the crime was a misdemeanor or felony, and previous criminal history. Using this information, participants, through a survey, were asked to predict whether a defendant would recidivate.
The research was conducted among a total of 800 participants, divided into two groups of 400. One group was allowed to see the pretrial defendant's race, while the other wasn't.
While COMPAS takes into account 137 features in determining recidivism, the tool's results (65.2 percent accuracy) in this instance were “statistically the same” as those of the group (67 percent), a statement from Dartmouth said.
“As machine learning and artificial intelligence tools emerged in criminal justice, they kind of bypassed this middle step in ensuring they're as accurate as we think they are,” Julia Dressel, who conducted the research for her undergraduate thesis in computer science at Dartmouth, told LTN. “People are quick to assume they're accurate and objective and think of course these things should be used. … We have to step back and realize that might not always be the case.”
“Right out of the gate, you know something is concerning when the accuracy is 65 percent,” Hany Farid, professor of computer science at Dartmouth College and co-leader of the study, told LTN. “People answering an online survey as accurately as the software: That should give us more pause.”
He added that many judges might look positively on using analytics tools because of their perceived accuracy, but “I think you would weigh that prediction very differently if I told you, 'Hey, I polled 12 people online, and this is what they said.'”
COMPAS's proprietary algorithm is unknown outside of its developer, Northpointe Inc. In 2017, The New York Times reported that a Northpointe executive said, “We've created [the algorithms], and we don't release them, because it's certainly a core piece of our business.”
The Dartmouth research took the seven pieces of information given to the study's human participants and fed it into “the simplest possible machine algorithm, the kind of thing you would teach in an undergraduate course,” logistic regression, and “it got 65 percent [accuracy], right out of the gate.”
Taking it a step further, the researchers gave the algorithm two pieces of information—age and prior convictions—and it achieved 65 percent accuracy, the same as COMPAS.
COMPAS has previously been challenged in the courts. In one case, a Wisconsin man was sentenced to six years in prison by a judge who cited a COMPAS assessment score. The man appealed, and the case made it to the Wisconsin Supreme Court, which ruled against him. In 2017, the U.S. Supreme Court declined to hear the case.
COMPAS is also no stranger to criticism. A 2016 analysis by ProPublica found “that black defendants were far more likely than white defendants to be incorrectly judged to be at a higher risk of recidivism.” Whites, conversely, were more likely “incorrectly flagged as low risk.”
The ranking of recidivism by race is partially due to limitations of algorithms. Researching the ProPublica dataset—that same used by Dartmouth—The Washington Post found that, while COMPAS doesn't account for race directly in its algorithm, many attributes it considers in predicting multiple-time offenders vary by race, like prior arrests, which black defendants are more likely to have.
Citing a different review, the Dartmouth study also noted that accuracy wasn't just an issue for COMPAS and that “eight out of nine [algorithmic] approaches failed to make accurate predictions.”
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1Uber Files RICO Suit Against Plaintiff-Side Firms Alleging Fraudulent Injury Claims
- 2The Law Firm Disrupted: Scrutinizing the Elephant More Than the Mouse
- 3Inherent Diminished Value Damages Unavailable to 3rd-Party Claimants, Court Says
- 4Pa. Defense Firm Sued by Client Over Ex-Eagles Player's $43.5M Med Mal Win
- 5Losses Mount at Morris Manning, but Departing Ex-Chair Stays Bullish About His Old Firm's Future
Who Got The Work
J. Brugh Lower of Gibbons has entered an appearance for industrial equipment supplier Devco Corporation in a pending trademark infringement lawsuit. The suit, accusing the defendant of selling knock-off Graco products, was filed Dec. 18 in New Jersey District Court by Rivkin Radler on behalf of Graco Inc. and Graco Minnesota. The case, assigned to U.S. District Judge Zahid N. Quraishi, is 3:24-cv-11294, Graco Inc. et al v. Devco Corporation.
Who Got The Work
Rebecca Maller-Stein and Kent A. Yalowitz of Arnold & Porter Kaye Scholer have entered their appearances for Hanaco Venture Capital and its executives, Lior Prosor and David Frankel, in a pending securities lawsuit. The action, filed on Dec. 24 in New York Southern District Court by Zell, Aron & Co. on behalf of Goldeneye Advisors, accuses the defendants of negligently and fraudulently managing the plaintiff's $1 million investment. The case, assigned to U.S. District Judge Vernon S. Broderick, is 1:24-cv-09918, Goldeneye Advisors, LLC v. Hanaco Venture Capital, Ltd. et al.
Who Got The Work
Attorneys from A&O Shearman has stepped in as defense counsel for Toronto-Dominion Bank and other defendants in a pending securities class action. The suit, filed Dec. 11 in New York Southern District Court by Bleichmar Fonti & Auld, accuses the defendants of concealing the bank's 'pervasive' deficiencies in regards to its compliance with the Bank Secrecy Act and the quality of its anti-money laundering controls. The case, assigned to U.S. District Judge Arun Subramanian, is 1:24-cv-09445, Gonzalez v. The Toronto-Dominion Bank et al.
Who Got The Work
Crown Castle International, a Pennsylvania company providing shared communications infrastructure, has turned to Luke D. Wolf of Gordon Rees Scully Mansukhani to fend off a pending breach-of-contract lawsuit. The court action, filed Nov. 25 in Michigan Eastern District Court by Hooper Hathaway PC on behalf of The Town Residences LLC, accuses Crown Castle of failing to transfer approximately $30,000 in utility payments from T-Mobile in breach of a roof-top lease and assignment agreement. The case, assigned to U.S. District Judge Susan K. Declercq, is 2:24-cv-13131, The Town Residences LLC v. T-Mobile US, Inc. et al.
Who Got The Work
Wilfred P. Coronato and Daniel M. Schwartz of McCarter & English have stepped in as defense counsel to Electrolux Home Products Inc. in a pending product liability lawsuit. The court action, filed Nov. 26 in New York Eastern District Court by Poulos Lopiccolo PC and Nagel Rice LLP on behalf of David Stern, alleges that the defendant's refrigerators’ drawers and shelving repeatedly break and fall apart within months after purchase. The case, assigned to U.S. District Judge Joan M. Azrack, is 2:24-cv-08204, Stern v. Electrolux Home Products, Inc.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250