AI-Based Crime Tools Aren't the Problem. The Biased Data They Use Is.
Lawyers and criminal justice advocates point to institutional racism, and the data that produces it, as being the propelling factor behind biased artificial intelligence-backed crime tools.
April 11, 2019 at 11:30 AM
3 minute read
Recently, the Los Angeles Police Department announced it would stop using algorithmic-based programs to identify who's most likely to commit violent crimes, according to The Los Angeles Times. An audit of the program by the department's inspector general found, among other things, that the police department used inconsistent criteria to label people “chronic offenders.”
The LAPD is one of many U.S. police departments and courts leveraging artificial intelligence-backed software to assist in policing, bail and sentencing decisions. While such tools are grabbing headlines for disproportionately targeting blacks and Latinos, observers say the root of these tools' problems is biased data.
“All the algorithms we've seen being used in the criminal justice realm use prior criminal history as a primary factor,” said Nyssa Taylor, criminal justice policy counsel for the American Civil Liberties Union of Pennsylvania.
Over-policing and plea bargains entered into not necessarily out of guilt, but by a defendant being low-income, can create misleading incarceration and recidivism data. “When you train a tool on a data set of that, it's going to reproduce the same biases in society,” said Hannah Sassaman, a Media Mobilizing Project policy director and 2017 Soros Justice fellow focused on limiting sentencing based on predictive algorithms.
“What people don't understand is what algorithms are, they are basically opinions disguised as facts,” said Philadelphia-based criminal defense attorney Troy Wilson. “People just think you are using numbers, you're using experiences, institutional racism to justify a particular result.”
He added, “A predictive algorithm that is trying to predict that you are such a high risk wants to give you a higher jail sentence before you committed another crime.”
Wilson said such a decision should not be taken lightly. “You should freak out—that's another reason I'm vehemently against it. How in the hell are you going to predict I'm going to commit another crime” when using AI and flawed data, he asked.
While criminal justice reform advocates stressed the weight government policy and policing have on data used in sentencing and bail tools, they also noted proprietary algorithms also presents obstacles.
In many cases, there is a lack of transparency over how such software came to its conclusion. Wilson said such proprietary software doesn't allow lawyers to cross-examine the software's developers.
Advocates for reform agreed the courts need to take account of racial disparities before deciding a defendant's sentence or likelihood to commit a crime.
“From my perspective the angst around assessment tools is a mild distraction from what should be the angst about the data we are using in the first place,” said Cherise Fanno Burdeen, CEO of Pretrial Justice Institute, an organization that assists states and local government with pretrial reforms. “It's not the assessment tool itself that is biased, and data by itself can not be biased, but it's generated by policies that are biased.”
She added, “There's no no silver bullet fix to racial equity, we have to go back to the policy.”
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1Blake Lively's claims that movie co-star launched smear campaign gets support in publicist's suit
- 2Middle District of Pennsylvania's U.S. Attorney Announces Resignation
- 3Vinson & Elkins: Traditional Energy Practice Meets Energy Transition
- 4After 2024's Regulatory Tsunami, Financial Services Firms Hope Storm Clouds Break
- 5Trailblazing Pennsylvania Judge Sylvia Rambo Dies at 88
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250