Do We Need a Private Right of Action Against Machine Bias?
To combat the use of biased decision-making technology by government agencies, some are advocating for a private right of action. But whether such a right works depends heavily on its scope and definitions.
October 11, 2018 at 10:30 AM
6 minute read
A number of local government agencies and courts around the U.S. are placing their trust in machines. Need help determining health care program eligibility? There's an algorithm for that. What about predicting the chances of inmate recidivism? Just enter in some variables and press a button.
Sounds simple, right? Except, of course, for the actual algorithms themselves. In fact, they can be so complex that, unknown to the agencies who deploy them, they may be biased.
Take for example the algorithm-based Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) recidivism prediction tool, which was shown to be biased against African-American inmates. Or an algorithmic program used by Idaho Department of Health and Welfare to determine Medicaid eligibility, which was accused of incorrectly kicking many locals off the state program.
For those affected by the programs, the consequences of using algorithmic-powered decision-making technology in public and legal services can be painfully unjust and long-lasting.
But there are possible ways to combat algorithmic machine bias. One can increase transparency over how such algorithms are developed and trained and submit them for ongoing testing, for instance.
Some also believe it's equally important to make agencies and courts that use such tools responsible for their impact by empowering those negatively affected by a biased technology with a private right of action.
The idea, however, is far from fleshed out, and there is much debate among tech and legal professionals over whether such a private right could help and if it is needed. It's clear, however, that much of the answer depends on the specifics of the law enabling the right, especially in how it defines discrimination.
|The Private Right
Two dozen legal, tech, and civil rights organizations advocated for a private right of action in New York City in an August 2017 letter of recommendation sent to the city's local automated decision-making systems task force. Launched by New York City Council in late 2017, the task force aims to draft a report in December 2019 on ways to regulate the use of automated decision-making technology by local government agencies.
The letter calls for the council to pass “a law providing a private right of action for individuals or groups of individuals that are injured by automated decision system determinations that are found to be discriminatory or produce discriminatory results.”
Rashida Richardson, one of the signers of the letter and director of policy research at The AI Now Institute, an interdisciplinary research center at NYU looking at the social implications of AI, said the private right of action was aimed at government agencies that use biased algorithmic decision-making tools.
It's this distinction that, for some, make a private right of action more feasible. Sandra Mayson, assistant professor at the University of Georgia School of Law, said such a right makes more sense when used “against government agencies deploying the algorithms,” instead of the tech companies developing them. “It's the deployment of the algorithm that would cause harm, not the algorithms itself in a void.”
Still, while Mayson said she is “in favor of anything that promotes transparency of algorithmic decision-making and government decision-making … the private right itself won't mean much without a clear delineation of what we mean by discrimination, and that is the hard thing to lay out.”
She added that while “there can be technical quantifiable measures [of algorithmic discrimination], the problem is choosing which one should apply in a given situation.”
But that is likely to be easier said than done. Because “what we mean by discrimination is often amorphous,” Mayson said, there isn't a consensus on whether certain algorithmic decision-making tools are biased at all. In fact, “there is a deep dispute about whether COMPAS was biased at all, and that's because people understand bias to mean different things,” she said.
|Legislating Away Bias?
The way U.S. law treats discrimination can support arguments both for and against a private right of action.
“The law uses two different analytical frameworks” to consider claims of discrimination: “disparate treatment,” which is the intentional differential treatment of a certain group of people, and “disparate impact,” where there is no intentional discrimination but disparate outcomes for different groups, Mayson explained.
Out of the two, disparate treatment has a much stronger legal foundation. “So the way that most discrimination laws are structured, they don't prohibit disparate impact in all or even most scenarios,” she said.
However, there are still some protection against disparate impact bias in U.S. law. “Existing laws generally require some showing of intentional disparate treatment on the basis of a protected trait or completely unjustified disparate impact for which there is no plausible excuse,” she noted.
Title VII of the Civil Rights Act of 1964, for example, made employment practices that cause “disparate impact on the basis of race, color, religion, sex, or national origin” unlawful.
For some, such a private right of action would add little to the discrimination laws already on the books. Helen Goff Foster, a partner in the privacy and security practice at Davis Wright Tremaine and 20-year federal privacy attorney who worked at the Federal Trade Commission, the Department of Homeland Security, and the Obama White House, believes that private right of actions “most often helps nobody but class action lawyers.”
She instead advocates for a technological solution to solve machine bias, essentially working closely with AI developers to improve their technology. “With something that is very new like artificial intelligence software … the rush to regulation is often more harmful than it is good. You need to have a dialogue with the players about what the problems are and how do we fix them in the most efficient way.”
But the AI NOW Institute's Richardson sees a private right of action against government agencies not as extraneous law that would stifle innovation, but as a necessary one to require “good governance efforts” by government agencies that use algorithmic decision-making tools. What's more, she believes that such a private right of action would be something a local government would implement, even though it straddles them with potentially more liability.
“I don't think it's impossible, and in fact I'm fairly encouraged that something of that nature could occur in New York City, particularly because the task force is expected to release its report with recommendations in December 2019,” she said.
“In the following two years, you are going to see a huge turnover in the city council … and when you have longtime legislators that are termed out, they are more willing to support and push more ambitious legislation.”
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1Friday Newspaper
- 2Judge Denies Sean Combs Third Bail Bid, Citing Community Safety
- 3Republican FTC Commissioner: 'The Time for Rulemaking by the Biden-Harris FTC Is Over'
- 4NY Appellate Panel Cites Student's Disciplinary History While Sending Negligence Claim Against School District to Trial
- 5A Meta DIG and Its Nvidia Implications
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250