When Algorithms Are Racist: How to Protect Against Biased Algorithms
When using algorithms, companies should be aware of any inherent biases the formula may have and monitor its results, attorneys said.
March 07, 2019 at 09:30 AM
4 minute read
A study released in November 2018 examined how algorithms are used to decide loan approval, a task that can be laden with biases. Companies that leverage algorithms can't turn a blind eye to the results their software provides; instead they should understand how the algorithm works, what data it pulls from and monitor its results, a Big Law attorney said.
The “Fairness Under Unawareness: Assessing Disparity When Protected Class Is Unobserved” paper penned by Cornell University professors, a Ph.D student and Capital One staffers, found potential pitfalls when algorithms are used when protected classes, such as gender or race, aren't given by applicants applying for a loan.
Nathan Kallus, a Cornell Tech professor and co-writer of the paper, said that when an applicant doesn't include their protected class, regulators may be overestimating disparities by guessing race by zip code or other factors. For example, when an applicant doesn't list their race in a loan application, institutions use “proxy variables” such as zip codes or surnames listed on the application to predict what race the applicant is, Kallus said.
“We wanted to investigate this approach by proxy and assess if it works,” Kallus said. “Obviously in such high-stakes domains, you really want to make sure you are doing the right thing. We really wanted to dig deep.”
The study reviews multiple algorithms used when protected classes aren't definitively known, and it found over- and underestimation of disparity.
Finding the fairest algorithm is difficult, Kallus said. While the paper doesn't recommend policy, Kallus suggested including an applicant's protected class may prove the easiest way to detect discriminatory institutional practices.
“What one might infer from these results is that maybe if you want to be fair, it's better to know who these people are. Maybe it's better to be aware,” Kallus said. However, such data can be misused or pose a privacy concern, he added.
The “garbage in, garbage out” adage can also apply to biased algorithms that fuel machine learning or artificial intelligence leveraged by court systems, regulators or financial institutions that wield significant consequences.
“People create algorithms, and if the people that create them live in a racist world of systemic discrimination, they reproduce that discrimination in their computer code. And I'm not sure how to guard against that,” said Andrea Freeman, a University of Denver Sturm College of Law professor and author of a North Carolina Law Review “Racism in the Credit Card Industry” article.
However, regulators would hold companies accountable if they used software that made discriminatory actions against protected classes, said Kevin Petrasic, White & Case's global financial institutions advisory practice chair.
“The regulator can go after the vendor and institution, especially if it's patently discriminatory or lending that has a disparate impact against a protected class.” If a company doesn't have controls in place to monitor algorithms, they are “not going to be given too much empathy from the regulator's perspective,” he added.
Petrasic, a White & Case partner and former special counsel to the U.S. Treasury Department's Office of Thrift Supervision, said an algorithm's issues may arise in how it was structured and trained and lead to potential inherent biases. “All of those issues suggest that there's a tremendous amount of awareness that needs to be had to the use of algorithms,” he explained.
Petrasic added financial institutions should have an “explainability factor” for their algorithms where they can explain how the algorithm works and what controls are in place to monitor the results.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1SEC’s Latest Enforcement Actions Fuel Demand for Big Law
- 2Sterlington Brings On Former Office Leader From Ashurst
- 3DOJ Takes on Largest NFT Scheme That Points to Larger Trend
- 4Arnold & Porter Matches Market Year-End Bonus, Requires Billable Threshold for Special Bonuses
- 5Advising 'Capital-Intensive Spaces' Fuels Corporate Practice Growth For Haynes and Boone
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250