When Algorithms Are Racist: How to Protect Against Biased Algorithms
When using algorithms, companies should be aware of any inherent biases the formula may have and monitor its results, attorneys said.
March 07, 2019 at 09:30 AM
4 minute read
A study released in November 2018 examined how algorithms are used to decide loan approval, a task that can be laden with biases. Companies that leverage algorithms can't turn a blind eye to the results their software provides; instead they should understand how the algorithm works, what data it pulls from and monitor its results, a Big Law attorney said.
The “Fairness Under Unawareness: Assessing Disparity When Protected Class Is Unobserved” paper penned by Cornell University professors, a Ph.D student and Capital One staffers, found potential pitfalls when algorithms are used when protected classes, such as gender or race, aren't given by applicants applying for a loan.
Nathan Kallus, a Cornell Tech professor and co-writer of the paper, said that when an applicant doesn't include their protected class, regulators may be overestimating disparities by guessing race by zip code or other factors. For example, when an applicant doesn't list their race in a loan application, institutions use “proxy variables” such as zip codes or surnames listed on the application to predict what race the applicant is, Kallus said.
“We wanted to investigate this approach by proxy and assess if it works,” Kallus said. “Obviously in such high-stakes domains, you really want to make sure you are doing the right thing. We really wanted to dig deep.”
The study reviews multiple algorithms used when protected classes aren't definitively known, and it found over- and underestimation of disparity.
Finding the fairest algorithm is difficult, Kallus said. While the paper doesn't recommend policy, Kallus suggested including an applicant's protected class may prove the easiest way to detect discriminatory institutional practices.
“What one might infer from these results is that maybe if you want to be fair, it's better to know who these people are. Maybe it's better to be aware,” Kallus said. However, such data can be misused or pose a privacy concern, he added.
The “garbage in, garbage out” adage can also apply to biased algorithms that fuel machine learning or artificial intelligence leveraged by court systems, regulators or financial institutions that wield significant consequences.
“People create algorithms, and if the people that create them live in a racist world of systemic discrimination, they reproduce that discrimination in their computer code. And I'm not sure how to guard against that,” said Andrea Freeman, a University of Denver Sturm College of Law professor and author of a North Carolina Law Review “Racism in the Credit Card Industry” article.
However, regulators would hold companies accountable if they used software that made discriminatory actions against protected classes, said Kevin Petrasic, White & Case's global financial institutions advisory practice chair.
“The regulator can go after the vendor and institution, especially if it's patently discriminatory or lending that has a disparate impact against a protected class.” If a company doesn't have controls in place to monitor algorithms, they are “not going to be given too much empathy from the regulator's perspective,” he added.
Petrasic, a White & Case partner and former special counsel to the U.S. Treasury Department's Office of Thrift Supervision, said an algorithm's issues may arise in how it was structured and trained and lead to potential inherent biases. “All of those issues suggest that there's a tremendous amount of awareness that needs to be had to the use of algorithms,” he explained.
Petrasic added financial institutions should have an “explainability factor” for their algorithms where they can explain how the algorithm works and what controls are in place to monitor the results.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1‘Catholic Charities v. Wisconsin Labor and Industry Review Commission’: Another Consequence of 'Hobby Lobby'?
- 2With DEI Rollbacks, Employment Lawyers See Potential For Targeting Corporate Commitment to Equality
- 3In-House Legal Network The L Suite Acquires Legal E-Learning Platform Luminate+
- 4In Police Shooting Case, Kavanaugh Bleeds Blue and Jackson ‘Very Very Confused’
- 5Trump RTO Mandates Won’t Disrupt Big Law Policies—But Client Expectations Might
Who Got The Work
J. Brugh Lower of Gibbons has entered an appearance for industrial equipment supplier Devco Corporation in a pending trademark infringement lawsuit. The suit, accusing the defendant of selling knock-off Graco products, was filed Dec. 18 in New Jersey District Court by Rivkin Radler on behalf of Graco Inc. and Graco Minnesota. The case, assigned to U.S. District Judge Zahid N. Quraishi, is 3:24-cv-11294, Graco Inc. et al v. Devco Corporation.
Who Got The Work
Rebecca Maller-Stein and Kent A. Yalowitz of Arnold & Porter Kaye Scholer have entered their appearances for Hanaco Venture Capital and its executives, Lior Prosor and David Frankel, in a pending securities lawsuit. The action, filed on Dec. 24 in New York Southern District Court by Zell, Aron & Co. on behalf of Goldeneye Advisors, accuses the defendants of negligently and fraudulently managing the plaintiff's $1 million investment. The case, assigned to U.S. District Judge Vernon S. Broderick, is 1:24-cv-09918, Goldeneye Advisors, LLC v. Hanaco Venture Capital, Ltd. et al.
Who Got The Work
Attorneys from A&O Shearman has stepped in as defense counsel for Toronto-Dominion Bank and other defendants in a pending securities class action. The suit, filed Dec. 11 in New York Southern District Court by Bleichmar Fonti & Auld, accuses the defendants of concealing the bank's 'pervasive' deficiencies in regards to its compliance with the Bank Secrecy Act and the quality of its anti-money laundering controls. The case, assigned to U.S. District Judge Arun Subramanian, is 1:24-cv-09445, Gonzalez v. The Toronto-Dominion Bank et al.
Who Got The Work
Crown Castle International, a Pennsylvania company providing shared communications infrastructure, has turned to Luke D. Wolf of Gordon Rees Scully Mansukhani to fend off a pending breach-of-contract lawsuit. The court action, filed Nov. 25 in Michigan Eastern District Court by Hooper Hathaway PC on behalf of The Town Residences LLC, accuses Crown Castle of failing to transfer approximately $30,000 in utility payments from T-Mobile in breach of a roof-top lease and assignment agreement. The case, assigned to U.S. District Judge Susan K. Declercq, is 2:24-cv-13131, The Town Residences LLC v. T-Mobile US, Inc. et al.
Who Got The Work
Wilfred P. Coronato and Daniel M. Schwartz of McCarter & English have stepped in as defense counsel to Electrolux Home Products Inc. in a pending product liability lawsuit. The court action, filed Nov. 26 in New York Eastern District Court by Poulos Lopiccolo PC and Nagel Rice LLP on behalf of David Stern, alleges that the defendant's refrigerators’ drawers and shelving repeatedly break and fall apart within months after purchase. The case, assigned to U.S. District Judge Joan M. Azrack, is 2:24-cv-08204, Stern v. Electrolux Home Products, Inc.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250