Automation in Hiring Could Mean 'Significant Legal Risks'
Amazon scrapped its experimental hiring tool after discovering a bias against certain résumés from women. And as one lawyer says, 'Just because something has a statistical correlation, doesn't mean it's a good or lawful way to select talent.'
October 18, 2018 at 03:29 PM
6 minute read
The original version of this story was published on National Law Journal
Amazon.com Inc. developed an experimental hiring tool one employee called the “holy grail” in recruiting: a data analytics program that could sift through thousands of applications and rate candidates.
The e-commerce giant scrapped the tool after discovering a bias against certain résumés from women, according to a report from Reuters. The algorithm picked up patterns from the company's male-dominated workforce and devalued, according to the report, résumés identifying “women” and other gender-specific terms.
Companies and their lawyers often talk about how the infusion of technology will create new efficiencies, but the Amazon debacle shows there are pitfalls. The apparent shortcomings in Amazon's tool point to ways automation can still incorporate unconscious biases. Amazon told Reuters that its tool “was never used by Amazon recruiters to evaluate candidates.”
“There are significant legal risks,” said Mark Girouard, an employment attorney at Minneapolis-based Nilan Johnson Lewis. “These tools find patterns in the data and look for correlations in whatever measure of success you are looking at. They can find correlations that are statistically significant. Just because something has a statistical correlation, doesn't mean it's a good or lawful way to select talent.”
Still, such programs are attractive to large companies. A recent survey by the management-side firm Littler Mendelson found that the most common use of data analytics and artificial intelligence is in hiring and recruiting. Nearly half of those employers surveyed said they use some kind of advanced data techniques to grow their workforce.
Girouard said employers seek these programs because they can ease the workload for hiring managers and they can be cheaper to develop than traditional assessments—such as a written or online test. He said employers also believe there is potential for less implicit or explicit bias since computers theoretically are neutral.
“When I am advising clients considering using these tools, I make sure that their vendor will let them look under the hook or look under the black box to look and monitor what they are finding,” he said. He added, “It's such a new area. I think as we see more employers head in this direction, it will likely to lead to litigation.”
Arran Stewart, co-founder of the online portal Job.com, predicted the Reuters story that revealed the issues with Amazon's experimental tool will “open a can of worms” and make employers and workers aware of the potential for mistakes with the automation.
Artificial intelligence is “like a child,” he said. “Anything you teach it, it will inherit. It will inherit bias. Developers and coders create AI and create the rules, set the dictionaries and taxonomies and the tools will inherit their biases, sometimes unknowingly.”
|One Former EEOC Official's Perspective
Civil rights advocates and federal regulators said there could be unintended consequences from automation. While the area may be ripe for discrimination claims, it's not obvious to job seekers that employers are using these tools.
In 2016, the U.S. Equal Employment Opportunity Commission held a meeting to discuss the implications of the rise of big data in the workplace. Kelly Trindel, then the EEOC's chief analyst in the Office of Research, Information and Planning, predicted some of the potential pitfalls for protected classes for companies who are increasingly using these programs to recruit and hire.
“The primary concern is that employers may not be thinking about big data algorithms in the same way that they've thought about more traditional selection devices and employment decision strategies in the past,” Trindel said at the EEOC meeting. “Many well-meaning employers wish to minimize the effect of individual decision-maker bias, and as such might feel better served by an algorithm that seems to maintain no such human imperfections. Employers must bear in mind that these algorithms are built on previous worker characteristics and outcomes.”
Algorithms focused on hiring can replicate “past behavior at the firm or firms used to create the dataset,” said Trindel, who has since left the agency. “If past decisions were discriminatory or otherwise biased, or even just limited to particular types of workers, then the algorithm will recommend replicating that discriminatory or biased behavior.”
At that same 2016 meeting, Littler Mendelson shareholder Marko Mrkonich said the challenge for employers “is to find a way to embrace the strengths of big data without losing sight of their own business goals and culture amidst potential legal risks.”
“The challenge for the legal system is to permit those engaged in the responsible development of big data methodologies in the employment sector to move forward and explore their possibilities without interference from guidelines and standards based on assumptions that no longer apply or that become obsolete the next year,” Mrkonich said.
An American Bar Association report from 2017 by Darrell Gay and Abigail Lowin of Arent Fox said there is “great liability” in allowing algorithms to take control without human oversight. Yet, they cited research that case law is lacking surrounding big-data guidance. They warned there is potential for lawsuits to ensue.
“On its face, this is good news for employers,” the attorneys wrote of the dearth of cases. They added, “In any case, it is difficult to build arguments or learn from past errors if those historical lessons are hidden behind obscure verbiage.”
The report said computers and algorithms are not easily trained to make nuanced judgments about job applicants. To defend against discrimination allegations, employers have to explain why one employee was hired instead of another.
“An algorithm can learn the population makeup of various protected groups; and it can learn the traits that an employer seeks in new employees; but it cannot adequately balance those potentially competing factors,” the attorneys wrote. They continued. “When employers rely too heavily on algorithms that do not receive the proper 'instruction' and oversight, this can create potential exposure.”
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1Cars Reach Record Fuel Economy but Largely Fail to Meet Biden's EPA Standard, Agency Says
- 2How Cybercriminals Exploit Law Firms’ Holiday Vulnerabilities
- 3DOJ Asks 5th Circuit to Publish Opinion Upholding Gun Ban for Felon
- 4GEO Group Sued Over 2 Wrongful Deaths
- 5Revenue Up at Homegrown Texas Firms Through Q3, Though Demand Slipped Slightly
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250