Automation in Hiring, and the 'Significant Legal Risks'
"Just because something has a statistical correlation, doesn't mean it's a good or lawful way to select talent,” one lawyer says.
October 18, 2018 at 03:29 PM
6 minute read
Amazon.com Inc. developed an experimental hiring tool one employee called the “holy grail” in recruiting: a data analytics program that could sift through thousands of applications and rate candidates.
The e-commerce giant scrapped the tool after discovering a bias against certain résumés from women, according to a report from Reuters. The algorithm picked up patterns from the company's male-dominated workforce and devalued, according to the report, résumés identifying “women” and other gender-specific terms.
Companies and their lawyers often talk about how the infusion of technology will create new efficiencies, but the Amazon debacle shows there are pitfalls. The apparent shortcomings in Amazon's tool point to ways automation can still incorporate unconscious biases. Amazon told Reuters that its tool “was never used by Amazon recruiters to evaluate candidates.”
“There are significant legal risks,” said Mark Girouard, an employment attorney at Minneapolis-based Nilan Johnson Lewis. “These tools find patterns in the data and look for correlations in whatever measure of success you are looking at. They can find correlations that are statistically significant. Just because something has a statistical correlation, doesn't mean it's a good or lawful way to select talent.”
Still, such programs are attractive to large companies. A recent survey by the management-side firm Littler Mendelson found that the most common use of data analytics and artificial intelligence is in hiring and recruiting. Nearly half of those employers surveyed said they use some kind of advanced data techniques to grow their workforce.
Girouard said employers seek these programs because they can ease the workload for hiring managers and they can be cheaper to develop than traditional assessments—such as a written or online test. He said employers also believe there is potential for less implicit or explicit bias since computers theoretically are neutral.
“When I am advising clients considering using these tools, I make sure that their vendor will let them look under the hook or look under the black box to look and monitor what they are finding,” he said. He added, “It's such a new area. I think as we see more employers head in this direction, it will likely to lead to litigation.”
Arran Stewart, co-founder of the online portal Job.com, predicted the Reuters story that revealed the issues with Amazon's experimental tool will “open a can of worms” and make employers and workers aware of the potential for mistakes with the automation.
Artificial intelligence is “like a child,” he said. “Anything you teach it, it will inherit. It will inherit bias. Developers and coders create AI and create the rules, set the dictionaries and taxonomies and the tools will inherit their biases, sometimes unknowingly.”
One Former EEOC Official's Perspective
Civil rights advocates and federal regulators said there could be unintended consequences from automation. While the area may be ripe for discrimination claims, it's not obvious to job seekers that employers are using these tools.
In 2016, the U.S. Equal Employment Opportunity Commission held a meeting to discuss the implications of the rise of big data in the workplace. Kelly Trindel, then the EEOC's chief analyst in the Office of Research, Information and Planning, predicted some of the potential pitfalls for protected classes for companies who are increasingly using these programs to recruit and hire.
“The primary concern is that employers may not be thinking about big data algorithms in the same way that they've thought about more traditional selection devices and employment decision strategies in the past,” Trindel said at the EEOC meeting. “Many well-meaning employers wish to minimize the effect of individual decision-maker bias, and as such might feel better served by an algorithm that seems to maintain no such human imperfections. Employers must bear in mind that these algorithms are built on previous worker characteristics and outcomes.”
Algorithms focused on hiring can replicate “past behavior at the firm or firms used to create the dataset,” said Trindel, who has since left the agency. “If past decisions were discriminatory or otherwise biased, or even just limited to particular types of workers, then the algorithm will recommend replicating that discriminatory or biased behavior.”
At that same 2016 meeting, Littler Mendelson shareholder Marko Mrkonich said the challenge for employers “is to find a way to embrace the strengths of big data without losing sight of their own business goals and culture amidst potential legal risks.”
“The challenge for the legal system is to permit those engaged in the responsible development of big data methodologies in the employment sector to move forward and explore their possibilities without interference from guidelines and standards based on assumptions that no longer apply or that become obsolete the next year,” Mrkonich said.
An American Bar Association report from 2017 by Darrell Gay and Abigail Lowin of Arent Fox said there is “great liability” in allowing algorithms to take control without human oversight. Yet, they cited research that case law is lacking surrounding big-data guidance. They warned there is potential for lawsuits to ensue.
“On its face, this is good news for employers,” the attorneys wrote of the dearth of cases. They added, “In any case, it is difficult to build arguments or learn from past errors if those historical lessons are hidden behind obscure verbiage.”
The report said computers and algorithms are not easily trained to make nuanced judgments about job applicants. To defend against discrimination allegations, employers have to explain why one employee was hired instead of another.
“An algorithm can learn the population makeup of various protected groups; and it can learn the traits that an employer seeks in new employees; but it cannot adequately balance those potentially competing factors,” the attorneys wrote. They continued. “When employers rely too heavily on algorithms that do not receive the proper 'instruction' and oversight, this can create potential exposure.”
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllSkadden and Steptoe, Defending Amex GBT, Blasts Biden DOJ's Antitrust Lawsuit Over Merger Proposal
4 minute readDoes the Treasury Hack Underscore a Big Problem for the Private Sector?
6 minute readBig Law Practice Leaders Gearing Up for State AG Litigation Under Trump
4 minute read'Religious Discrimination'?: 4th Circuit Revives Challenge to Employer Vaccine Mandate
2 minute readTrending Stories
Who Got The Work
J. Brugh Lower of Gibbons has entered an appearance for industrial equipment supplier Devco Corporation in a pending trademark infringement lawsuit. The suit, accusing the defendant of selling knock-off Graco products, was filed Dec. 18 in New Jersey District Court by Rivkin Radler on behalf of Graco Inc. and Graco Minnesota. The case, assigned to U.S. District Judge Zahid N. Quraishi, is 3:24-cv-11294, Graco Inc. et al v. Devco Corporation.
Who Got The Work
Rebecca Maller-Stein and Kent A. Yalowitz of Arnold & Porter Kaye Scholer have entered their appearances for Hanaco Venture Capital and its executives, Lior Prosor and David Frankel, in a pending securities lawsuit. The action, filed on Dec. 24 in New York Southern District Court by Zell, Aron & Co. on behalf of Goldeneye Advisors, accuses the defendants of negligently and fraudulently managing the plaintiff's $1 million investment. The case, assigned to U.S. District Judge Vernon S. Broderick, is 1:24-cv-09918, Goldeneye Advisors, LLC v. Hanaco Venture Capital, Ltd. et al.
Who Got The Work
Attorneys from A&O Shearman has stepped in as defense counsel for Toronto-Dominion Bank and other defendants in a pending securities class action. The suit, filed Dec. 11 in New York Southern District Court by Bleichmar Fonti & Auld, accuses the defendants of concealing the bank's 'pervasive' deficiencies in regards to its compliance with the Bank Secrecy Act and the quality of its anti-money laundering controls. The case, assigned to U.S. District Judge Arun Subramanian, is 1:24-cv-09445, Gonzalez v. The Toronto-Dominion Bank et al.
Who Got The Work
Crown Castle International, a Pennsylvania company providing shared communications infrastructure, has turned to Luke D. Wolf of Gordon Rees Scully Mansukhani to fend off a pending breach-of-contract lawsuit. The court action, filed Nov. 25 in Michigan Eastern District Court by Hooper Hathaway PC on behalf of The Town Residences LLC, accuses Crown Castle of failing to transfer approximately $30,000 in utility payments from T-Mobile in breach of a roof-top lease and assignment agreement. The case, assigned to U.S. District Judge Susan K. Declercq, is 2:24-cv-13131, The Town Residences LLC v. T-Mobile US, Inc. et al.
Who Got The Work
Wilfred P. Coronato and Daniel M. Schwartz of McCarter & English have stepped in as defense counsel to Electrolux Home Products Inc. in a pending product liability lawsuit. The court action, filed Nov. 26 in New York Eastern District Court by Poulos Lopiccolo PC and Nagel Rice LLP on behalf of David Stern, alleges that the defendant's refrigerators’ drawers and shelving repeatedly break and fall apart within months after purchase. The case, assigned to U.S. District Judge Joan M. Azrack, is 2:24-cv-08204, Stern v. Electrolux Home Products, Inc.