Questions to Ask When Considering Risks and Warning Signs in Tech-Based Screening Tools
Each and every job opening now results in a slew of applicants. As applications dramatically increase, more and more human resources departments are turning to technology to assist in reducing the burden of reviewing resumes and predict a candidate's success within their workforce.
April 03, 2019 at 01:15 PM
7 minute read
Each and every job opening now results in a slew of applicants. As applications dramatically increase, more and more human resources departments are turning to technology to assist in reducing the burden of reviewing resumes and predict a candidate's success within their workforce. Often these technological solutions—such as resume screening and personality testing—are presented to in-house counsel as a fait accompli and lead to a scramble to figure out whether the new screening tool passes the legal sniff test. This article aims to tee up the baseline issues that should be resolved before wide-scale implementation of these tools occurs.
A recent study strongly suggests that the rise of predictive analytics can create significant risk for employers. A December 2018 report by Upturn, a D.C.-based nonprofit organization, suggests that without active measures to mitigate bias, predictive tools are prone to some form of bias by default. These biases, and the problems they create, can appear at any point in the recruitment and hiring process, including at the advertising, sourcing, screening, interviewing and selection stages. Furthermore, lawmaking and regulatory bodies may lack the authority, resources and expertise to provide meaningful guidance and oversight with respect to employers' use of predictive analytics in recruiting and hiring.
A 2018 Reuters article discussed a recruiting tool that adapted and learned over time to disfavor female candidates, demonstrating that these concerns are not just abstract or theoretical. As a result, the risks to unwary employers have never been greater. This is especially so in light of a rapidly accelerating trend in state and local pay equity legislation, which raises critical questions about predictive analytics with respect to formulating candidate and applicant offers. Moreover, federal contractors face great risks that the government can evaluate these tools under the outdated and vague standards in the Uniform Guidelines for Employee Selection Procedures and determine that they are not valid.
Given these potential issues, there are a few warning signs that employers should consider when evaluating the legal risks related to data-driven recruitment and hiring practices.
- Is there a validation study and how good is it?
An initial sign that something is amiss with a predictive analytics tool is a lack of accompanying validation studies. Validation studies evaluate the degree to which a particular tool or algorithm actually measures what it sets out to measure.
Having a validation study in hand, however, does not alleviate the legal risk because they can be limited in what they may observe and study. For example, a validation study may attest to a given selection tool's ability to predict the existence and strength of a relationship between education and longevity in a position. However, that study may not appropriately convey whether and how a predictive tool controls for other hidden, intervening variables. These additional hidden variables—which can include gender or other protected characteristics—may in turn lead to biased applicant and candidate selections. A validation study that demonstrates a selection tool's strong predictive power does not, therefore, guarantee that a selection tool is sufficient from an analytical or legal standpoint.
- How much do you know about the variables used in the tool?
Predictive analytics tools vary in how they use and transform data. In an ideal scenario, vendors are transparent in the variables on which a particular tool screens candidates. This includes clear and detailed information on how a particular predictive analytics tool manipulates data, the sources of that data, on what bases the predictive analytics predicts applicant or candidate success, and how employers can interpret these results. Predictive tools that employers cannot adjust—to include or eliminate certain variables—may subject employers to liability for inadvertent discrimination resulting from the use of such tools. For this reason, employers should avoid using candidate screening tools whose functionality they do not understand.
- Is there over-reliance on an employer's own inputs?
In contrast to lack of information about how a tool works, employers are at risk if they include too much information in the tool. Some predictive analytics tools rely heavily on information sourced from an employer's own existing workforce, such as information on employees' education, experience, performance, demographics and other attributes.
There are risks associated with utilizing employer workforce data as the primary inputs for a predictive analytics tool. The use of an employer's workforce data magnifies the risk that predictive analytics will perpetuate bias that may already exist, however covertly, in an employer's existing hiring process. An employer's measure of what constitutes a “successful” employee in the past or present may be a product of past discrimination or inherently discriminatory. To the extent that the variables associated with whether an employee is successful aren't essential functions of the position, employers run the risk of having fewer objective bases to support its decisions. In these ways, candidate screening based upon existing employee data alone may compound, rather than reduce, the existence of bias in an employer's hiring practices.
- At the front end, does the tool have dubious and invasive screening questions?
Competency and personality testing are among the most popular forms of predictive analytics in employee recruiting and hiring. Like many other types of predictive analytics, competency and personality testing aim to incorporate objective metrics to screen candidates. Such testing, however, is a veritable minefield of potential biases and, as a result, employer liability. For example, competency testing may fail to adequately measure competencies that are related to the job at issue. If an employer implements high-level competency testing for a low-skill manufacturing position, it may run the risk of introducing illegitimate factors into the employment decision and disfavor qualified individuals from uneducated backgrounds. Disparate impact discrimination risks may surface if certain protected classes from uneducated backgrounds are overrepresented in the rejected applicant pool.
Personality testing that disfavors women or minority candidates carries similar risks as responses to hypothetical scenario questions may be correlated with preferences that are derived from impermissible sex-based biases and stereotypes. Other obvious risk factors include invasive questions about issues ranging from physical disabilities and marital status to political beliefs and feelings about controversial issues.
- Does removing the human element eliminate selection bias?
Many employers utilize predictive analytics in candidate screening as a means to reduce biased hiring decisions. Unconscious biases by recruiters and hiring managers could lead to judgments about candidates based on protected characteristics. By instead screening candidates on purportedly neutral and identical criteria, employers hope to reduce the possibility that such biases will lead to qualified candidates being overlooked.
The underrepresentation of existing women and minority employees, however, may raise concerns that an employer's data-driven screening process contains biases of its own. These biases can result from a variety of circumstances. As described above, a candidate screening tool may adapt to and perpetuate existing biases within an employer's workforce. With regard to resume software, certain control variables may screen out qualified candidates based on names, education, and other candidate information. These data points may in turn be proxies for protected characteristics such as sex, race, age, and sexual orientation.
But, overcorrecting could lead to reverse discrimination. A sudden and disproportionate rise in an employer's workforce of new female and minority hires may trigger concerns about the employer's candidate screening process.
Conclusion
As Upturn's 2018 report states, bias in predictive analytics can appear in many places and take many forms, and can be difficult for employers to detect and correct. With technological innovation outpacing lawmaking and regulation, employers may be left with little guidance as to whether and how to use particular predictive tools. Employers should, for these and many other reasons, exercise caution in implementing predictive hiring tools and promptly discuss any concerns about a particular screening tool with legal counsel.
Christopher Wilkinson is an employment law partner in Orrick's Washington, D.C. office. He previously served as Associate Solicitor for Civil Rights and Labor Management at the U.S. Department of Labor.
David B. Smith is an employment law senior associate in the firm's Washington, D.C. office.
Alex Mitchell is an employment law career associate based in the firm's Global Operations Center.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllA Blueprint for Targeted Enhancements to Corporate Compliance Programs
7 minute readThree Legal Technology Trends That Can Maximize Legal Team Efficiency and Productivity
Corporate Confidentiality Unlocked: Leveraging Common Interest Privilege for Effective Collaboration
11 minute readTrending Stories
- 1Call for Nominations: Elite Trial Lawyers 2025
- 2Senate Judiciary Dems Release Report on Supreme Court Ethics
- 3Senate Confirms Last 2 of Biden's California Judicial Nominees
- 4Morrison & Foerster Doles Out Year-End and Special Bonuses, Raises Base Compensation for Associates
- 5Tom Girardi to Surrender to Federal Authorities on Jan. 7
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250