Questions to Ask When Considering Risks and Warning Signs in Tech-Based Screening Tools
Each and every job opening now results in a slew of applicants. As applications dramatically increase, more and more human resources departments are turning to technology to assist in reducing the burden of reviewing resumes and predict a candidate's success within their workforce.
April 03, 2019 at 01:15 PM
7 minute read
Each and every job opening now results in a slew of applicants. As applications dramatically increase, more and more human resources departments are turning to technology to assist in reducing the burden of reviewing resumes and predict a candidate's success within their workforce. Often these technological solutions—such as resume screening and personality testing—are presented to in-house counsel as a fait accompli and lead to a scramble to figure out whether the new screening tool passes the legal sniff test. This article aims to tee up the baseline issues that should be resolved before wide-scale implementation of these tools occurs.
A recent study strongly suggests that the rise of predictive analytics can create significant risk for employers. A December 2018 report by Upturn, a D.C.-based nonprofit organization, suggests that without active measures to mitigate bias, predictive tools are prone to some form of bias by default. These biases, and the problems they create, can appear at any point in the recruitment and hiring process, including at the advertising, sourcing, screening, interviewing and selection stages. Furthermore, lawmaking and regulatory bodies may lack the authority, resources and expertise to provide meaningful guidance and oversight with respect to employers' use of predictive analytics in recruiting and hiring.
A 2018 Reuters article discussed a recruiting tool that adapted and learned over time to disfavor female candidates, demonstrating that these concerns are not just abstract or theoretical. As a result, the risks to unwary employers have never been greater. This is especially so in light of a rapidly accelerating trend in state and local pay equity legislation, which raises critical questions about predictive analytics with respect to formulating candidate and applicant offers. Moreover, federal contractors face great risks that the government can evaluate these tools under the outdated and vague standards in the Uniform Guidelines for Employee Selection Procedures and determine that they are not valid.
Given these potential issues, there are a few warning signs that employers should consider when evaluating the legal risks related to data-driven recruitment and hiring practices.
- Is there a validation study and how good is it?
An initial sign that something is amiss with a predictive analytics tool is a lack of accompanying validation studies. Validation studies evaluate the degree to which a particular tool or algorithm actually measures what it sets out to measure.
Having a validation study in hand, however, does not alleviate the legal risk because they can be limited in what they may observe and study. For example, a validation study may attest to a given selection tool's ability to predict the existence and strength of a relationship between education and longevity in a position. However, that study may not appropriately convey whether and how a predictive tool controls for other hidden, intervening variables. These additional hidden variables—which can include gender or other protected characteristics—may in turn lead to biased applicant and candidate selections. A validation study that demonstrates a selection tool's strong predictive power does not, therefore, guarantee that a selection tool is sufficient from an analytical or legal standpoint.
- How much do you know about the variables used in the tool?
Predictive analytics tools vary in how they use and transform data. In an ideal scenario, vendors are transparent in the variables on which a particular tool screens candidates. This includes clear and detailed information on how a particular predictive analytics tool manipulates data, the sources of that data, on what bases the predictive analytics predicts applicant or candidate success, and how employers can interpret these results. Predictive tools that employers cannot adjust—to include or eliminate certain variables—may subject employers to liability for inadvertent discrimination resulting from the use of such tools. For this reason, employers should avoid using candidate screening tools whose functionality they do not understand.
- Is there over-reliance on an employer's own inputs?
In contrast to lack of information about how a tool works, employers are at risk if they include too much information in the tool. Some predictive analytics tools rely heavily on information sourced from an employer's own existing workforce, such as information on employees' education, experience, performance, demographics and other attributes.
There are risks associated with utilizing employer workforce data as the primary inputs for a predictive analytics tool. The use of an employer's workforce data magnifies the risk that predictive analytics will perpetuate bias that may already exist, however covertly, in an employer's existing hiring process. An employer's measure of what constitutes a “successful” employee in the past or present may be a product of past discrimination or inherently discriminatory. To the extent that the variables associated with whether an employee is successful aren't essential functions of the position, employers run the risk of having fewer objective bases to support its decisions. In these ways, candidate screening based upon existing employee data alone may compound, rather than reduce, the existence of bias in an employer's hiring practices.
- At the front end, does the tool have dubious and invasive screening questions?
Competency and personality testing are among the most popular forms of predictive analytics in employee recruiting and hiring. Like many other types of predictive analytics, competency and personality testing aim to incorporate objective metrics to screen candidates. Such testing, however, is a veritable minefield of potential biases and, as a result, employer liability. For example, competency testing may fail to adequately measure competencies that are related to the job at issue. If an employer implements high-level competency testing for a low-skill manufacturing position, it may run the risk of introducing illegitimate factors into the employment decision and disfavor qualified individuals from uneducated backgrounds. Disparate impact discrimination risks may surface if certain protected classes from uneducated backgrounds are overrepresented in the rejected applicant pool.
Personality testing that disfavors women or minority candidates carries similar risks as responses to hypothetical scenario questions may be correlated with preferences that are derived from impermissible sex-based biases and stereotypes. Other obvious risk factors include invasive questions about issues ranging from physical disabilities and marital status to political beliefs and feelings about controversial issues.
- Does removing the human element eliminate selection bias?
Many employers utilize predictive analytics in candidate screening as a means to reduce biased hiring decisions. Unconscious biases by recruiters and hiring managers could lead to judgments about candidates based on protected characteristics. By instead screening candidates on purportedly neutral and identical criteria, employers hope to reduce the possibility that such biases will lead to qualified candidates being overlooked.
The underrepresentation of existing women and minority employees, however, may raise concerns that an employer's data-driven screening process contains biases of its own. These biases can result from a variety of circumstances. As described above, a candidate screening tool may adapt to and perpetuate existing biases within an employer's workforce. With regard to resume software, certain control variables may screen out qualified candidates based on names, education, and other candidate information. These data points may in turn be proxies for protected characteristics such as sex, race, age, and sexual orientation.
But, overcorrecting could lead to reverse discrimination. A sudden and disproportionate rise in an employer's workforce of new female and minority hires may trigger concerns about the employer's candidate screening process.
Conclusion
As Upturn's 2018 report states, bias in predictive analytics can appear in many places and take many forms, and can be difficult for employers to detect and correct. With technological innovation outpacing lawmaking and regulation, employers may be left with little guidance as to whether and how to use particular predictive tools. Employers should, for these and many other reasons, exercise caution in implementing predictive hiring tools and promptly discuss any concerns about a particular screening tool with legal counsel.
Christopher Wilkinson is an employment law partner in Orrick's Washington, D.C. office. He previously served as Associate Solicitor for Civil Rights and Labor Management at the U.S. Department of Labor.
David B. Smith is an employment law senior associate in the firm's Washington, D.C. office.
Alex Mitchell is an employment law career associate based in the firm's Global Operations Center.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllAI Disclosures Under the Spotlight: SEC Expectations for Year-End Filings
5 minute readA Blueprint for Targeted Enhancements to Corporate Compliance Programs
7 minute readThree Legal Technology Trends That Can Maximize Legal Team Efficiency and Productivity
Trending Stories
- 1Uber Files RICO Suit Against Plaintiff-Side Firms Alleging Fraudulent Injury Claims
- 2The Law Firm Disrupted: Scrutinizing the Elephant More Than the Mouse
- 3Inherent Diminished Value Damages Unavailable to 3rd-Party Claimants, Court Says
- 4Pa. Defense Firm Sued by Client Over Ex-Eagles Player's $43.5M Med Mal Win
- 5Losses Mount at Morris Manning, but Departing Ex-Chair Stays Bullish About His Old Firm's Future
Who Got The Work
J. Brugh Lower of Gibbons has entered an appearance for industrial equipment supplier Devco Corporation in a pending trademark infringement lawsuit. The suit, accusing the defendant of selling knock-off Graco products, was filed Dec. 18 in New Jersey District Court by Rivkin Radler on behalf of Graco Inc. and Graco Minnesota. The case, assigned to U.S. District Judge Zahid N. Quraishi, is 3:24-cv-11294, Graco Inc. et al v. Devco Corporation.
Who Got The Work
Rebecca Maller-Stein and Kent A. Yalowitz of Arnold & Porter Kaye Scholer have entered their appearances for Hanaco Venture Capital and its executives, Lior Prosor and David Frankel, in a pending securities lawsuit. The action, filed on Dec. 24 in New York Southern District Court by Zell, Aron & Co. on behalf of Goldeneye Advisors, accuses the defendants of negligently and fraudulently managing the plaintiff's $1 million investment. The case, assigned to U.S. District Judge Vernon S. Broderick, is 1:24-cv-09918, Goldeneye Advisors, LLC v. Hanaco Venture Capital, Ltd. et al.
Who Got The Work
Attorneys from A&O Shearman has stepped in as defense counsel for Toronto-Dominion Bank and other defendants in a pending securities class action. The suit, filed Dec. 11 in New York Southern District Court by Bleichmar Fonti & Auld, accuses the defendants of concealing the bank's 'pervasive' deficiencies in regards to its compliance with the Bank Secrecy Act and the quality of its anti-money laundering controls. The case, assigned to U.S. District Judge Arun Subramanian, is 1:24-cv-09445, Gonzalez v. The Toronto-Dominion Bank et al.
Who Got The Work
Crown Castle International, a Pennsylvania company providing shared communications infrastructure, has turned to Luke D. Wolf of Gordon Rees Scully Mansukhani to fend off a pending breach-of-contract lawsuit. The court action, filed Nov. 25 in Michigan Eastern District Court by Hooper Hathaway PC on behalf of The Town Residences LLC, accuses Crown Castle of failing to transfer approximately $30,000 in utility payments from T-Mobile in breach of a roof-top lease and assignment agreement. The case, assigned to U.S. District Judge Susan K. Declercq, is 2:24-cv-13131, The Town Residences LLC v. T-Mobile US, Inc. et al.
Who Got The Work
Wilfred P. Coronato and Daniel M. Schwartz of McCarter & English have stepped in as defense counsel to Electrolux Home Products Inc. in a pending product liability lawsuit. The court action, filed Nov. 26 in New York Eastern District Court by Poulos Lopiccolo PC and Nagel Rice LLP on behalf of David Stern, alleges that the defendant's refrigerators’ drawers and shelving repeatedly break and fall apart within months after purchase. The case, assigned to U.S. District Judge Joan M. Azrack, is 2:24-cv-08204, Stern v. Electrolux Home Products, Inc.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250