|

Each and every job opening now results in a slew of applicants. As applications dramatically increase, more and more human resources departments are turning to technology to assist in reducing the burden of reviewing resumes and predict a candidate's success within their workforce. Often these technological solutions—such as resume screening and personality testing—are presented to in-house counsel as a fait accompli and lead to a scramble to figure out whether the new screening tool passes the legal sniff test. This article aims to tee up the baseline issues that should be resolved before wide-scale implementation of these tools occurs.

A recent study strongly suggests that the rise of predictive analytics can create significant risk for employers. A December 2018 report by Upturn, a D.C.-based nonprofit organization, suggests that without active measures to mitigate bias, predictive tools are prone to some form of bias by default. These biases, and the problems they create, can appear at any point in the recruitment and hiring process, including at the advertising, sourcing, screening, interviewing and selection stages. Furthermore, lawmaking and regulatory bodies may lack the authority, resources and expertise to provide meaningful guidance and oversight with respect to employers' use of predictive analytics in recruiting and hiring.

A 2018 Reuters article discussed a recruiting tool that adapted and learned over time to disfavor female candidates, demonstrating that these concerns are not just abstract or theoretical. As a result, the risks to unwary employers have never been greater. This is especially so in light of a rapidly accelerating trend in state and local pay equity legislation, which raises critical questions about predictive analytics with respect to formulating candidate and applicant offers. Moreover, federal contractors face great risks that the government can evaluate these tools under the outdated and vague standards in the Uniform Guidelines for Employee Selection Procedures and determine that they are not valid.

Given these potential issues, there are a few warning signs that employers should consider when evaluating the legal risks related to data-driven recruitment and hiring practices.

  • Is there a validation study and how good is it?

An initial sign that something is amiss with a predictive analytics tool is a lack of accompanying validation studies. Validation studies evaluate the degree to which a particular tool or algorithm actually measures what it sets out to measure.

Having a validation study in hand, however, does not alleviate the legal risk because they can be limited in what they may observe and study. For example, a validation study may attest to a given selection tool's ability to predict the existence and strength of a relationship between education and longevity in a position. However, that study may not appropriately convey whether and how a predictive tool controls for other hidden, intervening variables. These additional hidden variables—which can include gender or other protected characteristics—may in turn lead to biased applicant and candidate selections. A validation study that demonstrates a selection tool's strong predictive power does not, therefore, guarantee that a selection tool is sufficient from an analytical or legal standpoint.

  • How much do you know about the variables used in the tool?

Predictive analytics tools vary in how they use and transform data. In an ideal scenario, vendors are transparent in the variables on which a particular tool screens candidates. This includes clear and detailed information on how a particular predictive analytics tool manipulates data, the sources of that data, on what bases the predictive analytics predicts applicant or candidate success, and how employers can interpret these results. Predictive tools that employers cannot adjust—to include or eliminate certain variables—may subject employers to liability for inadvertent discrimination resulting from the use of such tools. For this reason, employers should avoid using candidate screening tools whose functionality they do not understand.

  • Is there over-reliance on an employer's own inputs?

In contrast to lack of information about how a tool works, employers are at risk if they include too much information in the tool. Some predictive analytics tools rely heavily on information sourced from an employer's own existing workforce, such as information on employees' education, experience, performance, demographics and other attributes.

There are risks associated with utilizing employer workforce data as the primary inputs for a predictive analytics tool. The use of an employer's workforce data magnifies the risk that predictive analytics will perpetuate bias that may already exist, however covertly, in an employer's existing hiring process. An employer's measure of what constitutes a “successful” employee in the past or present may be a product of past discrimination or inherently discriminatory. To the extent that the variables associated with whether an employee is successful aren't essential functions of the position, employers run the risk of having fewer objective bases to support its decisions. In these ways, candidate screening based upon existing employee data alone may compound, rather than reduce, the existence of bias in an employer's hiring practices.

  • At the front end, does the tool have dubious and invasive screening questions?

Competency and personality testing are among the most popular forms of predictive analytics in employee recruiting and hiring. Like many other types of predictive analytics, competency and personality testing aim to incorporate objective metrics to screen candidates. Such testing, however, is a veritable minefield of potential biases and, as a result, employer liability. For example, competency testing may fail to adequately measure competencies that are related to the job at issue. If an employer implements high-level competency testing for a low-skill manufacturing position, it may run the risk of introducing illegitimate factors into the employment decision and disfavor qualified individuals from uneducated backgrounds. Disparate impact discrimination risks may surface if certain protected classes from uneducated backgrounds are overrepresented in the rejected applicant pool.

Personality testing that disfavors women or minority candidates carries similar risks as responses to hypothetical scenario questions may be correlated with preferences that are derived from impermissible sex-based biases and stereotypes. Other obvious risk factors include invasive questions about issues ranging from physical disabilities and marital status to political beliefs and feelings about controversial issues.

  • Does removing the human element eliminate selection bias?

Many employers utilize predictive analytics in candidate screening as a means to reduce biased hiring decisions. Unconscious biases by recruiters and hiring managers could lead to judgments about candidates based on protected characteristics. By instead screening candidates on purportedly neutral and identical criteria, employers hope to reduce the possibility that such biases will lead to qualified candidates being overlooked.

The underrepresentation of existing women and minority employees, however, may raise concerns that an employer's data-driven screening process contains biases of its own. These biases can result from a variety of circumstances. As described above, a candidate screening tool may adapt to and perpetuate existing biases within an employer's workforce. With regard to resume software, certain control variables may screen out qualified candidates based on names, education, and other candidate information. These data points may in turn be proxies for protected characteristics such as sex, race, age, and sexual orientation.

But, overcorrecting could lead to reverse discrimination. A sudden and disproportionate rise in an employer's workforce of new female and minority hires may trigger concerns about the employer's candidate screening process.

Conclusion

As Upturn's 2018 report states, bias in predictive analytics can appear in many places and take many forms, and can be difficult for employers to detect and correct. With technological innovation outpacing lawmaking and regulation, employers may be left with little guidance as to whether and how to use particular predictive tools. Employers should, for these and many other reasons, exercise caution in implementing predictive hiring tools and promptly discuss any concerns about a particular screening tool with legal counsel.

Christopher Wilkinson is an employment law partner in Orrick's Washington, D.C. office. He previously served as Associate Solicitor for Civil Rights and Labor Management at the U.S. Department of Labor.

David B. Smith is an employment law senior associate in the firm's Washington, D.C. office.

Alex Mitchell is an employment law career associate based in the firm's Global Operations Center.