Hiring ex Machina: Preparing for Illinois's Artificial Intelligence Video Interview Act
Given recent guidance and legislation on artificial intelligence, counsel and HR personnel need to appreciate both the underlying technology and the scope of the laws.
September 23, 2019 at 07:00 AM
8 minute read
|
The term "artificial intelligence" is often used but infrequently understood. There is no one definition for artificial intelligence. Indeed, what constitutes artificial intelligence evolves over time—the so-called "AI effect." Nevertheless, businesses increasingly use forms of artificial intelligence to manage human capital. Given recent guidance and legislation on artificial intelligence, counsel and HR personnel need to appreciate both the underlying technology and the scope of the laws. We all need to become tech-literate lawyers.
|How We Work with AI
"Artificial intelligence" typically denotes various machine learning applications and predictive algorithms that can mimic human-like cognitive functions. Often, AI is designed to evolve, learn and change over time as more data points fuel the algorithm. There are myriad forms of machine learning and artificial intelligence (weak, narrow, supervised, unsupervised, deep learning, etc.).
Within the context of human resources (HR), talent acquisition and recruitment, AI is integrated into business HR functions by supporting candidate engagement, selection and retention. AI enables employers to quickly sort, vet and qualify hundreds, if not thousands, of candidates at the click of a mouse. For instance, some programs use natural-language processing and machine learning to construct a psychological profile that predicts whether a person will fit a company's "culture."
|Artificial Intelligence and Updating the Law
Governments and regulators—both federal and state—are attempting to shape laws to ensure the playing field is equitable for all. In Illinois, the Illinois Artificial Intelligence Video Act (AIV), Illinois HB 2557 was signed into law by Governor J.B. Pritzker last month. The law, which goes into effect in January 2020, requires that Illinois employers using "artificial intelligence analysis" on applicant-submitted videos, provide job applicants with "information before the interview explaining how the artificial intelligence works and what characteristics it uses to evaluate applicants" and obtain their consent. With 2020 rapidly approaching, Illinois businesses that use or seek to use AI-enhanced hiring and screening tools need to consider how to comply with the Act.
To Whom Will the Act Apply? The Act is very broad. It applies where an employer: (1) is considering an applicant in for a position located in Illinois; (2) asks applicants to record videos; and (3) performs "artificial intelligence analysis" on those videos. The first two criteria seem relatively straightforward. As many other commentators have pointed out, however, "artificial intelligence" is not defined. Many notable, cutting-edge companies offer HR solutions in this space—Montage Talent, AllyO, and Knockri. Without a more precise definition, employers would be wise to comply with the law whenever they perform any data analysis on applicant videos.
What to Consider When Preparing to Provide Notice: While understating when compliance is required is relatively straightforward, determining how to comply is decidedly more difficult. The Act states only that businesses should provide job applicants with "information explaining how the artificial intelligence works" and identify "what characteristics it uses to evaluate applicants.
On What Characteristics Did the AI Rely? Like the algorithm itself, it may be difficult or impossible to identify the characteristics certain forms of "artificial intelligence" relied upon to reach determinations. Unsupervised learning techniques, for example, find patterns that humans may not see. Unsupervised learning, moreover, does not rely on pre-existing labels to group data. Such techniques, thus, may not yield interpretable characteristics to disclose to job applicants.
Developers of deep learning neural nets, likewise, may not know the precise characteristics on which their algorithms relied. This has led to a push for forms of "explainable AI" and the development of tools such as Local Interpretable Model-Agnostic Explanations (LIME), which may help identify the data points on which an algorithm chiefly relied. How do you know the factors an algorithm used to reach a decision? We use another algorithm. Others have also advocated using more simplistic forms of AI, such as linear regressions, which are easier to explain.
|Consequences Under the Illinois Artificial Intelligence Video Act
The current language of the AIV is silent on a number of enforcement questions, including: (1) whether job candidates have a private right of action for violations of the Act; (2) what remedies, if any, are available; and (3) what employers are covered under the Act. However, the disclosure and notice provisions of the Act resemble those required under FCRA. Thus, while it remains to be seen whether the AIV will be read to contain a private right of action, a review of FCRA-related decisions may be instructive in predicting how courts could interpret the law.
FCRA permits a private right of action to be brought within two years of the alleged unlawful background check. Under FCRA, a prevailing plaintiff may be awarded actual damages and attorneys fees and costs for negligent violations and, for willful violations, successful plaintiffs are entitled to damages up to $ 1,000 per violation, as well as punitive damages, attorneys fee and costs. Given FCRA's highly specific technical requirements, liability exposure for each individual violation, plus the multiplying effect of FCRA penalties, employers who violate FCRA often face class action litigation. For instance, in Shonfield v. Delta Air Lines, Inc., the United States District Court for the Northern District of California included an order granting the representative plaintiff's motion for preliminary approval of class settlement totaling $2.3 million for a class of approximately 44,100 class members who alleged that Delta had given them inadequate FCRA disclosure documentation when they applied for employment and consented to background checks.
In light of the similarities between FCRA and the AIV, employers need to appreciate that the plaintiffs' bar may seize on violations of the AIV just as they have pursued FCRA violations. Groups of job applicants who submitted videos but were ultimately not selected could form a relatively discrete class. Technical compliance failures with the disclosure and notice requirements under the AIV could lead to multimillion dollar settlements. Courts challenged with interpreting the AIV, moreover, may look to FCRA penalties absent other direction.
Other laws and regulatory schemes, moreover, may guide us as to what constitutes a violation of the AIV in the first place. The precise scope and detail required to comply with the AIV is unclear. Nevertheless, other laws and regulations governing the use of AI, such as General Data Protection Regulation (GDPR) Article 22, indicate that a basic, superficial explanation may be insufficient. Academics citing industry and regulatory guidance concerning GDPR Article 22, for instance, suggest that a detailed, fulsome explanation will be needed, including the significance and envisaged consequences of the data processing, the categories of data used in the decision-making process, the source of that data, how the data profiles are built, and how data is used.
|A Path Towards Compliance
Businesses can help to ensure compliance by taking a few precautions. First, the companies that provide "AI" enhanced recruiting tools are the best source for information about the underlying technology and the characteristics upon which their algorithms rely. While most of these companies will likely be willing to help their customers, ultimately it is the businesses duty to comply, not their vendors. As a result, when purchasing or licensing an AI recruiting tool, businesses should ensure they understand from where the third-party is sourcing their data. This is critical in ensuring that bias does not become embedded in the data. Business should also ensure that their agreement contains a requirement or duty that the vendor cooperate with reasonable requests necessary to comply with the Act.
Second, if a business has its own data science team, it should seek to include those personnel both in its communications with the vendor and with internal or outside counsel. These professionals may be able to help distill the complicated technologies that the businesses will need to understand and explain.
Third, and perhaps most critically, businesses must hire or retain tech-literate counsel. Counsel who cannot appreciate the differences between unsupervised and supervised learning or between a linear regression and a deep learning neural network will be hard pressed to draft an accurate, fulsome consent disclosure.
By following these steps, businesses can minimize the risk of non-compliance with a vague yet important law.
Justin Steffen is a Litigation Partner at Ice Miller where he helps FinTech and traditional clients overcome the obstacles to innovation. In his spare time, he teaches FinTech and the Law at both Loyola (Chicago) Law School and Northwestern Law School and co-chairs the Chicago Bar Association's Financial and Emerging Technology Committee and the FinTEx Regulatory Committee.
Heather Adams is of counsel in Ice Miller's Labor and Employment Group. She is a business-oriented labor and employment attorney who regularly advises and represents clients in high-stakes whistleblower actions and various other areas of employment law.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1Gibson Dunn Sued By Crypto Client After Lateral Hire Causes Conflict of Interest
- 2Trump's Solicitor General Expected to 'Flip' Prelogar's Positions at Supreme Court
- 3Pharmacy Lawyers See Promise in NY Regulator's Curbs on PBM Industry
- 4Outgoing USPTO Director Kathi Vidal: ‘We All Want the Country to Be in a Better Place’
- 5Supreme Court Will Review Constitutionality Of FCC's Universal Service Fund
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250