Right Place, Wrong Target? Targeted Ads and Tech-Driven Discrimination
Aside from obvious privacy concerns, the value of personal information also presents a concern less often discussed: discrimination.
September 26, 2019 at 12:41 PM
8 minute read
Welcome to 2019—a time when there seems to be a major corporate "hack" every month and consumer data is the new hot commodity. No one is safe! Whether it is the FaceApp or the Cambridge Analytica scandal, consumer data is becoming increasingly valuable and vulnerable. Aside from obvious privacy concerns, the value of personal information also presents a concern less often discussed: discrimination.
Employers who are paying attention know that it is illegal to discriminate based on an individual's race, gender, religion, national origin, disability or age. For example, it is illegal to post a job opportunity for an individual that is "40 years or younger" or openly seek out or exclude someone of a particular race or gender. In 2014, Ventura Corp. paid $354,250 to settle a lawsuit filed by the Equal Employment Opportunity Commission (EEOC) claiming the company discriminated against men because it had a pattern or practice of refusing to hire qualified male applicants. The EEOC also more recently sued the University of Kansas Medical Center for violation of the Age Discrimination in Employment Act (ADEA) based on allegations from a former employee that the medical center's department head instructed staff to give hiring preference to millennials over older applicants. Just this month, with the help of the American Civil Liberties Union (ACLU), a groundbreaking settlement was reached between the federal agency that operates AmeriCorps and plaintiffs in a class action suit filed against AmeriCorps alleging its screening process violated the Rehabilitation Act because it included a detailed medical questionnaire and restrictive prior health screening guidelines that ultimately result in discrimination against individuals with certain disabilities.
But, what about companies that utilize targeted advertisements for job postings where only people of a certain age, race or gender will see them? Is this practice illegal discrimination, even if it is inadvertent or unintentional? Due to the increasing capabilities of artificial intelligence (AI), a whole new world with respect to litigation related to technology-driven discrimination is emerging. In fact, U.S. Sens. Kamala Harris, Elizabeth Warren, and Patty Murray wrote a letter to the acting chair and commissioners of the EEOC on Sept. 17, 2018, stating their concern that facial analysis technologies used in the hiring process may perpetuate gender, racial, age and other biases in the workplace. The letter cited an example in which facial analysis technologies incorrectly identified 28 members of Congress as people who have been arrested, and disproportionately identified African American and Latino members of Congress in doing so. The senators ultimately requested that the EEOC put in place guidelines that explain to employers how facial analysis algorithmic techniques may result in bias and violate anti-discrimination laws.
Whether unintentional or inadvertent discrimination against job applicants is actionable is open for debate. For example, the U.S. Court of Appeals for the Seventh Circuit recently held in Kleber v. CareFusion, that protections under the ADEA only apply to current employees. However, in April, the U.S. District Court for the Northern District of California granted a motion for collective action against PricewaterhouseCoopers for discrimination under the ADEA for allegedly using a recruitment tool for hiring that could only be accessed by those on college campuses or using school-affiliated job sites—allegedly resulting in discrimination against entry-level applicants who are over the age of 40. Specifically, the collective action is open to all individuals who, from Oct. 18, 2013, forward, applied for a covered position in PwC's tax or assurance lines of service, met the minimum qualifications for the position to which they applied, were age 40 or older, and were not hired.
In 2017, ProPublica, along with the New York Times, published an article outing companies like Verizon, Amazon, Goldman Sachs, Target and Facebook for placing ads on social media platforms that target particular demographics, a method called "microtargeting." The New York Times also published an article in March detailing Facebook's recent efforts to prohibit companies from advertising housing, jobs or financial services to certain people based on certain demographics or characteristics. This move came after at least five lawsuits were filed against Facebook claiming it allows companies to target advertisements for jobs, home sales, and credit offers to certain people and not others according to gender, age and zip code.
By way of example, on Dec. 20, 2017, a lawsuit was filed in the U.S. District Court for the Northern District of California by the Communications Workers of America and others "seeking to vindicate the rights of older workers to be free of age discrimination in employment advertising, recruiting and hiring." This suit was filed against T-Mobile, US, Amazon.com, Cox Communications and Cox Media Group for "targeting their employment ads to younger workers via Facebook's ads platform." The complaint detailed the process companies go through after purchasing a Facebook ad—a process which the complaint alleges is a "conduit for age discrimination" because Facebook requires employers or employment agencies to select the population of Facebook users who will receive the ad, including the age range or "ethnic affinities" of those users, which they argue inherently leads to age-and-race-based discrimination in hiring.
While not as blatant as other forms of discrimination, targeted ads can be just as discriminatory and further exacerbate the problem of discrimination in both housing and hiring. Thus, targeted ads present a new area of potential liability for employers who are seeking new talent, but in the wrong way, and are only expected to increase in scope as AI becomes more commonplace. Even if a company is not intentionally discriminating in its hiring, or uses a third-party platform like Facebook to post ads, this does not necessarily excuse them from liability. Under Title VII and the ADEA, an employer can be found liable for employment discrimination under the disparate impact theory of liability where intent is not required and even a facially neutral hiring practice that merely results in discrimination can be problematic.
So what can employers do to ensure they are not part of the problem and limit their exposure to liability? Whether a company hires internally, outsources its hiring through third-party companies, or whether it uses automated hiring practices, there are steps companies can take to make sure that the techniques used will not inadvertently affect certain job seekers. Companies should implement trainings for hiring managers, supervisors and the employees working under them so that they understand federal, state and local anti-discrimination laws that exist where the company operates. A training on the importance of recognizing and preventing implicit bias should also be implemented. In addition, companies should ensure job opportunities are posted in various places where they are accessible by all, regardless of age, gender or zip code. Job postings should clearly articulate the requirements and duties of the job, the skills and qualifications applicants should possess and should include gender-neutral terms. According to a New York Times article, phrases like "top-tier" and "aggressive" tend to attract more male candidates, whereas terms and phrases like "partnerships" and "passion for learning" attract women applicants. Employers should take a close look not only at the individuals being hired, but also at those actually applying and being brought in for interviews to make sure the candidates are diverse. If this is not the case, changes must be made.
When it comes to interviewing candidates, have structured interviews wherein the same predetermined questions are asked to all candidates. Consider the company's values and ask questions that elicit responses that will help determine if a candidate shares similar values that would make them a good fit. If the resources are available, have more than one person interview each candidate. If your company uses an algorithm or some other technology-driven tool to find potential new hires, take the time to understand the tool and the way it works to make sure that it is not perpetuating human biases and narrowing the pool of candidates in unlawful ways, and that job opportunities are visible to a diverse audience and not certain targeted demographics. While the combination of the value of consumer data and AI can potentially pose problems for employers, companies that are conscientious and proactive can take steps today to be a part of the solution.
Fara A. Cohen is an associate at Griesing Law, where she focuses her practice on corporate transactions, commercial litigation and employment law. She can be reached at 215-501-7849 or [email protected].
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllPa. Federal District Courts Reach Full Complement Following Latest Confirmation
The Defense Bar Is Feeling the Strain: Busy Med Mal Trial Schedules Might Be Phila.'s 'New Normal'
7 minute readFederal Judge Allows Elderly Woman's Consumer Protection Suit to Proceed Against Citizens Bank
5 minute readJudge Leaves Statute of Limitations Question in Injury Crash Suit for a Jury
4 minute readTrending Stories
- 1Call for Nominations: Elite Trial Lawyers 2025
- 2Senate Judiciary Dems Release Report on Supreme Court Ethics
- 3Senate Confirms Last 2 of Biden's California Judicial Nominees
- 4Morrison & Foerster Doles Out Year-End and Special Bonuses, Raises Base Compensation for Associates
- 5Tom Girardi to Surrender to Federal Authorities on Jan. 7
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250