Right Place, Wrong Target? Targeted Ads and Tech-Driven Discrimination
Aside from obvious privacy concerns, the value of personal information also presents a concern less often discussed: discrimination.
September 26, 2019 at 12:41 PM
8 minute read
Welcome to 2019—a time when there seems to be a major corporate "hack" every month and consumer data is the new hot commodity. No one is safe! Whether it is the FaceApp or the Cambridge Analytica scandal, consumer data is becoming increasingly valuable and vulnerable. Aside from obvious privacy concerns, the value of personal information also presents a concern less often discussed: discrimination.
Employers who are paying attention know that it is illegal to discriminate based on an individual's race, gender, religion, national origin, disability or age. For example, it is illegal to post a job opportunity for an individual that is "40 years or younger" or openly seek out or exclude someone of a particular race or gender. In 2014, Ventura Corp. paid $354,250 to settle a lawsuit filed by the Equal Employment Opportunity Commission (EEOC) claiming the company discriminated against men because it had a pattern or practice of refusing to hire qualified male applicants. The EEOC also more recently sued the University of Kansas Medical Center for violation of the Age Discrimination in Employment Act (ADEA) based on allegations from a former employee that the medical center's department head instructed staff to give hiring preference to millennials over older applicants. Just this month, with the help of the American Civil Liberties Union (ACLU), a groundbreaking settlement was reached between the federal agency that operates AmeriCorps and plaintiffs in a class action suit filed against AmeriCorps alleging its screening process violated the Rehabilitation Act because it included a detailed medical questionnaire and restrictive prior health screening guidelines that ultimately result in discrimination against individuals with certain disabilities.
But, what about companies that utilize targeted advertisements for job postings where only people of a certain age, race or gender will see them? Is this practice illegal discrimination, even if it is inadvertent or unintentional? Due to the increasing capabilities of artificial intelligence (AI), a whole new world with respect to litigation related to technology-driven discrimination is emerging. In fact, U.S. Sens. Kamala Harris, Elizabeth Warren, and Patty Murray wrote a letter to the acting chair and commissioners of the EEOC on Sept. 17, 2018, stating their concern that facial analysis technologies used in the hiring process may perpetuate gender, racial, age and other biases in the workplace. The letter cited an example in which facial analysis technologies incorrectly identified 28 members of Congress as people who have been arrested, and disproportionately identified African American and Latino members of Congress in doing so. The senators ultimately requested that the EEOC put in place guidelines that explain to employers how facial analysis algorithmic techniques may result in bias and violate anti-discrimination laws.
Whether unintentional or inadvertent discrimination against job applicants is actionable is open for debate. For example, the U.S. Court of Appeals for the Seventh Circuit recently held in Kleber v. CareFusion, that protections under the ADEA only apply to current employees. However, in April, the U.S. District Court for the Northern District of California granted a motion for collective action against PricewaterhouseCoopers for discrimination under the ADEA for allegedly using a recruitment tool for hiring that could only be accessed by those on college campuses or using school-affiliated job sites—allegedly resulting in discrimination against entry-level applicants who are over the age of 40. Specifically, the collective action is open to all individuals who, from Oct. 18, 2013, forward, applied for a covered position in PwC's tax or assurance lines of service, met the minimum qualifications for the position to which they applied, were age 40 or older, and were not hired.
In 2017, ProPublica, along with the New York Times, published an article outing companies like Verizon, Amazon, Goldman Sachs, Target and Facebook for placing ads on social media platforms that target particular demographics, a method called "microtargeting." The New York Times also published an article in March detailing Facebook's recent efforts to prohibit companies from advertising housing, jobs or financial services to certain people based on certain demographics or characteristics. This move came after at least five lawsuits were filed against Facebook claiming it allows companies to target advertisements for jobs, home sales, and credit offers to certain people and not others according to gender, age and zip code.
By way of example, on Dec. 20, 2017, a lawsuit was filed in the U.S. District Court for the Northern District of California by the Communications Workers of America and others "seeking to vindicate the rights of older workers to be free of age discrimination in employment advertising, recruiting and hiring." This suit was filed against T-Mobile, US, Amazon.com, Cox Communications and Cox Media Group for "targeting their employment ads to younger workers via Facebook's ads platform." The complaint detailed the process companies go through after purchasing a Facebook ad—a process which the complaint alleges is a "conduit for age discrimination" because Facebook requires employers or employment agencies to select the population of Facebook users who will receive the ad, including the age range or "ethnic affinities" of those users, which they argue inherently leads to age-and-race-based discrimination in hiring.
While not as blatant as other forms of discrimination, targeted ads can be just as discriminatory and further exacerbate the problem of discrimination in both housing and hiring. Thus, targeted ads present a new area of potential liability for employers who are seeking new talent, but in the wrong way, and are only expected to increase in scope as AI becomes more commonplace. Even if a company is not intentionally discriminating in its hiring, or uses a third-party platform like Facebook to post ads, this does not necessarily excuse them from liability. Under Title VII and the ADEA, an employer can be found liable for employment discrimination under the disparate impact theory of liability where intent is not required and even a facially neutral hiring practice that merely results in discrimination can be problematic.
So what can employers do to ensure they are not part of the problem and limit their exposure to liability? Whether a company hires internally, outsources its hiring through third-party companies, or whether it uses automated hiring practices, there are steps companies can take to make sure that the techniques used will not inadvertently affect certain job seekers. Companies should implement trainings for hiring managers, supervisors and the employees working under them so that they understand federal, state and local anti-discrimination laws that exist where the company operates. A training on the importance of recognizing and preventing implicit bias should also be implemented. In addition, companies should ensure job opportunities are posted in various places where they are accessible by all, regardless of age, gender or zip code. Job postings should clearly articulate the requirements and duties of the job, the skills and qualifications applicants should possess and should include gender-neutral terms. According to a New York Times article, phrases like "top-tier" and "aggressive" tend to attract more male candidates, whereas terms and phrases like "partnerships" and "passion for learning" attract women applicants. Employers should take a close look not only at the individuals being hired, but also at those actually applying and being brought in for interviews to make sure the candidates are diverse. If this is not the case, changes must be made.
When it comes to interviewing candidates, have structured interviews wherein the same predetermined questions are asked to all candidates. Consider the company's values and ask questions that elicit responses that will help determine if a candidate shares similar values that would make them a good fit. If the resources are available, have more than one person interview each candidate. If your company uses an algorithm or some other technology-driven tool to find potential new hires, take the time to understand the tool and the way it works to make sure that it is not perpetuating human biases and narrowing the pool of candidates in unlawful ways, and that job opportunities are visible to a diverse audience and not certain targeted demographics. While the combination of the value of consumer data and AI can potentially pose problems for employers, companies that are conscientious and proactive can take steps today to be a part of the solution.
Fara A. Cohen is an associate at Griesing Law, where she focuses her practice on corporate transactions, commercial litigation and employment law. She can be reached at 215-501-7849 or [email protected].
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllPa. Federal District Courts Reach Full Complement Following Latest Confirmation
The Defense Bar Is Feeling the Strain: Busy Med Mal Trial Schedules Might Be Phila.'s 'New Normal'
7 minute readFederal Judge Allows Elderly Woman's Consumer Protection Suit to Proceed Against Citizens Bank
5 minute readJudge Leaves Statute of Limitations Question in Injury Crash Suit for a Jury
4 minute readTrending Stories
Who Got The Work
J. Brugh Lower of Gibbons has entered an appearance for industrial equipment supplier Devco Corporation in a pending trademark infringement lawsuit. The suit, accusing the defendant of selling knock-off Graco products, was filed Dec. 18 in New Jersey District Court by Rivkin Radler on behalf of Graco Inc. and Graco Minnesota. The case, assigned to U.S. District Judge Zahid N. Quraishi, is 3:24-cv-11294, Graco Inc. et al v. Devco Corporation.
Who Got The Work
Rebecca Maller-Stein and Kent A. Yalowitz of Arnold & Porter Kaye Scholer have entered their appearances for Hanaco Venture Capital and its executives, Lior Prosor and David Frankel, in a pending securities lawsuit. The action, filed on Dec. 24 in New York Southern District Court by Zell, Aron & Co. on behalf of Goldeneye Advisors, accuses the defendants of negligently and fraudulently managing the plaintiff's $1 million investment. The case, assigned to U.S. District Judge Vernon S. Broderick, is 1:24-cv-09918, Goldeneye Advisors, LLC v. Hanaco Venture Capital, Ltd. et al.
Who Got The Work
Attorneys from A&O Shearman has stepped in as defense counsel for Toronto-Dominion Bank and other defendants in a pending securities class action. The suit, filed Dec. 11 in New York Southern District Court by Bleichmar Fonti & Auld, accuses the defendants of concealing the bank's 'pervasive' deficiencies in regards to its compliance with the Bank Secrecy Act and the quality of its anti-money laundering controls. The case, assigned to U.S. District Judge Arun Subramanian, is 1:24-cv-09445, Gonzalez v. The Toronto-Dominion Bank et al.
Who Got The Work
Crown Castle International, a Pennsylvania company providing shared communications infrastructure, has turned to Luke D. Wolf of Gordon Rees Scully Mansukhani to fend off a pending breach-of-contract lawsuit. The court action, filed Nov. 25 in Michigan Eastern District Court by Hooper Hathaway PC on behalf of The Town Residences LLC, accuses Crown Castle of failing to transfer approximately $30,000 in utility payments from T-Mobile in breach of a roof-top lease and assignment agreement. The case, assigned to U.S. District Judge Susan K. Declercq, is 2:24-cv-13131, The Town Residences LLC v. T-Mobile US, Inc. et al.
Who Got The Work
Wilfred P. Coronato and Daniel M. Schwartz of McCarter & English have stepped in as defense counsel to Electrolux Home Products Inc. in a pending product liability lawsuit. The court action, filed Nov. 26 in New York Eastern District Court by Poulos Lopiccolo PC and Nagel Rice LLP on behalf of David Stern, alleges that the defendant's refrigerators’ drawers and shelving repeatedly break and fall apart within months after purchase. The case, assigned to U.S. District Judge Joan M. Azrack, is 2:24-cv-08204, Stern v. Electrolux Home Products, Inc.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250