Welcome to 2019—a time when there seems to be a major corporate "hack" every month and consumer data is the new hot commodity. No one is safe! Whether it is the FaceApp or the Cambridge Analytica scandal, consumer data is becoming increasingly valuable and vulnerable. Aside from obvious privacy concerns, the value of personal information also presents a concern less often discussed: discrimination.

Employers who are paying attention know that it is illegal to discriminate based on an individual's race, gender, religion, national origin, disability or age. For example, it is illegal to post a job opportunity for an individual that is "40 years or younger" or openly seek out or exclude someone of a particular race or gender. In 2014, Ventura Corp. paid $354,250 to settle a lawsuit filed by the Equal Employment Opportunity Commission (EEOC) claiming the company discriminated against men because it had a pattern or practice of refusing to hire qualified male applicants. The EEOC also more recently sued the University of Kansas Medical Center for violation of the Age Discrimination in Employment Act (ADEA) based on allegations from a former employee that the medical center's department head instructed staff to give hiring preference to millennials over older applicants. Just this month, with the help of the American Civil Liberties Union (ACLU), a groundbreaking settlement was reached between the federal agency that operates AmeriCorps and plaintiffs in a class action suit filed against AmeriCorps alleging its screening process violated the Rehabilitation Act because it included a detailed medical questionnaire and restrictive prior health screening guidelines that ultimately result in discrimination against individuals with certain disabilities.

But, what about companies that utilize targeted advertisements for job postings where only people of a certain age, race or gender will see them? Is this practice illegal discrimination, even if it is inadvertent or unintentional? Due to the increasing capabilities of artificial intelligence (AI), a whole new world with respect to litigation related to technology-driven discrimination is emerging. In fact, U.S. Sens. Kamala Harris, Elizabeth Warren, and Patty Murray wrote a letter to the acting chair and commissioners of the EEOC on Sept. 17, 2018, stating their concern that facial analysis technologies used in the hiring process may perpetuate gender, racial, age and other biases in the workplace. The letter cited an example in which facial analysis technologies incorrectly identified 28 members of Congress as people who have been arrested, and disproportionately identified African American and Latino members of Congress in doing so.  The senators ultimately requested that the EEOC put in place guidelines that explain to employers how facial analysis algorithmic techniques may result in bias and violate anti-discrimination laws.