Facebook headquarters in Menlo Park, California. Photo: David Paul Morris/Bloomberg

A class action alleges Facebook Inc. fails to provide a safe workplace for content moderators hired by a Boca Raton-based contractor, claiming they were traumatized while weeding out graphic content depicting child abuse, suicide and other violent acts.

Filed in California's San Mateo Superior Court on Friday, the lawsuit seeks to represent thousands of California residents who deleted Facebook content over the past three years and allegedly suffered psychological trauma by viewing thousands of videos and images of “child abuse, rape, torture, bestiality, beheadings, suicide, murder and other forms of extreme violence.”

The defendants are Facebook and Pro Unlimited Inc., the Boca Raton company that employed the content moderators. Plaintiffs seek injunctive relief and a medical monitoring fund to help pay for “baseline tests and diagnostic examinations” to help diagnose post-traumatic stress disorder and other health effects.

“It is well-documented that repeated exposure to such images can have a profoundly negative effect on the viewer,” Korey Nelson, plaintiffs attorney from Burns Charest, said in a statement. “Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job.”

Burns Charest filed the suit along with the Joseph Saveri Law Firm in San Francisco.

Facebook did not respond to a request for comment by deadline.

Selena Scola, the plaintiff in the lawsuit, claimed she worked as a public content contractor at Facebook in Menlo Park and Mountain View from June 2017 to March 2018. She is one of 7,500 content moderators worldwide that filter Facebook's site.

“From her cubicle in Facebook's Silicon Valley offices, Ms. Scola witnessed thousands of acts of extreme and graphic violence,” the complaint said. “As a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace, Ms. Scola developed and suffers from significant psychological trauma and post-traumatic stress disorder.”

Her complaint said Facebook receives more than 1 million user reports of objectionable content every day. Many reports are filed over images on Facebook Live, which has allowed people to livestream “murder, beheadings, torture, and even their own suicides,” the complaint said.

But Facebook has failed to follow industry safety standardssome of which it helped draft, such as altering the image so it's blurred, in black and white, smaller or without audio. Other internet service providers have mandatory psychological counseling for content moderators and psychological assessments for job applicants.

“Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop,” the complaint said. “Moderators review thousands of trauma-inducing images each day, with little training on how to handle the resulting distress.”

The complaint brings negligence and consumer fraud claims under California law.

Regarding the proposed medical monitoring fund, Steven Williams of the Joseph Saveri Law Firm in San Francisco said in a statement, “Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized.”

In a footnote, Scola said she planned to amend her complaint with “additional and known allegations” of her experience that were left out of the lawsuit “out of an abundance of caution” due to a nondisclosure agreement she signed with Facebook.

The complaint references a 2017 article in The Guardian, in which a former Facebook content moderator said: “You'd go to work at 9 a.m. every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that's what you see. Heads being cut off.”

In that article, a company spokeswoman said Facebook provides training and assessment of moderators after two weeks.

“We recognize that this work can often be difficult,” she told The Guardian. “That is why every person reviewing Facebook content is offered psychological support and wellness resources. We have worked with psychologists to put a program in place that is specifically designed to support people in these roles. The program includes counseling, resiliency training and support programs.”

In 2016, two former workers at Microsoft who flagged child pornography, killings, bestiality and other videos at the software company sued for failing to provide adequate mental health support.