Photo Credit : Spencer E Holtaway/flickr
|

A class action alleges that Facebook Inc. fails to provide a safe workplace for content moderators who suffer from psychological trauma relating to their jobs of weeding out graphic content depicting child abuse, suicide and other violent events.

Filed in California's San Mateo Superior Court this past Friday, the lawsuit seeks to represent thousands of California residents who worked as content moderators for Facebook over the past three years and allegedly suffered from trauma after viewing thousands of videos and images of “child abuse, rape, torture, bestiality, beheadings, suicide, murder and other forms of extreme violence.”

Also named in the suit is Pro Unlimited Inc., the Boca Raton, Florida, company that employed the content moderators bringing suit. Plaintiffs seek injunctive relief and a medical monitoring fund to help pay for “baseline tests and diagnostic examinations” to help diagnose post-traumatic stress disorder and other health effects.

“It is well-documented that repeated exposure to such images can have a profoundly negative effect on the viewer,” said Korey Nelson, plaintiffs' attorney from Burns Charest, in a statement. “Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job.” Burns Charest filed the suit along with the Joseph Saveri Law Firm in San Francisco.

Facebook spokesman Bertie Thomson said in an emailed statement: “We are currently reviewing this claim. We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources. Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling, available at the location where the plaintiff worked, and other wellness resources like relaxation areas at many of our larger facilities.”

Selena Scola, the plaintiff in the lawsuit, claimed she worked as a public content contractor at Facebook in Menlo Park and Mountain View from June 2017 to March 2018. She is one of 7,500 content moderators worldwide that filter Facebook's site.

“From her cubicle in Facebook's Silicon Valley offices, Ms. Scola witnessed thousands of acts of extreme and graphic violence,” the complaint said. “As a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace, Ms. Scola developed and suffers from significant psychological trauma and post-traumatic stress disorder ('PTSD').”

Her complaint said Facebook receives more than 1 million user reports of objectionable content every day. Many reports are filed over images on Facebook Live, which has allowed people to livestream “murder, beheadings, torture, and even their own suicides,” the complaint said.

But Facebook has failed to follow industry safety standards, some of which it helped draft, such as altering the image so that it's blurred, in black and white, smaller, or without audio. Other internet service providers also have mandatory psychological counseling for content moderators and psychological assessments for job applicants.

“Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop,” the complaint said. “Facebook content moderators review thousands of trauma-inducing images each day, with little training on how to handle the resulting distress.”

The complaint brings negligence and consumer fraud claims under California law.

Regarding the medical monitoring fund, Steven Williams, of the Joseph Saveri Law Firm in San Francisco, said in a statement: “Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized.”

In a footnote, Scola said she planned to amend her complaint with “additional and known allegations” of her experience that were left out of the lawsuit “out of an abundance of caution” due to a nondisclosure agreement she signed with Facebook.

The complaint references a 2017 article in The Guardian, in which one former Facebook content moderator said: “You'd go to work at 9 a.m. every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that's what you see. Heads being cut off.”

In 2016, two former workers at Microsoft who flagged child pornography, killings, bestiality and other videos at the software company sued for failing to provide them adequate mental health support.