Facebook sign

Years ago, a young mom I knew—a lawyer—took a job with the Justice Department prosecuting child sexual exploitation cases. She knew it was important work, but it necessitated viewing the evidence.

It fundamentally changed her. "Every time someone smiles at my baby, I now immediately think 'PERVERT,'" she told me. 

I don't think there's any way to do a job like that without it exacting a deep toll (and paradoxically, the people who wouldn't mind seeing such images are not the ones you'd want in these positions).

At least my friend was empowered to go after the slimeballs personally—to prosecute them and put them in prison. That was some consolation.  

Jenna GreeneBut the small army of people who work for Facebook or its vendors as content moderators play a far more passive role. They're tasked with removing posts that violate the social media giant's terms of use—photos, videos and livestreams of "child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder … constant and unmitigated exposure to highly toxic and extremely disturbing images," according to a putative class action against Facebook by its traumatized moderators.

They alleged that the company has ignored workplace safety standards for content moderators, instead forcing them "to work in dangerous conditions that cause debilitating physical and psychological harm," in violation of California law.

Facebook has now agreed to pay $52 million to settle the suit, which was filed in San Mateo County, California Superior Court in September of 2018. As first reported by The Verge, the deal will give 11,250 current and former content moderators a minimum of $1,000 each.

If class members are diagnosed with post-traumatic stress disorder or related conditions, they can also receive money for their treatment costs, as well as additional damages totaling up to $50,000.

The deal is subject to approval by Judge V. Raymond Swope. 

In a press release on Tuesday, plaintiffs counsel Daniel Charest of Burns Charest in Dallas called the settlement "a great result for the class members. This groundbreaking litigation fixed a major workplace problem involving developing technology and its impact on real workers who suffered in order to make Facebook safer for its users."

The plaintiffs are also represented by the Joseph Saveri Law Firm in San Francisco and the Law Office of William Most in New Orleans.

Facebook is represented by a Covington & Burling team that includes partners Ashley Simonsen, Emily Johnson Henn and Megan Rodgers and associate Kathryn Cahoy. A Covington spokesman referred a request for comment to Facebook, which did not respond. 

In the settlement agreement, Facebook denies any wrongdoing or liability. It says it agreed to settle the case to "avoid further expense, inconvenience, and the distraction of burdensome and protracted litigation and thereby to put to rest this controversy and avoid the risks inherent in complex litigation."

The complaint lays out an utterly soul-crushing description of the work performed by content moderators. 

It's an enormous task—every day, Facebook gets more than one million user reports of potentially objectionable content on its social media sites and applications. "Facebook Live" —a feature that allows users to broadcast live video streams on their Facebook pages—sounds particularly problematic.

Among the examples cited in the complaint: "[A] father killed his 11-month-old daughter and livestreamed it before hanging himself. Six days later, Naika Venant, a 14-year-old who lived in a foster home, tied a scarf to a shower's glass doorframe and hung herself."

The complaint also quotes one moderator who described his job to the Guardian newspaper like this: "You'd go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that's what you see. Heads being cut off."

According to the complaint, Facebook was part of a coalition of internet companies that crafted industry standards for minimizing harm to content moderators. However, the plaintiffs alleged that Facebook "failed to implement the very standards it helped create."

Other internet companies, for example, blunt the impact on content moderators by blurring or distorting images, rendering them in black and white or only showing them in thumbnail size. Audio is removed from videos. Moderators are also provided with mandatory psychological counseling.

"Facebook does not take these steps," the complaint states.

In addition to the monetary payments, Facebook in the settlement agreed to conduct resiliency pre-screening and assessments as part of its recruitment and hiring processes, improve its review tools, allow moderators to remove themselves from viewing specific types of content and make counseling by trained, licensed professionals available.