Section 230 of the Communications Decency Act, enacted in the 1990s, is the legal foundation of the modern internet. Its subsection (c)(1) states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” By treating the provider or re-poster as merely the conduit for third party content instead of a responsible publisher, subsection (c)(1) has protected search engines, bulletin boards, buyer-seller exchanges and social media against civil liability for what third party creators post, turning the internet into the cheapest and most accessible form of publishing ever known. Along with the hundred blooming flowers that the drafters of Section 230 hoped for, it has generated a bumper crop of noxious weeds. Among them are the “challenges” posted on social media, which dare young people to create videos of themselves performing some kind of risky act. Some of the challenges are merely disgusting. Some are potentially fatal. One has led to the Third Circuit’s recent decision in Anderson v. TikTok that limited the scope of Section 230 immunity.

The so-called Blackout Challenge posted on TikTok dares people to self-asphyxiate until they lose consciousness. Ten-year-old Nylah Anderson took the Blackout Challenge. Tragically, she strangled herself to death. Nylah hadn’t found the Blackout Challenge by chance or by browsing at random. TikTok brought it to her attention through its “For You Page” algorithm, which suggests to users new things that they will like. According to the Third Circuit, the algorithm makes recommendations based on “a variety of factors, including the user’s age and other demographics, online interactions, and other metadata.”

Nylah’s mother sued TikTok for wrongful death, alleging that TikTok was negligent by posting the Blackout Challenge at all, by allowing minors to post videos of their participation, and by recommending and promoting the Blackout Challenge through its algorithm. The district court dismissed the complaint on the ground that Section 230 immunized TikTok from any liability as publisher. The Court of Appeals reversed in part, vacated in part, and remanded for further proceedings.

Earlier this year, the Supreme Court had held in Moody v. NetChoice that an internet platform’s decision whether and how far to moderate content, like the editorial judgment of a traditional publication, was the platform’s expressive activity and thereby protected by the First Amendment. After reviewing the legal history that led to the enactment of Section 230 in 1996, the Third Circuit held that while Section 230 immunized TikTok from liability as “publisher or speaker” for the content posted by third parties, it did not immunize TikTok from potential liability for its own editorial judgment as a first party speaker. It held that the recommendations produced by TikTok’s “For You” algorithm were not “provided by another,” as Section 230 requires, but were generated by TikTok. Accordingly, it remanded for further proceedings on Anderson’s claim that TikTok was negligent in recommending and promoting the hazardous Blackout Challenge to impressionable children.

Concurring in the judgment and dissenting in part, Judge Paul Matey would have gone much further. His opinion points out that while TikTok had the power to take down the Blackout Challenge and response videos, it did not do so even after it knew that several minors had died performing the challenge. He argued that Section 230 should be read in the light of the common law meaning of the term “publisher,” which therefore immunized TikTok from the initial posting of the Blackout Challenge. However, he continued, the common law had held liable “distributors” of material that they knew to be unlawful, including common carriers such as telegraph companies. Matey would therefore have held that Section 230 did not immunize TikTok from the decision not to remove the Blackout Challenge and response videos once it knew of their harmful effect on children.

Anderson bucks the trend of decisions broadly interpreting Section 230 immunity, and it has attracted worried criticism from friends of the tech industry. Some of that criticism may be more alarmist than necessary. Under well-established First Amendment law, publishers of magazine articles and films have been held not liable for the acts of teenaged copycats on the ground that the publisher did not intend to incite imitation. Anderson is distinguishable from those cases, however, because the Blackout Challenge and similar challenge videos do invite the viewer to imitate them and broadcast their actions on line.

TikTok moved unsuccessfully for panel rehearing or rehearing en banc, and the Third Circuit’s mandate issued on Oct. 31. If the Anderson majority opinion stands, it will require internet hosts to devote far more time and money than they do now to tailoring their recommendation algorithms. If Matey’s separate opinion gains acceptance, it will require far more aggressive monitoring and culling of content, once posted, that could produce common law liability. Matey cites liberally to Justice Thomas’ dissents from prior denials of certiorari in an open invitation for the court to grant certiorari in this case. We will not be surprised if it does. But since Section 230 immunity is statutory, the ultimate answer will lie with Congress.