Timing Is Ripe for Section 230 Amendments — But the 'How' Is Missing
It may be now or never for making changes to the rules governing the responsibility online content platforms have for content posted to their sites. But there are still tough questions bogging down the process.
October 24, 2019 at 11:00 AM
4 minute read
There could be no time like the present for anyone hoping to see amendments made to Section 230 of the Communications Decency Act.
The long-standing provision buffering tech companies from liability for third-party content posted to their sites has been the subject of much debate recently, whether from members of the U.S. House of Representatives Energy and Commerce Committee concerned over similar language appearing in U.S. trade agreements or U.S. senators seeking amendments to address alleged tech company bias against conversation.
But political momentum still may not be enough to paper over some of the thornier questions surrounding how online content can be feasibly moderated at the federal or even the state level.
"If I was placing bets, I wouldn't want to bank my house or anything on seeing any of these things move forward," said Jessica Lee, a partner and co-chairwoman of the privacy, security and data innovations practice at Loeb & Loeb. "I think we'll see something eventually, but it's really the timing that's up in the air."
However, it's hard to imagine the timing getting much better than now for amendments to Section 230. Lee thinks that the right atmosphere for a change is certainly in place, citing concerns over the integrity of the upcoming 2020 presidential election, the rise of bullying and hate crimes online, and the mounting backlash against big tech as factors.
"I think that if it's going to happen, now is the time that it's going to happen. Will it happen? I think you still have some challenges," Lee said.
Liz Harding, a shareholder at Polsinelli, referred to those hurdles as "the law of unintended consequences."
She agrees that the political will currently exists to make changes to the ways in which online platforms are held accountable for the content posted on their sites, but identified some practical concerns likely to fetter progress.
For starters, big-picture concerns about censorship and the inadvertent erosion of free speech could temper lawmakers' enthusiasm. On the micro-level, there's the complicated process of actually divining parameters around what constitutes hate speech or content likely to provoke violence, both of which may ultimately be easier to recognize than political bias.
"Determining what should be censored from publication is extremely subjective. What is offensive to you may not be to me. What you see as political bias may be different to my interpretation," Harding said.
So are new rules around how online platforms moderate content a lost cause?
Despite some of the complications at play, the interest in gaining some kind of traction on the problem remains. Lee raised the possibility that the individual states could step up to fill the absence of a federal standard with laws of their own, similar to how California has taken the initiative on privacy with the forthcoming California Consumer Privacy Act.
However, it's not a given that a state-centric approach would be greeted kindly at the federal level.
"I think the problem with tackling this at the state level is that of preemption and whether any proposed state law could effectively serve to change the Section 230 protections for platforms," Harding said.
However, continued public and political pressures could still yield progress. Lee indicated that tech companies might attempt to get ahead of the issue by proposing their own content moderation solution.
Still, some healthy skepticism may apply. "If they put something out or they put a solution out, it has to be something that's going to pass the smell test and not just be, like, some tech-shine," Lee said.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1Ed Sheeran Meets Led Zeppelin in the Second Circuit
- 2You Too Can Be a Programmer: How Generative AI Can Upskill Any E-Discovery Professional to Write Code
- 3Is Arbitration Working?
- 4New Battleground: Wall Street Law Firms Eye London Growth
- 5Standstill: Court Defers to Legislature on Texas Healthcare
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250