Texas Outlaws 'Deepfakes'—but the Legal System May Not Be Able to Stop Them
In September, Texas became the first state in the country to criminalize "deepfakes"—video clips created with artificial intelligence that make people appear to say or do something they did not. But legal experts questioned the new law's constitutionality and said the rapidly evolving technology behind deepfakes has the potential wreak havoc on the legal system, particularly when it comes to authenticating evidence in litigation.
October 11, 2019 at 01:20 PM
7 minute read
In September, Texas became the first state in the country to criminalize "deepfakes"—video clips created with artificial intelligence that make people appear to say or do something they did not. But legal experts questioned the new law's constitutionality and said the rapidly evolving technology behind deepfakes has the potential wreak havoc on the legal system, particularly when it comes to authenticating evidence in litigation.
Texas Senate Bill 751 (SB751) amended the state's election code to criminalize deepfake videos created "with intent to injure a candidate or influence the result of an election" and which are "published and distributed within 30 days of an election." Doing so is now a class A misdemeanor and offenders can be sentenced to a year in a county jail and fined up to $4,000.
Politicians are calling deepfakes the newest threat to the country's democracy, a dangerous tool that can be used to sway voters in the weeks leading up to elections, and warning that very soon members of the public will not be able to believe their own eyes. But experts said SB 751 might conflict with the First Amendment right to freedom of speech.
It's not clear whether the Constitution would allow deepfakes to be banned outright without being challenged as an infringement on the creator's First Amendment rights, however, SB751 targets deepfakes in terms of election interference and might survive a legal challenge, according to the Texas Senate Research Center.
Although the platforms where deepfakes are published could be held liable under SB751, catching the creators could be a thorny issue since so many deepfakes are produced overseas, which makes it difficult to trace a viral video all the way to its source.
The Technology is Ahead of the Law
The ability to distort reality has taken an exponential leap forward with deepfake technology, said Robert M. "Bobby" Chesney, a professor at The University of Texas School of Law, and co-author (along with Danielle Citron, a professor at Boston University School of Law) of "Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security," to be published soon in the California Law Review.
"Fraudulent audio and video are not new," Chesney said in an email interview. "What is new is the invention of a capacity for creating synthetic media that is not only highly realistic but also (relatively) easily spread. With more players, we get more bad actors."
As with most issues where technology runs afoul of the law, deepfake creators currently have an advantage over lawmakers, said Chesney.
"A lot of money is being spent (government, private sector, academia, etc.) attempting to establish reliable and scalable detection technologies," Chesney said. "Technologists currently are divided over whether the defense (detection) has or ever will catch up with offense (creation)."
The real danger is deepfakes could be used to impugn someone's reputation or harm them in their professional life, said Chesney. They might even be used as "evidence" to deliver the wrong verdict in a trial.
"As access to the technology for creating deepfakes spreads, the chances of all kinds of malicious uses increase," Chesney said. "Certainly it's possible fraudulent content of this kind might one day have an unrecognized, but serious impact on a jury."
Those in the legal profession really need to be vigilant, said Chesney.
"Lawyers and law firms will face all the same risks—reputational sabotage, unfair competition, harassment, deepfake-enabled phishing, etc.—as other individuals and organizations," Chesney explained.
"Beyond that, litigation, arbitration and other dispute resolution systems will face increasing challenges with respect to the authentication of evidence, including a growing need for forensic experts," he continued.
There are, however, steps law firms can take to harden their defenses against deepfakes, said Chesney.
"All organizations with reputations to protect already face some degree of fraud risk targeting key personnel (and, hence, the organization itself), and law firms are no different," Chesney said. "The spread of the ability to create deepfakes will accentuate that existing risk, calling for more resources to be committed to keeping an eye out for such gambits, for rapid response and the like."
How Can Deepfakes Be Stopped?
The law surrounding freedom of expression in the United States is simply unable to handle something like deepfakes, said Jared Schroeder, an assistant professor of journalism at Southern Methodist University, specializing in First Amendment law, particularly as it relates to freedom of expression in virtual spaces, how information flows and how individuals come to understandings in democratic society.
"The First Amendment precedential record we have right now just won't allow us to limit people from creating and posting deepfakes," Schroeder said.
But there is still the chance for redress through the courts if someone harms you, said Schroeder.
"For instance, if someone is defamed by these deepfakes—say one portrays you as a politician doing or saying something you never said or did, you could sue them for defamation," Schroeder explained.
A law like the one in Texas basically criminalizes creating a deepfake for political purposes, said Schroeder. Then the second part of it puts expectations on the publishers, meaning a plaintiff can go after the platform publishing or hosting the deepfake that harmed them.
"The first part of this law will get into trouble because political speech is one of the highest forms of protected speech we have," Schroeder said.
"The Supreme Court has been very loathe to allow what many would consider reasonable limitations on this," he continued. "An example would be the Stolen Valor Act—which seems pretty reasonable, and Congress thought so too—a law that considers lying about military honor. But in U.S. v. Alvarez, the U.S. Supreme Court struck it down, saying, 'We can't allow this kind of law to limit what people can say.'"
Deepfakes are not inherently bad, said Schroeder. The law doesn't make allowances for satire but you could try to argue for it on these grounds because deepfakes are, at times, somewhat akin to a new generation of memes.
Schroeder noted, by way of example, the proliferation of deepfakes superimposing the likeness of actor Nicolas Cage onto a host of characters.
Those videos are distorting the truth, "but it's a lie with a wink because, in this case, we're in on the joke," Schroeder said.
So far, we don't have many examples where someone was harmed, but it's coming, said Schroeder.
Deepfakes can easily damage someone's reputation, said Schroeder.
"Lawmakers are trying head this off because this is a threat," he said. "Lawmakers tried, but this law is their first attempt, and as it comes to be tested, the courts will not allow this type of limitation on speech. Plus, it really doesn't address really bad actors, such as the ones from China, Russia and Ukraine, which have been in the news of late."
"They're trying to address a problem but this law won't have any teeth because there is no such thing as a false idea," Chesney said.
"Our theory of freedom of speech has always been market-driven, where we put all the ideas out and you can weigh them and some of them will fail," he continued. "But this law takes the opposite approach, with the government believing people are not rational or no longer able to discern the truth for themselves, so we're going to put it in the government's hands in order to stop it before it comes out. I appreciate their effort, but the law doesn't allow this. I'm concerned deepfakes are a real threat but I don't know how we'll stop them with our current understanding of free expression."
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllFrom ‘Deep Sadness’ to Little Concern, Gaetz’s Nomination Draws Sharp Reaction From Lawyers
7 minute readActions Speak Louder Than Words: Law Firms Shrink From 'Performative' Statements
6 minute readLaw Firm Diversity Pros Fear for Future of DEI Efforts Under Trump Presidency
Trending Stories
- 1Biden Has Few Ways to Protect His Environmental Legacy, Say Lawyers, Advocates
- 2UN Treaty Enacting Cybercrime Standards Likely to Face Headwinds in US, Other Countries
- 3Clark Hill Acquires L&E Boutique in Mexico City, Adding 5 Lawyers
- 46th Circuit Judges Spar Over Constitutionality of Ohio’s Ballot Initiative Procedures
- 5On The Move: Polsinelli Adds Health Care Litigator in Nashville, Ex-SEC Enforcer Joins BCLP in Atlanta
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250