Ahead of the high-stakes 2020 presidential election, state lawmakers are moving toward regulating deepfakes, which are essentially fraudulent audio or video content created by AI technology.

On Oct. 3, California joined Texas in passing measures amending state election codes to combat deepfakes that may influence voting before an election. Unlike Texas, California didn't make creating or sharing deepfakes about election candidates a criminal offense. However, Assembly Bill No. 730 does authorize an election candidate, whose likeness appears in a "materially deceptive audio or visual media," to bring a civil action against the person or entity that distributed the deepfake 60 days before an election.

Also on Oct. 3, California Gov. Gavin Newsom signed into law Assembly Bill No. 602, an amendment to the state's Civil Code that provides a plaintiff whose likeness was used in a computer-generated nude or sexual act video or image with the ability to obtain damages and other relief.

California's deepfakes "revenge porn" amendment followed Virginia, which expanded its nonconsensual pornography ban to deepfakes in July.

While, according to media reports, AB 602's passage was met with little public criticism, lawyers and First Amendment advocates have said AB 730 may run into several legal issues. 

"I feel in a lot of ways the one that protects real victims, anonymous individuals that are subject to harassing attention, is a lot more practical," said Cynthia Cole, a Baker Botts privacy and data security special counsel, of AB 602. "It does what we traditionally expect from law, that is protecting people from nefarious acts."

But AB 730 veers into the political arena, protected political speech and First Amendment rights, and may not be effective in its intended goal, she added. 

For Cole, the question comes down to "do we allow what continues to fall in the bounds of free speech into the political arena? Now we are trying to define that with this law and I don't know if we can and most of these people are not going to be subject to punishment under this law, we won't be able to find the actors," she said

Cole noted deepfakes in politics is part of a larger issue of state and federal regulations not keeping up with personal information, such as photos and videos, being readily available and sophisticated technology that can easily manipulate data.

Lawyers expect more states could enact deepfake laws before the presidential 2020 election and add to the growing array of state regulations governing personal and online data.

"How Congress is, where it's difficult to get things passed, it's a situation where states are jumping in and enacting these types of laws before federal lawmakers can," said Mark Lyon, a Gibson, Dunn & Crutcher partner and artificial intelligence and automated systems practice group chairman.

Still, if states look to mirror California's updated Election Code, they may be met with similar criticism that questions the feasibility and constitutionality of the law.

"The goal of guarding against reputational harm to candidates is already protected by the law against defamation, while the other apparent goal of the bill [AB 730]—to prevent any voter from being deceived into voting for or against a candidate—appears to be a questionable governmental purpose, if only because it is unachievable and virtually impossible to prove," wrote Kevin Baker, the ACLU of California Center for Advocacy and Policy's legislative director, in a letter provided to Legaltech News requesting that Newsom veto the legislation.