Artificial intelligence (AI) is bringing legal services to new heights of accuracy. But in other areas, it may be muddying the waters.

AI technology is powering video editing apps that allow users to easily superimpose people's faces onto actors in videos. The apps, of which “FakeApp” is the most well-known, create such convincing videos that they're deemed “Deepfakes.” And such visual content is causing alarm among those who believe it will usher in a new era of “fake news.”

There is, however, little recourse to stop such tools from being used. The technology itself and publishing of it online are legal. But users aren't completely free of all liability, as depending on how the technology is deployed, such videos may violate of a host of U.S. laws.

“You can imagine editing somebody into a security video robbing a bank—would that violate the law? Depends on what you do with it,” said Jeffrey Hermes, deputy director at the Media Law Resource Center, a trade organization for media attorneys.

“If you assert that it's an accurate representation of events when you know it's not, to the end you're accusing somebody of a crime they haven't been convicted of, and to the extent that you're using the video to make that accusation, that might be defamatory and could be actionable as such.”

Similarly, if a person uses the app to edit an actor into a commercial without their permission, the actor may claim that their rights of publicity are being violated, he added.

In the case of using such technology to create pornographic content, which is a popular use of tools like FakeApp, the law may be a little more complex.

“What if you edit someone into a porn video? Well that's a trickier case, is that a privacy violation? It's an interesting question because you're not really disclosing any private facts,” Hermes said. Yet he noted that such videos could be defamatory “if you're presenting the video suggesting that this person actually engaged in the acts. But there are some grey areas there when you get into that kind of conduct.”

Given the way such apps piece together media to create fake videos, there is bound to be questions over intellectual property rights as well. “What the technology does is create a composite work based on collection of photos of the subject plus an underlying video into which the subject is being inserted,” Hermes explained.

If such pictures or video were copyrighted and used without permission, that could be a violation of the intellectual property laws, he said.

What's more, if a creator of a fake video is illegally taking content from another person's computer, website or device to add to the video, “then there are potentially Computer Fraud and Abuse Act issues and misappropriation issues as well,” said Markham Erickson, partner at Steptoe & Johnson and chair of the firm's Internet, Telecom, and Technology practice group.

Still, even if apps that allowed the creation of fake video content became a concern for many in the U.S., there is little chance that websites or app stores would be under any legal pressure to stop offering them to their users. Under Section 230 of the Communications Decency Act, publishers of third-party content have immunity from any liability stemming from such content.

“To the extent that you're downloading the app, from either one of the major app stores on your phone or from another website that is hosting the apps, those platforms enjoy the same immunity under Section 230 as a social network does for the speech, because ultimately that app is somebody's speech,” Erickson said.

But there are a few caveats to this immunity. “There have been cases where courts have found that if a website affirmatively solicits unlawful content, then they essentially become the content creator” and lose protection under Section 230, Erickson said.

To be sure, the technology behind “Deepfakes” is nothing entirely new and has been used in the filmmaking industry, underscoring its need in some areas of the economy.

“You wouldn't want to declare the technology itself is illegal because it has all kinds of potential uses,” Hermes said, noting that the technology was “very similar to that was what was used in [Star Wars movie] Rogue One for the young Princess Leia's appearance.”