Media law attorneys tracking harmful Internet trends are coming up with solutions to protect clients in court.

Whether it's fighting against one person's face being realistically pasted on another's body in a porn video, or against the mass collection of images for facial recognition databases used by law enforcement, lawyers discuss causes of action that could come into play.

At the American Bar Association Forum on Communications Law in Austin on Friday, communications lawyers touched on cutting-edge legal issues arising from new technologies, including deepfake videos, facial recognition software, data scraping and more.

Here are three of the trends they're tracking that might make attorneys shudder.

|

Deepfake videos

Before delving into the types of causes of action that attorneys can lob against deepfake video creators, Brendan Charney, associate with Davis Wright Tremaine in Los Angeles, took a step back to explain some fundamentals.

What is a deepfake?

Charney said that artificial intelligence technology can manipulate audio or video to create convincing fake videos, which have mainly been deployed to create fake pornography and fake news.

When fighting deepfakes in court, defamation claims will be straightforward if the fake tape depicted the client doing or saying something that they never did, said Charney. For something that doesn't rise to that level, there could still be a false light claim, he added. Those videos that depict celebrities may give rise to right to publicity claims.

As deepfake technology improves and produces more realistic fakes, it's possible that a journalist could be tricked and mistakenly report on a deepfake as though it were true. If media organizations face lawsuits over it, Charney said that their defense attorneys could argue there was no actual malice in reporting on the deepfake. The plaintiff would likely argue that the journalist didn't follow the proper standards and was reckless when publishing false information, he said. Therefore, following standards is key: Journalists should use traditional reporting methods as well as forensic computing tools that help proactively spot the fakes.

|

Facial recognition

By scraping a person's photos posted on Facebook, LinkedIn or YouTube, new facial recognition companies are building databases of people's faces and names, said Jeremy Feigelson, partner in Debevoise & Plimpton in New York.

One company, Clearview, sells the use of its database system to over 6,000 nationwide law enforcement agencies. Police who have images, but not names, can run images through the system to search for the identities of crime suspects and crime victims, especially the victims of child pornography, Feigelson said.

Although the company's position is that its service is constitutional, and compliant with state biometric privacy laws, a recent class-action lawsuit alleged that by collecting and storing people's images without their permission, that Clearview had violated HIPPA, noted Feigelson. Also, social media companies like Facebook and Twitter have sent Clearview cease and desist letters claiming the company is engaged in unauthorized scraping to get the initial images for its database, he added.

|

Data scraping

David Whittenstein, partner in Cooley in Washington, D.C., explained that scraping is an automated method to access and extract data from a third-party website. It can be legitimate, like when Google scrapes websites to archive them in its search engine, or illegitimate, like when marketing companies scrape email addresses and put them in marketing lists without permission.

Internet companies like LinkedIn, Craigslist and Facebook object to scrapers taking data from their websites, note Wittenstein. Some have successfully brought claims against scrapers of trespass to chattel, and courts have found that when a scraper accesses a company's server to scrape its data, and burdens the server, that counts as trespassing. Sometimes businesses can assert copyright claims against trespassers, as when The Associated Press brought claims over scraped news stories, he added.

Lawyers have successfully used the 1986 Computer Fraud and Abuse Act to stop scraping. Yet a recent ruling, HiQ v. LinkedIn, from the U.S. Court of Appeals for the Ninth Circuit determined that the act was meant to apply to private information that is protected, not public information such as professional profiles on LinkedIn.