Public figures aren't the only targets of deepfakes. Indeed, any company can suffer reputational and litigation damages from the use of deepfake content.

A deepfake refers to a realistic but fake video or audio recording made from leveraging artificial intelligence tools and previously recorded visual or audio content. In the past year, deepfakes have garnered attention from legislators as a technique to channel false information. For instance, a video of U.S. Sen. Nancy Pelosi doctored to make her appear to be slurring her words was widely circulated on social media platforms in May, amplifying some fears that as a U.S. presidential election nears, deepfakes will be leveraged to stir misinformation.

While deepfakes aimed at public figures have received attention, CEOs and heads of human resources can also be targeted by the technology, warned Littler Mendelson robotics, AI and automation practice group co-chair Natalie Pierce. Pierce, along with three other firm colleagues authored a report on the implications of Deepfakes in the workplace. 

Though a significant amount of audio and video is needed to create a realistic deepfake, the report highlighted that voicemails can also present disgruntled employees enough material to create a high-quality deepfakes.

“It [deepfakes] could do a lot of harm if an employer isn't knowledgeable that this technology is out there and if they believe what they see,” Pierce said. “That poses a big potential risk also in the same way that emails or text messages are doctored.”

A successful deepfake could have dire effects on a company's reputation, stock and possibly be used as evidence in litigation, added Littler Mendelson chief data analytics officer Aaron Crews.

“As this technology democratizes and it's pretty easy to acquire, the ability for it to penetrate in companies of any size is very high,” Crews said. He added, “I promise you in the not-too-distant future that someone will have a video and it's entirely fabricated and it's attached to a complaint.”

Pierce agreed and said to prevent a deepfake from successfully infiltrating a company, employers should “question things that may not be right when they have significant [impact]. At least having access to resources to detecting deepfakes is critical.”

Those resources include analyzing GPS and time stamps on communications and utilizing cryptographic key signing to validate messages while the industry waits for a deepfakes-specific verification tool to be developed, according to the firm's report.

If seeing something recorded was ever deemed a gold standard, it has lost that luster with deepfakes, Crews said. In-person interviews with the parties and forensic tests are now more important when investigating remarks or actions made in a recording.

“Questioning and verifying will become even more important because I think there is such a tendency to look at a piece of video and if something shocking is shown you can jump to conclusions,” Pierce said.