Can Deepfakes Pose a Cybersecurity Threat to Legal?
Deepfakes use real-life video and audio to create realistic, but fake messages. While not an immediate threat to lawyers, they can pose significant cybersecurity risk to clients.
January 14, 2019 at 09:00 AM
5 minute read
Deepfakes may not presently be a top cybersecurity concern for many lawyers. But as the believable fraudulent and doctored videos or audio are becoming more prevalent, they could cause perilous financial and reputation effects on clients.
Deepfakes use machine learning techniques, feeding a computer real data about images or audio, to create a believable video. In a widely publicized instance, a video disseminated by the Trump administration of a journalist interacting with the president's staff was found to be doctored intentionally, according to The Associated Press.
But while deepfakes can be extremely realistic, there are limitations. Currently deepfakes can't perform live; they have to be prerecorded.
“These technologies can't produce live fakes,” explained Robert Chesney, who teaches U.S. national security at the University of Texas at Austin School of Law. “I can't Skype with you [through a fake video], that is adapting and speaking on the fly. It has to be a prerecording.”
Still, many in the business world rely on voicemail, marking a possible entrance for a deepfake to disseminate false information.
“By definition, deepfake is a cybersecurity threat because what deepfake represents is a spoof or fake publication of a video or audio recording typically associated to a business leader or political leader, statements that the actual individual didn't make,” explained Fox Rothschild partner Scott Vernick.
“What people are concerned about, given how easy it is to replicate in ways that look quite genuine a business leader or political leader saying or doing something that's not true … [is that] it can move markets and shape or not shape the fortunes of companies,” Philadelphia-based Vernick added.
Audio deepfakes may pose an immediate risk because they require less robust technology to create a high-end believable fake compared to video deepfakes, noted Ice Miller partner Guillermo Christensen. What's more, Deepfakes used in combination with a successful cyber hack of a widely used infrastructure could cause widespread misinformation before the government can issue a counter message.
Public figure clients of attorneys can also be targeted with deepfakes, causing reputational or worse damage.
“Suppose you have someone like Beyonce, Jay-Z, Elon Musk, Mark Zuckerberg, Kamala Harris, [and someone] managed to put something out and it went viral and took positions that are quite contrary to what their actual position is. [It] could destroy someone's brand, political future, organizations,” Fox Rothschild's Vernick said.
Likewise, divisive statements from a deepfake attributed to a publicly traded company could also be detrimental financially. “You would be sure the stock would take a nosedive, and it would be highly distracting to the company,” Vernick said.
|Fighting Against Deepfakes
Training to specifically spot deepfakes isn't given in many industries, according to lawyers contacted by Legaltech News. However, The Wall Street Journal does host training for editorial staff to quickly detect deepfakes, possibly indicating an industry-specific need to train staff not to disseminate false information that harms their customers.
“In the media business, this is a subset of a broader issue of how we verify sources,” explained Reed Smith partner Gerard Stegmaier. For most information-centric businesses, such as news organizations, repeating a lie would have an immediate detrimental public consequence, Stegmaier said.
When an entity or individual does fall for a deepfake, it may be difficult for them to find the creator or have unfair content removed because of the Communications Decency Act, Stegmaier explained. Section 230 of the act holds internet service providers shouldn't be held responsible for content created by others.
But when a deepfake or any cybersecurity event occurs, Ice Miller's Christensen said a plan should already be in place.
“Any unexpected threat or challenge to business, it's much easier to respond to that event when you have responses in place,” Christensen said. “You need to have something, a strong framework of who would get called on in the company if something strange happened.”
How large or small of a threat deepfakes are has yet to be seen, but they should be acknowledged as a hazard to cybersecurity.
“Deepfakes are real and emerging as an issue but they, like certain types of technology, could emerge very quickly; we talk about this today and it could be a very big deal in six months or it could be nothing,” Reed Smith's Stegmaier cautioned. “We simply don't know.”
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250