Can Deepfakes Pose a Cybersecurity Threat to Legal?
Deepfakes use real-life video and audio to create realistic, but fake messages. While not an immediate threat to lawyers, they can pose significant cybersecurity risk to clients.
January 14, 2019 at 09:00 AM
5 minute read
Deepfakes may not presently be a top cybersecurity concern for many lawyers. But as the believable fraudulent and doctored videos or audio are becoming more prevalent, they could cause perilous financial and reputation effects on clients.
Deepfakes use machine learning techniques, feeding a computer real data about images or audio, to create a believable video. In a widely publicized instance, a video disseminated by the Trump administration of a journalist interacting with the president's staff was found to be doctored intentionally, according to The Associated Press.
But while deepfakes can be extremely realistic, there are limitations. Currently deepfakes can't perform live; they have to be prerecorded.
“These technologies can't produce live fakes,” explained Robert Chesney, who teaches U.S. national security at the University of Texas at Austin School of Law. “I can't Skype with you [through a fake video], that is adapting and speaking on the fly. It has to be a prerecording.”
Still, many in the business world rely on voicemail, marking a possible entrance for a deepfake to disseminate false information.
“By definition, deepfake is a cybersecurity threat because what deepfake represents is a spoof or fake publication of a video or audio recording typically associated to a business leader or political leader, statements that the actual individual didn't make,” explained Fox Rothschild partner Scott Vernick.
“What people are concerned about, given how easy it is to replicate in ways that look quite genuine a business leader or political leader saying or doing something that's not true … [is that] it can move markets and shape or not shape the fortunes of companies,” Philadelphia-based Vernick added.
Audio deepfakes may pose an immediate risk because they require less robust technology to create a high-end believable fake compared to video deepfakes, noted Ice Miller partner Guillermo Christensen. What's more, Deepfakes used in combination with a successful cyber hack of a widely used infrastructure could cause widespread misinformation before the government can issue a counter message.
Public figure clients of attorneys can also be targeted with deepfakes, causing reputational or worse damage.
“Suppose you have someone like Beyonce, Jay-Z, Elon Musk, Mark Zuckerberg, Kamala Harris, [and someone] managed to put something out and it went viral and took positions that are quite contrary to what their actual position is. [It] could destroy someone's brand, political future, organizations,” Fox Rothschild's Vernick said.
Likewise, divisive statements from a deepfake attributed to a publicly traded company could also be detrimental financially. “You would be sure the stock would take a nosedive, and it would be highly distracting to the company,” Vernick said.
Fighting Against Deepfakes
Training to specifically spot deepfakes isn't given in many industries, according to lawyers contacted by Legaltech News. However, The Wall Street Journal does host training for editorial staff to quickly detect deepfakes, possibly indicating an industry-specific need to train staff not to disseminate false information that harms their customers.
“In the media business, this is a subset of a broader issue of how we verify sources,” explained Reed Smith partner Gerard Stegmaier. For most information-centric businesses, such as news organizations, repeating a lie would have an immediate detrimental public consequence, Stegmaier said.
When an entity or individual does fall for a deepfake, it may be difficult for them to find the creator or have unfair content removed because of the Communications Decency Act, Stegmaier explained. Section 230 of the act holds internet service providers shouldn't be held responsible for content created by others.
But when a deepfake or any cybersecurity event occurs, Ice Miller's Christensen said a plan should already be in place.
“Any unexpected threat or challenge to business, it's much easier to respond to that event when you have responses in place,” Christensen said. “You need to have something, a strong framework of who would get called on in the company if something strange happened.”
How large or small of a threat deepfakes are has yet to be seen, but they should be acknowledged as a hazard to cybersecurity.
“Deepfakes are real and emerging as an issue but they, like certain types of technology, could emerge very quickly; we talk about this today and it could be a very big deal in six months or it could be nothing,” Reed Smith's Stegmaier cautioned. “We simply don't know.”
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1Pa. High Court: Concrete Proof Not Needed to Weigh Grounds for Preliminary Injunction Order
- 2'Something Else Is Coming': DOGE Established, but With Limited Scope
- 3Polsinelli Picks Up Corporate Health Care Partner From Greenberg Traurig in LA
- 4Kirkland Lands in Phila., but Rate Pressure May Limit the High-Flying Firm's Growth Prospects
- 5Davis Wright Tremaine Turns to Gen AI To Teach Its Associates Legal Writing
Who Got The Work
J. Brugh Lower of Gibbons has entered an appearance for industrial equipment supplier Devco Corporation in a pending trademark infringement lawsuit. The suit, accusing the defendant of selling knock-off Graco products, was filed Dec. 18 in New Jersey District Court by Rivkin Radler on behalf of Graco Inc. and Graco Minnesota. The case, assigned to U.S. District Judge Zahid N. Quraishi, is 3:24-cv-11294, Graco Inc. et al v. Devco Corporation.
Who Got The Work
Rebecca Maller-Stein and Kent A. Yalowitz of Arnold & Porter Kaye Scholer have entered their appearances for Hanaco Venture Capital and its executives, Lior Prosor and David Frankel, in a pending securities lawsuit. The action, filed on Dec. 24 in New York Southern District Court by Zell, Aron & Co. on behalf of Goldeneye Advisors, accuses the defendants of negligently and fraudulently managing the plaintiff's $1 million investment. The case, assigned to U.S. District Judge Vernon S. Broderick, is 1:24-cv-09918, Goldeneye Advisors, LLC v. Hanaco Venture Capital, Ltd. et al.
Who Got The Work
Attorneys from A&O Shearman has stepped in as defense counsel for Toronto-Dominion Bank and other defendants in a pending securities class action. The suit, filed Dec. 11 in New York Southern District Court by Bleichmar Fonti & Auld, accuses the defendants of concealing the bank's 'pervasive' deficiencies in regards to its compliance with the Bank Secrecy Act and the quality of its anti-money laundering controls. The case, assigned to U.S. District Judge Arun Subramanian, is 1:24-cv-09445, Gonzalez v. The Toronto-Dominion Bank et al.
Who Got The Work
Crown Castle International, a Pennsylvania company providing shared communications infrastructure, has turned to Luke D. Wolf of Gordon Rees Scully Mansukhani to fend off a pending breach-of-contract lawsuit. The court action, filed Nov. 25 in Michigan Eastern District Court by Hooper Hathaway PC on behalf of The Town Residences LLC, accuses Crown Castle of failing to transfer approximately $30,000 in utility payments from T-Mobile in breach of a roof-top lease and assignment agreement. The case, assigned to U.S. District Judge Susan K. Declercq, is 2:24-cv-13131, The Town Residences LLC v. T-Mobile US, Inc. et al.
Who Got The Work
Wilfred P. Coronato and Daniel M. Schwartz of McCarter & English have stepped in as defense counsel to Electrolux Home Products Inc. in a pending product liability lawsuit. The court action, filed Nov. 26 in New York Eastern District Court by Poulos Lopiccolo PC and Nagel Rice LLP on behalf of David Stern, alleges that the defendant's refrigerators’ drawers and shelving repeatedly break and fall apart within months after purchase. The case, assigned to U.S. District Judge Joan M. Azrack, is 2:24-cv-08204, Stern v. Electrolux Home Products, Inc.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250