Deepfakes may not presently be a top cybersecurity concern for many lawyers. But as the believable fraudulent and doctored videos or audio are becoming more prevalent, they could cause perilous financial and reputation effects on clients.

Deepfakes use machine learning techniques, feeding a computer real data about images or audio, to create a believable video. In a widely publicized instance,  a video disseminated by the Trump administration of a journalist interacting with the president’s staff was found to be doctored intentionally, according to The Associated Press.

This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.

To view this content, please continue to their sites.

Not a Lexis Subscriber?
Subscribe Now

Not a Bloomberg Law Subscriber?
Subscribe Now

Why am I seeing this?

LexisNexis® and Bloomberg Law are third party online distributors of the broad collection of current and archived versions of ALM's legal news publications. LexisNexis® and Bloomberg Law customers are able to access and use ALM's content, including content from the National Law Journal, The American Lawyer, Legaltech News, The New York Law Journal, and Corporate Counsel, as well as other sources of legal information.

For questions call 1-877-256-2472 or contact us at [email protected]