Deepfakes: When a Picture Is Worth Nothing at All
"Deepfakes" is the name for highly realistic, falsified imagery and sound recordings; they are digitized and personalized impersonations. Deepfakes are made by using AI-based facial and audio recognition and reconstruction technology; AI algorithms are used to predict facial movements as well as vocal sounds. In her Artificial Intelligence column, Katherine B. Forrest explores the legal issues likely to arise as deepfakes become more prevalent.
October 28, 2019 at 12:45 PM
7 minute read
A picture can be worth a thousand words, or nothing at all. Deepfakes—highly realistic and personalized, but totally falsified, video or audio tapes—are changing the value and reliability of video and audio evidence.
"Deepfakes" is the name for highly realistic, falsified imagery and sound recordings; they are digitized and personalized impersonations. Deepfakes are made by using AI-based facial and audio recognition and reconstruction technology; AI algorithms are used to predict facial (mouth, jaw, eye, etc.) movements as well as vocal sounds.
Deepfake videos are created by taking an image or video clip of an individual—doing or saying anything at all. Through careful dissection and disassembly of each pixel of an image, or the dissection of a short audio clip of a voice, and using AI predictive technology, the software can create a doppelgänger doing something entirely different than the original image or sound clip. Indeed, what the doppelgänger may be doing on the fake tape is limited only by the imagination of the creator.
You don't have to be a technologist to access off-the-shelf software to create deepfakes, more and more of it is available free online. AI machine learning enables their creation. Just as the brain learns to recognize and identify a person by repeated exposure, the creator of a deepfake simply gathers a data set consisting of a number of images (still photographs or videoclips) of a target person (say a well-known political figure), and feeds it into an AI enabled program. That program compares the target image to a base image (say, an actor pretending to do and say something the politician would never say or do; or a video from a security camera showing the face of an actual perpetrator). The software uses machine learning to iteratively compare the images and effectively create a substitute of the target image onto the base image. The software learns from those images how the face appears in a number of different positions and is able to mimic those positions on the base image. One of the earliest uses of deepfakes was altering pornographic images with the faces of well known actresses.
Technology and techniques are being developed to detect deepfakes—either by recognizing a voice or image as falsified. Some of these efforts seem quite basic and manual: having humans do careful clip by clip analysis of suspect images or sound recordings and flagging potential errors that reveal them as fake. But there are also computerized pixel comparisons that can be done: taking a base image and analyzing it for missing, condensed or defective bits.
Technology is also being developed to tag new video and audio clips with indicia of authenticity. This new technology will be chasing improvements in the deepfake technology itself and provides only a partial solution. In addition, any certification of authenticity is primarily useful for new audio and video—while immediate harm may come from allegedly "discovered" and "old" audio and video clips created long before deepfakes came onto the scene.
How many times have we all heard "let's go to the tape," "seeing is believing," a "picture is worth a thousand words"? We are used to accepting what we see with our eyes and hear with our ears as "truth," and for that reason, in legal settings, audio and videotapes can be powerful evidence. A good quality audio or videotape can show who said what to whom, who was present in a specific location and who took a particular action. Audio and videotapes are used by the parties in myriad legal proceedings as well as prosecutors, judges and juries.
In a criminal case, a videotape of events can be the smoking gun that makes the difference between an Indictment or terminated investigation, a plea or going to trial, a conviction or acquittal. Imagine a videotape that shows the defendant holding a gun in a vestibule at the time that a murder occurred in that location; or buying a quantity of drugs in a location where a known drug sale took place. In a civil setting, imagine a videotape that shows that a person used a racial epithet or engaged in an inappropriate act at the office. If these visual images of people are fakes, they can result in enormous harm, including the conviction and incarceration of the wrong person.
Of course deepfakes outside of legal proceedings can also do great harm: fake tapes of political candidates saying or doing things they never said or did in order to influence the outcome of an election is an evil we should all be concerned about. But there are also national security issues: deepfakes being used to show governmental officials apparently taking military actions (for instance, authorizing a military strike) that they never took.
Deepfakes are changing our ability to rely on audio and video evidence: What we assume is truth might in fact be fiction.
What does the emergence of deepfakes mean for lawyers and judges? In 2019, it means there needs to be an awareness that they exist and to anticipate challenges (both well founded and opportunistic) to audio and videotape authenticity. We can expect the development of a cottage industry of people who purport to be able to tell a fake from an authentic tape, and to have a new series of experts in the courtroom. We may need to develop protocols for clear chain of custody histories for audio and videotapes; these histories will record the metadata demonstrating that a tape came off a particular server and was transmitted in a controlled manner.
In criminal cases, it means that defendants may argue that even if they appear on tape, "it wasn't me"; in cases in which the public is paying for the defense and associated costs, judges will be asked to approve costs of testing authenticity. This could become expensive and until technology has advanced, and not be guaranteed to result in a clear answer. Judges will then be faced with questions as to whether to allow a defendant to argue that a deepfake is the reason why the jury sees his face in the tape, in an effort to raise a reasonable doubt.
In civil cases, there will undoubtedly be Daubert and Frye challenges to the expertise of those individuals proffered as experts in differentiating real from fake; and the financial burden of undertaking that exercise may end up becoming a cost certain parties are unable to bear. An inability to effectively challenge a suspected or actual fake tape could make the difference between winning and losing a case.
We are entering a new era in which what we see cannot be equated with what we should believe. Lawyers and judges will need to keep abreast of developments in this area in order to confront challenges.
Katherine B. Forrest is a partner in Cravath, Swaine & Moore's litigation department. She most recently served as a U.S. District Judge for the Southern District of New York and was the former Deputy Assistant Attorney General in the Antitrust Division of the U.S. Department of Justice.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllJustices Pass on Service Providers' Challenge to NY's Broadband Rate Caps
3 minute readInvestors Sue in New York Over $440M International Crypto Ponzi Scheme
4 minute readBig Tech and Internet Companies Slammed With Consumer Class Actions in December
Trending Stories
- 1Trailblazing Broward Judge Retires; Legacy Includes Bush v. Gore
- 2Federal Judge Named in Lawsuit Over Underage Drinking Party at His California Home
- 3'Almost an Arms Race': California Law Firms Scooped Up Lateral Talent by the Handful in 2024
- 4Pittsburgh Judge Rules Loan Company's Online Arbitration Agreement Unenforceable
- 5As a New Year Dawns, the Value of Florida’s Revised Mediation Laws Comes Into Greater Focus
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250