Deepfakes and the Growing Trend of Fabricated Video Evidence
Can you completely trust your evidence? Is it easy for a nefarious individual to fake a surveillance video and, based on this doctored material, convince a judge or jury to make a specific ruling or verdict?
May 02, 2019 at 11:50 AM
4 minute read
Can you completely trust your evidence? Is it easy for a nefarious individual to fake a surveillance video and, based on this doctored material, convince a judge or jury to make a specific ruling or verdict?
Imagine a jury is watching evidence in your trial. A video of the suspect committing murder is playing. The video is clear. The suspect can be identified. His voice is heard. The victim's mother shouts, “My baby!” The verdict is now a forgone conclusion. He's convicted and executed. Years later you learn the video of the murder was doctored.
You had no reason to second-guess the authenticity of that video. It looked and sounded genuine. That's the problem with deepfakes; you think you're viewing factual footage. These videos could alter a jury's perception of reality. These videos could sway voter opinions, alter the results of a trial, ruin reputations or even incite violence.
In December 2017, a Reddit user called “deepfakes” put faces of celebrities on pornographic video clips. The word is a mix of “fakes” and “deep learning,” which refers to the type of artificial intelligence (AI) used to create these images. The word deepfakes quickly morphed to mean any type of falsified videos using a technique for human image synthesis based on AI.
As technology improves, it will become more and more difficult to differentiate between authentic and falsified videos. Deepfake technology is becoming more affordable and accessible. Will video evidence lose the status as trusted evidence? When presenting to a jury, video is powerful evidence to influence a trial, and that's what makes deepfakes so dangerous.
In a recent article, Riana Pfefferkorn, associate director of surveillance and cybersecurity at the Stanford Center for Internet and Society, said, “Deepfakes get at a concept that is very important in both encryption and cybersecurity more generally: authentication. She added, “AI is now capable of generating fake human faces, which I for one cannot detect any telltale signs that it's not a real photo.” She noted the abuse of the tools used to make a fake video is only going to get worse and expects to see litigation about the growing problem. Pfefferkorn predicts that a call for more qualified experts to help weed through the deepfakes on their way to courtrooms will be the future protocol. “This is such a cutting-edge issue that there are only a few people who right now are qualified enough to give expert opinions as to whether or not something is a deepfake.”
Without qualified experts to authenticate potentially questionable video, juries might be swayed by arguments to not take certain evidence into their consideration.
How can this emerging problem be eliminated? One way is through analyzing and carefully controlling the chain of evidence. A strong chain of evidence would prevent falsified video from being entered as evidence. Since there would be no doubt as to the source of the evidence, a judge would feel confident to admit it.
Another way to combat deepfakes is a constant attention and vigilance to recognize artifacts and signs left by computers that create these videos.
Hollywood is able to reproduce historic events with great accuracy and realism, such as the D-Day scene in “Saving Private Ryan.” In fact, the realism in this scene was so believable that some veterans had to leave theaters.
Until now, courts haven't been concerned about this Hollywood technology being used with evidence, so why is the issue being raised now?
Because creating such realism in films and television was expensive and required a lot of expertise to reproduce. Now, the technology being utilized by deepfakes is a bigger threat because it is becoming accessible to the general public at low cost and with minimal education or training.
The risk of fake evidence is higher today and only becoming greater. A good chain of evidence, along with creating and improving systems to sniff out fake video and audio, is key to preserving authenticity at trial.
David Notowitz is the founder of NCAVF. He is an Emmy award-winning producer and multifaceted video and audio forensic evidence expert. His specialties include news, documentaries and commercial video production. Notowitz works as a forensic video expert witness on cases investigated by police officers, detectives, private investigators, insurance investigators, public defenders and criminal defense attorneys, as well as work with private civil and criminal attorneys and large corporations across the United States.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250