Attack of the (Voice) Clones: Protecting the Right to Your Voice
A wide range of tools have been developed to perform vocal cloning, leading to vocal deepfakes becoming a common source of scams and misinformation. And these issues have only been exacerbated by a lack of appropriate laws and regulations to rein in the use of AI and protect an individual's right to their voice.
September 23, 2024 at 12:03 PM
8 minute read
CybersecurityIn January 2023, AI speech synthesis company ElevenLabs, Inc. released a beta platform for its natural-sounding vocal cloning tool. Using this platform, a brief snippet of a person's voice could generate audio files of the target saying anything the uploader desired. This release created a spike in misappropriated vocal cloning from viral rap songs to parodies of political figures. Recognizing their software was being widely misused, ElevenLabs installed safeguards to ensure the company could trace the generated audio back to a creator. But it was too late. Pandora's box was already open.
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllPa. Hospital Agrees to $65M Class Action Settlement Over Data Breach
4 minute readPa. Fed Judge Greenlights Negligence Suit Against Medical Center Following Data Breach
4 minute readLaw Firms Mentioned
Trending Stories
- 1The Law Firm Disrupted: Playing the Talent Game to Win
- 2A&O Shearman Adopts 3-Level Lockstep Pay Model Amid Shift to All-Equity Partnership
- 3Preparing Your Law Firm for 2025: Smart Ways to Embrace AI & Other Technologies
- 4BD Settles Thousands of Bard Hernia Mesh Lawsuits
- 5A RICO Surge Is Underway: Here's How the Allstate Push Might Play Out
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250