California passes bill mandating consent for AI use of deceased actors' likenesses
Legislation and industry actions address unauthorized AI replicas of voices and likenesses, with SAG-AFTRA leading efforts to protect performers' rights amid rising deepfake threats.
California's Senate has given the green light to a new law that mandates obtaining consent from the estates of deceased performers before using their AI-generated likenesses.
The bill, AB 1836, was passed on August 31. If Governor Gavin Newsom signs it into law, it will require anyone seeking to use a deceased individual’s likeness in media through AI technology to secure approval from their estate.
This legislation builds on previous efforts in California, particularly the passage of AB 2602, which established similar consent requirements for the AI use of living actors' likenesses.
The Screen Actors Guild‐American Federation of Television and Radio Artists (SAG-AFTRA), which represents around 160,000 professionals in the entertainment and media industries, has been a strong advocate for this legislation. The union has applauded the Senate's decision to pass AB 1836 and is optimistic that Governor Newsom will enact it into law. SAG-AFTRA emphasized that these legal protections are in line with those secured in their television and film contracts with major studios, following a significant four-month strike last year. These measures are seen as essential to safeguarding performers' rights in the evolving landscape of digital media.
Legislative battle over unauthorized AI replicas of voices and likenesses
In a recent development, SAG-AFTRA and AI startup Narrativ have partnered to establish a marketplace where actors can license their voices for AI-generated advertisements. This platform allows actors to set their own fees and maintain control over how their voices are used, sparking a heated debate within the entertainment industry. In July, Eleven Labs also made headlines by securing agreements with the estates of Hollywood legends like James Dean and Judy Garland to use their voices in its new Reader App.
On July 26, SAG-AFTRA spearheaded a strike to protest the use of actors’ images and voices in video games, raising alarms about the potential for AI to replace human performers, which could lead to significant job losses.
The growing prevalence of deepfakes, which saw a tenfold increase between 2022 and 2023, with over 2 million reported cases of identity fraud, has highlighted the urgent need for regulatory measures. In response, U.S. senators introduced the NO FAKES Act on July 31, a bill that would make it illegal to create unauthorized digital replicas of individuals' voices and likenesses. OpenAI has expressed support for this legislation, which would enable individuals to seek damages for the unauthorized use of their AI-generated likenesses or voices.