Zelda Williams speaks out against AI & disturbing recreations of her dad

Embed from Getty Images
We’ve been covering the copyright lawsuits being filed by authors who didn’t consent to tech companies using their works to train AI programs. It can hard trying to follow how the AI programs operate (especially when said programs are named after animals, like LLaMA). Understanding the infringement issues through the lens of actors makes it a bit easier to understand. Zelda Williams has just described what it’s like to hear AI recreate the voice of her father, the late-great Robin Williams. I think we can all understand on a primal level the supreme ick factor, not to mention moral failing, of AI playing at being someone’s dearly-departed relation. More on what Zelda posted to Instagram on Sunday:

Artificial intelligence has a lot of potential uses, and abuses, such as recreating the image or voice of a person without their consent — which is one of the issues at the heart of the ongoing actors’ strike.

Zelda Williams understandably has some opinions on this — as her dad was the often imitated, never duplicated Robin Williams — and she weighed in on the practice of AI recreations, calling them “disturbing.”

“I am not an impartial voice in SAG’s fight against AI,” Williams wrote via her Instagram Story on Sunday. “I’ve witnessed for YEARS how many people want to train these models to create/recreate actors who cannot consent, like Dad. This isn’t theoretical, it is very very real.”

Robin Williams, an Oscar-winning actor and comedian known for his improvisational skills and characterizations, died by suicide in 2014.

“I’ve already heard AI used to get his ‘voice’ to say whatever people want and while I find it personally disturbing, the ramifications go far beyond my own feelings,” she continued. “Living actors deserve a chance to create characters with their choices, to voice cartoons, to put their HUMAN effort and time into the pursuit of performance.”

SAG-AFTRA made AI recreations of an actor’s image or voice — including “the use of performer’s voice, likeness or performance to train an artificial intelligence system designed to generate new visual, audio, or audiovisual content” — as “a mandatory subject of bargaining.”

The subject is a hot button topic in Hollywood, with Tim Burton comparing AI recreations to “a robot taking your humanity, your soul.” In support of the actors’ strike, John Cusack called AI a “criminal enterprise.” And on Saturday, Tom Hanks warned of an AI video of himself promoting a dental plan.

“These recreations are, at their very best, a poor facsimile of great people,” Williams concluded, “but at their worst, a horrendous Frankensteinian monster, cobbled together from the worst bits of everything this industry is, instead of what it should stand for.”

[From Entertainment Weekly]

Zelda is really well spoken here, as she’s been for years. She’s managed to eschew the main nepo baby conversations, I think probably because she’s shifted her focus from acting to directing. Her next project, Lisa Frankenstein, is a horror-comedy that will be her feature-length debut, and it’s written by Diablo Cody. (Ok, so she’s still getting a bit of a nepo bump.) I guess she had the film on her mind when she called AI at its worst “a horrendous Frankensteinian monster.” Still true, though! And speaking of, it wasn’t just Tom Hanks over the weekend that got scammed. Gayle King also had to warn people that rogue AI was responsible for a video of her touting weight loss products. A huge chunk of celebs’ management budgets now are gonna be allocated for lawyers playing whack-a-mole with all these fake AI videos. The WGA felt that they secured adequate protections from AI in their new deal; we’ll see how SAG-AFTRA builds on that in their negotiations with AMPTP.

Embed from Getty Images

Embed from Getty Images

Photos credit: Derek Ross/Avalon and Getty

You can follow any responses to this entry through the RSS 2.0 feed.

14 Responses to “Zelda Williams speaks out against AI & disturbing recreations of her dad”

Comments are Closed

We close comments on older posts to fight comment spam.

  1. equality says:

    Copyright laws and the like need to be passed to cover this sort of violations of rights, even for deceased people, I think. If some of these companies had to start paying some sort of fine, as well as, compensating people imitated and next-of-kin it might slow it down some. I agree with her, let living people earn by saying what they are agreeable to acting out or saying instead of recreating someone’s voice to force the issue. It bugs me also when people create memes of someone with a quote like it was something they said. Is everyone going to have to start copyrighting their voice/face/body in order to not be violated?

    • Torttu says:

      Copyright of “your person” should be automatic. The AI companies are out of control, how they operate is infuriating.

      • Concern Fae says:

        You should control the commercial use of all of your data. There should be severe penalties for false data about you in things like credit reports.

    • Izzy says:

      Control of one’s own name or likeness for commercial purposes is covered already by right of publicity, and I would think that courts would find that recreating someone in their entirety is using their likeness.

      • equality says:

        From what I’ve read researching it this morning, it’s not. And there are apparently gray areas if they don’t recreate the likeness exactly, it can be termed an “artistic rendering”. If it were covered entirely, people wouldn’t be able to take pictures in public without permission.

  2. Eleonor says:

    This is not only creepy and disrespectful, but painful for the loved ones.

    • goofpuff says:

      Extremely painful and traumatizing. We’re not just talking about hawking products, but imagine what someone could do to hurt someone with this technology. Deep Fakes already exist and used to harm people (usually women) as an attack or blackmail.

  3. North of Boston says:

    What’s the reason these AI projects are faking content using the identifiable likenesses and voices of really famous people? That’s right, they are purposely trying to benefit, profit off of the skill, creative talents, reputation and celebrity of those people. And the fact that the AI masters are using previous work by these artists, without permission, to teach AI is outrageous. It’s not just profiting off of somebody else but also harming their reputation, value in the market. While William Devine and Tom Selleck are choosing to use their image, voice, efforts to hawk whatever junky products there are, Hanks and King and Williams did not, and in the last case can never. Putting words in their mouths, using deep fake technology to have them perform things they never said or did, never agreed to should absolutely be outlawed if it isn’t already and union contracts that ban it make be the step needed while lawmakers are busy focusing on important tasks like shutting down the US government, using their positions for free stuff and access and furthering MAGA/Putin’s agenda.

    People aren’t IP that go into the public domain, they are people with their own rights to their assets, rights that pass on to their heirs.

    I can’t mock up an Apple logo and make a product that looks like an iPhone and advertise it, sell it without serious legal issues. This should be the same. I hope those with deep pockets bring lawsuits at anyone pulling this shit with other people’s image/voice likenesses and make them pay through the nose.

    • Izzy says:

      I mentioned above, but for public figures especially, their name and likeness is considered a type of aiP that is protected by right of publicity. IIRC the two big cases were Waits v Frito-Lay and Midler v Ford Motor Co. The cases clearly establish that public figures control the commercial uses of name and likeness. In. Midler’s case, it was the sound of her voice, which had been almost perfectly replicated using a backup singer, after Midler had said no to doing the commercial. The recording was so like her that her friends were asking her why she changed her mind about doing the commercial.

  4. This is going to be hard to enforce. AI will end up going just left of what is not allowed. Instead of having a Robin Williams voice, they will have a voice that is 89% Robin Williams, if that is not allowed, then 87% Robin Williams voice, that will be easy for AI to do.

    • North of Boston says:

      Then it will be on us, the consumer to not engage with that content, not pay money for things it’s used to promote.

      (Like those stupid breakfast sausages … why would I want to eat foods being endorsed by someone who has been dead for 40 years? Even if it was with his estate’s okay)

      • equality says:

        I eat anything because it is good. Not because someone famous endorses it. Company time and effort would be better spent on creating a quality product than on worrying about celeb endorsement.

      • kirk says:

        I agree that consumers have responsibility to not pay for products that misuse other people’s personhood or IP. That said though, what if I’m already eating or using a product, then later discover that their advertising is basic misappropriation. Calling the company to voice objections can be done, but wouldn’t it be better to have mechanism in place that makes it easier for artists to clamp down on misuse of their work or image? I would prefer a broader protection that can be widely agreed upon and adhered to in society. Companies and people that misuse AI to create facsimiles of a person or their unique works without their consent is uethical and has a depressing effect on emergence of new artists and new forms of content creation.

  5. Twin Falls says:

    This is just so sad for his family. I agree with Tim Burton. It’s theft of humanity.