Scarlett Johansson says ChatGPT copied her voice after she turned them down


Have y’all been following this Scarlett Johansson/ChatGPT drama? To recap: Last September, OpenAI CEO Sam Altman asked ScarJo to be the voice of ChatGPT 4.0. You know how Siri or Alexa will respond to you? Scarlett would be that voice, but for what’s probably the most well-known Artificial Intelligence platform out there right now. According to Scarlett, she politely said no. Despite her declination, Altman continued to ask her, even as recently as days before the latest ChatGPT system was released.

Well, I’m sure you can see where this one is going. ChatGPT 4.0, named “Sky,” released a demo earlier in May and its voice sounds a lot like ScarJo’s. Altman basically did a public “F you,” as well, Tweeting the word “her” on May 13. “Her” is likely a reference to Scarlett’s movie Her, in which she plays a chat system named Samantha. As you may recall, Scarlett engaged in a game of FAFO with Disney back in 2021 when she filed a breach of contract lawsuit against them for their handling of Black Widow’s release (an underrated movie that I actually did enjoy). This week, she once again proved that she is Not The One by having her legal team send OpenAI two letters and then releasing a statement calling Altman out. Altman, of course, tucked his tail between his legs, denied using Scar’s voice, gave some fake apology, and took it down.

Scarlett Johansson is speaking out. In a statement to ET, Johansson, 39, through her rep, accused Sam Altman, the CEO of OpenAI, of recreating her voice for his company’s ChatGPT after she turned down the opportunity to voice the chatbot.

“Last September, I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system,” Johansson’s statement read. “He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and AI. He said he felt that my voice would be comforting to people.”

“After much consideration and for personal reasons, I declined the offer,” the statement continued. “Nine months later, my friends, family and the general public all noted how much the newest system named ‘Sky’ sounded like me.”

Upon hearing the demo, Johansson said she “was shocked, angered and in disbelief that Mr. Altman would pursue a voice that sounded so eerily similar to mine that my closest friends and news outlets could not tell the difference.”

“Mr. Altman even insinuated that the similarity was intentional, tweeting a single word ‘her’ – a reference to the film in which I voiced a chat system, Samantha, who forms an intimate relationship with a human,” the statement continued, alluding to the 2013 film, Her.

Days before the demo’s release, Johansson claimed in her statement, “Mr. Altman contacted my agent, asking me to reconsider. Before we could connect, the system was out there.”

“As a result of their actions, I was forced to hire legal counsel, who wrote two letters to Mr. Altman and OpenAI, setting out what they had done and asking them to detail the exact process by which they created the ‘Sky’ voice,” the statement read. “Consequently, OpenAI reluctantly agreed to take down the ‘Sky’ voice.”

Johansson’s statement concluded with a general statement about the emergence of AI.

“In a time when we are all grappling with deepfakes and the protection of our own likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity,” the statement read. “I look forward to resolution in the form of transparency and the passage of appropriate legislation to help ensure that individual rights are protected.”

After Johansson’s statement, which was first obtained by NPR, Altman addressed the situation in a statement to The Verge.

“The voice of Sky is not Scarlett Johansson’s, and it was never intended to resemble hers,” he said. “We cast the voice actor behind Sky’s voice before any outreach to Ms. Johansson. Out of respect for Ms. Johansson, we have paused using Sky’s voice in our products. We are sorry to Ms. Johansson that we didn’t communicate better.”

[From ET Online]

It’s super f-cked up that Altman wouldn’t take no for an answer and did what he wanted anyway, but it’s not surprising. He’s a billionaire who seldom sees consequences for his actions. Even when OpenAI tried to fire his ass back in November, he ended up being rehired. We all know that men like Altman and big companies in general take advantage of and steal from the little people all of the damn time. It’s just gross how much the rules don’t apply to them and infuriating how easily they can escape legal repercussions. They picked the wrong target in Johannson, though. She does not play and has lots of money to fight back.

I’m glad that ScarJo’s statement made note of some of the dangers of using unregulated AI. It’s all fun-and-games when Alexa is telling us the weather (of which I ask her every single morning), but the further we go down the road and the better the technology gets, it’s going to become a very scary mess. It already is for a lot of young women, both celebrities and civilians. How do we protect our likenesses, work, and identities moving forward? We absolutely need more federal regulation. Despite there being calls for Congress to do something, the only current legislation on the books is the DEFIANCE Act, a proposed bill that’s supposed to protect deepfake victims, but that only covers victims of non-consensual deepfake pornography. A previous bill, the Preventing Deepfakes of Intimate Images Act, was introduced in December 2022, but expired in the House of Representatives’ last Congressional session. We’re falling behind on regulating AI and it’s only going to get worse.

photos credit: Ron Sachs/CNP/INSTARimages, MAGO/RW/Avalon

You can follow any responses to this entry through the RSS 2.0 feed.

30 Responses to “Scarlett Johansson says ChatGPT copied her voice after she turned them down”

Comments are Closed

We close comments on older posts to fight comment spam.

  1. bisynaptic says:

    Another reason to dislike and distrust “AI”. We need regulation that protects people from these men’s avarice.

  2. sevenblue says:

    What a stupid man. Scarlett sued the Mouse and won. He really thought she wouldn’t sue his back. It is scary that men like him will never accept the word “no” and keep getting successful anyway. No woman can get away with something like this. I know, there won’t be any online hatred campaign against this guy either.

  3. LooneyTunes says:

    I am torn on this. If someone hires an actress who looks like ScarJo after she turns down a part, is that actionable? Had he hired someone who sounded like ScarJo without trying to hire her first, is that actionable? Where is the line? It seems like he created liability for himself by tweeting “her” after getting denied by Scarlett but would’ve had plausible deniability if he had not done that. This would make an interesting case.

    • sevenblue says:

      But, did they hire someone who sounded like Scarlett or did they feed Scarlett’s earlier works to AI and “created” this? That is why her lawyer asked them to give detail about the process and they dropped it immediately. They did the same to other voice actors like Stephen Fry whose voice was also fed to AI without his permission through using his earlier voice works.

      • LooneyTunes says:

        @sevenblue Interesting! You’re right—they could’ve used her voice/work to “create” Sky. I want to see the case law that comes out of this (as a legal geek).

    • North of Boston says:

      He did what he did, and the company did what it did. No need to go down any “but what if he didn’t?” path in this instance.

      She is right to call him/them out using whatever legal means available.

    • Izzy says:

      Yes, it is actionable. Right of Publicity allows one to control their name or likeness for commercial purposes, and in one of the key cases, Midler v Ford Motor Co, likeness was deemed to include voice when it is so recognizable that there is a strong likelihood of confusion. Altman basically hired someone who sounded just like ScarJo, whose voice is well known and distinctive, because she said no, and the law clearly says you cannot do that.

    • Kate says:

      The scariest part of this to me is when teens will start using it to bully or shame their classmates. Imagine if someone who hates a certain kid used AI to call another peer and either say something with negative language or something sexual and there being no proof it wasn’t the person it was pretending to be. Lots and lots of scary consequences there.

    • Fifty-50 says:

      It’s likely OpenAI didn’t hire anyone. Look up AI voice cloning—there’s enough audio from her movies for ChatGPT to recreate her voice. Voice actors have also recently filed a lawsuit for a company cloning their voices: https://www.hollywoodreporter.com/business/business-news/scarlett-johansson-ai-legal-threat-1235905899/

      • Blithe says:

        Yes, this. People whose voices are publicly available— and whose voices and likenesses are a critical part of their livelihood need protections. All of us do. I was recently shocked when I called a financial institution, and, in the difficult process that it took for me to reach an actual human, I was urged to agree to provide a voice sample to identify me in the future. It was very difficult to get out of that particular loop — without inadvertently leaving what would be a voice sample.

    • MinnieMouse says:

      Recreating her voice for commercial purposes, even if it was simply by hiring a voice actor to replicate it, actually is illegal, and has been settled case law since a matter involving Bette Midler in the 70s or 80s, I believe. LaineyGossip did a pretty good rundown on the laws yesterday.

  4. Marysia says:

    Remember how Scarlett sued that french writer for making one of her characters like her? Or that AI company for making ad with character resembling her?
    I don’t get how anyone still wants to mess with her….

  5. Amy T says:

    “I’m sorry Ms. Johansson didn’t communicate better?”

    I hope she takes his arrogant ass to the cleaners. I have friends whose writing was used to train these systems and were never informed. They have ScarJo’s level of ire but not her resources.

    Grateful that she’s not letting this go, for all our sakes.

  6. Digital Unicorn says:

    ScarJo has a distinctive voice so he knew what he was doing – hope she sues him into oblivion. She’s one of the few people who successfully sued the House of Mouse so she clearly has a badass team of lawyers.

  7. Jay says:

    What’s strange to me is that Altman apparently asked SJ to “reconsider” her decision mere days before the release of Sky. Is that enough time for her to record and have the voice ready to go? I doubt it. I think that they used her voice to train their AI and were looking for her to sign off on it.

    All of this is super creepy, and incidentally it was a big part of the actor’s strike negotiations – the use of an actor’s likeness.

    • Lucy2 says:

      I agree, I think they used her voice, ignored her refusal, and thought they would talk her into it as time went on. When she still refused, they just went ahead anyway.
      If they are claiming it is another voice actor, let’s see them! Let’s see this person who sounds just like her.

      • Fifty-50 says:

        That is OpenAI’s—really the entire generative AI indistry (Midjourney, Stability AI, Anthropic, Google, Meta, etc etc)—MO and the reason why every company which has launched an AI is embroiled in a billion lawsuits for copyright violation.

  8. alexc says:

    I worked in IT for 25+ years and Sam Altman is very dangerous. Too many technology companies are run by brilliant sociopaths with zero empathy and no emotional intelligence. Oh and they are arrogant and greedy AF.

  9. Wagiman says:

    The ship has sailed! It’s been a mess for a long time.. AI is /will corrupt legal systems, gov you name it. It’s unstoppable. Legal is going to be huge.

  10. Fifty-50 says:

    There’s a bill making its way through the Senate called the No Fakes Act, which has gained bipartisan support. In the meantime, New York has passed some laws giving people the right to sue for the commercial use of deepfakes and the Colorado governor recently signed a pretty broad AI bill into law. The thing about states passing their own laws is that companies that want to do business there have to comply—OpenAI and Google etc aren’t going to suddenly stop operating in Colorado and move out (especially with Denver being a center for tech), so it paves the way for other states to pass similar laws while Congress plays catch-up.

    • Lizzie Bathory says:

      The EU also just passed its own AI regulations, which will affect how all of these companies operate. More regulation for the “move fast & break things” crowd, please!

      Especially gross in this story is that at its base, it has nothing to do with tech or the law–this is just Sam Altman deciding to recreate the story from “Her,” regardless of whether Scarlett agreed to contribute her work. So he decided to just steal, which is very on brand for the AI world.

      • Fifty-50 says:

        Altman is working as though this is still a pre-Cambridge Analytica world, not taking into account that regulators and lawmakers are much more skeptical on giving tech free rein in the name of “innovation.”

        Lol please excuse me as I get on my soapbox – I think AI has a huge image problem right now. At the moment, it’s associated with:

        – massive theft of authors’ writing, artists’ works, and singers’ music (I’ve actually lost count of all the lawsuits)
        – the proliferation of deepfake images, audio, and video
        – creation of child sexual abuse materials (DOJ recently arrested a guy who made and distributed over 13,000 CSAM pictures created using the text-to-image generative AI program Stable Diffusion)
        – election fraud (the New Hampshire Primaries robocalls faking Biden’s voice were AI-generated)
        – hallucinations (the number of lawyers sanctioned for using AI which creates fake case law is hilarious)
        – discrimination (jobs which use machine learning to select candidates are more likely to discriminate against Black individuals; Rite Aid was sued for using advanced facial recognition technology which disproportionately denied entry to Black and brown individuals)
        – healthcare steering (algorithms are more likely to recommend that insurance companies deny certain treatments based on expected outcome and cost)
        – climate crisis (AI currently uses more energy than Morocco)

        There’s also the blatant hypocrisy of OpenAI trying to file a lawsuit against Reddit for r/ChatGPT’s “unauthorized use” of OpenAI’s logo: https://www.gizchina.com/2024/05/13/openai-vs-reddit-legal-battle/

        The buzz that generative AI initially created among investors is already fading – many investors are no longer impressed and some are actively wary of a company’s claims regarding AI, especially since there’s been several cases of AI-washing. All of this has served to make the public generally distrust generative AI.

        I mean, obviously I have an extremely skewed view of AI but right now, in this present moment, I honestly don’t see what society has actually gained from this technology. What current, real-world problem has AI solved? Tech has promised it will do amazing things – I haven’t seen anything amazing yet.

  11. Karen says:

    Sam Altman has been accused of abusing his sister for years. Here is a statement from her made around the time he was fired by the board of Open AI.

    https://www.hackingbutlegal.com/p/statement-by-annie-altman-sam-altmans

  12. UpIn Toronto says:

    Machine learning AI : YES!

    Generative AI: NO! Stop scraping the internet for sourcing

  13. Rnot says:

    A billionaire who won’t take no for an answer. How long until we hear about sexual harassment and assaults?

    • SarahCS says:

      Not sexual but two comments above yours is a link to a statement by his sister about how he has abused her and controlled her.