Scarlett Johansson has released a statement expressing her shock at a new voice mode developed by OpenAI which sounds eerily like her character in the 2013 film Her.
Last week, the generative artificial intelligence startup released a demo of its new ChatGPT 4o model which included a new feature that answers user questions in a feminine, emotive voice.
The voice option, dubbed “Sky”, was removed on Monday this week (the company said it is pausing the feature for the moment) following widespread criticism, especially since Johansson’s statement revealed that Open AI had approached her nine months ago to voice its AI system — an offer she declined for “personal reasons”.
“Last September, I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system,” the actor revealed in her statement.
“He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and AI. He said he felt that my voice would be comforting to people.”
Johansson went on to say that her friends, family and the general public “all noted how much the newest system named ‘Sky’ sounded like me.”
“When I heard the released demo, I was shocked, angered and in disbelief that Mr. Altman would pursue a voice that sounded so eerily similar to mine that my closest friends and news outlets could not tell the difference,” she continued.
“Mr. Altman even insinuated that the similarity was intentional, tweeting a single word “her” – a reference to the film in which I voiced a chat system, Samantha, who forms an intimate relationship with a human.”
The actor said as a result of Open AI’s actions, she was forced to hire legal counsel.
“In a time when we are all grappling with deepfakes and the protection of our own likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity,” she concluded. “I look forward to resolution in the form of transparency and the passage of appropriate legislation to help ensure that individual rights are protected.”
On Monday morning, the company posted a public statement, acknowledging the concerns and saying it is “working to pause the use of Sky while we address them.”
“We believe that AI voices should not deliberately mimic a celebrity’s distinctive voice — Sky’s voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using her own natural speaking voice,” the statement read. “To protect their privacy, we cannot share the names of our voice talents.”
OpenAI claimed that it chose Sky’s voice based on criteria that included having a “warm, engaging, confidence-inspiring, charismatic voice with rich tone” and being “an approachable voice that inspires trust.”
According to the company, they received over 400 submissions from voice and screen actors. The audition involved actors recording a script of ChatGPT responses.
“We spoke with each actor about the vision for human-AI voice interactions and OpenAI, and discussed the technology’s capabilities, limitations, and the risks involved, as well as the safeguards we have implemented. It was important to us that each actor understood the scope and intentions of Voice Mode before committing to the project.”
Just days after the launch of its latest AI model, Jan Leike, a safety researcher at Open AI quit, citing concerns over the company’s safety culture and processes.
“Over the past years, safety culture and processes have taken a backseat to shiny products,” he wrote on X.
“These problems are quite hard to get right, and I am concerned we aren’t on a trajectory to get there…Building smarter-than-human machines is an inherently dangerous endeavour. OpenAI is shouldering an enormous responsibility on behalf of all of humanity.”
Over the weekend, Sam Altman and OpenAI’s president, Greg Brockman, sought to address the concerns, writing on X: “We believe both in delivering on the tremendous upside and working to mitigate the serious risks; we take our role here very seriously and carefully weigh feedback on our actions.”