"You're full of 💩": Smite studio walks back contract that would let it use AI to replicate dead voice actors
Hi-Rez Studios takes heat for a clause in its contract that says it can make AI clones of your voice if you die
Update: Hi-Rez president Stewart Chisam has shared an updated version of the AI rider described in the below report. The controversial clause which previously stipulated that the studio could use AI to replicate the voices of dead actors seems to have been removed in its entirety, but protection against AI cloning or "digital doubles" remains.
Original story...
Smite and Paladins developer Hi-Rez Studios has responded to backlash over the alleged use of AI to replicate actors' voices, arguing that it'll only do so under certain circumstances when an actor dies or is incapacitated while under contract, and now pledging to change its contract in response.
Voice actor Henry Schrader, who has credits on Genshin Impact, Blue Lock, and One Piece, said in a tweet Thursday that Hi-Rez "will be using AI to clone voices" and added that it had "refused to add in any words to contracts that would protect actors from it."
Schrader went on to say that several actors who he had been in contact with were told they weren't able to view the contract outlining these conditions unless they signed a non-disclosure agreement, thereby preventing them from warning others of working with the developer.
Hi-Rez president Stewart Chisam responded by saying Shrader was "full of 💩" and sharing a partial clause seemingly protecting voice actors from AI voice replication. However, other voice actors joined the conversation with their own experiences with Hi-Rez contracts. Marin M. Miller, the voice actor behind Nimbus from Destiny 2's Lightfall expansion, called into question contracts they'd allegedly seen that would allow Hi-Rez to "simulate the talents' voice after they die."
Where are your sources because as the guy that approved an AI Rider for voice talent that includes the following, I think you’re full of 💩 pic.twitter.com/kwEwAejWxSAugust 17, 2023
Following increasing blowback, Chisam shared what he claims is the full rider on AI-related contract details, which indeed seems to confirm that Hi-Rez has stipulated to actors that it may use AI to simulate their voices after they die or become incapacitated. Here's the full clause in question:
Sign up to the 12DOVE Newsletter
Weekly digests, tales from the communities you love, and more
"Client agrees not to use, or sublicense, the Performance to simulate Talent's voice or likeness to create any synthesized or "digital double" voice or likeness of Talent. Notwithstanding the foregoing, in the event of Talent's death or incapacity that leaves Talent unable to perform at any foreseeable time, Client shall be permitted to use the Performance or recording(s) or other digital representation(s) of the Performances of Talent to produce new audio, images, and/or video of Talent's voice and/or likeness (a 'Synthetic Performance')."
Essentially, the clause seems to confirm that what Hi-Rez is being accused of is true, to some degree. In a statement to 12DOVE, Chisam said "the clause around death was not that important to us and so narrow in our use case we would probably never use it," adding, "I think the lawyers wanted something that would cover a lot of contingencies (that's their job to think through) and the way it's worded seemed reasonable."
Chisam tells us he's since "asked the lawyers to remove that part," referring to the section about AI cloning of dead actors' voices. It's important to note that this clause also includes protection from "digital double" AI clones, so the final wording of the amended contract will be important. That said, we asked Chisam if Hi-Rez would be willing to work with voice actors who want to opt out of having their voice used for AI in this way, to which he responded: "Yeah."
"I don't want to have a lot of different versions of the Rider floating out there as that becomes difficult to keep track of who has what," he explained. "So I've asked the lawyers to remove that part. When I was asked to review it, it seemed reasonable and I approved it, but it's not really a clause I give a shit about."
The use of AI to replicate actors' likenesses is one of the issues at the heart of the ongoing WGA strike with the potential to impact the release schedules for a massive slate of upcoming movies and TV shows, and it's a growing concern in the video game industry as well, with actors including Jennifer Hale and David Hayter having gone on record to oppose the practice.
We've asked for clarification on the removal of language from the contract and will update this article if we hear back.
After scoring a degree in English from ASU, I worked as a copy editor while freelancing for places like SFX Magazine, Screen Rant, Game Revolution, and MMORPG on the side. Now, as GamesRadar's west coast Staff Writer, I'm responsible for managing the site's western regional executive branch, AKA my apartment, and writing about whatever horror game I'm too afraid to finish.