Right now, every industry faces discussions about how artificial intelligence might help or hinder work. In movies, creators are concerned that their work might be stolen to train AI replacements, their future jobs might be taken by machines, or even that the entire process of filmmaking could become fully automated, removing the need for everything from directors to actors to everybody behind the scenes.
But “AI” is far more complicated than ChatGPT and Sora, the kinds of publicly accessible tools that crop up on social media. For visual effects artists, like those at Wētā FX who worked on Kingdom of the Planet of the Apes, machine learning can be just another powerful tool in an artistic arsenal, used to make movies bigger and better-looking than before. Kingdom visual effects supervisor Erik Winquist sat down with Polygon ahead of the movie’s release and discussed the ways AI tools were key to making the movie, and how the limitations on those tools still make the human element key to the process.
For the making ofKingdom of the Planet of the Apes, Winquist says some of the most important machine-learning tools were called “solvers.”
“A solver, essentially, is just taking a bunch of data — whether that’s the dots on an actor’s face [or] on their mocap suit — and running an algorithm,” Winquist explains. “[It’s] trying to find the least amount of error, essentially trying to match up where those points are in 3D space, to a joint on the actor’s body, their puppet’s body, let’s say. Or in the case of a simulation, a solver is essentially taking where every single point — in the water sim, say — was in the previous frame, looking at its velocity, and saying, ‘Oh, therefore it should be here [in the next frame],’ and applying physics every step of the way.”
For the faces of Kingdom’s many ape characters, Winquist says the solvers might manipulate digital ape models to roughly match the actors’ mouth shapes and lip-synching, giving the faces the vague creases and
Read more on polygon.com