Sony teased an exciting AI reveal last week and today, it unveiled Gran Turismo Sophy, a new AI developed in partnership with Polyphony Digital that races vehicles so well that it outperforms professional GT drivers.
On the surface, it looks like a very good driving AI and if you’re anything like me, you’ve had your butt kicked by AI drivers in the likes of Need for Speed, Dirt, and even Gran Turismo before. However, Sony’s presentation today, which included speakers from Polyphony Digital including Gran Turismo 7 producer Kazunori Yamauchi, showcased just how great Sophy is at Gran Turismo.
Before diving into the brain behind Sophy, let me mention how Sony showcased Sophy’s AI strength during a preview Game Informer attended: they had multiple professional Gran Turismo Sport players (see: real humans) race against multiple versions of Sophy and guess who came in first place? Sophy. A human came in second place but another version of the Sophy AI took home the bronze medal.
In short, Sophy outperformed one of the best Gran Turismo players in the world and the AI did it quite realistically. That’s because the Sophy AI isn’t designed to race as good as (or better than) human players through unrealistic ways. Instead, Sophy uses real tactics and racing strategy to finish first, much in the same way professional GT drivers do.
“GT Sophy is an autonomous AI agent trained utilizing a novel deep reinforcement learning platform developed in collaboration between Sony AI, PDI, and SIE,” Sony says. “Each group contributed to the success of the project by bringing together expertise in fundamental AI research and development, a hyper-realistic real world racing simulator, and infrastructure for massive scale AI training.”
Sony
Read more on gameinformer.com