Gran Turismo 7 players can look forward to racing against a superhuman AI competitor known as ‘Gran Turismo Sophy’, thanks to a breakthrough in artificial intelligence.
Created by Sony AI, GT Sophy was made in conjunction with Gran Turismo developer Polyphony Digital and has been in development for five years. Sophy is a new autonomous AI agent that has been trained to win against the best GT Sport drivers in the world today and uses a brand-new AI algorithm that has helped Sony create a sophisticated racing opponent, one that simply hasn’t been possible until now.
However, unlike your typical AI opponents, Gran Turismo Sophy isn’t the equivalent of putting the in-game’s CPU on hard. It reacts and drives like a real human (or superhuman), and crucially had to master three essential driving skills using deep reinforcement learning (essentially, the process of repeating and learning countless scenarios until the best outcome is achieved).
“Gran Turismo Sophy is a significant development in AI whose purpose is not simply to be better than human players, but to offer players a stimulating opponent that can accelerate and elevate the players’ techniques and creativity to the next level,” said Hiroaki Kitano, CEO of Sony AI. “In addition to making contributions to the gaming community, we believe this breakthrough presents new opportunities in areas such as autonomous racing, autonomous driving, high-speed robotics and control.”
The first element Sophy had to master was race car control. Sophy needed to understand car dynamics, racing lines, and precision maneuvers to conquer challenging tracks. Next, Sophy had to learn racing tactics. Split-second decision-making skills are required when racing to deal with ever-evolving
Read more on techradar.com