Hyper-capable AIs have been beating us at our own games for years. Whether it’s Go or Jeopardy, DOTA 2 or Nethack, artificial intelligences have routinely proven themselves superior competitors, helping advance not only the state of gaming arts but also those of machine learning and computational science as well. On Wednesday, Sony announced its latest addition to the field, GT Sophy, an AI racer capable of taking on — and beating — some of the world’s best Gran Turismo players.
GT Sophy (the GT stands for “Gran Turismo”) is the result of a collaboration between Sony AI, Polyphony Digital (PDI) and Sony Interactive Entertainment (SIE), as well as more than half a decade of research and development.
“Gran Turismo Sophy is a significant development in AI whose purpose is not simply to be better than human players, but to offer players a stimulating opponent that can accelerate and elevate the players’ techniques and creativity to the next level,” Sony AI CEO, Hiroaki Kitano, said in a statement Wednesday. “In addition to making contributions to the gaming community, we believe this breakthrough presents new opportunities in areas such as autonomous racing, autonomous driving, high-speed robotics and control.”
Utilizing as novel deep reinforcement learning method, the research team taught its AI agent how to control a digital race car within the structure of the GT game, helping Sophy to understand vehicle dynamics and capabilities, as well as racing tactics like slipstreaming, passing and blocking overtakers and basic track etiquette.
“To drive competitively GT Sophy had to learn to control the car at the physical limit, optimize for braking and acceleration points, as well as find the right lines that squeeze the last tenth
Read more on engadget.com