AI agents have bested humans at many games, from chess to Go to poker. Now, the machines can claim a new high score on the classic racing video game series Gran Turismo.
Sony announced today that its researchers have developed an AI driver named GT Sophy that is “reliably superhuman” — able to beat top human drivers in Gran Turismo Sport in back-to-back laps. You might think this an easy challenge. After all, isn’t racing simply a matter of speed and reaction time and therefore simple for a machine to master? But experts in both video game racing and artificial intelligence say GT Sophy’s success is a significant breakthrough, with the agent showing mastery of tactics and strategy.
“Outracing human drivers so skilfully in a head-to-head competition represents a landmark achievement for AI,” writes Stanford automotive professor J. Christian Gerdes in an editorial in the scientific journal Nature that accompanies a paper describing the work. “GT Sophy’s success on the track suggests that neural networks might one day have a larger role in the software of automated vehicles than they do today.”
GT Sophy was trained using a method known as reinforcement learning: essentially a form of trial-and-error in which the AI agent is thrown into an environment with no instructions and rewarded for hitting certain goals. In the case of GT Sophy, Sony’s researchers say they had to craft this “reward function” extremely carefully: for example, fine-tuning penalties for collisions in order to shape a driving style that was aggressive enough to win but that didn’t lead to the AI simply bullying other racers off the road.
Using reinforcement learning, GT Sophy was able to navigate round a racetrack with just a few hours of training and
Read more on theverge.com