Speed around a French village in the video game Gran Turismo and you might spot a Corvette behind you trying to catch your slipstream.
The technique of using the draft of an opponent's racecar to speed up and overtake them is one favored by skilled players of PlayStation's realistic racing game.
But this Corvette driver is not being controlled by a human — it's GT Sophy, a powerful artificial intelligence agent built by PlayStation-maker Sony.
Gran Turismo players have been competing against computer-generated racecars since the franchise launched in the 1990s, but the new AI driver that was unleashed last week on Gran Turismo 7 is smarter and faster because it's been trained using the latest AI methods.
“Gran Turismo had a built-in AI existing from the beginning of the game, but it has a very narrow band of performance and it isn't very good," said Michael Spranger, chief operating officer of Sony AI. “It's very predictable. Once you get past a certain level, it doesn't really entice you anymore.”
But now, he said, “this AI is going to put up a fight.”
Visit an artificial intelligence laboratory at universities and companies like Sony, Google, Meta, Microsoft and ChatGPT-maker OpenAI and it's not unusual to find AI agents like Sophy racing cars, slinging angry birds at pigs, fighting epic interstellar battles or helping human gamers build new Minecraft worlds -- all part of the job description for computer systems trying to learn how to get smarter in games.
But in some instances, they are also trying to learn how to get smarter in the real world. In a January paper, a University of Cambridge researcher who built an AI agent to control Pokémon characters argued it could “inspire all sorts of applications that
Read more on tech.hindustantimes.com