Ubisoft has never been one to shy away from new and upcoming technology, and as you'd expect, it's been experimenting with implementation of generative AI in video games. The publisher unveiled its NEO NPC project at GDC, an early prototype that aims to explore how AI tech can make in-game characters and experiences more immersive.
Stressing that the experiment is purely a prototype and that there's «still a way to go before it can be implemented in a game», Ubisoft's Paris team is leading the charge, using Nvidia's Audio2Face application along with Inworld's Large Language Model to test the limits of player interactions with NPCs. Early demonstrations of how this currently works comprise multiple interactive scenarios that have the player conversing with in-game characters using speech-to-text. They'll respond to what you said, rather than having a limited number of lines to recite.
Ubisoft's NEO NPCs are apparently a step beyond just chatbots, though. Their personalities and backgrounds are authored by human writers, and within these prototypes, operate within the bounds of their scenarios. One example is a heist situation, in which the player must decide how to break into an enemy base. An NPC named Iron asks what you think is the best approach, and responds accordingly. However, there are only so many ways to approach the heist, meaning you ultimately need to pick from a limited set of options. Iron will protest to your alternative suggestions, as highlighted in Eurogamer's write-up. While this sounds limited, the idea seems to be that NEO NPCs work within the direction of a game's design, complementing scenarios with more immersive conversations rather than simply spewing generated answers with no regard for context.
The team behind this prototype says these NEO NPCs aren't going to deliver robotic answers, but will instead act like the characters they represent. In other words, not only will they provide more life-like responses to players, but they'll do it
Read more on pushsquare.com