I just had a conversation with an in-game NPC that could easily appear scripted. Except it wasn't, at all. I asked a question, the NPC answered, and all thanks to Nvidia's ACE technology and Convai.
You might have caught sight of Nvidia ACE in action during the company's Special Address stream. It's essentially a technology that allows in-game NPCs to react and respond to players in real-time, with voiced dialogue and animations. Nvidia's been showing off the same tech demo called Kairos, which takes place inside a ramen restaurant in a cyberpunk world, since ACE was announced back at Computex. Over at CES 2024, I got to try it out for myself.
In the tech demo—which is build in Unreal Engine 5 and uses a platform from AI startup Convai—you play as Kai, and you're able to speak with Jin and Nova, two NPCs in the demo's cyberpunk world.
First off, I gave a prompt that was entered into the demo's underlying AI system. This will be used to direct the conversation between the two NPCs before I even interact with them. I picked the first thing that came into my head, which for whatever reason was skateboarding. That meant as I walked up to both characters, I could hear them discussing the finer points of skateboarding injuries.
That's a small part of the AI-driven system. Mostly it's about responding to you, as another character in the world. This requires a microphone to directly ask questions and speak with the NPCs.
Below is a video of me speaking with Jin and ordering some ramen. It's pretty weird to hold a conversation with an NPC like this, yet it's also pretty fun. There's a slight delay in responding to you, which comes across as an awkward pause, but honestly the delay was minimal and the general accuracy of the
Read more on pcgamer.com