NVIDIA has unveiled technology called Avatar Cloud Engine (ACE) that would allow gamers to speak naturally to non-playable characters (NPCs) and receive appropriate responses. The company revealed the tech during its generative AI keynote at Computex 2023, showing a demo called Kairos with a playable character speaking to an NPC named Jin in a dystopic-looking Ramen shop.
The demo (below in 32:9, the widest widescreen I've ever seen) shows the player carrying on a conversation with Jin. "Hey Jin, how are you," the person asks. "Unfortunately, not so good," replies Jin. "How come?" " I am worried about the crime around here. It's gotten bad lately. My ramen shop got caught in the crossfire."
Yes, the dialogue is a tad wooden; it seems like ChatGPT might have done a better job. Still, the idea is to show that you could just speak into your headset and an NPC will answer in the proper context, making for a more natural interaction than you'd usually get in such a situation.
NVIDIA made the demo in partnership with Convai to promote ACE, which can run both in the cloud and locally (on NVIDIA hardware, natch). It uses NVIDIA NeMo for building, customizing and deploying large language models that can be customized with lore and character backstories, while using guardrails to protect against inappropriate conversations. It also deploys a speech recognition and speech-to-text tool called Riva, along with NVIDIA's Omniverse Audio2Face "for instantly creating expressive facial animation of a game character to match any speech track."
Great deals on consumer electronics delivered straight to your inbox, curated by Engadget’s editorial team. See latest
Please enter a valid email address
Please select a newsletter
By subscribing, you
Read more on engadget.com