Recently, Nvidia showcased its generative AI-based digital human technologies that can create dynamic, lifelike characters for use in games and other applications. With help from participating studios, Nvidia showed demos that leveraged these AI technologies to create immersive interactions with computer-generated avatars.
Digital human technologies is an umbrella term that includes Nvidia’s Avatar Cloud Engine (ACE), NeMo, and RTX tech. Among them, ACE facilitates speech recognition and animation tasks, NeMo revolves around natural language processing, and RTX technology is what enables devs to render high-fidelity models. Over the past few years, Nvidia has continued working on these technologies and sometimes shown them in action at events.
At GDC 2024, an Unreal Engine 5-based demo from Inworld AI and NvidiaCovert Protocol showcased modular NPCs that adapt themselves with respect to the player’s actions – making sure each player experiences a unique playthrough. This demo uses Nvidia’s microservices to detect player voice input and animate NPC dialogues. Additionally, this dynamic NPC behavior is influenced by a character’s personality, emotional state, and the context of the interaction with the player.
Besides game demos, other possible real-world uses of Nvidia’s digital human technologies were shown at another recent event, GTC 2024. First off, Hippocratic AI revealed a look at a digital healthcare agent designed for handling specific tasks, such as care coordination and post-discharge management. It relies on Nvidia’s ACE, Audio2Face, Omniverse Streamer Client, and more technologies for the same. On top of that, UneeQ’s collaborative work on a digital avatar for customer service needs, which uses Nvidia’s tech and Synanim ML, also took the stage with a demo.
Among the big names, Ubisoft games may feature generative AI-fueled NPCs in the future. It shared three tech demos of NEO NPCs, its own variation of dynamic characters built using Inworld AI and Nvidia
Read more on gamerant.com