At this week’s Game Developers Conference, Ubisoft offered a possible glimpse of an AI-filled future for gamers. The company demoed a prototype at the show that used Nvidia’s Ace microservice to produce fully voiced “smart NPCs” that players could interact with by speaking into a microphone. Despite drawing skepticism online (from myself included), the demo itself impressed us once we went hands-on with it. I had a surprisingly cogent conversation with an environmentally conscious NPC about the ethics of eco-terrorism, a completely off-script conversation made possible through AI.
It’s one of the stronger use cases we’ve seen of the tech yet, but it has a surprising shortcoming that even Ubisoft is struggling to solve: linguistic bias.
In the short demo, I took on the role of a space-faring character who gets involved with a resistance group’s fight against a megacorporation. The three-part demo had me chatting with two different characters, both of which Ubisoft created long backstories for and fed into Nvidia’s Ace tool. I learned about the sci-fi world by chatting with an NPC, asking him about his comrades, before planning a perfect heist with creative thinking.
After finishing my demo, I asked two Ubisoft workers involved with the project if there were any shortcomings with the tool that frustrated them. Though they were high on the tech overall, it was clear from some deep sighs that they had a laundry list of kinks they still needed to work out before the studio fully adopts the tech. Their number one gripe is the inherent bias present in the English language, a human problem that AI is currently inheriting by default.
The demoists pointed to two specific examples that hadn’t even registered with me during my playthrough. At one point, I asked an NPC to tell me about their least favorite member of their team. After telling me they loved all their crew members, they threw a character named Iron under the bus for being a prickly guy. There was
Read more on digitaltrends.com