Unity Technologies has announced a new tool for its developer suite that uses AI to help devs identify toxicity in online games. The new Safe Voice tool is launching in closed beta and is aimed at letting studios isolate and review toxicity reports quickly. Unity says the tool was previously used by Hi-Rez's Rogue Company in early testing.
Safe Voice is said to analyze aspects like tone, loudness, intonation, emotion, pitch, and context to identify toxic interactions. It activates when a player flags an issue with a behavior, and then starts monitoring and delivers a report to human moderators. That overview dashboard will let moderators review individual incidents as well as see trends over time to help its moderation plans. Unity also says this is the first in a larger suite of toxicity solutions it has coming.
«It's one of the number one reasons that people leave a game and stop playing because there's some sort of bad situation around toxicity and other elements of abuse,» Mark Whitten, Unity president of Create Solutions, told GameSpot.
Hi-Rez Studios announced a Unity partnership for a new voice chat recording system in February, when it issued the update that started testing the new tool. In the Safe Voice announcement, Rogue Company lead producer said the tool has been helpful in identifying and mitigating problems before they escalate.
In the early testing phase, Unity was testing to make sure that it was accurately flagging problems and shortening the time that humans needed to be involved. To that end, Whitten said, it was very successful. Game developers that would typically get tens of thousands of reports in a given period were able to narrow those reports down quickly and prioritize the ones most likely to
Read more on gamespot.com