Call of Duty is joining the growing number of online games combatting toxicity by listening to in-game voice chat, and it's using AI to automate the process. Activision announced a partnership with AI outfit Modulate to integrate its proprietary voice moderation tool—ToxMod—into Modern Warfare 2, Warzone 2, and the upcoming Modern Warfare 3.
Activision says ToxMod, which begins beta testing in North American servers today, is able to «identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more.»
Modulate describes ToxMod as «the only proactive voice chat moderation solution purpose-built for games.» While the official website lists a few games ToxMod is already being used in (mostly small VR games like Rec Room), Call of Duty's hundreds of thousands of daily players will likely represent the largest deployment of the tool to date.
Call of Duty's ToxMod AI will not have free rein to issue player bans. A voice chat moderation Q&A published today specifies that the AI's only job is to observe and report, not punish.
«Call of Duty’s Voice Chat Moderation system only submits reports about toxic behavior, categorized by its type of behavior and a rated level of severity based on an evolving model,» the answer reads. «Activision determines how it will enforce voice chat moderation violations.»
So while voice chat complaints against you will, in theory, be judged by a human before any action is taken, ToxMod looks at more than just keywords when flagging potential offenses. Modulate says its tool is unique for its ability to analyze tone and intent in speech to determine what is and isn't toxic. If you're naturally curious how that's achieved, you won't find a
Read more on pcgamer.com