This year's Call Of Duty: Modern Warfare 3 will make use of Modulate's ToxMod AI to help moderate voice chat in multiplayer, Activision have revealed. The system won't detect offences in real time, nor will it be able to kick you from games for effing and blinding. It just submits reports of bad behaviour, with Activision staff making the final call on enforcement actions, which might involve reviewing recordings to work out the context of an outburst.
You can turn off voice chat in Call of Duty to avoid being recorded, but it doesn't look like there's a way to opt out of AI voice moderation in the current/new Call of Duty games at the time of writing. The tech is being beta-tested from today, 30th August in Modern Warfare 2 and Warzone in North America, with the English-language worldwide release to land alongside Modern Warfare 3 on 10th November.
Intriguingly, Modulate boasts that "ToxMod's advanced AI intuitively understands the difference between friendly trash talk and truly harmful toxic behavior". Given that many non-artificial intelligences, aka human beings, struggle with this distinction, I'm interested and a little fearful to see their working. The tech doesn't just listen out for naughty keywords, but "assesses the tone, timbre, emotion, and context of a conversation to determine the type and severity of toxic behavior".
In a release on Modulate's site, Activision's chief technology officer Michael Vance described the introduction of Toxmod to COD as a "critical step forward to creating and maintaining a fun, fair and welcoming experience for all players." The game already makes use of text-based filtering for in-game chat and usernames, on top of a player-reporting system.
Modulate - founded in 2017,
Read more on rockpapershotgun.com