Activision has announced that it will be implementing new real-time voice moderation systems to Call of Duty's online experience in the run-up to Modern Warfare 3.
According to a blog post on the Call of Duty website, Activision aims to deliver «global real-time voice chat moderation, at scale,» all with the aim to «enforce against toxic speech — including hate speech.»
An «initial beta rollout» of this new chat moderation technology has already begun in North America, where the new features were added on August 30 for Call of Duty: Modern Warfare 2and Call of Duty: Warzone. This is slated to be followed by a near-global implementation of the new voice moderation systems timed to coincide with the release of Modern Warfare 3 on November 10. The system will be introduced in every global region except Asia.
The new system uses AI-powered chat moderation software, ToxMod, developed by Modulate. According to Activision's blog post, the system is designed to identify toxic behavior, including «hate speech, discriminatory language, harassment, and more.» These new features come alongside existing text-based moderation systems, which work across 14 different languages.
Activision's blog post expresses confidence in the moderation system, touting that "Call of Duty's existing anti-toxicity moderation has restricted voice and/or text chat to over 1 million accounts." Activision also announced that, of accounts to which they issued cautions, «20% of players did not re-offend after receiving a first warning.»
The implementation of ToxMod follows in the wake of a similar move by Microsoft, which recently began rolling out its own voice chat recording tool, allowing players to send voice chat snippets to Microsoft's moderation team.
Read more on techradar.com