Activision has begun testing a new AI-powered voice chat moderation system for Call of Duty games.
The publisher has launched a beta test for the moderation technology in North America inside Call of Duty: Modern Warfare 2 and Call of Duty: Warzone.
The system will be rolled out globally to coincide with Modern Warfare 3’s release in November.
“Call of Duty’s new voice chat moderation system utilizes ToxMod, the AI-Powered voice chat moderation technology from Modulate, to identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more,” Activision said in a blog post.
“This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system.”
Since the release of Modern Warfare 2 last October, Activision claims to have restricted voice and/or text chat to over one million accounts that violated its Call of Duty code of conduct.
It said 20% of players didn’t reoffend after getting a first warning, while those that did received penalties including voice and text chat bans, and temporary account restrictions.
Activision plans to reveal Modern Warfare 3’s multiplayer offering at the returning Call of Duty next showcase on October 5.
The first of two Modern Warfare 3 multiplayer beta weekends will kick off the following day, and it will be available exclusively on PS4 and PS5.
The game will feature modernised versions of all 16 launch maps from 2009’s Modern Warfare 2. A selection of these will be included in the multiplayer beta, along with new Ground War experiences.
The second beta weekend is
Read more on videogameschronicle.com