Activision has announced new measures to combat toxicity within Call of Duty, confirming it'll be introducing what it calls «global real-time voice chat moderation» alongside the launch of Modern Warfare 3 on 10th November, with a US beta trial of the feature starting today.
Call of Duty's new voice chat moderation system will employ AI-powered technology from Modulate to identify and enforce against toxic speech in real-time, with flagged language including hate speech, discrimination, and harassment.
An initial beta for Call of Duty's new voice moderation system is being rolled out across Warzone and Modern Warfare 2 starting today, 30th August, in North America, and a global release will coincide with Modern Warfare 3's arrival in November. Activision notes the tools will only support English at first, with additional languages coming «at a later date».
In a Q&A accompanying today's announcement, Activision explains the AI-powered system will only be responsible for identifying and reporting percieved offences for further review — attaching a behaviour category and rated level of severity to each submission — and that the publisher itself will determine how each violations is enforced based on its Security and Enforcement Policy.
It adds that «trash-talk» will be acceptable along as it doesn't fall within the definition of harmful language outlined in its Code of Conduct, and notes that the only way to opt out of the new moderation system is to disable in-game voice chat.
Activision says Call of Duty's existing anti-toxicity moderation policies have so far resulted in voice and/or text chat restrictions across over 1m accounts since the launch of Modern Warfare 2, and that 20 percent of players did not reoffend
Read more on eurogamer.net