Call of Duty's anti-toxicity voice chat moderation system has detected more than two million accounts that are being investigated.
Last year, Activision announced that it would be implementing a new real-time voice moderation tool to its more recentCall of Duty games that would «enforce against toxic speech» by detecting things like «hate speech, discriminatory language, harassment, and more» from players.
A beta version of the AI-powered moderation system was added to Modern Warfare 2 and Warzone in August 2023 in North America in English only before later implementing it into Call of Duty: Modern Warfare 3, expanding it globally (minus Asia) and adding Spanish and Portuguese support to its moderation.
Now, according to the company in a recent blog post, its anti-toxicity system program has managed to detect more than two million accounts that have seen in-game enforcement for «disruptive voice chat, based on the Call of Duty Code of Conduct.»
However, Activision says it found an «unfortunate trend» that saw only one in five players report toxic behavior and speech. Still, in cases that go unreported, its voice moderation system would take action against players.
«Active reporting is still critical so that players can raise any negative situation they encounter,» the company said.
«To encourage more reporting, we’ve rolled out messages that thank players for reporting, and in the future, we're looking to provide additional feedback to players when we act on their reports.»
Activision explained that after examining the month-over-month data, Call of Duty had seen an 8% reduction in repeat offenders since the voice-chat moderation system was added and saw a 50% reduction in players «exposed to severe instances of disruptive voice» chat since Modern Warfare III was released.
For violating the rules of conduct, players can expect to see consequences such as being globally muted from voice and text chat and/or restricting other social features.
«Our tools will continue to
Read more on techradar.com