A few hours ago, Activision announced the next step in its fight to combat online toxicity in the Call of Duty franchise with the North American rollout of an AI-powered real-time voice chat moderation tool.
The new system is based on ToxMod, a technology made by Modulate. Here's how it works, according to the description available on the official website:
First, ToxMod triages voice chat data to determine which conversations warrant investigation and analysis.
Second, ToxMod analyzes the tone, context, and perceived intention of those filtered conversations using its advanced machine learning processes.
Third, ToxMod escalates the voice chats deemed most toxic, and empowers moderators to efficiently take actions to mitigate bad behavior and build healthier communities.
Activision has already rolled out a beta release of ToxMod in North America across Call of Duty: Modern Warfare II and Warzone. The global release (except for Asia) will occur with the launch of Call of Duty: Modern Warfare III on November 10th. For now, English is the only supported language, with others coming in the future.
This new anti-toxicity system will add to the existing text-based filtering of the games' text chat available in fourteen languages, and of course, there's still the in-game player reporting system.
In other Call of Duty news, Season 05 Reloaded is out now, adding three operators (including Tomb Raider's Lara Croft), new weapons, and much more. Read the entire patch notes here.
Read more on wccftech.com