Call of Duty is now making use of AI to help find "hate speech, discriminatory language, harassment, and more" in voice chat and ban jerks from the game.
The AI system, called ToxMod, is "focused on detecting harm within voice chat versus specific keywords," according to an official FAQ. Detection happens in real time, though the devs say ToxMod "only submits reports about toxic behavior, categorized by its type of behavior and a rated level of severity based on an evolving model. Activision determines how it will enforce voice chat moderation violations."
With this new system, the devs confirm that "voice chat is monitored and recorded for the express purpose of moderation," and they've got a pretty simple solution for anybody who doesn't want an AI monitoring them for harassment: "Players that do not wish to have their voice moderated can disable in-game voice chat in the settings menu."
ToxMod is live in North America for Modern Warfare 2 and Warzone as of today, August 30, and will roll out globally (except for Asia) with the launch of Modern Warfare 3 on November 10. According to a press release, English is the only language supported right now, but other languages will follow "at a later date."
The AI system was created by a company called Modulate. Up until this point ToxMod has seemingly been deployed primarily in social VR games like Rec Room and Among Us VR. While it's possible other games have been using the tech without a formal announcement, Call of Duty is the first AAA title to be listed on ToxMod's official site.
The Modern Warfare 3 open beta is set to kick off in October, and you can follow that link for a breakdown of all the details.
Read more on gamesradar.com