Activision is turning to AI to help it moderate voice chat in Call of Duty games.
The Call of Duty anti-toxicity team announced that Call of Duty: Modern Warfare III will be the first game to use real-time AI-powered voice chat moderation technology when it launches in November.
The technology is called ToxMod from Modulate.ai and was purpose-built for use in videos games. It uses advanced machine learning to analyze the nuances of each conversation in order to determine toxicity. Modulate says ToxMod is capable of learning the code of conduct for any given game and provides reports to moderation teams so they can decide what action is appropriate to take against toxic players.
In Call of Duty games, ToxMod will be used to "identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more."
Before ToxMod is used worldwide in Modern Warfare III, testing will be carried out on Aug. 30 in North America using Modern Warfare II and Warzone. Initially, only English will be supported, but other languages will be montiored by the system "at a later date." Modulate says ToxMod is fluent in 18 languages and more are being added.
Mike Pappas, CEO at Modulate, said, "We're enormously excited to team with Activision to push forward the cutting edge of trust and safety ... This is a big step forward in supporting a player community the size and scale of Call of Duty, and further reinforces Activision’s ongoing commitment to lead in this effort."
Activision isn't just keen to cut down on toxic behavior, it also wants to stamp out cheating in CoD games. A range of tactics have been used to discourage players from cheating, with the most recent being to show hallucinations to
Read more on pcmag.com