Games have the power to connect people across the world in enjoyment and teamwork, but they can also create a space ripe for toxic speech and hatred. Activision is attempting to minimize the latter, announcing a new collaboration with Modulate, a company using technology to identify these issues, for direct voice chat moderation in Call of Duty.
Modulate's AI system, ToxMod, attempts to identify threats like hate speech, radicalization and self-harm in real-time. It claims to work in three steps: triage, analyze and escalate. ToxMod listens to all voice chats and pinpoints which warrant a further look. This flagged data is stored in their servers, while all other data will be processed right on the initial device. The company says it then evaluates everything from tone to emotion, analyzing not only "what is being said, but also how it is said and how other players respond to it." Finally, it attempts to alert moderators about the most toxic incidents and leaves it up to them to take action. The company claims it's the "only voice-native moderation solution" currently available, having protected "tens of millions of players."
The integration of ToxMod could aid in preventing toxic responses as a whole, working alongside existing text-centric and reporting systems. "Tackling disruptive voice chat particularly has long been an extraordinary challenge across gaming," Activision's chief technology officer Michael Vance said in a statement. "With this collaboration, we are now bringing Modulate's state of the art machine learning technology that can scale in realtime for a global level of enforcement. This is a critical step forward to creating and maintaining a fun, fair and welcoming experience for all players." Last year,
Read more on engadget.com