Activision is rolling out in-game voice chat moderation in Call of Duty.
In a blog post on the franchise's website, the company wrote that this feature uses Modulate's ToxMod, an artificial intelligence system that can be used to recognise in real-time toxic speech. That includes hate speech, discriminatory language and harassment, Activision says.
This tech is being tested in beta form as of yesterday in Call of Duty: Modern Warfare 2 and Warzone, before a full rollout in the upcoming Call of Duty: Modern Warfare 3.
"This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system," the Call of Duty team wrote.
Activision also said that since Modern Warfare 2's launch last October, it has restricted voice and/or text chat for over one million accounts who have violated Call of Duty's code of conduct.
Read more on pcgamesinsider.biz