Call of Duty: Modern Warfare 3 launches with in-game voice chat moderation powered by AI.
Activision is using AI-powered model ToxMod from Modulate, which features real-time voice chat moderation, to combat toxic behaviour.
The tech uses AI to identify in real-time and enforce against toxic speech including hate speech, discriminatory language, and harassment. Call of Duty already uses text-based filtering across 14 languages for in-game text (chat and usernames), and an in-game player reporting system.
In an FAQ, Activision moved to reassure players by insisting voice chat is monitored and recorded "for the express purpose of moderation", and "is focused on detecting harm within voice chat versus specific keywords".
The new system moderates based on the existing Call of Duty Code of Conduct. "Voice chat that includes bullying or harassment will not be tolerated," Activision warned. The Code of Conduct allows for “trash-talk” and "friendly banter", however. "Hate speech, discrimination, sexism, and other types of harmful language, as outlined in the Code of Conduct, will not be tolerated," Activision added.
Activision also stressed the AI does not enforce violations of the Code of Conduct it detects. "Call of Duty’s Voice Chat Moderation system only submits reports about toxic behaviour, categorised by its type of behaviour and a rated level of severity based on an evolving model," Activision explained. "Activision determines how it will enforce voice chat moderation violations."
Despite this new AI tech, enforcement of voice chat moderaton is not instantaneous, Activision clarified. Detection does happen in real time, with the system categorising and flagging toxic language based on the Call of Duty Code of Conduct as it is
Read more on ign.com