Riot Games has shared an update on its ongoing efforts to combat voice and chat toxicity in its free-to-play shooter Valorant, pledging that harsher, more immediate punishments for abusers — and its previously announced voice recording moderation system — are on the way.
Riot initially outlined the areas it would be focusing on to combat unwanted player behaviour in Valorant — which included repeated AFK offences as well as those related to toxic comms use — in a blog post last year. The developer has now offered an update on its progress, highlighting some of the new steps it will be implementing in a bid to make for a more pleasant player experience in the near future.
Presently, Riot relies on a combination of player reports and automatic text detection to curb unwanted player behaviour, which it defines as insults, threats, harassment, or offensive language. These moderation methods are said to have resulted in 400,000 voice and text chat mutes, plus 40,000 game bans (implemented for «numerous, repeated instances of toxic communication» and ranging from a few days to permanent) this January alone.
Valorant — Official Music Video — Die For You ft. Grabbitz.
Despite these efforts, Riot admits «the frequency with which players encounter harassment in our game hasn't meaningfully gone down». As such, it calls the work it's done so far «at best, foundational» and accepts «there's a ton more to build on top of it in 2022 and beyond.»
To that end, the developer is pledging to make a number of changes to its existing moderation methods. For starters, it's exploring — as part of a Regional Test Pilot Program limited to Turkey at present — the creation of Player Support agents who'll oversee incoming reports strictly dedicated
Read more on eurogamer.net