Xbox is continuing to clamp down on toxic and abusive players on its systems, revealing that it has seen a huge increase in accounts reprimanded for posting "vulgar" content. Since improving its moderation system, the company says there's been a 450 percent increase in "enforcements" over such content, presumably ranging from taking down the post to suspending players altogether.
This also comes as Xbox expands the definition of what it classes as vulgar content, now including "offensive gestures, sexualized content, and crude humour". Xbox defends these increased moderation tactics, saying that such content was "detracting from the core gaming experience" for other users, and that most of the players who posted this material were just given a warning, not banned from their accounts altogether.
Related: The Abuse Of Ada Wong’s Actor Is The Latest Episode Of Gamers Being Awful
This information comes from Xbox's latest transparency report into its moderation practices. Here, the company discusses how it is trying to take a more "proactive" role in enforcing its terms and conditions, using AI to spot material and language that's banned on its services.
"As the needs of players continue to evolve, so do our tools," reads the report. "The safety of our players is a top priority – and to advance safe online experiences, we will continue to invest in innovation, work in close collaboration with industry partners and regulators, and collect feedback from the community."
80 percent of content reports were dealt with through "proactive moderation", meaning they could have been decided by either an AI or a human. Xbox doesn't discuss the possibilities of false reports, despite much of this work possibly being conducted by an AI.
Read more on thegamer.com