Microsoft has released a Digital Transparency Report in an effort to show the steps being taken to create a safe community for Xbox players. And one of the key highlights from that report is that the Xbox team has taken punitive action against 4.78 million accounts between the period of January 1-June 30, 2022.
Player toxicity has become a larger focus amongst game developers and publishers this year. Developers have taken stronger public stances to condemn toxic members in their respective communities, and some have even gone so far as to implement gameplay measures to discourage disruptive behavior.
In its report, Microsoft defined its actions as "enforcements," which range from removing the offensive content or suspending an account shutting that account down completely and removing said content.
Of the accounts enforced, it continued, 4.33 million were brought about by activities like cheating or making inauthentic bot accounts. By comparison, other activities said to "ultimately create an unlevel playing field for our players or detract from their experiences," such as adult sexual content (199,000) or fraud (87,000) were incredibly low.
During that six-month period, cheating and inauthentic accounts took up 57 percent of overall enforcements.
For the six-month 2022 period, Microsoft had to suspend 4.5 million accounts.
In terms of reactive measures (read: actions made in response to player reports) during that same period, the Xbox team enforced 2.53 million players. 46 percent of that number was made up communications, such as a message or post left on an activity feed post.
The end of Xbox's report shows that nearly 33.1 million players have been reported during the first half of 2022. Compared to 2021's similar
Read more on gamedeveloper.com