Minecraft's newest snapshot finally adds reporting, a long-awaited and highly-requested feature that lets you flag certain players for abusive comments in chat. Historically, this has been left to individual server moderators and admins with the help of community plugins and add-ons, but it's now an official part of the game.
Snapshot 22W24A is available right now for Java players and is mostly a bug fix update with performance tweaks, but it also adds reporting, letting you address players who send "abusive messages." To use it, you have to navigate to the social menu and select the message in question, reporting it and the player in a specific category. This will then be passed onto a moderator team who will check and presumably act on the report.
RELATED: I Don't Care About The Mysteries Of Starfield's Universe, I Just Want To Live In It
However, there's no moderation in place for the Snapshot. As it is, the feature simply lets you report players without any further action to test that reporting works. When it's implemented into the final build of the game, that's when it'll be fully operational as a manned system.
The general idea is that when somebody is toxic in the chat, be it saying hateful things, spouting bigotry, or shouting slurs, you can report that to Mojang and hopefully have it dealt with more officially. This could weed out toxic players across Minecraft rather than just on individual servers, but it remains to be seen how it will actually work in action when the build properly launches.
"We want everyone to feel safe and welcome in Minecraft, which is why we have community guidelines in place," Mojang wrote. "If you feel unsafe, uncomfortable, or concerned that someone is breaking our community
Read more on thegamer.com