Online communities, we are told, are besieged by negativity and toxic behaviour. How do community managers stay on top of it all? As in all things, machines can help, but getting the right mix of human and AI moderation, and keeping it all up to date, is key.
A company that’s doing this behind the scenes for gaming brands like Team BDS and Paradox is Bodyguard.ai. PC Games Insider met the team at Game Connection Europe in Paris in November, shortly after the Nice-based Bodyguard.ai revealed its inaugural Business Online Toxicity Barometer. Researching more than 170 million comments made in a 12-month period across 1,200 brand channels, it found that nine million interactions were toxic content, and of that number, some 28% were hateful, and 1% were actual threats. Discrimination accounted for over 200,000 comments. Many were spam, scams, frauds or trolling comments – less alarming but still very problematic.
This tallies with a Unity report from the end of the pandemic, which found that two out of three people who play games online experience harassment of some sort. Unity’s research showed that 92% of players think solutions should be implemented to reduce toxic behaviour in multiplayer games.
Bodyguard.ai has developed a rules-based AI tool that protects individuals, communities, and brands from toxic online content. It plugs into social media and other community tools, comment sections and the like, and using a set of regularly-updated rules, is quick to react to toxic activity. This isn’t just about deleting harmful content – when we followed up our Parisian meeting with a detailed Zoom call, the team are keen to point out that this support is good for the mental health of human community managers, who are still
Read more on pcgamesinsider.biz