Even with increasingly effective moderation tools, the sheer scope of player content and communication is expanding every day, and online discourse has never been more heated. In this GamesBeat Next panel, Hank Howie, game industry evangelist at Modulate was joined by Alexis Miller, director of product management at Schell Games and Tomer Poran, VP solution strategy at ActiveFence, to talk about best practices for moderating game communities of all sizes and demographics, from codes of conduct to strategy, technology and more.
It’s an especially critical conversation as privacy, safety and trust regulations become a larger part of the conversation, Poran said and AI is growing more powerful as a tool for not only detecting but also creating harmful or toxic content — and it has become crucial in the evolution of content moderation strategies in gaming spaces, which have lagged behind other online arenas.
“While child safety and user safety have been incredibly important pillars in gaming, even before social media, [gaming] is a phase behind social media in its approach to content moderation,” Poran said. “One thing we’re seeing in gaming, that we saw maybe several years ago in social media, is this move toward proactivity. We’re seeing more sophisticated, proactive content moderation tools like ActiveFence and Modulate and many other vendors. We’re seeing more investment from companies in these technologies.”
Up until just a few years ago, it was nearly impossible to moderate voice content, Howie added — and even when the technology was developed, it was too expensive to implement. But once Modulate made it affordable, developers suddenly had access to everything that was said in their games.
“Every single company we’ve ever
Read more on venturebeat.com