Toxicity is a serious issue that players expect studios of online games to address. New legal regulations are also demanding studios do more to protect their players or face hefty fines.
While it’s clear that there is a moral and a growing legal imperative to protect players from toxicity on an online platform, it’s also the right thing for studios to do when it comes to increasing an online game’s revenue.
Modulate CEO and co-founder Mike Pappas told GamesIndustry.biz how the company’s AI-assisted voice moderation tool ToxMod isn’t just the most efficient way to combat toxicity but that “effective content moderation helps to foster a positive and safe gaming environment, which directly improves player experience - and player retention.”
Positive and safe environments lead to increased spending
When live service games are dependent on a user base that spends money on the platform, then it’s more important than ever to ensure you’re not losing customers through churn, which can be the result of toxicity when left unchecked. This translates to the real world too; customers are unlikely to return to an establishment that feels unsafe and unwelcoming, and its reputation may further put off potential new customers.
“In the EU, the Digital Services Act can levy fines up to 6% of a company’s worldwide annual turnover for failing to implement user safety policies"
“If I have a bad experience, I probably churn unless the platform is proactive in demonstrating their commitment to fixing things,” Pappas says. “Not only are players who experience or witness toxicity more likely to churn, but even those who stick around may become disillusioned and stop submitting user reports, which further exacerbates the toxicity problem.”
But while a game studio might not see the necessity of addressing toxicity if their title is popular and compelling enough that players stick around in spite of it, a survey from Take This shows that 61% of players choose to spend less money in games due to
Read more on gamesindustry.biz