Presented by Modulate
New safety and privacy regulations are set to impact game developers across the globe. As part of VentureBeat’s most recent Spotlight webinar on fighting community toxicity, we brought together industry experts from Xbox, Modulate and legal firm Osborne Clarke to discuss how to stay ahead of the new regulatory landscape while centering on player safety.
Watch free on demand now.
Over the last year or two, regulators have been paying far more attention to the idea of trust and safety, both in terms of child safety and violent radicalization, but also staying aware of the toxicity that can flourish on platforms that aren’t sufficiently monitored, and how that impacts mental health.
More than a dozen new and pending internet trust and safety regulations from the United States and the EU to Australia, Ireland, the U.K. and Singapore are set to impact the gaming industry, targeting hate speech, harassment, misinformation and danger to underage children. The eSafety Commissioner in Australia has been working to produce guidance for a variety of industries on engaging more effectively with trust and safety, with a particular emphasis on gaming and social platforms, said Mike Pappas, CEO and co-founder of Modulate.ai.
And it was one of the few spaces where the industry’s own code of regulators produced for the eSafety commission for review was denied, because the commission felt that it was insufficiently protective in a variety of ways, “especially around looking for platforms to, be more proactive in anticipating the kinds of harms that can happen online, and doing something more aggressive to mitigate those harms,” he added. “That general theme of being proactive and thoughtful up front, and safety by
Read more on venturebeat.com