Presented by Modulate
This article is part of GamesBeat’s special issue, Gaming communities: Making connections and fighting toxicity.
Over a dozen new and pending internet trust and safety regulations are slated to seriously impact game developers in the near future, from the United States and the EU to Australia, Ireland, the U.K. and Singapore. The regulations target the rise of hate speech, harassment and misinformation driven by major world events, including COVID-related misinformation, potential election influence and the rise of white supremacist extremism. On top of that, privacy laws are being revisited, such as California’s Age-Appropriate Design Act, modeled off the U.K.’s Children’s code.
And as the DSA and other regulations begin kicking into force in 2024, experts only expect enforcement to become more common. Unfortunately, “No one reported it! We didn’t know there was illegal content!” won’t cut it anymore. Today, regulators and consumers are looking for evidence that the studios are taking the problem seriously, with a focus on harm reduction. In other words, game studios must now proactively minimize any harm on their platforms since they could be liable for such harms even if they were never reported by users.
Compliance has eight major components, which may sound daunting at the outset. It includes writing a clear code of conduct and updating terms of service and producing regular transparency reports, with the help of internal teams who can work with regulators as needed.
On the platform, developers need to find ways to minimize harmful content at scale, especially terrorism, CSAM and grooming – which means building out new moderation and monitoring tools. A user report portal and an appeals portal,
Read more on venturebeat.com