Missed the GamesBeat Summit excitement? Don't worry! Tune in now to catch all of the live and virtual sessions here.
This article is part of GamesBeat’s special issue, Gaming communities: Making connections and fighting toxicity.
We’re probably never going to stamp out toxicity in online games. But it’s always interesting to see how we can fight the good fight.
As an example, game developers often fight a cat-and-mouse game against cheaters in video games such as Valorant and Call of Duty. While toxicity encompasses a lot more than cheating, it provides a window into the cat-and-mouse game. And cheating itself caused a lot of fights and led to a lot of toxicity.
So the developers would scoop up the known cheaters and put them in a battle space nicknamed “cheater island.” Rather than cheating against fair-minded players, the cheaters fought each other, often not realizing their cheating was kind of pointless.
Yet the cheaters escalated their technology as well, selling cheats that were hard to detect. They also simply created a lot of accounts. Once the “banhammer” dropped on one account, the cheaters would shift to the next one. So the developers at Riot Games and Activision went a step further, developing an anti-cheat system that could access the player’s operating system and identify if the machine was used for cheating. If so, they would stop it from being used to create a new account. This was all part of the arms race between game developers and cheaters.
It seemed that such a system could work really well if the tech were shared across games and across companies. But there are privacy laws that could stop that from happening, introducing another complexity in the war against toxic behavior.
“This needs to be cross
Read more on venturebeat.com