Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.
Modulate has raised $30 million to build out its AI product, ToxMod, which scans voice chat using machine learning to find toxic players in online games.
ToxMod uses artificial intelligence to highlight problems that human moderators should pay attention to as players chat with each other in online games. It’s a problem that will only get worse with the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. The company raised the round thanks to large customers such as Rec Room and Poker Stars VR relying on it to help their community managers find the biggest toxicity problems.
“This is a problem that everyone in the industry has desperately needed to solve,” said Mike Pappas, CEO of Modulate, in an interview with GamesBeat. “This is such a large-scale market need, and we were waiting to prove that we’ve actually built the product to satisfy this.”
Lakestar led the round with participation from existing investors Everblue Management, Hyperplane Ventures, and others. In addition, Mika Salmi, managing partner of Lakestar, will join Modulate’s board.
GamesBeat Summit Next 2022
Join gaming leaders live this October 25-26 in San Francisco to examine the next big opportunities within the gaming industry.
Modulate’s ToxMod is a proactive voice moderation system designed to capture not just overt toxicity (hate speech, adult language) but also more insidious harms like child grooming, violent radicalization, and self-harm. The system’s AI has been trained on more than 10 million hours
Read more on venturebeat.com