As online gaming platform Roblox confronts a lawsuit alleging it enabled a California girl's exploitation, its chief scientist said finding dangerous content in the company's virtual world is nothing like spotting it in video.
"It's such a challenge to moderate 3D," said Morgan McGuire in an interview at the Reuters Momentum conference in Austin on Tuesday. He had no comment on the recent lawsuit but said Roblox was built with safety and civility at the forefront.
San Mateo, Calif.-based Roblox is deploying bots to patrol user-generated games and press buttons to detect any dangerous content that players have disguised.
"This is more like shutting down speakeasies," he said, referring to US prohibition-era bars hiding from law enforcement.
BBC reported this year that users had created explicit private spaces, known as "condos," where people's digital avatars could engage in virtual sex.
McGuire said the company is able to remove inappropriate content within minutes thanks to artificial intelligence software and moderators. What escapes scrutiny is not "horrifically bad" but more likely a terms violation, he said.
Still, a lawsuit filed last week alleges that adult men on Roblox connected with a girl, who was born in 2009, and nudged her to sign up for Discord, Snap Inc's Snapchat and Meta Platforms Inc's Instagram to talk to them. The suit alleges they pushed her to drink, abuse prescription drugs and send explicit images, later precipitating suicide attempts.
McGuire said Roblox bars users from posting phone numbers or off-platform links. But as momentum grows to connect disparate worlds into an alternate digital reality called the metaverse, Roblox is engaging with a consortium of companies about how they could work
Read more on gadgets.ndtv.com