Last year, after a game developer declared Steam wouldn't accept any game that used AI, Valve clarified that it was «working through how to integrate it into our already-existing review policies» and that «our review process is a reflection of current copyright law and policies, not an added layer of our opinion. As these laws and policies evolve over time, so will our process.»
Another step in that process has been taken, as Valve has released a new statement addressing AI content on Steam. «Today,» it begins, «after spending the last few months learning more about this space and talking with game developers, we are making changes to how we handle games that use AI technology. This will enable us to release the vast majority of games that use it.»
There are two changes coming to Steam that will make this possible. The first is an update to the Content Survey, which is a form that developers fill out when submitting a game to Steam. The survey will now have a section where developers have to disclose whether they've used AI during development, and whether it was pre-generated (used to create assets pre-release), or live-generated (used to create content while the game's being played).
If that content is pre-generated, developers will have to «promise Valve that your game will not include illegal or infringing content, and that your game will be consistent with your marketing materials.» If it's live-generated, «you'll need to tell us what kind of guardrails you're putting on your AI to ensure it's not generating illegal content.»
The second change regards that possibility of an algorithmic tool creating something illegal while a game's running: «we're releasing a new system on Steam that allows players to report illegal
Read more on pcgamer.com