Star Trek's holodeck is one of the most alluring sci-fi technologies: You give a few verbal instructions to a computer, and boom, you're on a street in 1940s San Francisco, or wherever you want to be. We may never have holograms you can touch, but the part where a computer can generate any 3D scene it's asked for is being worked on right now by a small studio in London.
At the Game Developers Conference in San Francisco on Wednesday, Anything World CEO Gordon Midwood asked me what I wanted to see. I said I wanted to see a donkey, and a few seconds later, a donkey was walking around on the screen in front of us. Sure, it sort of walked like a horse, and yeah, all it did was mosey around a field, but those are just details. The software delivered on its basic promise: I asked for a donkey and a donkey appeared.
For the next demonstration, Midwood took his hands away from the keyboard. «Let's make an underwater world and add 100 sharks and a dolphin,» he said into a microphone. A few seconds later, I was looking at a dolphin who'd shown up to the wrong party: 100 swimming sharks.
Developers who want to use Anything World as a game development or prototyping tool will incorporate it into an engine like Unity, but as Midwood demonstrated, it can also produce scenes, objects, and creatures on the fly. It was the coolest thing I saw on the GDC show floor, and others have already noticed its potential. Roblox is exploring a deal with the company, and Ubisoft is already using the software for prototyping, as well as for a collaborative project called Rabbids Playground.
With so much blockchain stuff haunting GDC, the sight of an older tech buzzword was comforting: Anything World uses machine learning algorithms developed in part
Read more on pcgamer.com