Developing a video game is hard, full stop. The original impetus — the seedling of an idea, emotion, or mechanic — sprouts into a consumer-ready product after years of beta tests, bug squashes, and redesigns. Like most art, the final work often barely resembles the first attempt, and they’re linked solely by the creative DNA of the developer’s blood, sweat, and tears. Anything can happen during this time: Teams can grow or shrink, fads can form or pass, and technologies can be invented or become obsolete. Development time has stretched exponentially; what once took a year or two has elongated into a laborious multi-year process focused heavily on recreating life-like 4K graphics, innovative gameplay, and Hollywood-level narratives. It’s easier than ever to start making a video game nowadays, but it’s also harder and more laden with expectations.
No developer sets out to be a failure, for years of their lives to be spent on a game infamous for being a glitched-out, buggy, half-baked mess — any successes already long-forgotten. But again: Game development is hard. Sometimes when you promise to deliver the moon, you end up landing in rural America instead.
The developers at Arkane Austin are nothing if not ambitious. Their first solo venture, 2017’s Prey, continued Arkane’s penchant for immersive sims, which are often defined by their willingness to let the player tackle any obstacle however they see fit. For example, enemies within Prey can be dealt with via turrets, stealth kills, shotgun blasts, GLOO, or simply running away. This development ethos of player choice feels the most hands-off, but in reality, is intricate and precise. When everything is possible, every option must be accounted for. Although largely ignored
Read more on wegotthiscovered.com