Missed the GamesBeat Summit excitement? Don't worry! Tune in now to catch all of the live and virtual sessions here.
Niantic and 8th Wall announced today that they’re adding hand tracking to the latter’s AR developer toolkit. According to 8th Wall, developers will now be able to build hand and wrist-based AR applications. They can incorporate real-time APIs from the open web to incorporate textures or videos unique to each user. It’s available to 8th Wall developers now.
The new tools contain a proprietary hand model with 36 attachment points across the palm, knuckles, wrist and fingers. This gives developers fine-tuned control over the position and point of contact for AR items on the hand. The Hand Tracking tool also has an adaptive hand mesh that forms to the shape of the user’s real hand.
According to 8th Wall, Hand Tracking allows users to interact with, move and alter objects in AR apps as if they were real, and they can also puppeteer hand-shaped objects that follow their movements. Developers can also attach special effects to users’ hands or allow them to “try on” real world objects within the bounds of the app. Developers can access Hand Tracking by cloning a sample project in the library.
GamesBeat's creed when covering the game industry is «where passion meets business.» What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.
Read more on venturebeat.com