Humans give specific signals when they interact, and Google wants its devices to know them well. Several Google products can read body language and are "aware of their own space." The Nest Hub can pick up gestures. Users can snooze google's smart alarms with just a wipe, the Pixel comes with motion sensing and even their thermostat can tell if a person is right in front of it.
All these devices that simulate "awareness" are powered by old technology, the radar. Google's experts have managed to update, miniaturize and digitalize radar technology. They call it Soli. Soli, like any radar, emits electromagnetic waves and interprets how these waves are affected in 3D dimensions. Using deep learning, it can recognize things it has been programmed to recognize, like a hand in motion.
Related: The Pixel 7 and Pixel 7 Pro Just Leaked — Here's What They Look Like
The Google Advanced Technology & Projects (ATAP) team is taking Soli radar technology to the limit. They want devices to be able to identify users' intentions and their level of engagement, as well as gestures to act as commands. A simple turn or glance at a device would trigger it into action. Studying basic human body language, the team already programmed a wide range of cues and interactions. Devices "should feel like a best friend," the senior interaction designer at ATAP, Lauren Bedal, told Engadget.
The Google ATAP teams say that they are programming devices to be non-intrusive. Soli is now aware of people's nonverbal aspects, such as their proximity, body language and even biosignals like heartbeat and respiration. From micro-gestures that use fingers to full hand movements or identifying if a user is just passing by, the team has created a "library of
Read more on screenrant.com