Human language is complex and can be difficult for people to understand, let alone robots. But Google Research and Everyday Robots want to change that.
Parent company Alphabet is bringing together robotics and artificial intelligence to create a bot that can understand natural language commands.
"There's a genius to human beings—from understanding idioms to manipulating our physical environments—where it seems like we just 'get it,'" says(Opens in a new window) Vincent Vanhoucke, head of robotics at Google Research. "The same can't be said for robots."
Modern robots are mostly confined to industrial environments, where they can be coded for specific tasks like picking up boxes or acting as mobile storage containers. But they don't have the flexibility to adapt to unpredictable real-world events.
Enter PaLM-SayCan(Opens in a new window), an algorithm that combines the understanding of language models with practical capabilities of a helper bot. Together, Google Research and Everyday Robots are using PaLM (Pathways Language Model) to facilitate more natural interaction by teaching the droid to process open-ended prompts and respond sensibly.
"It not only makes it possible for people to communicate with helper robots via text or speech," Vanhoucke said. "But also improves the robot's overall performance and ability to execute more complex and abstract tasks by tapping into the world knowledge encoded in the language model."
Like showing your work on a math test, researchers feed the model with examples of the human thought process to help it reason through prompts. Ask PaLM-SayCan, for example, to "bring me a snack and something to wash it down with," the machine uses a chain of thought to recognize a bag of chips and a
Read more on pcmag.com