Meta in September launched its new AI smart glasses in collaboration with Ray-Ban with an in-built voice assistant and cameras to capture amazing pictures and videos. During the launch Meta CEO Mark Zuckerberg promised a new AI feature that would enable the AI tool to tell more about the surroundings, and now finally, the company has rolled out the update. However, as of now only a few users in the US will get access to the update as part of a free trial. Know more about the upcoming AI features of Meta Ray-Ban AI glasses.
According to a Meta blog post, the AI assistant integrated into the Meta Ray- Ban glasses has been upgraded, now users will not only be able to talk to the assistant, but now it will also be able to tell you what you are seeing in front of you. Meta says, “You can ask Meta AI for help writing a caption for a photo taken during a hike or you can ask Meta AI to describe an object you're holding.”
Also read: Meta AI is here: Know how to talk to the chatbot on WhatsApp, Instagram, and Facebook
Meta also notifies that the feature is only being tested to gather feedback on multimodal AI features. Therefore, the features will be available to only a limited of users for testing and sharing feedback. Later, the company promised to make the required changes to launch the new AI upgrade globally for all the Meta AI glasses users.
Meta laid out some examples of what users can ask the glasses. It said they can ask about a building, restaurants, objects, and more. Below are some of the example:
Additionally, Zuckerberg also shared an Instagram video post to show the followers how the new AI features of Meta smart glasses will work. He took out a shirt from his closet and asked the AI glasses, “Hey Meta, look and tell
Read more on tech.hindustantimes.com