Plants and animals can be identified by select iPhone models running iOS 15 or later in an impressive demonstration of Apple'sVisual Lookup feature, which allows users to learn more about famous landmarks, art, plants, flowers and pets. The feature was introduced alongside Live Text as part of an iOS 15 release that focused on increased actions within photos using on-device intelligence. Due to the computing required on the iPhone, it makes sense that only specific iPhone models can take advantage of this feature. By keeping as much computing on-device, Apple can protect user privacy by keeping it off the company's servers altogether.
Both the Live Text and Visual Lookup features scan objects within a photo saved to the device's camera roll. Their functions are different. While Visual Lookup aims to provide information about a subject — perhaps to identify it or add more background information — Live Text extracts data from a photo. Users can view the text in an image that is compatible with Live Text, which will likely be found in prominent signs or captured documents. Live Text can be copied, searched on the web, or shared with other people and applications. But for unidentifiable plants and animals within a photo, Visual Lookup can fill in the gaps and add more information about flora and fauna captured on the iPhone.
Related: iOS 15 Live Text For iPhone: Tapping A Phone Number In A Photo & More
To see if a photo has elements that can be detected by Visual Lookup, open it in full screen on the Photos app. Every photo saved in the Photos app will have an 'i' button located at the bottom of the screen when it is opened. However, when a photo has detectable elements with Visual Lookup, the 'i' button will be partially
Read more on screenrant.com