Apple's iPhone has long been renowned for its sleek design and user-friendly interface, but the tech giant is now venturing into the realm of artificial intelligence (AI) to further elevate the user experience.
In a recent research collaboration with Cornell University, Apple has revealed a groundbreaking advancement titled "Ferret-UI: Grounded Mobile UI Understanding with Multimodal LLMs." This innovative system employs a multimodal large language model to comprehensively interpret mobile user interfaces, revolutionising how users interact with their iPhones, Forbes reported.
Also read: iPhone 14 at Rs. 33,400 after discount on Amazon: Know full offer details
Ferret-UI, an evolution of the previously introduced Ferret system, boasts an array of capabilities, including icon recognition, text detection, widget parsing, and screen description. This enhanced functionality not only facilitates seamless navigation across various applications but also paves the way for enhanced accessibility features.
Moreover, Ferret-UI holds promise for developers, serving as a versatile testing tool for simulating diverse user scenarios and optimising app performance across different demographics.
Also read: Samsung Galaxy S25 Ultra leaks roundup: Expected specs, camera features and more
In a recent development, backend code discoveries by Nicolás Álvarez hint at additional server-side tools dubbed "Safari Browsing Assistant" and "Encrypted Visual Search." While these features align with the principles outlined in the Ferret-UI research, it's important to note that they remain subject to potential modifications or may serve as placeholders for future implementations.
Despite commendable strides in AI, Apple faced criticism for lagging behind Google in embracing AI-driven smartphones. However, with the impending Worldwide Developer Conference in June, Apple is poised to shed light on its AI roadmap, laying the groundwork for the highly anticipated iPhone 16 and iPhone 16 Pro slated for
Read more on tech.hindustantimes.com