Apple's WWDC 2024 event is just 10 days away and people are eagerly waiting for the company to reveal its latest advancement in artificial intelligence (AI) and upcoming iOS 18 features. There are rumours about iOS 18 having AI capabilities, but more features are rumoured as the event approaches. Recently, a tipster revealed the advanced AI capabilities of Apple's voice assistant, Siri which may include a deeper contextual understanding.
According to an Apple Insider report, the company is working under a project named “Greymatter” which is testing and building all the upcoming AI features for the iOS 18. While the project is working on several app features, its main focus is to bring advanced AI capabilities to Siri. The report highlighted that Apple is currently working on a feature called “Catch Up' for Siri which is based on notification summarisation. With this feature, Siri can provide users with a concise overview of their recent notifications, eliminating the need to scroll down to a long list of individual notifications.
Reportedly, Siri is getting a new smart response framework along with Apple's on-device LLM to gain advanced response generation capabilities. This will allow the voice assistant to gain knowledge of “people and companies, calendar events, locations, dates, and much more” to provide a detailed response to user queries. Furthermore, with Ajax LLM Siri is expected to be empowered with AI capabilities such as text summarisation and transcription. The report highlights, “This ultimately means that Siri will be able to answer queries on-device, create summaries of lengthy articles, or transcribe audio as in the updated Notes or Voice Memos applications.”
Apple is also working on Siri's ability to handle cross-device media and TV control tasks. Therefore, users can command voice assistance to play music, movies, and other content from Apple Watch to other devices.
Note that all these features are based on leaks and speculation and do not provide any
Read more on tech.hindustantimes.com