Apple has various plans regarding generative AI, but to access them, users will need an iPhone 15 Pro or an iPhone 15 Pro Max at the bare minimum as both flagships feature 8GB RAM. The same amount of memory is said to be retained on the iPhone 16 series, but according to an analyst, it is still an insufficient number. According to him, Apple is facing severe limitations in the development of its on-device Large Language Models, or LLMs.
TF International Securities analyst Ming-Chi Kuo has made predictions on this latest Medium blog, talking about what Apple intends to unveil during its WWDC 2024. While Kuo believes that Apple is working on its cloud-based and on-device LLMs, he does not believe that the Cupertino firm will proceed with an announcement during the event. He also mentions that cloud-based LLMs are difficult to train, so Apple requires immense development time. As for on-device AI, progress is minimal due to the iPhone 16’s 8GB RAM.
It is no secret that phone makers cannot deliver complete on-device LLM solutions to the masses because the memory requirements are through the roof. One estimation states that Android smartphones touting 20GB of RAM will become a common sight as these devices will have sufficient memory to run on-device LLMs. Apple is said to be researching on how to store Large Language Models on the flash memory, which will make it much easier to bring this technology to a multitude of devices.
Unfortunately, based on Kuo’s latest prediction, a breakthrough from Apple’s side is still a dream, and more progress is required. However, the company has been reported to run more on-device AI features than cloud-based ones, resulting in faster operations. This may explain why the iPhone 15 Pro and iPhone 15 Pro Max
Read more on wccftech.com