Microsoft is dedicated to providing developers with powerful tools to rapidly socialise the development of AI-driven apps in the new era. Regardless of whether developers are working on x86/x64 or Arm64 platforms, Microsoft aims to simplify the integration of AI-powered experiences into Windows apps across both Cloud and Edge environments.
During last year's Build conference, Microsoft introduced Hybrid Loop, an innovative development pattern that enables the seamless integration of hybrid AI scenarios spanning Azure and client devices. Today, Microsoft has announced the realisation of this vision through the utilisation of ONNX Runtime as the gateway to Windows AI, complemented by Olive, a comprehensive toolchain designed to streamline the process of optimising models for various Windows and other devices. Third-party developers now have access to the same extensive tools used internally by Microsoft for operating AI models on Windows and other devices, including CPU, GPU, NPU, or hybrid setups with Azure, by leveraging ONNX Runtime.
Notably, ONNX Runtime now supports a unified API that allows developers to run models either on the device or in the Cloud, enabling hybrid inferencing scenarios. This means that apps can leverage local resources whenever possible and seamlessly switch to cloud-based processing when necessary. Introducing the Azure EP preview, developers can effortlessly connect to models deployed in AzureML or even utilise the Azure OpenAI service. By simply specifying the cloud endpoint and defining criteria for cloud usage, developers gain enhanced control over costs and user experience. Azure EP empowers developers to choose between utilising larger cloud models or smaller local models during runtime,
Read more on tech.hindustantimes.com