Developers can now use ONNX Runtime (Machine Learning Inference Engine) to build machine learning apps on Android and iOS platforms through Xamarin


[ad_1]

Traditionally, AI models have been run on powerful servers in the cloud. The implementation of “machine learning on the device”, such as the use of mobile phones, is rarely mentioned. This lack of mobile implementation can be attributed mainly to the lack of storage memory, compute resources and the power required to use the AI ​​models. Despite these limitations, implementing AI on mobile can be very useful in some problematic scenarios.

To achieve the goal of implementing mobile-based AI models, Microsoft recently released ONNX Runtime version 1.10, which supports building C # applications using Xamarin. Xamarin is an open source platform for building applications using C # and .NET. This will likely help developers build AI models on Android or iOS platforms. This new version enables the creation of cross-platform applications using Xamarin.Forms. Microsoft has also added a sample app in Xamarin, which runs a ResNet classifier using the ONNX Runtime NuGet package in Android and iOS mobiles. To understand the detailed steps of adding the ONNX runtime package and learn more about Xamarin.Forms apps, you can check here.

ONNX Runtime supports deep learning frameworks such as Python, TensorFlow, and classic machine learning libraries such as scikit-learn, LightGBM, and XGBoost. It is also compatible with a wide range of hardware, providing a faster customer experience by using the best accelerators wherever possible. ONNX Mobile Runtime will provide a huge boost in implementing Android and iOS AI models optimized for lower storage spaces. A list of packages available for different platforms can be found here.

There are huge benefits to moving towards an on-device AI implementation. Latency occurring due to the involvement of uploading and downloading from the servers is eliminated. It can enable real-time data processing like tracking, classification, object detection using moving cameras in real time without any network connectivity requirement. As all processing will take place offline, additional mobile data charges will not be charged, ultimately reducing the costs of using the applications for the end user. Better data confidentiality will be ensured because the data will not be sent to any server. This is very important in cases involving sensitive data.

As Microsoft receives feedback from developers, it will continue to update packages. Soon we can also expect to have the option of on-device training modules.

The references:

[ad_2]

Comments are closed.