React Native library for on-device AI model execution
Top 41.9% on sourcepulse
This library provides a declarative interface for running AI models directly within React Native applications, leveraging Meta's ExecuTorch framework. It targets React Native developers seeking to integrate on-device AI capabilities without deep native or ML expertise, enabling efficient, local inference for features like LLM-powered text generation and computer vision.
How It Works
The library acts as a bridge between React Native's JavaScript environment and the native ExecuTorch runtime. It abstracts the complexities of model loading, execution, and data handling, allowing developers to interact with AI models using familiar React patterns. This approach simplifies the integration of advanced AI features into mobile applications, offering state-of-the-art performance for on-device inference.
Quick Start & Requirements
yarn add react-native-executorch
followed by cd ios && pod install && cd ..
Highlighted Details
useLLM
) for simplified model integration.Maintenance & Community
Created by Software Mansion, a company with significant contributions to React Native. Further developments and plans can be found on their discussion page.
Licensing & Compatibility
Licensed under The MIT License, permitting commercial use and integration with closed-source applications.
Limitations & Caveats
Running LLMs requires substantial RAM, and users may need to increase emulator RAM allocation to prevent crashes. The library mandates the React Native New Architecture.
2 days ago
1 day