react-native-executorch  by software-mansion

React Native library for on-device AI model execution

created 9 months ago
878 stars

Top 41.9% on sourcepulse

GitHubView on GitHub
Project Summary

This library provides a declarative interface for running AI models directly within React Native applications, leveraging Meta's ExecuTorch framework. It targets React Native developers seeking to integrate on-device AI capabilities without deep native or ML expertise, enabling efficient, local inference for features like LLM-powered text generation and computer vision.

How It Works

The library acts as a bridge between React Native's JavaScript environment and the native ExecuTorch runtime. It abstracts the complexities of model loading, execution, and data handling, allowing developers to interact with AI models using familiar React patterns. This approach simplifies the integration of advanced AI features into mobile applications, offering state-of-the-art performance for on-device inference.

Quick Start & Requirements

Highlighted Details

  • Enables on-device AI model execution in React Native via ExecuTorch.
  • Provides declarative hooks (useLLM) for simplified model integration.
  • Includes examples for speech-to-text (Whisper), computer vision, and LLMs (Llama).
  • Requires the React Native New Architecture.

Maintenance & Community

Created by Software Mansion, a company with significant contributions to React Native. Further developments and plans can be found on their discussion page.

Licensing & Compatibility

Licensed under The MIT License, permitting commercial use and integration with closed-source applications.

Limitations & Caveats

Running LLMs requires substantial RAM, and users may need to increase emulator RAM allocation to prevent crashes. The library mandates the React Native New Architecture.

Health Check
Last commit

2 days ago

Responsiveness

1 day

Pull Requests (30d)
37
Issues (30d)
24
Star History
270 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.