ai  by callstackincubator

React Native SDK for on-device LLM execution, Vercel AI SDK compatible

created 1 year ago
561 stars

Top 58.3% on sourcepulse

GitHubView on GitHub
Project Summary

This library enables on-device execution of Large Language Models (LLMs) within React Native applications, offering Vercel AI SDK compatibility. It targets mobile developers seeking to integrate powerful AI features without relying on cloud-based inference, thereby enhancing privacy and reducing latency.

How It Works

The project leverages the MLC LLM Engine, a C++ inference engine optimized for diverse hardware, to run LLMs directly on mobile devices. It provides a bridge between the native MLC LLM runtime and the JavaScript environment of React Native. This approach allows developers to utilize pre-compiled LLM models, managed via a configuration file, and interact with them using familiar Vercel AI SDK patterns.

Quick Start & Requirements

  • Installation: npm install react-native-ai
  • Prerequisites:
    • MLC LLM Engine repository cloned (git clone https://github.com/mlc-ai/mlc-llm).
    • MLC_LLM_SOURCE_DIR environment variable set to the cloned MLC LLM directory.
    • mlc_llm CLI installed and verified.
    • mlc-config.json file with model configurations.
    • Android: ANDROID_NDK and TVM_NDK_CC environment variables set.
    • iOS: "Increased Memory Limit" capability enabled in Xcode, and pod install run in the ios directory.
    • Polyfills for @azure/core-asynciterator-polyfill, @ungap/structured-clone, web-streams-polyfill, and text-encoding are required.
  • Preparation: Run npx react-native-ai mlc-prepare to build necessary binaries.
  • Resources: Requires cloning and setting up the MLC LLM Engine, which involves submodules and potentially NDK setup.

Highlighted Details

  • Provides direct compatibility with Vercel AI SDK functions like streamText and generateText.
  • Supports model downloading with progress callbacks and model preparation for inference.
  • Enables retrieval of available models configured in mlc-config.json.

Maintenance & Community

  • Developed by Callstack, a React and React Native focused company.
  • Open-source project with contribution guidelines available.

Licensing & Compatibility

  • MIT License.
  • Compatible with commercial and closed-source applications.

Limitations & Caveats

The setup process is complex, requiring significant environment variable configuration and native build steps for both Android and iOS. Specific model compatibility and performance will depend on the underlying MLC LLM Engine and the device's hardware capabilities.

Health Check
Last commit

21 hours ago

Responsiveness

Inactive

Pull Requests (30d)
31
Issues (30d)
34
Star History
252 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.