React Native SDK for on-device LLM execution, Vercel AI SDK compatible
Top 58.3% on sourcepulse
This library enables on-device execution of Large Language Models (LLMs) within React Native applications, offering Vercel AI SDK compatibility. It targets mobile developers seeking to integrate powerful AI features without relying on cloud-based inference, thereby enhancing privacy and reducing latency.
How It Works
The project leverages the MLC LLM Engine, a C++ inference engine optimized for diverse hardware, to run LLMs directly on mobile devices. It provides a bridge between the native MLC LLM runtime and the JavaScript environment of React Native. This approach allows developers to utilize pre-compiled LLM models, managed via a configuration file, and interact with them using familiar Vercel AI SDK patterns.
Quick Start & Requirements
npm install react-native-ai
git clone https://github.com/mlc-ai/mlc-llm
).MLC_LLM_SOURCE_DIR
environment variable set to the cloned MLC LLM directory.mlc_llm
CLI installed and verified.mlc-config.json
file with model configurations.ANDROID_NDK
and TVM_NDK_CC
environment variables set.pod install
run in the ios
directory.@azure/core-asynciterator-polyfill
, @ungap/structured-clone
, web-streams-polyfill
, and text-encoding
are required.npx react-native-ai mlc-prepare
to build necessary binaries.Highlighted Details
streamText
and generateText
.mlc-config.json
.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The setup process is complex, requiring significant environment variable configuration and native build steps for both Android and iOS. Specific model compatibility and performance will depend on the underlying MLC LLM Engine and the device's hardware capabilities.
21 hours ago
Inactive