Discover and explore top open-source AI tools and projects—updated daily.
software-mansion-labsLocal RAG SDK for mobile applications
Top 93.2% on SourcePulse
React Native RAG enables developers to integrate private, local Retrieval Augmented Generation (RAG) capabilities into their React Native applications. It allows LLMs to leverage custom knowledge bases directly on the user's device, enhancing privacy and offline functionality. The library is designed for developers building privacy-first mobile AI experiences, offering a flexible and extensible framework.
How It Works
The project employs a modular and extensible architecture, allowing developers to select and integrate specific components for LLM, Embeddings, VectorStore, and TextSplitter as needed. Core to its functionality is on-device inference, powered by @react-native-rag/executorch, which facilitates private and efficient model execution directly on mobile devices. For data persistence, it supports SQLite through the @react-native-rag/op-sqlite plugin, enabling local storage and management of vector stores.
Quick Start & Requirements
Installation is straightforward via npm:
npm install react-native-rag
Users will also need to integrate embedding and large language models. The library recommends @react-native-rag/executorch for on-device inference, requiring additional installation:
npm install @react-native-rag/executorch react-native-executorch
For vector store persistence, @react-native-rag/op-sqlite is available. A complete example app demonstrating library usage is provided.
Highlighted Details
@react-native-rag/executorch ensures data privacy and offline capabilities.@react-native-rag/op-sqlite plugin.useRAG hook, a configurable RAG class, and direct component interaction for fine-grained control.Maintenance & Community
The project is developed by Software Mansion, a software agency recognized for its contributions to the React Native ecosystem. Specific community channels like Discord or Slack, or a public roadmap, are not detailed in the provided README.
Licensing & Compatibility
The library is released under the MIT license, which is permissive and generally allows for commercial use and integration into closed-source projects.
Limitations & Caveats
Users are responsible for providing and configuring the necessary embedding and LLM models, with specific setup required for on-device inference. The README does not specify performance benchmarks or detailed hardware requirements for running models locally.
3 months ago
Inactive
allenai
activeloopai
exo-explore