Starter kit for local-only AI apps, document Q&A focused
Top 30.0% on sourcepulse
This project provides a starter kit for building local-only AI applications, specifically focusing on document Q&A, with the primary benefit of zero running costs. It targets developers and users who want to experiment with and deploy AI functionalities without relying on cloud services or incurring expenses.
How It Works
The stack leverages a combination of open-source technologies for a fully local AI experience. Ollama handles model inference, Supabase with pgvector serves as the vector database, and Langchain.js orchestrates LLM interactions. Embeddings are generated using Transformer.js with the all-MiniLM-L6-v2 model, and the application logic is built with Next.js. This modular approach allows for easy substitution of components and ensures all processing occurs client-side or on the user's machine.
Quick Start & Requirements
npm install
supabase start
(requires Supabase CLI, install via brew install supabase/tap/supabase
)node src/scripts/indexBlogLocal.mjs
npm run dev
http://localhost:3000
Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project is presented as a starter kit, implying it may require further development for production readiness. Specific limitations regarding supported operating systems, hardware requirements beyond local execution, or performance benchmarks are not detailed in the README.
1 year ago
Inactive