Discover and explore top open-source AI tools and projects—updated daily.
ykhliStarter kit for local-only AI apps, document Q&A focused
Top 29.4% on SourcePulse
This project provides a starter kit for building local-only AI applications, specifically focusing on document Q&A, with the primary benefit of zero running costs. It targets developers and users who want to experiment with and deploy AI functionalities without relying on cloud services or incurring expenses.
How It Works
The stack leverages a combination of open-source technologies for a fully local AI experience. Ollama handles model inference, Supabase with pgvector serves as the vector database, and Langchain.js orchestrates LLM interactions. Embeddings are generated using Transformer.js with the all-MiniLM-L6-v2 model, and the application logic is built with Next.js. This modular approach allows for easy substitution of components and ensures all processing occurs client-side or on the user's machine.
Quick Start & Requirements
npm installsupabase start (requires Supabase CLI, install via brew install supabase/tap/supabase)node src/scripts/indexBlogLocal.mjsnpm run devhttp://localhost:3000Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project is presented as a starter kit, implying it may require further development for production readiness. Specific limitations regarding supported operating systems, hardware requirements beyond local execution, or performance benchmarks are not detailed in the README.
1 year ago
Inactive
ykhli