GenAI stack for building GenAI apps
Top 10.4% on sourcepulse
This repository provides a pre-configured Docker environment for building Generative AI applications, integrating Langchain, Neo4j, and Ollama. It's designed for developers and researchers looking for a quick start to explore RAG (Retrieval Augmented Generation) patterns, knowledge graphs, and LLM interactions, offering multiple demo applications for inspiration and immediate use.
How It Works
The stack leverages Docker Compose to orchestrate multiple services: Ollama for local LLM hosting, Neo4j for knowledge graph and vector storage, and Python applications built with Langchain for orchestrating LLM calls, data loading, and RAG pipelines. This approach simplifies setup and dependency management, allowing users to focus on application logic rather than infrastructure.
Quick Start & Requirements
docker compose up
.env
file from env.example
and configure LLM, database URIs, and API keys.Highlighted Details
Maintenance & Community
The project is maintained by Docker. Community interaction channels are not explicitly mentioned in the README.
Licensing & Compatibility
The repository appears to be under a permissive license, but specific details are not provided in the README. Compatibility for commercial use would depend on the underlying licenses of Langchain, Neo4j, and Ollama.
Limitations & Caveats
The README warns of performance issues with Docker Desktop versions 4.24.x. The setup requires careful configuration of environment variables, especially API keys for cloud-based models.
4 months ago
Inactive