self-hosted-ai-starter-kit  by n8n-io

Local AI environment via Docker Compose template

Created 1 year ago
12,428 stars

Top 4.0% on SourcePulse

GitHubView on GitHub
Project Summary

This open-source Docker Compose template provides a quick setup for a local, self-hosted AI and low-code development environment. Curated by n8n, it targets developers and users looking to build secure AI workflows locally, offering a comprehensive toolkit with n8n, Ollama, Qdrant, and PostgreSQL.

How It Works

The kit leverages Docker Compose to orchestrate multiple AI components. It integrates n8n for low-code workflow automation, Ollama for running local LLMs, Qdrant as a vector store for AI data, and PostgreSQL for data management. This combination allows users to build applications like AI agents, secure document summarizers, and enhanced communication bots without relying on external cloud services.

Quick Start & Requirements

  • Install: git clone https://github.com/n8n-io/self-hosted-ai-starter-kit.git && cd self-hosted-ai-starter-kit
  • Run (Nvidia GPU): docker compose --profile gpu-nvidia up
  • Run (AMD GPU Linux): docker compose --profile gpu-amd up
  • Run (Mac/Apple Silicon, CPU or local Ollama): docker compose up
  • Run (CPU): docker compose --profile cpu up
  • Prerequisites: Docker, Docker Compose. Nvidia GPU users may need to configure Docker for GPU access. Mac users running Ollama locally need to adjust environment variables.
  • Setup: Access n8n at http://localhost:5678/. Initial LLM download (e.g., Llama3.2) may take time.
  • Docs: n8n AI Concepts

Highlighted Details

  • Integrates n8n with over 400 integrations and AI-specific nodes.
  • Supports local LLM inference via Ollama and vector storage with Qdrant.
  • Provides pre-configured Docker Compose profiles for different hardware (Nvidia GPU, AMD GPU, CPU, Mac).
  • Includes example workflows for AI agents, document analysis, and chatbots.

Maintenance & Community

  • Curated by n8n.
  • Support available via the n8n Forum.

Licensing & Compatibility

  • Licensed under the Apache License 2.0.
  • Suitable for commercial use and integration with closed-source applications.

Limitations & Caveats

The starter kit is designed for proof-of-concept projects and is not fully optimized for production environments. Mac users cannot directly expose their GPU to Docker instances.

Health Check
Last Commit

2 weeks ago

Responsiveness

Inactive

Pull Requests (30d)
7
Issues (30d)
0
Star History
979 stars in the last 30 days

Explore Similar Projects

Starred by Chris Lattner Chris Lattner(Author of LLVM, Clang, Swift, Mojo, MLIR; Cofounder of Modular), Tobi Lutke Tobi Lutke(Cofounder of Shopify), and
11 more.

modular by modular

0.1%
25k
AI toolchain unifying fragmented AI deployment workflows
Created 2 years ago
Updated 1 day ago
Feedback? Help us improve.