self-hosted-ai-starter-kit  by n8n-io

Local AI environment via Docker Compose template

created 1 year ago
10,994 stars

Top 4.7% on sourcepulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

This open-source Docker Compose template provides a quick setup for a local, self-hosted AI and low-code development environment. Curated by n8n, it targets developers and users looking to build secure AI workflows locally, offering a comprehensive toolkit with n8n, Ollama, Qdrant, and PostgreSQL.

How It Works

The kit leverages Docker Compose to orchestrate multiple AI components. It integrates n8n for low-code workflow automation, Ollama for running local LLMs, Qdrant as a vector store for AI data, and PostgreSQL for data management. This combination allows users to build applications like AI agents, secure document summarizers, and enhanced communication bots without relying on external cloud services.

Quick Start & Requirements

  • Install: git clone https://github.com/n8n-io/self-hosted-ai-starter-kit.git && cd self-hosted-ai-starter-kit
  • Run (Nvidia GPU): docker compose --profile gpu-nvidia up
  • Run (AMD GPU Linux): docker compose --profile gpu-amd up
  • Run (Mac/Apple Silicon, CPU or local Ollama): docker compose up
  • Run (CPU): docker compose --profile cpu up
  • Prerequisites: Docker, Docker Compose. Nvidia GPU users may need to configure Docker for GPU access. Mac users running Ollama locally need to adjust environment variables.
  • Setup: Access n8n at http://localhost:5678/. Initial LLM download (e.g., Llama3.2) may take time.
  • Docs: n8n AI Concepts

Highlighted Details

  • Integrates n8n with over 400 integrations and AI-specific nodes.
  • Supports local LLM inference via Ollama and vector storage with Qdrant.
  • Provides pre-configured Docker Compose profiles for different hardware (Nvidia GPU, AMD GPU, CPU, Mac).
  • Includes example workflows for AI agents, document analysis, and chatbots.

Maintenance & Community

  • Curated by n8n.
  • Support available via the n8n Forum.

Licensing & Compatibility

  • Licensed under the Apache License 2.0.
  • Suitable for commercial use and integration with closed-source applications.

Limitations & Caveats

The starter kit is designed for proof-of-concept projects and is not fully optimized for production environments. Mac users cannot directly expose their GPU to Docker instances.

Health Check
Last commit

2 weeks ago

Responsiveness

Inactive

Pull Requests (30d)
14
Issues (30d)
0
Star History
2,671 stars in the last 90 days

Explore Similar Projects

Starred by Tobi Lutke Tobi Lutke(Cofounder of Shopify), Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), and
13 more.

open-webui by open-webui

0.9%
105k
Self-hosted AI platform for local LLM deployment
created 1 year ago
updated 1 day ago
Feedback? Help us improve.