AI canvas for visual exploration of LLM conversations
Top 95.8% on sourcepulse
Tangent is a canvas for exploring AI conversations, enabling users to branch, merge, and resurrect chat threads. It targets AI enthusiasts and researchers seeking a more visual and experimental approach to interacting with LLMs, moving beyond traditional chat interfaces.
How It Works
Tangent operates as an offline-first application, primarily leveraging local models via Ollama for both embeddings and LLM generation. It processes exported chat data from platforms like Claude and ChatGPT, allowing users to create conversational branches, resume interrupted threads, and cluster discussions by inferred topics. The architecture includes a modular backend API for managing chats, messages, states, and topics, with services for background processing, clustering, and embedding generation.
Quick Start & Requirements
./install.sh
.all-minilm
for embeddings and qwen2.5-coder:7b
for generation (or custom alternatives).install.sh
script handles Ollama setup, Whisper.cpp installation, Python environment setup, and frontend startup.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project is described as "kinda hardcoded for ollama" but with plans for generalization. While Whisper.cpp is optional, its setup is detailed as a prerequisite for voice features. The frontend setup may require manual installation of missing packages.
1 week ago
1 day