loom  by socketteer

Tree-based writing interface for human-AI collaboration

created 4 years ago
1,279 stars

Top 31.8% on sourcepulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

This project provides an experimental, tree-based writing interface designed for human-AI collaboration, specifically targeting writers and researchers who want to explore complex narratives or ideas. It offers a novel "block multiverse" visualization for GPT-3 generations, allowing users to navigate and manipulate branching story paths.

How It Works

Loom utilizes a tree structure to represent writing projects, enabling users to visualize and interact with different narrative branches. Its core innovation is the "block multiverse" interface, which visually plots potential AI-generated continuations as interconnected blocks. Users can "renormalize" to specific blocks, effectively zooming into a particular branch, and propagate further generations from that point. This approach facilitates exploration of counterfactuals and iterative refinement of AI-generated content.

Quick Start & Requirements

  • Installation:
    • Linux: python3 -m venv env, source env/bin/activate, pip install -r requirements.txt
    • macOS: conda create -n pyloom python=3.10, conda activate pyloom, pip install -r requirements-mac.txt
    • Docker (Linux): make build, make run
  • Prerequisites: Python >= 3.9.13 (3.10 recommended for macOS), tkinter (Linux), OpenAI API key (or other supported providers like GooseAI, AI21).
  • Local Inference: Supports llama-cpp-python for local LLM execution (e.g., Llama 3). Requires setting up llama-cpp-python with appropriate backend (e.g., Metal for MPS on Mac) and running its server.
  • Resources: Requires API keys for cloud-based models or a local LLM setup.
  • Docs: Conceptual explanation of block multiverse interface

Highlighted Details

  • Tree-based visualization with node expansion/collapse and topology editing.
  • "Block multiverse" mode for visualizing and navigating AI generation branches.
  • Extensive hotkey support for navigation, editing, and generation control.
  • Supports local inference via llama-cpp-python.

Maintenance & Community

The project is described as experimental and actively developed, with potential instability and poor documentation. No specific community links (Discord, Slack) or roadmap are provided in the README.

Licensing & Compatibility

The README does not explicitly state a license.

Limitations & Caveats

The project is explicitly labeled as experimental, unstable, and poorly documented. Functionality and stability may vary. The lack of a clear license could pose compatibility issues for commercial or closed-source use.

Health Check
Last commit

1 year ago

Responsiveness

1+ week

Pull Requests (30d)
0
Issues (30d)
0
Star History
34 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.