ai-town  by a16z-infra

AI town starter kit for building a virtual world

created 2 years ago
8,620 stars

Top 6.0% on sourcepulse

GitHubView on GitHub
Project Summary

AI Town is a deployable starter kit for creating a virtual world populated by AI characters that interact and socialize, inspired by the "Generative Agents" research paper. It targets developers and researchers looking to build and customize their own AI-driven simulations, offering a robust foundation for scalable, multi-player experiences and a JavaScript/TypeScript framework as an alternative to Python-based simulators.

How It Works

The project utilizes Convex as its backend, providing a hosted platform with a built-in database, real-time capabilities, and support for shared global state and transactions. AI characters' interactions, memory, and decision-making are powered by configurable Large Language Models (LLMs) like Llama3 (default) or OpenAI, with PixiJS handling rendering and user interface interactions. Background music generation is integrated via Replicate's MusicGen.

Quick Start & Requirements

  • Install: npm install followed by npm run dev for local development.
  • Prerequisites: Node.js (v18 recommended), a Convex account (free tier available), and optionally Ollama for local LLM inference.
  • Setup: Local setup is straightforward, with Docker Compose providing a self-contained option. Cloud deployment to Convex is also supported.
  • Links: AI Stack Devs Discord, Live Demo

Highlighted Details

  • Built on Convex, a real-time backend platform with ACID-compliant database and serverless functions.
  • Supports local LLM inference via Ollama or cloud-based APIs like OpenAI and Together.ai.
  • Customizable characters, spritesheets, and environment maps using Tiled.
  • Optional integration with Clerk for authentication and Replicate for background music.

Maintenance & Community

The project is actively maintained by a16z-infra. Community support is available via a dedicated Discord server.

Licensing & Compatibility

  • License: MIT.
  • Compatibility: Permissive for commercial use and integration with closed-source applications.

Limitations & Caveats

  • Customizing LLM providers or embedding models requires re-initializing the database, potentially leading to data loss.
  • The simulation may pause after 5 minutes of inactivity unless configured otherwise.
  • Windows users may require WSL2 and specific configurations for Ollama connectivity.
Health Check
Last commit

5 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
283 stars in the last 90 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of AI Engineering, Designing Machine Learning Systems), Pietro Schirano Pietro Schirano(Founder of MagicPath), and
1 more.

SillyTavern by SillyTavern

3.2%
17k
LLM frontend for power users
created 2 years ago
updated 3 days ago
Feedback? Help us improve.