OmegaWiki  by skyllwt

AI research platform for full-lifecycle knowledge compounding

Created 2 weeks ago

New!

378 stars

Top 75.0% on SourcePulse

GitHubView on GitHub
Project Summary

ΩmegaWiki is a comprehensive AI research platform designed to address knowledge persistence and workflow fragmentation in scientific research. It transforms Andrej Karpathy's LLM-Wiki concept into a full-lifecycle system, enabling researchers to manage everything from paper ingestion to publication. Powered by 23 Claude Code skills, it centralizes research knowledge in a wiki, fostering compounding insights and accelerating discovery for AI researchers.

How It Works

ΩmegaWiki employs a wiki-centric architecture where every Claude Code skill reads from and writes back to a central wiki. This approach ensures knowledge persistence and compounding, unlike RAG systems that re-process information on each query. The system builds a structured knowledge graph with 9 entity types and 9 relationship types, explicitly tracking knowledge gaps and failed experiments as anti-repetition memory. This design facilitates the generation of novel research ideas, experiment design, and paper writing, all driven by a unified knowledge base.

Quick Start & Requirements

  • Primary Install: Clone the repository, install Claude Code (npm install -g @anthropic-ai/claude-code), log in (claude login), and run the setup script (./setup.sh or setup.ps1). Place papers in raw/papers/ and initialize the wiki with claude then /init <topic>.
  • Prerequisites: Python 3.9+, Node.js 18+.
  • Dependencies: ANTHROPIC_API_KEY (required, via claude login), optional keys for Semantic Scholar, DeepXIV, and OpenAI-compatible LLMs.
  • Links: Claude Code Documentation, Semantic Scholar API.

Highlighted Details

  • Features 23 Claude Code skills covering the entire research lifecycle: paper ingestion, idea generation, experiment design, drafting, and rebuttal writing.
  • Organizes knowledge into 9 entity types (e.g., Paper, Concept, Idea, Experiment) and 9 relationship types (e.g., supports, contradicts, inspired_by).
  • Supports cross-model adversarial review using any OpenAI-compatible API.
  • Offers bilingual support (English and Chinese) and daily arXiv paper ingestion via GitHub Actions.

Maintenance & Community

The project is actively developed, with a roadmap including a demo dataset, LaTeX venue templates, and multi-user collaboration. Community interaction and support are primarily facilitated through a WeChat group. Early supporters can receive MiMo API credits through an Angel User Program.

Licensing & Compatibility

The project is released under the MIT license, permitting broad use, modification, and distribution, including for commercial purposes and integration into closed-source projects.

Limitations & Caveats

ΩmegaWiki is in its early stages, with several features like a demo dataset, LaTeX venue templates, and multi-user collaboration still under development. Remote GPU experiment execution requires specific setup (SSH/rsync/screen) and is best performed on Linux-based systems (WSL2, macOS, Linux). The Angel User Program offering API credits is time-limited and subject to availability.

Health Check
Last Commit

2 days ago

Responsiveness

Inactive

Pull Requests (30d)
17
Issues (30d)
5
Star History
381 stars in the last 17 days

Explore Similar Projects

Feedback? Help us improve.