Discover and explore top open-source AI tools and projects—updated daily.
anadimMinimal transformer for multi-digit addition
Top 80.4% on SourcePulse
This repository presents the AdderBoard challenge: to engineer the smallest possible autoregressive transformer capable of adding two 10-digit numbers with over 99% accuracy. It targets researchers and engineers interested in extreme model compression, efficient transformer architectures, and exploring the fundamental capabilities of transformers beyond natural language processing. The primary benefit is advancing the state-of-the-art in minimal, performant neural network designs for specific computational tasks.
How It Works
The project frames integer addition as a sequence-to-sequence problem for transformers. Models must autonomously learn digit alignment, per-digit arithmetic, and carry propagation through self-attention, MLPs, and autoregressive generation, without hardcoded logic in their forward pass. This necessitates innovative approaches to tokenization, data formatting, and architectural design to minimize parameter counts. Submissions are categorized into "Trained" (weights learned via generic algorithms) and "Hand-coded" (weights analytically determined for constructive proof).
Quick Start & Requirements
verify.py script.verify.py script for testing submissions.Highlighted Details
d=4, d=7).Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The challenge is strictly defined for 10-digit integer addition. Models must adhere to a pure autoregressive transformer definition, forbidding task-specific control flow within the model's forward pass. Some leaderboard entries are marked as preliminary, requiring independent verification on the 10K test suite.
2 weeks ago
Inactive
evanmiller
THUDM
KellerJordan
ridgerchu
SafeAILab
huggingface