MinAtar  by kenjyoung

AI testbed for reinforcement learning agents, miniaturized Atari 2600 games

created 6 years ago
305 stars

Top 88.8% on sourcepulse

GitHubView on GitHub
Project Summary

MinAtar provides a testbed of miniaturized Atari 2600 games, designed for efficient AI agent experimentation. It offers simplified 10x10 grid versions of five classic Atari titles, featuring multi-channel state representations that isolate game objects. This makes it ideal for researchers and developers focusing on reinforcement learning algorithms.

How It Works

MinAtar implements Atari games on a 10x10 grid, using multi-channel state representations where each channel corresponds to a specific game object (e.g., ball, paddle, bricks). This abstraction simplifies game mechanics and state complexity compared to the original Arcade Learning Environment, enabling faster iteration and experimentation with RL algorithms.

Quick Start & Requirements

  • Install via pip: pip install minatar
  • For examples: pip install ".[examples]" (requires PyTorch)
  • Verify installation: python examples/random_play.py -g breakout
  • Available games: Asterix, Breakout, Freeway, Seaquest, SpaceInvaders.
  • Gymnasium compatibility: gym.make('MinAtar/Breakout-v1')
  • Visualization: env.display_state() or render_mode='human'
  • Human play: python examples/human_play.py -g <game>

Highlighted Details

  • Fully compatible with Gymnasium API.
  • Offers two action set versions per game: full (6 actions) and minimal.
  • Includes visualization tools: display_state() and a GUI class.
  • Provides example implementations of DQN and Actor-Critic.
  • Benchmarks and results for DQN and AC agents are available.

Maintenance & Community

  • The project is associated with a paper: Young, K. & Tian, T. (2019). MinAtar: An Atari-Inspired Testbed for Thorough and Reproducible Reinforcement Learning Experiments.
  • No explicit community links (Discord/Slack) or roadmap are provided in the README.

Licensing & Compatibility

  • Licensed under GNU General Public License v3.0 or later.
  • This license may impose restrictions on use in closed-source commercial applications due to its copyleft nature.

Limitations & Caveats

  • A bug in Seaquest (v1.0.10 and lower) affected oxygen bar representation, potentially impacting agent learning; users should use v1.0.11 or later for consistent results.
Health Check
Last commit

7 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
6 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.