Starter code for embodied AI Habitat Challenge
Top 83.0% on sourcepulse
This repository provides the starter code and infrastructure for the 2023 Habitat Navigation Challenge, focusing on ObjectNav and ImageNav tasks. It's designed for researchers and developers in embodied AI aiming to build agents that can navigate unseen environments to find specific objects or match goal images, leveraging realistic robot configurations and continuous action spaces.
How It Works
The challenge utilizes the HM3D-Semantics v0.2 dataset and simulates navigation for a HelloRobot Stretch robot. Agents interact with the environment using continuous actions (linear/angular velocity, camera pitch) or abstract waypoint commands. Performance is evaluated using Success and Success weighted by Path Length (SPL) metrics, assessing both task completion and navigation efficiency.
Quick Start & Requirements
git clone https://github.com/facebookresearch/habitat-challenge.git
) and install dependencies via Docker.habitat-challenge-data/data/scene_datasets/hm3d_v0.2
.scripts/test_local_objectnav.sh
or scripts/test_local_imagenav.sh
after building a Docker image.Highlighted Details
Maintenance & Community
The project is maintained by Facebook AI Research (FAIR). Questions and issues can be raised via GitHub issues.
Licensing & Compatibility
The code is released under a permissive license, allowing for commercial use and integration with closed-source projects.
Limitations & Caveats
The provided starter code and Docker setup are Linux-specific. Participants must manage dataset downloads and Docker image updates. Overfitting to the test set is discouraged due to limited submissions for leaderboard and challenge phases.
2 years ago
1 day