Virtual environment for embodied agents with real-world perception
Top 40.5% on sourcepulse
GibsonEnv is a simulator designed for embodied AI research, providing realistic, real-world environments for training agents in perception and sensorimotor control. It addresses the challenges of real-world robot fragility and slow learning by offering a simulated, physics-enabled space that facilitates efficient training and aims for seamless transfer to physical systems.
How It Works
GibsonEnv integrates a physics engine (Bulletphysics) to simulate agent embodiment and environmental interactions. Its core innovation lies in virtualizing real-world spaces, capturing semantic complexity, and incorporating a "Goggles" function for domain adaptation. This function acts as a learned inverse mapping to bridge the sim-to-real gap by transforming real-world camera inputs to match simulated ones, enabling more robust transfer learning.
Quick Start & Requirements
docker pull xf1280/gibson:0.3.1
docker run --runtime=nvidia -ti --rm -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix -v :/root/mount/gibson/gibson/assets/dataset xf1280/gibson:0.3.1
libglew-dev
, libglm-dev
, libassimp-dev
, cmake
, python3.5
, pytorch
, tensorflow==1.3
.git clone https://github.com/StanfordVL/GibsonEnv.git
cd GibsonEnv && ./download.sh && ./build.sh build_local && pip install -e .
Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
1 year ago
1 day