CLI tool extends ROS 2 with LLMs
Top 96.5% on SourcePulse
This project provides a ROS 2 command-line interface (CLI) extension that integrates with Large Language Models (LLMs) like OpenAI and Ollama. It aims to simplify ROS 2 interactions for beginners and experienced users alike by allowing natural language queries to retrieve information, execute commands, and understand ROS 2 concepts.
How It Works
The extension leverages the OpenAI Python API or Ollama's compatible API to process natural language queries. It then translates these queries into executable ROS 2 commands or provides explanations of ROS 2 concepts. The backend can be configured to use either OpenAI or Ollama, with environment variables controlling API keys, model names, and endpoints.
Quick Start & Requirements
docker run -it --rm --net=host -e OPENAI_API_KEY=$OPENAI_API_KEY tomoyafujita/ros2ai:humble
(OpenAI API key not required for Ollama).pip install openai ollama validators --break-system-packages --ignore-installed
(for Rolling/Jazzy) or pip install openai ollama validators --ignore-installed
(for Humble).colcon build
in a ROS 2 workspace.Highlighted Details
ros2 ai exec "..."
).Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
--break-system-packages
flag is required for installation on some Python environments due to PEP 668, which may have implications for system stability.1 month ago
1 day