godot-ai-assistant-hub  by FlamxGames

Godot plugin for AI-assisted game development

Created 1 year ago
256 stars

Top 98.5% on SourcePulse

GitHubView on GitHub
Project Summary

This plugin embeds AI assistants directly into the Godot Engine, enabling them to read and write code within the editor. It serves Godot developers by acting as a flexible interface to various local or remote Large Language Models (LLMs), streamlining AI-assisted development workflows without requiring users to manage LLM inference directly within Godot.

How It Works

The Godot AI Assistant Hub functions as an API-agnostic bridge between Godot and external LLM services. It does not run LLM models itself but connects to providers like Ollama (officially supported), Google Gemini, OpenRouter, OpenWebUI, and xAI. Users configure "assistant types" using Godot Resources, defining the LLM model and associated "quick prompts." These quick prompts allow for single-click actions, leveraging keywords like {CODE} to insert selected editor code or {CHAT} to include current prompt text, facilitating direct code manipulation and documentation generation within Godot.

Quick Start & Requirements

  • Installation: Download the addon, unzip it, and copy the ai_assistant_hub folder into your Godot project's addons/ directory. Reload the project and enable the plugin in Project > Project Settings > Plugins.
  • Prerequisites: For local LLMs, Ollama (or another supported LLM tool) and at least one downloaded model are required. Remote LLM usage requires API access and incurs associated costs.
  • Compatibility: Tested with Godot versions 4.3 to 4.6 (stable versions only).
  • Documentation: Tutorial playlist and in-editor hints are available.

Highlighted Details

  • Directly write/edit code and documentation within Godot's Code Editor.
  • Read highlighted code for context-aware AI interactions.
  • Create custom assistant types and reusable "quick prompts" without coding.
  • Manage multiple simultaneous chat sessions with different assistants.
  • Edit conversation history for error correction.
  • Extensible to support LLM providers with a REST API.

Maintenance & Community

This project is a hobby endeavor by a solo developer, Forest, with improvements made on a need-basis. There is no formal roadmap, but ideas are welcomed in the project's Discussions section. Community contributions maintain support for several LLM providers beyond the officially supported Ollama.

Licensing & Compatibility

The project is licensed under the MIT license, permitting commercial use and integration into closed-source projects.

Limitations & Caveats

Performance and setup complexity for local LLMs are dependent on the user's hardware and chosen models. Support for advanced features like "reasoning level" is still pending for LLM providers other than Ollama, with community contributions being sought. Anonymous usage statistics are collected to understand user patterns.

Health Check
Last Commit

2 weeks ago

Responsiveness

Inactive

Pull Requests (30d)
1
Issues (30d)
9
Star History
23 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.