kubectl-ai  by sozercan

Kubectl plugin for generating Kubernetes manifests using LLMs

created 2 years ago
1,188 stars

Top 33.5% on sourcepulse

GitHubView on GitHub
Project Summary

This project provides a kubectl plugin that leverages Large Language Models (LLMs) to generate and apply Kubernetes manifests. It's designed for developers and operators who want to quickly create or modify Kubernetes configurations using natural language prompts, reducing the need to manually search for or write YAML.

How It Works

The plugin acts as a wrapper around LLM APIs, translating natural language requests into Kubernetes YAML manifests. It supports OpenAI, Azure OpenAI, and compatible endpoints. A key feature is the optional --use-k8s-api flag, which enables the plugin to query the Kubernetes OpenAPI specification. This allows for more accurate manifest generation, including custom resource definitions (CRDs), by leveraging function calling capabilities of newer LLM models.

Quick Start & Requirements

  • Install via Homebrew: brew tap sozercan/kubectl-ai && brew install kubectl-ai
  • Install via Krew: kubectl krew index add kubectl-ai https://github.com/sozercan/kubectl-ai && kubectl krew install kubectl-ai/kubectl-ai
  • Requires a Kubernetes configuration and an OpenAI API key (or Azure OpenAI/compatible endpoint).
  • Environment variables for API key, deployment name, and endpoint are supported.
  • See: Installation, Usage

Highlighted Details

  • Generates and applies Kubernetes manifests using natural language prompts.
  • Supports OpenAI, Azure OpenAI, and local OpenAI-compatible endpoints (e.g., AIKit).
  • Optional --use-k8s-api flag for enhanced accuracy using Kubernetes OpenAPI Spec.
  • Can pipe input/output for integration with existing workflows and editors.

Maintenance & Community

  • Developed by sozercan.
  • Community links are not explicitly mentioned in the README.

Licensing & Compatibility

  • The README does not explicitly state a license.

Limitations & Caveats

The plugin generates full manifests and does not inherently know the current state of the Kubernetes cluster. The --use-k8s-api flag requires LLM models supporting function calling (e.g., 0613 or later).

Health Check
Last commit

6 months ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
0
Star History
63 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.