Discover and explore top open-source AI tools and projects—updated daily.
Go framework for building production-ready AI agents
Top 96.8% on SourcePulse
Summary
Ingenimax/agent-sdk-go is a Go framework designed for building production-ready AI agents. It offers seamless integration of multi-LLM support, advanced memory management, a modular tool ecosystem, and enterprise-grade features, enabling developers to create flexible and extensible AI applications.
How It Works
The SDK employs a modular architecture, coordinating an Agent core with LLM interfaces (OpenAI, Anthropic, Google Vertex AI, Ollama, vLLM), persistent memory solutions, and a plug-and-play tool ecosystem. It supports the Model Context Protocol (MCP) for integrating external services and features declarative YAML configuration for defining agents and tasks. A key differentiator is its zero-effort bootstrapping and auto-configuration capabilities, allowing agent profiles and tasks to be generated from simple system prompts.
Quick Start & Requirements
go get github.com/Ingenimax/agent-sdk-go
. As a CLI tool: clone the repository and run make build-cli
or ./scripts/install-cli.sh
../bin/agent-cli init
, configure API keys via .env
file or environment variables, then run commands like ./bin/agent-cli run "..."
or ./bin/agent-cli chat
.OPENAI_API_KEY
, REDIS_ADDRESS
).Highlighted Details
Maintenance & Community
The project is hosted on GitHub at https://github.com/Ingenimax/agent-sdk-go
. Specific details regarding active contributors, community channels (like Discord/Slack), or a public roadmap were not explicitly detailed in the provided README content.
Licensing & Compatibility
The project is released under the permissive MIT License. This license allows for broad compatibility, including commercial use and integration into closed-source applications without significant restrictions.
Limitations & Caveats
The requirement for Go 1.23+ may limit adoption on older development environments. While Redis is optional, its absence might impact performance for distributed memory scenarios. Integration with certain MCP servers may necessitate specific external dependencies or Docker setups.
19 hours ago
Inactive