Lightweight SDK for LLM agents using function calling
Top 92.3% on sourcepulse
microchain provides a lightweight framework for building function-calling LLM agents. It targets developers and researchers looking for a minimal, bloat-free solution to integrate LLM capabilities with custom Python functions. The primary benefit is its simplicity and direct approach to enabling LLMs to execute defined Python logic.
How It Works
microchain leverages a function-calling paradigm where LLMs are instructed to generate Python function calls based on user prompts and available tool definitions. It supports various LLM providers (OpenAI, Hugging Face) and templating strategies (Hugging Face, Vicuna). Functions are defined as Python classes inheriting from microchain.Function
, with type annotations and docstrings guiding the LLM. An Engine
manages registered functions, and an Agent
orchestrates the LLM interaction, prompt construction, and function execution.
Quick Start & Requirements
pip install microchain-python
Highlighted Details
Reasoning
and Stop
functions for agent control flow.bootstrap
feature to pre-populate agent history with initial function calls.Maintenance & Community
The project is maintained by galatolofederico. Further community or roadmap information is not detailed in the README.
Licensing & Compatibility
The repository does not explicitly state a license in the provided README. Users should verify licensing for commercial or closed-source integration.
Limitations & Caveats
The README does not specify limitations, known bugs, or deprecation status. The lack of explicit licensing information may pose a barrier to commercial adoption.
10 months ago
1 day