SDK for turning docstrings into LLM functions
Top 63.2% on sourcepulse
This library simplifies LLM interactions by transforming Python docstrings into executable prompts, targeting developers who want to quickly prototype LLM-powered features. It leverages the llm
library to parse docstrings, inject variables, and execute prompts against various LLM backends, offering a streamlined approach to prompt engineering.
How It Works
The core mechanism involves a decorator that parses a function's docstring, treating it as a Jinja2 template. This template is then populated with function arguments at runtime to construct a prompt. The prompt is sent to an LLM backend specified in the decorator, and the response is returned. This approach provides syntactic sugar over the llm
library, enabling features like Pydantic schema validation for structured outputs and flexible prompt customization via inner functions.
Quick Start & Requirements
pip install smartfunc
.env
files.llm
library.Highlighted Details
async_backend
for asynchronous operations and microbatching.debug=True
mode to inspect prompts and responses.Maintenance & Community
The project is maintained by koaning and relies on the llm
library, which has a large community and is actively maintained.
Licensing & Compatibility
The library is released under the MIT license, permitting commercial use and integration with closed-source projects.
Limitations & Caveats
The library is designed for simplicity and rapid prototyping, intentionally omitting features found in more comprehensive libraries like instructor
, ell
, or marvin
. Schema support is backend-dependent, and users may encounter errors if a chosen backend does not support it.
3 months ago
1 day