SDK for LLM app development
Top 96.9% on sourcepulse
This library provides a Pythonic interface for building LLM applications, abstracting away complexities and offering features like type-safe structured outputs, agent creation with function calling, streaming, and automatic prompt caching. It targets developers looking for a streamlined and efficient way to integrate LLMs into their projects, aiming to be the go-to tool for LLM app development.
How It Works
Promptic utilizes a decorator-based approach (@llm
) to define LLM interactions. Functions decorated with @llm
use their docstrings as prompt templates, automatically combining them with function arguments. It leverages LiteLLM for broad LLM provider support, allowing easy switching between models. For structured outputs, it integrates with Pydantic or JSON Schema, and for agentic behavior, functions can be registered as tools callable by the LLM.
Quick Start & Requirements
pip install promptic
Highlighted Details
ImageBytes
for vision-capable models.Maintenance & Community
CONTRIBUTING.md
.Licensing & Compatibility
Limitations & Caveats
Gemini models do not support streaming when using tools/function calls, a limitation inherited from the underlying provider. For provider-specific features or workarounds, direct interaction with LiteLLM or the provider's SDK may be necessary.
1 month ago
1 day