LLMOps template for building LLM-infused apps using Prompt Flow
Top 83.7% on sourcepulse
This template provides a comprehensive framework for managing the lifecycle of LLM-infused applications using Microsoft's Prompt Flow. It targets engineers, data scientists, and developers seeking to streamline prompt engineering, experimentation, evaluation, and deployment, offering a structured approach to LLMOps.
How It Works
The template centralizes code hosting for multiple Prompt Flow projects, enabling robust lifecycle management from local experimentation to production deployment. It supports variant and hyperparameter experimentation, A/B deployments, and detailed reporting for all runs. The system automatically detects and executes various flow types (Python class, function, YAML, DAG) and integrates with CI/CD pipelines (Azure DevOps, GitHub Actions, Jenkins).
Quick Start & Requirements
pip install promptflow promptflow-tools promptflow-sdk jinja2 promptflow[azure] openai promptflow-sdk[builtins] python-dotenv
.env
file for Azure OpenAI connection details (API key, base, type, version).Highlighted Details
Maintenance & Community
This is a Microsoft-maintained project. Contributions are welcome, subject to a Contributor License Agreement (CLA). The project adheres to the Microsoft Open Source Code of Conduct.
Licensing & Compatibility
Limitations & Caveats
Currently, only Azure OpenAI is supported as a provider for connections. The template's CI/CD configurations in .azure-pipelines
are specific to Azure DevOps and may require adaptation for other CI/CD systems beyond the provided GitHub Actions and Jenkins examples.
3 months ago
Inactive