openllmetry  by traceloop

Open-source observability SDK for LLM applications

Created 2 years ago
6,410 stars

Top 8.0% on SourcePulse

GitHubView on GitHub
Project Summary

OpenLLMetry provides open-source observability for LLM applications by extending OpenTelemetry. It offers instrumentations for popular LLM providers and vector databases, allowing users to integrate LLM-specific tracing into existing observability stacks like Datadog or Honeycomb. The project aims to simplify LLM observability for developers and researchers.

How It Works

OpenLLMetry leverages OpenTelemetry's robust framework, adding custom semantic conventions and instrumentations specifically for LLM interactions. This approach ensures compatibility with a wide range of observability backends that already support OpenTelemetry. The Traceloop SDK simplifies integration by providing a single point of initialization, automatically capturing LLM calls, vector database queries, and other relevant application traces.

Quick Start & Requirements

  • Install the SDK: pip install traceloop-sdk
  • Initialize in code:
    from traceloop.sdk import Traceloop
    Traceloop.init()
    
  • For immediate trace visibility locally, use Traceloop.init(disable_batch=True).
  • Supported destinations include Datadog, Honeycomb, Grafana, SigNoz, and many others. See docs for detailed connection instructions.

Highlighted Details

  • Instruments LLM providers: OpenAI, Anthropic, Cohere, HuggingFace, Bedrock, SageMaker, and more.
  • Instruments Vector DBs: Chroma, Pinecone, Qdrant, Weaviate, Milvus, and others.
  • Integrates with frameworks: LangChain, LlamaIndex, Haystack, LiteLLM, CrewAI.
  • Semantic conventions are now part of the official OpenTelemetry project.

Maintenance & Community

Licensing & Compatibility

  • Licensed under the Apache 2.0 license.
  • Compatible with commercial and closed-source applications.

Limitations & Caveats

The SDK includes an opt-in telemetry feature for collecting anonymous usage data to identify breaking changes in LLM provider APIs. Users can disable this via environment variables or initialization parameters.

Health Check
Last Commit

13 hours ago

Responsiveness

1 day

Pull Requests (30d)
62
Issues (30d)
11
Star History
217 stars in the last 30 days

Explore Similar Projects

Starred by Han Wang Han Wang(Cofounder of Mintlify), John Resig John Resig(Author of jQuery; Chief Software Architect at Khan Academy), and
6 more.

evidently by evidentlyai

0.3%
7k
Open-source framework for ML/LLM observability
Created 4 years ago
Updated 13 hours ago
Feedback? Help us improve.