The OpenLLMetry provider uses the Traceloop SDK that provides lightweight, LLM-focused observability with minimal overhead and easy integration for existing language model applications.

Configuration

apiVersion: openlit.io/v1alpha1
kind: AutoInstrumentation
metadata:
  name: openllmetry-instrumentation
spec:
  selector:
    matchLabels:
      openllmetry.instrument: "true"
  python:
    instrumentation:
      provider: "openllmetry"
      version: "latest"
  otlp:
    endpoint: "http://openlit:4318"

Provider Specific Features

OpenLLMetry offers LLM-specific observability using the Traceloop SDK:

Custom Package Installation

Install additional LLM provider packages alongside OpenLLMetry:
spec:
  python:
    instrumentation:
      customPackages: "openai>=1.0.0,anthropic>=0.8.0,cohere>=4.0.0"
Supported LLM Providers:
  • OpenAI: openai>=1.0.0
  • Anthropic: anthropic>=0.8.0
  • Cohere: cohere>=4.0.0
  • Google: google-generativeai>=0.3.0
  • Azure OpenAI: openai>=1.0.0 (with Azure configuration)

Traceloop SDK Integration

Uses the official Traceloop SDK for LLM observability:
# Core dependency (automatically included)
traceloop-sdk  # Official OpenLLMetry SDK
Features:
  • Purpose-built for LLM monitoring
  • Lightweight and minimal overhead
  • LLM-specific performance metrics
  • Easy integration with existing applications

LLM Performance Monitoring

Focused on language model specific metrics: Capabilities:
  • LLM request/response latency
  • Token usage and throughput
  • Model performance analytics
  • Error rates and patterns

Environment Variables

VariableDescriptionDefault
TRACELOOP_APP_NAMEApplication name for tracing"openllmetry-app"
OTEL_EXPORTER_OTLP_ENDPOINTOTLP endpoint (standard)Auto-configured
TRACELOOP_API_ENDPOINTTraceloop-specific endpoint"http://localhost:4318"
TRACELOOP_DISABLE_BATCHDisable batch processingfalse