Configuration
Provider Specific Features
OpenLLMetry offers LLM-specific observability using the Traceloop SDK:Custom Package Installation
Install additional LLM provider packages alongside OpenLLMetry:- OpenAI:
openai>=1.0.0
- Anthropic:
anthropic>=0.8.0
- Cohere:
cohere>=4.0.0
- Google:
google-generativeai>=0.3.0
- Azure OpenAI:
openai>=1.0.0
(with Azure configuration)
Traceloop SDK Integration
Uses the official Traceloop SDK for LLM observability:- Purpose-built for LLM monitoring
- Lightweight and minimal overhead
- LLM-specific performance metrics
- Easy integration with existing applications
LLM Performance Monitoring
Focused on language model specific metrics: Capabilities:- LLM request/response latency
- Token usage and throughput
- Model performance analytics
- Error rates and patterns
Environment Variables
Variable | Description | Default |
---|---|---|
TRACELOOP_APP_NAME | Application name for tracing | "openllmetry-app" |
OTEL_EXPORTER_OTLP_ENDPOINT | OTLP endpoint (standard) | Auto-configured |
TRACELOOP_API_ENDPOINT | Traceloop-specific endpoint | "http://localhost:4318" |
TRACELOOP_DISABLE_BATCH | Disable batch processing | false |