Configuration
Provider Specific Features
OpenInference offers comprehensive instrumentation coverage using pure OpenTelemetry standards:Custom Package Installation
Install additional AI framework packages alongside OpenInference:- LangChain:
langchain>=0.1.0,langchain-community>=0.0.20
- LlamaIndex:
llama-index>=0.9.0,llama-index-vector-stores-chroma>=0.1.0
- LLM Providers:
openai>=1.0.0,anthropic>=0.8.0,google-generativeai>=0.3.0
- Vector Databases:
chromadb>=0.4.0,pinecone-client>=2.0.0
Comprehensive AI Framework Support
Built-in instrumentors for major AI frameworks:OpenTelemetry Standard Compliance
Pure OpenTelemetry implementation with standard semantic conventions: Features:- 100% OpenTelemetry semantic conventions compliance
- Vendor-neutral tracing data
- Multi-backend compatibility (Jaeger, Grafana, DataDog, etc.)
- Standard resource attributes and span naming
Environment Variables
Variable | Description | Default |
---|---|---|
OTEL_SERVICE_NAME | Service name for tracing | "openinference-app" |
OTEL_DEPLOYMENT_ENVIRONMENT | Deployment environment | "production" |
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT | Capture message content | true |