OpenLIT SDK offers two ways to add opentelemetry-native observability to your AI applications:Documentation Index
Fetch the complete documentation index at: https://docs.openlit.io/llms.txt
Use this file to discover all available pages before exploring further.
Zero-code instrumentation
Runtime wrapper, no code changes neededUse when:
terminal
- Working with existing/production apps
- Code changes are restricted or risky
Manual instrumentation
Add 2 lines to your code for auto instrumentationUse when:
example.py
- Building new applications
- Need custom spans or metadata
- Want advanced configuration options
Comparison
| Zero-code instrumentation | Manual instrumentation | |
|---|---|---|
| Code changes | None | 2 lines |
| Advanced features | Basic configuration | Full access to custom spans, metadata |
| Setup time | Instant | 1 minute |
| Best for | Production apps, Existing Apps | New apps, custom needs |
Quickstart: LLM Observability
Production-ready AI monitoring setup in 2 simple steps with zero code changes
Integrations
60+ AI integrations with automatic instrumentation and performance tracking
Deploy OpenLIT Platform
Deployment options for scalable LLM monitoring infrastructure
Destinations
Send telemetry to Datadog, Grafana, New Relic, and other observability stacks
Zero-code observability with the OpenLIT Controller
Discover and instrument LLM traffic across Kubernetes, Docker, and Linux using eBPF — no code changes required.

