Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.openlit.io/llms.txt

Use this file to discover all available pages before exploring further.

OpenLIT SDK offers two ways to add opentelemetry-native observability to your AI applications:

Zero-code instrumentation

Runtime wrapper, no code changes needed
terminal
openlit-instrument \
  python app.py
Use when:
  • Working with existing/production apps
  • Code changes are restricted or risky

Manual instrumentation

Add 2 lines to your code for auto instrumentation
example.py
import openlit
openlit.init()
Use when:
  • Building new applications
  • Need custom spans or metadata
  • Want advanced configuration options

Comparison

Zero-code instrumentationManual instrumentation
Code changesNone2 lines
Advanced featuresBasic configurationFull access to custom spans, metadata
Setup timeInstant1 minute
Best forProduction apps, Existing AppsNew apps, custom needs

Quickstart: LLM Observability

Production-ready AI monitoring setup in 2 simple steps with zero code changes

Integrations

60+ AI integrations with automatic instrumentation and performance tracking

Deploy OpenLIT Platform

Deployment options for scalable LLM monitoring infrastructure

Destinations

Send telemetry to Datadog, Grafana, New Relic, and other observability stacks

Zero-code observability with the OpenLIT Controller

Discover and instrument LLM traffic across Kubernetes, Docker, and Linux using eBPF — no code changes required.