Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.openlit.io/llms.txt

Use this file to discover all available pages before exploring further.

The OpenLIT SDK can send AI observability data (traces, metrics, logs) from LLMs, vector databases, and AI frameworks directly to your existing observability stack. This enables:
  • Unified Monitoring: Consolidate AI and application metrics in one platform
  • Existing Workflows: Leverage your current dashboards, alerts, and processes
  • Cost Optimization: Use existing observability investments
  • Compliance: Meet data residency and security requirements

Supported Destinations

OpenTelemetry Collector

Prometheus + Tempo

Prometheus + Jaeger

Grafana Cloud

New Relic

Elastic

HyperDX

DataDog

Dash0

SigNoz

OneUptime

Dynatrace

OpenObserve

Highlight.io

Murnitur

Middleware

SigLens

Oodle

Langfuse


Quickstart: LLM Observability

Production-ready AI monitoring setup in 2 simple steps with zero code changes

Integrations

60+ AI integrations with automatic instrumentation and performance tracking

Deploy OpenLIT Platform

Deployment options for scalable LLM monitoring infrastructure

Zero-code observability with the OpenLIT Controller

Discover and instrument LLM traffic across Kubernetes, Docker, and Linux using eBPF — no code changes required.