Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.openlit.io/llms.txt

Use this file to discover all available pages before exploring further.

OpenLIT is open source and is very easy and lightweight to self-host as it needs only 3 componets to run:
  1. OpenLIT itself
  2. ClickHouse for storage
  3. OpenTelemetry Collector for telemetry collection.
Already have ClickHouse or OpenTelemetry Collector?You can reuse your existing infrastructure and skip deploying these components. OpenLIT can be configured to connect to your existing ClickHouse database and OTel Collector setup. Learn more about connecting to existing databases →
This section contains guides for different deployment scenarios.
Kubernetes

Kubernetes

Deploy OpenLIT’s lightweight 3-component architecture on Kubernetes for scalable AI Engineering capabilities.

Docker

Deploy OpenLIT’s lightweight 3-component stack using Docker for an easy-to-manage AI Engineering setup.

Kubernetes

Deploy OpenLIT on your Kubernetes cluster using the OpenLIT Helm chart. The Helm chart deploys all 3 components (OpenLIT platform, ClickHouse, and OpenTelemetry Collector) for a reliable and scalable solution.
1

Helm Repo Setup

shell
helm repo add openlit https://openlit.github.io/helm/
helm repo update
2

Installing the Helm Chart

shell
helm install openlit openlit/openlit

Docker

For a quick and straightforward setup, Docker Compose can be used to deploy OpenLIT’s complete stack: the OpenLIT platform, ClickHouse database, and OpenTelemetry Collector.
1

Git Clone OpenLIT Repository

git clone git@github.com:openlit/openlit.git
2

Start Docker Compose

From the root directory of the OpenLIT Repo, Run the below command
docker compose up -d

Quickstart: LLM Observability

Production-ready AI monitoring setup in 2 simple steps with zero code changes

Configuration

Configure OpenLIT deployment settings, environment variables, and database connections

Create a dashboard

Create custom visualizations with flexible widgets, queries, and real-time AI monitoring

Manage prompts

Version, deploy, and collaborate on prompts with centralized management and tracking

Zero-code observability with the OpenLIT Controller

Discover and instrument LLM traffic across Kubernetes, Docker, and Linux using eBPF — no code changes required.