Skip to main content
OpenLIT uses OpenTelemetry Auto-Instrumentation to help you monitor LLM applications built using LangChain. This includes tracking performance and operation stats. Auto-instrumentation means you don’t have to set up monitoring manually for different LLMs, frameworks, or databases. By simply adding OpenLIT in your application, all the necessary monitoring configurations are automatically set up. The integration is compatible with:
  • langchain-core >= 0.1.20
  • modern langchain releases that expose langchain_core
  • modern langgraph releases when installed alongside LangChain
Minimal requirement:
  • langchain-core
Common installation patterns:
  • LangChain applications: pip install langchain
  • LangChain Core only: pip install langchain-core
  • LangGraph applications: pip install langgraph

Get started

1

Install OpenLIT

Open your command line or terminal and run:
pip install openlit
2

Install LangChain or LangChain Core

Install the framework package your application uses:
pip install langchain
# or
pip install langchain-core
# or
pip install langgraph
3

Initialize OpenLIT in your Application

Perfect for existing applications - no code modifications needed:
# Configure via CLI arguments
openlit-instrument \
  --service-name my-ai-app \
  --environment production \
  --otlp-endpoint YOUR_OTEL_ENDPOINT \
  python your_app.py
Perfect for: Legacy applications, production systems where code changes need approval, quick testing, or when you want to add observability without touching existing code.
Replace: YOUR_OTEL_ENDPOINT with the URL of your OpenTelemetry backend, such as http://127.0.0.1:4318 if you are using OpenLIT and a local OTel Collector.To send metrics and traces to other Observability tools, refer to the supported destinations.For more advanced configurations and application use cases, visit the OpenLIT Python repository or OpenLIT Typescript repository.

Quickstart: LLM Observability

Production-ready AI monitoring setup in 2 simple steps with zero code changes

Configuration

Configure the OpenLIT SDK according to you requirements.

Destinations

Send telemetry to Datadog, Grafana, New Relic, and other observability stacks
Kubernetes

Running in Kubernetes? Try the OpenLIT Operator

Automatically inject instrumentation into existing workloads without modifying pod specs, container images, or application code.