OpenLIT uses OpenTelemetry Auto-Instrumentation to help you monitor AI Agents built using Phidata. This includes tracking performance and operation stats. Auto-instrumentation means you don’t have to set up monitoring manually for different LLMs, frameworks, or databases. By simply adding OpenLIT in your application, all the necessary monitoring configurations are automatically set up.Documentation Index
Fetch the complete documentation index at: https://docs.openlit.io/llms.txt
Use this file to discover all available pages before exploring further.
Get started
Initialize OpenLIT in your Application
- Python
- Typescript
- Zero Code Instrumentation
- One-Line Instrumentation
Perfect for existing applications - no code modifications needed:
- Via CLI Arguments
- Via Environment Variables
Perfect for: Legacy applications, production systems where code changes need approval, quick testing, or when you want to add observability without touching existing code.
YOUR_OTEL_ENDPOINT with the URL of your OpenTelemetry backend, such as http://127.0.0.1:4318 if you are using OpenLIT and a local OTel Collector.To send metrics and traces to other Observability tools, refer to the supported destinations.For more advanced configurations and application use cases, visit the OpenLIT Python repository or OpenLIT Typescript repository.Quickstart: LLM Observability
Production-ready AI monitoring setup in 2 simple steps with zero code changes
Configuration
Configure the OpenLIT SDK according to you requirements.
Destinations
Send telemetry to Datadog, Grafana, New Relic, and other observability stacks
Zero-code observability with the OpenLIT Controller
Discover and instrument LLM traffic across Kubernetes, Docker, and Linux using eBPF — no code changes required.

