📋 Prerequisites
- Kubernetes cluster+ with cluster admin access
- Helm package manager
- kubectl configured for your cluster
☸️ Don't have a Kubernetes cluster? Create one locally
☸️ Don't have a Kubernetes cluster? Create one locally
We recommend using K3d or minikube for trying OpenLIT Operator out in a local environment.
- Install k3d following their official guide
- Create cluster:
1
Deploy OpenLIT Platform
Install the OpenLIT observability platform to collect and monitor your LLM App and AI Agent performance:Add Helm RepositoryInstall OpenLIT Platform
2
Deploy OpenLIT Operator
Install the OpenLIT Operator to enable zero-code instrumentation:Install OpenLIT OperatorVerify the operator is runningExpected output:
3
Create AutoInstrumentation Custom Resource
Create an AutoInstrumentation resource to define how your AI apps should be instrumented:
Already have AI applications running? You’ll need to restart them to enable instrumentation:
4
Deploy the example AI Agent
Deploy the example AI agent built using CrewAI:
5
View Traces and metrics in OpenLIT
Access the OpenLIT dashboard to view your AI application traces:Port Forward to OpenLITAccess Dashboard
- Open your browser and navigate to
http://localhost:3000
- Navigate to Traces section in the dashboard
- Service Overview: Your
openlit-test-app
service with health metrics - Trace Timeline: Individual traces for HTTP requests and OpenAI API calls
- LLM Operations: Detailed spans showing OpenAI API calls with token usage
- Performance Metrics: Response times, error rates, and throughput
- Cost Tracking: Token usage and estimated costs