Documentation Index
Fetch the complete documentation index at: https://docs.openlit.io/llms.txt
Use this file to discover all available pages before exploring further.
OpenLIT automatically instruments VectorDBs alongside LLMs, MCP, and frameworks by default.
Deploy OpenLIT
Start Docker Compose
From the root directory of the OpenLIT Repo, Run the below command:
Install OpenLIT SDK
Want zero-code observability? Try the OpenLIT Controller
The Controller uses eBPF to automatically discover and instrument LLM traffic across Kubernetes, Docker, and Linux — no SDK required. Use SDKs for deeper application-level tracing.
- Python
- Typescript
Instrument your application
- Python
- Typescript
- Zero-Code instrumentation
- Manual instrumentation
- Via CLI arguments
- Via environment variables
Monitor, debug and test the quality of your vector database
Navigate to OpenLIT at
You should see VectorDB-specific traces and metrics including:If you have any questions or need support, reach out to our community.
127.0.0.1:3000 to start monitoring your VectorDB applications.
- Vector Operations: Track insert, update, delete, and query operations performance
- Similarity Search Metrics: Monitor search latency, relevance scores, and result quality
- Embedding Performance: Analyze embedding generation and storage efficiency
- Index Operations: Monitor index building, updates, and optimization processes
- Resource Utilization: Track memory usage, disk I/O, and computational costs
- Database Performance: Monitor connection pooling, query optimization, and throughput
Integrations
60+ AI integrations with automatic instrumentation and performance tracking
Create a dashboard
Create custom visualizations with flexible widgets, queries, and real-time AI monitoring
Manage prompts
Version, deploy, and collaborate on prompts with centralized management and tracking
Zero-code observability with the OpenLIT Controller
Discover and instrument LLM traffic across Kubernetes, Docker, and Linux using eBPF — no code changes required.

