OpenLIT automatically instruments VectorDBs alongside LLMs, MCP, and frameworks by default.
1
Deploy OpenLIT
1
Git clone OpenLIT repository
2
Start Docker Compose
From the root directory of the OpenLIT Repo, Run the below command:
2
3
Instrument your application
Not sure which method to choose? Check out Instrumentation Methods to understand the differences.
4
Monitor, debug and test the quality of your vector database
Navigate to OpenLIT at
You should see VectorDB-specific traces and metrics including:If you have any questions or need support, reach out to our community.
127.0.0.1:3000
to start monitoring your VectorDB applications.
- Vector Operations: Track insert, update, delete, and query operations performance
- Similarity Search Metrics: Monitor search latency, relevance scores, and result quality
- Embedding Performance: Analyze embedding generation and storage efficiency
- Index Operations: Monitor index building, updates, and optimization processes
- Resource Utilization: Track memory usage, disk I/O, and computational costs
- Database Performance: Monitor connection pooling, query optimization, and throughput
Quickstart: LLM Evaluations
Get started with evaluating your LLM responses in 2 simple steps
Integrations
60+ AI integrations with automatic instrumentation and performance tracking
Destinations
Send telemetry to Datadog, Grafana, New Relic, and other observability stacks
Running in Kubernetes? Try the OpenLIT Operator
Automatically inject instrumentation into existing workloads without modifying pod specs, container images, or application code.