OpenLIT automatically instruments VectorDBs alongside LLMs, MCP, and frameworks by default.
This guide demonstrates production-ready VectorDB observability setup with OpenTelemetry-native auto-instrumentations. Get enterprise-grade AI monitoring with zero code changes using our CLI or minimal SDK integration for complete vector database performance tracking. Learn how to implement real-time vector operations monitoring, embedding performance tracking, similarity search optimization, and cost analysis for your VectorDB applications with OpenTelemetry traces and metrics.
1

Deploy OpenLIT

1

Git clone OpenLIT repository

git clone git@github.com:openlit/openlit.git
2

Start Docker Compose

From the root directory of the OpenLIT Repo, Run the below command:
docker compose up -d
3

Instrument your application

Not sure which method to choose? Check out Instrumentation Methods to understand the differences.
# Install OpenLIT
pip install openlit

# Start VectorDB monitoring instantly
openlit-instrument --service-name my-vectordb-app python your_vectordb_app.py

# With custom settings for VectorDB applications
openlit-instrument \
  --otlp-endpoint http://127.0.0.1:4318 \
  --service-name my-vectordb-app \
  --environment production \
  python your_vectordb_app.py
4

Monitor, debug and test the quality of your vector database

Navigate to OpenLIT at 127.0.0.1:3000 to start monitoring your VectorDB applications.
You should see VectorDB-specific traces and metrics including:
  • Vector Operations: Track insert, update, delete, and query operations performance
  • Similarity Search Metrics: Monitor search latency, relevance scores, and result quality
  • Embedding Performance: Analyze embedding generation and storage efficiency
  • Index Operations: Monitor index building, updates, and optimization processes
  • Resource Utilization: Track memory usage, disk I/O, and computational costs
  • Database Performance: Monitor connection pooling, query optimization, and throughput
Send Observability telemetry to other OpenTelemetry backendsIf you wish to send telemetry directly from the SDK to another backend, you can stop the current Docker services by using the command below. For more details on sending the data to your existing OpenTelemetry backends, checkout our Supported Destinations guide.
docker compose down
If you have any questions or need support, reach out to our community.

Kubernetes

Running in Kubernetes? Try the OpenLIT Operator

Automatically inject instrumentation into existing workloads without modifying pod specs, container images, or application code.