OpenLIT automatically instruments MCP alongside LLMs, VectorDBs, and frameworks by default.
This guide demonstrates production-ready MCP (Model Context Protocol) observability setup with OpenTelemetry-native auto-instrumentations. Get enterprise-grade AI monitoring with zero code changes using our CLI or minimal SDK integration for complete MCP performance tracking. Learn how to implement real-time context tracking, tool usage monitoring, protocol performance analysis, and resource utilization optimization for your MCP applications with OpenTelemetry traces and metrics.
1

Deploy OpenLIT

1

Git clone OpenLIT repository

git clone git@github.com:openlit/openlit.git
2

Start Docker Compose

From the root directory of the OpenLIT Repo, Run the below command:
docker compose up -d
3

Instrument your MCP server

Not sure which method to choose? Check out Instrumentation Methods to understand the differences.
# Install OpenLIT
pip install openlit

# Start MCP monitoring instantly
openlit-instrument --service-name my-mcp-app python your_mcp_app.py

# With custom settings for MCP applications
openlit-instrument \
  --otlp-endpoint http://127.0.0.1:4318 \
  --service-name my-mcp-app \
  --environment production \
  python your_mcp_app.py
4

Monitor, debug and test the quality of your MCP server

Navigate to OpenLIT at 127.0.0.1:3000 to start monitoring your MCP applications.
You should see MCP-specific traces and metrics including:
  • Context Protocol Interactions: Track context loading, management, and utilization
  • Tool Usage Metrics: Monitor tool calls and their performance within MCP workflows
  • Protocol Performance: Analyze MCP handshakes and communication efficiency
  • Resource Utilization: Monitor context window usage and memory consumption
  • Error Tracking: Identify and debug MCP protocol errors and failures
Send Observability telemetry to other OpenTelemetry backendsIf you wish to send telemetry directly from the SDK to another backend, you can stop the current Docker services by using the command below. For more details on sending the data to your existing OpenTelemetry backends, checkout our Supported Destinations guide.
docker compose down
If you have any questions or need support, reach out to our community.

Kubernetes

Running in Kubernetes? Try the OpenLIT Operator

Automatically inject instrumentation into existing workloads without modifying pod specs, container images, or application code.