OpenLIT automatically instruments MCP alongside LLMs, VectorDBs, and frameworks by default.
1
Deploy OpenLIT
1
Git clone OpenLIT repository
2
Start Docker Compose
From the root directory of the OpenLIT Repo, Run the below command:
2
3
Instrument your MCP server
Not sure which method to choose? Check out Instrumentation Methods to understand the differences.
4
Monitor, debug and test the quality of your MCP server
Navigate to OpenLIT at
You should see MCP-specific traces and metrics including:If you have any questions or need support, reach out to our community.
127.0.0.1:3000
to start monitoring your MCP applications.
- Context Protocol Interactions: Track context loading, management, and utilization
- Tool Usage Metrics: Monitor tool calls and their performance within MCP workflows
- Protocol Performance: Analyze MCP handshakes and communication efficiency
- Resource Utilization: Monitor context window usage and memory consumption
- Error Tracking: Identify and debug MCP protocol errors and failures
Quickstart: LLM Evaluations
Get started with evaluating your LLM responses in 2 simple steps
Integrations
60+ AI integrations with automatic instrumentation and performance tracking
Destinations
Send telemetry to Datadog, Grafana, New Relic, and other observability stacks
Running in Kubernetes? Try the OpenLIT Operator
Automatically inject instrumentation into existing workloads without modifying pod specs, container images, or application code.