To send OpenTelemetry traces generated by OpenLIT from your AI Application to Murnitur.ai, follow the below steps.

1. Get your Credentials

  1. Visit Murnitur.ai: Go to Murnitur.ai to create your account
  2. Generate API Key: Navigate to your dashboard and generate your API key
  3. Save your credentials:
    • API Key: Your Murnitur trace token for authentication
    • Endpoint: https://middleware.murnitur.ai
Murnitur.ai is optimized for trace data. The integration automatically disables metrics to focus on trace observability.

2. Instrument your application

For Kubernetes deployments with zero-code instrumentation:
apiVersion: openlit.io/v1alpha1
kind: AutoInstrumentation
metadata:
  name: murnitur-instrumentation
  namespace: default
spec:
  selector:
    matchLabels:
      instrument: "true"
  python:
    instrumentation:
      provider: "openlit"
      version: "latest"
  otlp:
    endpoint: "https://middleware.murnitur.ai"
    headers: "x-murnix-trace-token=YOUR_MURNITUR_API_KEY"
    timeout: 30
  resource:
    environment: "production"
    serviceName: "my-ai-service"
  # Murnitur focuses on trace data, metrics are disabled
  disableMetrics: true
Replace:
  1. YOUR_MURNITUR_API_KEY with your Murnitur API key from Step 1.
Refer to the OpenLIT Operator Documentation for more advanced configurations and use cases.

3. Visualize in Murnitur.ai

Once your LLM application is instrumented, you can explore the telemetry data in Murnitur.ai:
  1. Navigate to Murnitur.ai: Go to your Murnitur.ai Dashboard
  2. Explore Traces: View your AI application traces including:
    • LLM request traces with detailed timing and execution flow
    • Model performance analytics and latency metrics
    • Request/response data for debugging and optimization
    • Token usage and cost tracking information
    • Complete trace hierarchy showing the full request lifecycle
  3. Trace Analytics: Analyze patterns, performance bottlenecks, and usage trends
  4. Performance Monitoring: Monitor latency, throughput, and error rates
  5. Debugging Tools: Use detailed trace data to debug and optimize your AI applications
Your OpenLIT-instrumented AI applications will appear automatically in Murnitur.ai with comprehensive trace observability focused on LLM performance, execution flow, and detailed debugging capabilities optimized for AI workloads.