To send OpenTelemetry metrics generated by OpenLIT from your AI Application to Oodle, follow the below steps.

1. Get your Oodle Credentials

  1. Sign in to your Oodle account
  2. Get your Oodle credentials:
    • OODLE_ENDPOINT: Your Oodle metrics ingestion endpoint
    • INSTANCE_ID: Your Oodle instance identifier
    • API_KEY: Your Oodle API key for authentication
Oodle currently supports metrics only. Traces are not yet supported and will be handled by a debug exporter.

2. Configure OpenTelemetry Collector

Install OpenTelemetry Collector (if not already running): For detailed installation instructions, refer to the OpenTelemetry Collector Documentation. Configure the Collector for Oodle integration:
Replace:
  1. OODLE_ENDPOINT with your Oodle endpoint
  2. INSTANCE_ID with your Oodle instance ID
  3. API_KEY with your Oodle API key

3. Instrument your application

For Kubernetes deployments with zero-code instrumentation:
apiVersion: openlit.io/v1alpha1
kind: AutoInstrumentation
metadata:
  name: oodle-instrumentation
  namespace: default
spec:
  selector:
    matchLabels:
      instrument: "true"
  python:
    instrumentation:
      provider: "openlit"
      version: "latest"
  otlp:
    endpoint: "YOUR_OTELCOL_URL:4318"
    timeout: 30
  resource:
    environment: "production"
    serviceName: "my-ai-service"
Replace:
  1. YOUR_OTELCOL_URL:4318 with your OpenTelemetry Collector HTTP endpoint.
    • Example (Local): http://127.0.0.1:4318
    • Example (Remote): http://otel-collector.company.com:4318
Refer to the OpenLIT Operator Documentation for more advanced configurations and use cases.

4. Import Oodle Dashboard

Once your LLM application is instrumented, you can visualize the metrics in Oodle using our pre-built dashboard:
  1. Log into your Oodle Instance
  2. Navigate to Dashboards: Click Dashboards in the primary menu
  3. Import Dashboard: Click NewImport in the drop-down menu
  4. Copy the Dashboard JSON: Use the JSON provided in the accordion below
  5. Paste and Import: Paste the JSON directly into the text area and click Import
  6. Save the Dashboard: Save the imported dashboard
Your OpenLIT-instrumented AI applications will appear in Oodle with comprehensive metrics including:
  • LLM Request Rates: Monitor request volume and patterns
  • Usage Costs: Track AI model costs and budget allocation
  • Token Consumption: Analyze input/output token usage
  • Model Performance: Compare performance across different models
  • VectorDB Operations: Monitor database requests and operations
  • Application Breakdown: View metrics by application and environment
Currently, Oodle supports metrics only. Trace data is handled by the debug exporter for logging purposes.