Meet Otter 🦦 — your AI-powered observability copilot built into OpenLIT. Otter lets you interact with your observability platform using natural language. It can query your telemetry data, create and manage platform resources, and generate complete dashboards — all through conversation.Documentation Index
Fetch the complete documentation index at: https://docs.openlit.io/llms.txt
Use this file to discover all available pages before exploring further.
Meet Otter
Otter is always just a click away via the floating chat button (bottom-right corner of every page) or from the Chat entry in the sidebar. Otter paddles through your traces, metrics, and logs so you don’t have to.Key Features
- Natural Language Queries: Ask questions about your traces, metrics, costs, and tokens — the AI converts them to ClickHouse SQL and executes them
- Resource Management: Create and manage rules, contexts, prompts, vault secrets, and custom models through conversation
- Dashboard Generation: Describe the dashboard you want and get a complete importable layout with working queries
- SQL Execution: Generated queries run automatically with results displayed inline as tables and charts
- Save as Widget: Save any query result as a reusable dashboard widget
Get Started
Configure AI Provider
Navigate to Chat → Settings (gear icon in chat sidebar) and configure:
- Provider — Select from 14 supported providers (OpenAI, Anthropic, Google, etc.)
- Model — Choose a model from the provider’s list (includes custom models from Manage Models)
- API Key — Select a vault secret containing your provider’s API key
Store your API key in the Vault first, then select it in Chat Settings.
Start a Conversation
Click New Chat in the sidebar or use the floating chat button available on any page.Try example prompts like:
- “What are my top 5 most expensive models this week?”
- “Show me error rate by provider over the last 24 hours”
- “Create a rule that triggers when request duration exceeds 5 seconds”
Data Queries
Ask questions about your OpenTelemetry data and the AI generates ClickHouse SQL with dynamic time filters.Supported Tables
| Table | Description |
|---|---|
otel_traces | Traces and spans (LLM calls, API requests, etc.) |
otel_metrics_gauge | Gauge metrics (GPU temperature, memory, etc.) |
otel_metrics_sum | Counter/sum metrics (token counts, etc.) |
otel_metrics_histogram | Histogram metrics (request durations, etc.) |
Example Queries
Save as Widget
Click Save as Widget on any query result to:- Choose a widget type (Stat Card, Bar Chart, Line Chart, Pie Chart, Area Chart, Table)
- Select a dashboard to add it to
- The query is automatically converted to use dynamic time filters (
{{filter.timeLimit.start/end}})
Resource Management
The chat can create and manage platform resources through natural language. All operations execute immediately.Supported Operations
| Resource | Operations |
|---|---|
| Rules | Create, update, delete, list, get details, link/unlink entities |
| Contexts | Create, update, delete, list |
| Prompts | Create, update version, delete, list |
| Vault Secrets | Create, update, delete, list |
| Custom Models | Create, update, delete, list |
Examples
Vault key names are automatically normalized to UPPER_SNAKE_CASE (e.g., “my openai key” becomes “MY_OPENAI_KEY”).
Dashboard Generation
Describe the dashboard you want and the AI generates a complete importable layout.How it Works
- Ask: “Create a GPU monitoring dashboard with utilization, memory, and temperature widgets”
- The AI generates a dashboard JSON with:
- Widget definitions (type, title, SQL query, display properties)
- Grid layout positions
- Dynamic time filters (
{{filter.timeLimit.start/end}})
- A dashboard card appears with:
- Dashboard title and widget count
- Widget preview chips
- Import Dashboard button
- Download JSON button
- Click Import Dashboard to create it instantly
Dashboard Queries
All generated queries use the Mustache template pattern for dynamic time filtering:Cost & Token Tracking
Each conversation tracks:- Per-message: Input tokens, output tokens, estimated cost
- Per-conversation: Accumulated totals displayed in the sidebar
Conversation Management
- URL-based navigation: Each conversation has a unique URL (
/chat?id=<uuid>) - Auto-generated titles: Titles are generated after the first message exchange
- Persistent history: Conversations and messages are stored in ClickHouse
- Search: Filter conversations by title in the sidebar
- Delete: Hover over a conversation and click the trash icon
Configuration
Chat Settings
Access via the gear icon in the chat sidebar or navigate to/chat/settings.
| Setting | Description |
|---|---|
| Provider | LLM provider (OpenAI, Anthropic, Google, Mistral, Cohere, Groq, etc.) |
| Model | Model from the provider’s list (includes custom models) |
| API Key | Vault secret reference for the provider’s API key |
Supported Providers
OpenAI, Anthropic, Google, Mistral, Cohere, Groq, Perplexity, Azure OpenAI, Together AI, Fireworks AI, DeepSeek, xAI, Hugging Face, ReplicateSecurity
- All SQL queries execute with
readonlymode — no write operations possible - Queries are validated against an allowlist of tables (
otel_traces,otel_metrics_*) - DDL/DML statements (INSERT, UPDATE, DELETE, DROP, etc.) are rejected
- API keys are stored in the Vault and never sent to the client
- Query results have a LIMIT cap of 10,000 rows

