1. Get your Credentials
- Sign in to your Middleware account: Go to Middleware Dashboard
- Navigate to API Keys: Go to Settings → API Keys (Direct Link)
- Copy your credentials:
- MW_API_KEY: Your Middleware API key for authentication
- MW_TARGET: Your Middleware target URL endpoint Save these values - you’ll need them for configuration.
2. Instrument your application
For Kubernetes deployments with zero-code instrumentation:MW_TARGET
with your Middleware target URL from Step 1.- Example:
https://abcd.middleware.io:443
- Example:
MW_API_KEY
with your Middleware API key from Step 1.- Example:
dxyxsdojzrgpsvizzzcsvhrwnmzqdsdsd
- Example:
3. Visualize in Middleware
Once your LLM application is instrumented, you can explore the telemetry data in Middleware:- Navigate to LLM Observability: Go to your Middleware Dashboard and click on LLM Observability in the sidebar
- Explore AI Operations: View your AI application traces including:
- LLM request traces with detailed timing
- Token usage and cost information
- Vector database operations
- Model performance analytics
- Request/response payloads (if enabled)
- Custom Dashboards: Create custom dashboards for your specific LLM metrics
- Alerting: Set up alerts for LLM performance anomalies and cost thresholds
- Performance Analysis: Analyze latency, throughput, and resource usage patterns