Use this file to discover all available pages before exploring further.
To send OpenTelemetry metrics and traces generated by OpenLIT from your AI Application to an OpenTelemetry Collector, follow the below steps.The OpenTelemetry Collector is a vendor-agnostic way to receive, process, and export telemetry data. It can act as an intermediary to route your OpenLIT data to multiple backends or apply processing transformations.
# Download and run the collector binarycurl -LO https://github.com/open-telemetry/opentelemetry-collector-releases/releases/latest/download/otelcol-contrib_linux_amd64.tar.gztar -xzf otelcol-contrib_linux_amd64.tar.gz./otelcol-contrib --config=otel-collector-config.yaml
Basic Collector Configuration:
Create an otel-collector-config.yaml file:
YOUR_COLLECTOR_ENDPOINT with your OpenTelemetry Collector endpoint from Step 1.
# Set environment variables (takes precedence over CLI args)export OTEL_EXPORTER_OTLP_ENDPOINT="YOUR_COLLECTOR_ENDPOINT"export OTEL_SERVICE_NAME="my-ai-service"export OTEL_DEPLOYMENT_ENVIRONMENT="production"# Run your applicationopenlit-instrument python app.py
Replace:
YOUR_COLLECTOR_ENDPOINT with your OpenTelemetry Collector endpoint from Step 1.
Refer to the OpenLIT Python SDK repository for more advanced configurations and use cases.
Once your LLM application is sending data to the OpenTelemetry Collector, configure exporters to send data to your preferred observability backends:Popular Exporter Configurations:
# Check collector logsdocker logs <collector-container-id># Or for Kuberneteskubectl logs -l app.kubernetes.io/name=opentelemetry-collector# Health check endpoint (if enabled)curl http://localhost:13133/
Benefits of Using OpenTelemetry Collector:
Vendor Agnostic: Route data to multiple backends simultaneously
Data Processing: Apply transformations, filtering, and sampling
Protocol Translation: Convert between different telemetry formats
Buffering & Reliability: Handle network issues and backend outages
Cost Optimization: Sample and filter data to reduce costs
Security: Add authentication, encryption, and data anonymization
Your OpenLIT-instrumented AI applications will send telemetry data to the Collector, which can then process and route it to any number of observability backends, providing flexibility and powerful data processing capabilities for your LLM monitoring infrastructure.