Configure the OpenLIT SDK for OpenTelemetry-native LLM observability, cost tracking, and performance monitoring
Parameter | CLI Argument | Environment Variable | Description | Default | Required |
---|---|---|---|---|---|
environment | --environment | OTEL_DEPLOYMENT_ENVIRONMENT | Deployment environment | "default" | No |
service_name | --service_name | OTEL_SERVICE_NAME | Service name for tracing | "default" | No |
otlp_endpoint | --otlp_endpoint | OTEL_EXPORTER_OTLP_ENDPOINT | OpenTelemetry endpoint for LLM monitoring data export | None | No |
otlp_headers | --otlp_headers | OTEL_EXPORTER_OTLP_HEADERS | Authentication headers for enterprise monitoring backends | None | No |
disable_batch | --disable_batch | OPENLIT_DISABLE_BATCH | Disable batch span processing | False | No |
capture_message_content | --capture_message_content | OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT | Enable LLM prompt and response content tracing for debugging | True | No |
disabled_instrumentors | --disabled_instrumentors | OPENLIT_DISABLED_INSTRUMENTORS | Disable specific AI service instrumentations (comma-separated) | None | No |
disable_metrics | --disable_metrics | OPENLIT_DISABLE_METRICS | Disable cost tracking and performance metrics collection | False | No |
pricing_json | --pricing_json | OPENLIT_PRICING_JSON | Custom pricing configuration for accurate LLM cost tracking | None | No |
detailed_tracing | --detailed_tracing | OPENLIT_DETAILED_TRACING | Enable detailed AI framework and component-level tracing | True | No |
collect_system_metrics | --collect_system_metrics | OPENLIT_COLLECT_SYSTEM_METRICS | Comprehensive system monitoring (CPU, memory, disk, network, GPU) for AI workloads | False | No |
tracer | N/A | N/A | An instance of OpenTelemetry Tracer for tracing operations | None | No |
event_logger | N/A | N/A | EventLoggerProvider instance | None | No |
meter | N/A | N/A | OpenTelemetry Metrics instance | None | No |
Parameter | CLI Argument | Environment Variable | Description | Default | Required |
---|---|---|---|---|---|
application_name | --application_name | OTEL_SERVICE_NAME | Application name for tracing (deprecated, use service_name ) | "default" | No |
collect_gpu_stats | --collect_gpu_stats | OPENLIT_COLLECT_GPU_STATS | Enable GPU statistics collection (deprecated, use collect_system_metrics ) | False | No |
Environment Variable | Description | Example |
---|---|---|
OTEL_RESOURCE_ATTRIBUTES | Key-value pairs for resource attributes | service.version=1.0.0,deployment.environment=production |
OTEL_SERVICE_VERSION | Version of the service | 1.2.3 |
OTEL_RESOURCE_ATTRIBUTES_POD_NAME | Kubernetes pod name (if applicable) | my-ai-app-pod-xyz |
OTEL_RESOURCE_ATTRIBUTES_NODE_NAME | Kubernetes node name (if applicable) | node-123 |
openlit.get_prompt()
Parameter | Description |
---|---|
url | Sets the OpenLIT URL. Defaults to the OPENLIT_URL environment variable. |
api_key | Sets the OpenLIT API Key. Can also be provided via the OPENLIT_API_KEY environment variable. |
name | Unique prompt identifier for retrieval. Use with prompt_id for specific prompt versioning |
prompt_id | Numeric ID for direct prompt access. Enables precise prompt version control. Optional |
version | Specific prompt version retrieval for consistent AI behavior across deployments. Optional |
shouldCompile | Enable dynamic prompt compilation with variables for personalized LLM interactions. Optional |
variables | Dynamic variables for prompt template compilation and customization. Optional |
meta_properties | Tracking metadata for prompt usage analytics and audit trails in production. Optional |
openlit.get_secrets()
Parameter | Description |
---|---|
url | Sets the Openlit URL. Defaults to the OPENLIT_URL environment variable. |
api_key | Sets the OpenLIT API Key. Can also be provided via the OPENLIT_API_KEY environment variable. |
key | Specific secret key retrieval for individual credential access. Optional |
should_set_env | Automatically set retrieved secrets as environment variables for seamless application integration. Optional |
tags | Tag-based secret filtering for organized credential management across different AI services. Optional |