OpenLIT uses OpenTelemetry Auto-Instrumentation to help you monitor LLM applications built using models from Amazon Bedrock. This includes tracking performance, token usage, costs, and how users interact with the application.

Auto-instrumentation means you don’t have to set up monitoring manually for different LLMs, frameworks, or databases. By simply adding OpenLIT in your application, all the necessary monitoring configurations are automatically set up.

The integration is compatible with

  • Boto3 Python SDK client >= 1.34.93

Get Started

1

Install OpenLIT

Open your command line or terminal and run:

pip install openlit
2

Initialize OpenLIT in your Application

You can set up OpenLIT in your application using either function arguments directly in your code or by using environment variables.

Add the following two lines to your application code:

import openlit

openlit.init(
  otlp_endpoint="YOUR_OTEL_ENDPOINT", 
)

Replace:

  1. YOUR_OTEL_ENDPOINT with the URL of your OpenTelemetry backend, such as http://127.0.0.1:4318 if you are using OpenLIT and a local OTel Collector.

To send metrics and traces to other Observability tools, refer to the Connections Guide.

For more advanced configurations and application use cases, visit the OpenLIT Python repository.