Connections
OneUptime
LLM Observability with OneUptime and OpenLIT
To directly send OpenTelemetry metrics and traces generated by OpenLIT SDK from your AI Application to OneUptime, Follow the below steps.
1
Get your OneUptime OpenTelemetry Credentials
- Sign in to your OneUptime account.
- Click on
More
in the Navigation bar and click onProject Settings
. - On the Telemetry Ingestion Key page, click on
Create Ingestion Key
to create a token. - Once you created a token, click on
View
to view and copy the token.
2
Add the following two lines to your application code:
Replace:
YOUR_ONEUPTIME_SERVICE_TOKEN
with the OneUptime Ingestion Key value you copied in Step 1.
Refer to the OpenLIT Python SDK repository for more advanced configurations and use cases.
3
Visualize in OneUptime
Once your LLM application is instrumented, you should see the traces and metrics in the OneUptime telemetry traces page.