Environment variables
Sets the host address of the ClickHouse server for OpenLIT to connectExample:
Sets the port on which ClickHouse listensExample:
Sets the name of the database in Clickhouse to be used by OpenLITExample:
Sets the username for authenticating with ClickHouseExample:
Sets the password for authenticating with ClickHouseExample:
Sets the location where SQLITE data is stored.Example:
OAuth authentication variables
For detailed OAuth setup instructions, see the OAuth Authentication Setup guide.
NEXTAUTH_URL
Sets the canonical URL of your site for NextAuth.js authenticationExample:
NEXTAUTH_SECRET
Used to encrypt the NextAuth.js JWT tokens and email verification hashesExample:Generate with:
openssl rand -base64 32
GOOGLE_CLIENT_ID
Google OAuth client ID for Google sign-in integrationExample:
GOOGLE_CLIENT_SECRET
Google OAuth client secret for Google sign-in integrationExample:
GITHUB_CLIENT_ID
GitHub OAuth client ID for GitHub sign-in integrationExample:
GITHUB_CLIENT_SECRET
GitHub OAuth client secret for GitHub sign-in integrationExample:
Environment file placement
Environment variables can be configured in multiple ways depending on your deployment method:Development setup
1
Client-side .env
Create a This file is automatically loaded by Next.js during development.
.env
file in the src/client/
directory for development:2
Docker Compose .env
Create a This file is automatically loaded by Docker Compose.
.env
file in the same directory as your docker-compose.yml
file:3
Development Docker Compose .env
For development Docker setup, create a
.env
file alongside src/dev-docker-compose.yml
:Production setup
For production deployments, set environment variables directly in your hosting platform or container orchestration system (Kubernetes, Docker Swarm, etc.).Sample environment file (.env)
.env
Create a dashboard
Create custom visualizations with flexible widgets, queries, and real-time AI monitoring
Manage prompts
Version, deploy, and collaborate on prompts with centralized management and tracking
LLM playground
Compare cost, duration, and response tokens across different LLMs to find the most efficient model
Running in Kubernetes? Try the OpenLIT Operator
Automatically inject instrumentation into existing workloads without modifying pod specs, container images, or application code.