OpenLIT automatically instruments LLMs, VectorDBs, MCP, and frameworks by default.
1
Deploy OpenLIT
1
Git clone OpenLIT repository
2
Start Docker Compose
From the root directory of the OpenLIT Repo, Run the below command:
2
3
Instrument your AI application
Not sure which method to choose? Check out Instrumentation Methods to understand the differences.
4
Monitor, debug and test the quality of your AI applications
With real-time LLM observability data now flowing to OpenLIT, visualize comprehensive AI performance metrics including token costs, latency patterns, hallucination rates, and model accuracy to optimize your production AI applications.Just head over to OpenLIT at 

127.0.0.1:3000
on your browser to start exploring. You can login using the default credentials- Email:
user@openlit.io
- Password:
openlituser

