Quickstart - LLM tracing
LLM tracing "Hello world."
Last updated
LLM tracing "Hello world."
Last updated
This quickstart shows how to instrument a simple LLM app to send inputs and outputs to Evidently Cloud. You will use the open-source Tracely library.
You will need an OpenAI key to create a toy LLM app.
Need help? Ask on .
Set up your Evidently Cloud workspace:
Sign up. If you do not have one yet, sign up for a free .
Create an Organization. When you log in the first time, create and name your Organization.
Create a Project. Click + button under Project List. Create a Project, copy and save the Project ID. ()
Get your API token. Click the Key icon in the left menu. Generate and save the token. ().
You can now go to your Python environment.
Install the Tracely library to instrument your app:
Install the Evidently library to interact with Evidently Cloud:
Install the OpenAI library to create a toy app:
Imports:
Initialize the OpenAI client. Pass the token as an environment variable:
Set up tracing parameters. Give it a name to identify your tracing dataset.
Create a simple function to send questions to Open AI API and receive a completion. Set the questions list:
Create a function and use the trace_event()
decorator to trace it:
Want to run evaluations over this data? See a Quickstart.
Check out a more in-depth tutorial to learn more about tracing:
Go to the Evidently Cloud, open Datasets in the left menu (), and view your Traces.