Quickstart - LLM tracing

LLM tracing "Hello world."

circle-info

You are looking at the old Evidently documentation: this API is available with versions 0.6.7 or lower. Check the newer version herearrow-up-right.

This quickstart shows how to instrument a simple LLM app to send inputs and outputs to Evidently Cloud. You will use the open-source Tracely library.

You will need an OpenAI key to create a toy LLM app.

Need help? Ask on Discordarrow-up-right.

1. Set up Evidently Cloud

Set up your Evidently Cloud workspace:

You can now go to your Python environment.

2. Installation

Install the Tracely library to instrument your app:

!pip install tracely

Install the Evidently library to interact with Evidently Cloud:

Install the OpenAI library to create a toy app:

Imports:

2. Initialize Tracing

Initialize the OpenAI client. Pass the token as an environment variable:

Set up tracing parameters. Give it a name to identify your tracing dataset.

3. Trace a simple function

Create a simple function to send questions to Open AI API and receive a completion. Set the questions list:

Create a function and use the trace_event() decorator to trace it:

4. View Traces

Go to the Evidently Cloud, open Datasets in the left menu (Datasets Pagearrow-up-right), and view your Traces.

What's next?

Want to run evaluations over this data? See a Quickstart.

Quickstart - LLM evaluationschevron-right

Check out a more in-depth tutorial to learn more about tracing:

Tutorial - Tracingchevron-right

Last updated