Installation
Install Mistral’s official library and LangSmith:mistralai provides a Mistral client for interacting with Mistral’s API.
Setup
Set your API keys and project name:- Ensure you have a Mistral API key from your Mistral AI account (set this as
MISTRAL_API_KEY). - Set
LANGSMITH_TRACING=trueand provide your LangSmith API key (LANGSMITH_API_KEY) activates automatic logging of traces. - Specify a
LANGSMITH_PROJECTname to organize traces by project; if not set, traces go to the default project (named “default”). - The
LANGSMITH_TRACINGflag must be true for any traces to be recorded.
Configure tracing
-
Instrument the Mistral API call with LangSmith. In your script, create a Mistral client and wrap a call in a traced function:
In this example, you use the Mistral SDK to send a chat completion request (with a user prompt) and retrieve the model’s answer. The
@traceabledecorator (from the LangSmith Python SDK) wraps thequery_mistralfunction so that each invocation is logged as a trace run of type"llm". Themetadata={"ls_provider": "mistral", "ls_model_name": "mistral-medium-latest"}tags the trace with the provider (Mistral) and model name. You can also refer to the LangSmith JavaScript SDK. -
Execute your script to generate a trace. For example:
The
query_mistral("Hello, how are you?")call will reach out to the Mistral API, and because of the@traceable/traceablewrapper, LangSmith will log this call’s inputs and outputs as a new trace. You’ll find the model’s response printed to the console, and a corresponding run appear in LangSmith.
View traces in LangSmith
After running the example, you can inspect the recorded traces in the LangSmith UI:- Open the LangSmith UI and log in to your account.
- Select the project you used for this integration (for example, the name set in
LANGSMITH_PROJECT, or default if you didn’t set one). - Find the trace corresponding to your Mistral API call. It will be identified by the function name (
query_mistral) or a custom name if provided. - Click on the trace to open it. You’ll be able to inspect the model input and output, including the prompt messages you sent and the response from Mistral, as well as timing information (latency) and any error details if the call failed.