deepseek-chat (for general conversations) and deepseek-reasoner (for advanced reasoning tasks). Using LangSmith allows you to debug, monitor, and evaluate your LLM applications by capturing structured traces of inputs, outputs, and metadata.
This guide shows you how to integrate DeepSeek with LangSmith in both Python and TypeScript, using LangSmith’s @traceable (Python) and traceable(...) (TypeScript) utilities to log LLM calls automatically.
Installation
Install OpenAI and LangSmith:https://api.deepseek.com/v1) instead of OpenAI’s endpoint.
Setup
Set your API keys and project name:- Ensure you have a DeepSeek API key from your DeepSeek account.
- Set
LANGSMITH_TRACING=trueand provide your LangSmith API key (LANGSMITH_API_KEY) activates automatic logging of traces. - Specify a
LANGSMITH_PROJECTname to organize traces by project; if not set, traces go to the default project (named “default”). - The
LANGSMITH_TRACINGflag must be true for any traces to be recorded.
Configure tracing
-
Instrument the DeepSeek API call with LangSmith. In your script, create an OpenAI client configured to use DeepSeek’s API endpoint and wrap a call in a traced function:
In this example, you use the OpenAI SDK to interact with DeepSeek’s API. The OpenAI client is configured with
base_url="https://api.deepseek.com/v1"to route requests to DeepSeek’s endpoint while maintaining OpenAI-compatible syntax. The@traceabledecorator (Python) ortraceablefunction (TypeScript) wraps your function so that each invocation is logged as a trace run of type"llm". Themetadataparameter tags the trace with:ls_provider: Identifies the provider (DeepSeek) for filtering traces.ls_model_name: Specifies the model used for cost tracking and analytics.
response.choices[0].message), which includes the response content along with metadata like the role and any additional fields. LangSmith automatically captures:- Input messages sent to the model.
- The model’s complete response (content, role, etc.).
- Model name and token usage statistics.
- Execution timing and any errors.
-
Execute your script to generate a trace:
The function call will reach out to DeepSeek’s API, and because of the
@traceable/traceablewrapper, LangSmith will log this call’s inputs and outputs as a new trace. You’ll find the model’s response printed to the console, and a corresponding run appear in the LangSmith UI.
View traces in LangSmith
After running the example, you can inspect the recorded traces in the LangSmith UI:- Open the LangSmith UI and log in to your account.
- Select the project you used for this integration (for example, the name set in
LANGSMITH_PROJECT, or “default” if you didn’t set one). - Find the trace corresponding to your DeepSeek API call. It will be identified by the function name (
DeepSeek Chat Completion). - Click on the trace to open it. You’ll be able to inspect the model input and output, including the prompt messages you sent and the response from DeepSeek, as well as timing information (latency) and token usage.