Alpha Notice: These docs cover the v1-alpha release. Content is incomplete and subject to change.For the latest stable version, see the v0 LangChain Python or LangChain JavaScript docs.
create_agent()
, you get built-in observability through LangSmith - a powerful platform for tracing, debugging, evaluating, and monitoring your LLM applications.
Traces capture every step your agent takes, from the initial user input to the final response, including all tool calls, model interactions, and decision points. This enables you to debug your agents, evaluate performance, and monitor usage.
Prerequisites
Before you begin, ensure you have the following:- A LangSmith account (free to sign up)
Enable tracing
All LangChain agents automatically support LangSmith tracing. To enable it, set the following environment variables:You can get your API key from your LangSmith settings.
Quick start
No extra code is needed to log a trace to LangSmith. Just run your agent code as you normally would:default
. To configure a custom project name, see Log to a project.
Trace selectively
You may opt to trace specific invocations or parts of your application using LangSmith’stracing_context
context manager:
Log to a project
Statically
Statically
You can set a custom project name for your entire application by setting the
LANGSMITH_PROJECT
environment variable:Dynamically
Dynamically
You can set the project name programmatically for specific operations:
Add metadata to traces
You can annotate your traces with custom metadata and tags:tracing_context
also accepts tags and metadata for fine-grained control:
To learn more about how to use traces to debug, evaluate, and monitor your agents, see the LangSmith documentation.