Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.langchain.com/llms.txt

Use this file to discover all available pages before exploring further.

Strands Agents is an SDK for building model-driven agents. LangSmith provides a Strands Agents integration that exports Strands OpenTelemetry spans in a LangSmith-compatible format, including agent runs, model calls, tool calls, prompts, completions, and token usage.

Installation

Install LangSmith with Strands Agents support:
pip install "langsmith[strands-agents]"
This installs LangSmith, Strands Agents, Strands Agents tools, and the OpenTelemetry OTLP HTTP exporter.

Setup

1. Configure environment variables

Set your LangSmith API key and project name. If you use Amazon Bedrock as the model provider for Strands Agents, also configure AWS credentials with your preferred AWS authentication method.
export OTEL_EXPORTER_OTLP_ENDPOINT=https://api.smith.langchain.com/otel/v1/traces
export OTEL_EXPORTER_OTLP_HEADERS="x-api-key=<your_langsmith_api_key>,Langsmith-Project=<your_project_name>"

# Required when using Amazon Bedrock.
export AWS_REGION=<your_aws_region>
The Strands Agents integration uses the standard OpenTelemetry OTLP exporter. Configure the LangSmith endpoint and headers before calling setup_langsmith_telemetry().

2. Enable Strands Agents telemetry

Call setup_langsmith_telemetry() once at application startup before creating or invoking agents:
from langsmith.integrations.strands_agents import setup_langsmith_telemetry

setup_langsmith_telemetry()
For local debugging, pass console=True to also print transformed spans to stdout:
setup_langsmith_telemetry(console=True)

3. Create and run your agent

Once configured, Strands Agents traces are exported to LangSmith automatically:
from langsmith.integrations.strands_agents import setup_langsmith_telemetry
from strands import Agent

setup_langsmith_telemetry()

agent = Agent(
    system_prompt="You are a concise assistant.",
)

response = agent("Explain what LangSmith tracing is in one sentence.")
print(response)

View traces in LangSmith

After running your application, open your LangSmith project to view traces that include:
  • Agent invocation spans
  • Event loop cycle spans
  • LLM call spans with prompts, completions, and token usage
  • Tool call spans with tool inputs and outputs when your agent uses tools

Customize the OTLP exporter

If you need to pass custom options to the underlying OpenTelemetry exporter, create a LangSmithSpanExporter with create_langsmith_exporter() and attach it to the Strands tracer provider manually:
from langsmith.integrations.strands_agents import create_langsmith_exporter
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from strands.telemetry import StrandsTelemetry

telemetry = StrandsTelemetry()
exporter = create_langsmith_exporter(
    endpoint="https://api.smith.langchain.com/otel/v1/traces",
    headers={
        "x-api-key": "<your_langsmith_api_key>",
        "Langsmith-Project": "<your_project_name>",
    },
)
telemetry.tracer_provider.add_span_processor(BatchSpanProcessor(exporter))
Use this approach when you want to configure exporter options in code instead of environment variables.

Resources