Strands Agents is an SDK for building model-driven agents. LangSmith provides a Strands Agents integration that exports Strands OpenTelemetry spans in a LangSmith-compatible format, including agent runs, model calls, tool calls, prompts, completions, and token usage.Documentation Index
Fetch the complete documentation index at: https://docs.langchain.com/llms.txt
Use this file to discover all available pages before exploring further.
Installation
Install LangSmith with Strands Agents support:Setup
1. Configure environment variables
Set your LangSmith API key and project name. If you use Amazon Bedrock as the model provider for Strands Agents, also configure AWS credentials with your preferred AWS authentication method.The Strands Agents integration uses the standard OpenTelemetry OTLP exporter. Configure the LangSmith endpoint and headers before calling
setup_langsmith_telemetry().2. Enable Strands Agents telemetry
Callsetup_langsmith_telemetry() once at application startup before creating or invoking agents:
console=True to also print transformed spans to stdout:
3. Create and run your agent
Once configured, Strands Agents traces are exported to LangSmith automatically:View traces in LangSmith
After running your application, open your LangSmith project to view traces that include:- Agent invocation spans
- Event loop cycle spans
- LLM call spans with prompts, completions, and token usage
- Tool call spans with tool inputs and outputs when your agent uses tools
Customize the OTLP exporter
If you need to pass custom options to the underlying OpenTelemetry exporter, create aLangSmithSpanExporter with create_langsmith_exporter() and attach it to the Strands tracer provider manually:
Resources
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

