Skip to main content
This guide shows you how to trace Amazon Bedrock model calls with LangSmith using the native AWS SDKs. LangSmith also works seamlessly with LangChain’s Bedrock integrations. Either approach provides insights into:
  • Request and response payloads
  • Token usage and costs
  • Latency and performance metrics
  • Custom tags and metadata for filtering and analysis
  • Multi-step chains and agent workflows

Installation

pip install boto3 langsmith
This integration uses the native AWS SDKs with LangSmith’s tracing capabilities. For Python, you’ll use boto3 (the AWS SDK for Python) along with langsmith to capture traces. For JavaScript/TypeScript, you’ll use @aws-sdk/client-bedrock-runtime with the langsmith package. Both implementations use the Bedrock Converse API, which provides a unified interface for interacting with foundation models.

Setup

To enable LangSmith tracing, configure your LangSmith API key and project settings. You’ll also need to set up your AWS credentials to authenticate with Bedrock.

LangSmith configuration

export LANGSMITH_API_KEY=<your_langsmith_api_key>
export LANGSMITH_PROJECT=<your_project_name> # optional, defaults to "default"
export LANGSMITH_TRACING=true
You can obtain your LangSmith API key from smith.langchain.com by navigating to Settings > API Keys. The LANGSMITH_PROJECT variable allows you to organize traces into different projects.

AWS credentials

Configure your AWS credentials to authenticate with Bedrock. You’ll need an AWS account with Bedrock access enabled. Follow the AWS setup instructions to create your credentials and enable model access:
export AWS_ACCESS_KEY_ID=<your_aws_access_key_id>
export AWS_SECRET_ACCESS_KEY=<your_aws_secret_key>
export AWS_SESSION_TOKEN=<your_session_token> # only if using temporary credentials
export AWS_DEFAULT_REGION=<your_bedrock_region> # e.g., us-east-1 or us-west-2

Configure tracing

Once your environment variables are set, you can trace Bedrock model calls by wrapping your invocation functions with LangSmith’s @traceable decorator (Python) or traceable function (TypeScript). The following example demonstrates how to use the Bedrock Converse API with LangSmith tracing. The Converse API is AWS’s recommended unified interface for foundation models, providing consistent request and response handling across different model providers. You can enhance traces with custom tags and metadata—tags help you categorize traces (e.g., by environment, feature, or test type), while metadata allows you to attach arbitrary key-value pairs for detailed context:
import boto3
from langsmith import traceable

# Initialize Bedrock runtime client (ensure AWS creds and region are set)
bedrock = boto3.client("bedrock-runtime", region_name="us-east-1")
model_id = "anthropic.claude-3-haiku-20240307-v1:0"  # Example Bedrock model ID

# Decorate the model invocation function to auto-capture a trace with tags/metadata
@traceable(tags=["aws-bedrock", "langsmith", "integration-test"],
           metadata={"env": "dev", "model_provider": "bedrock", "model_id": "claude-3-haiku"})
def generate_text(prompt: str) -> str:
    # Prepare a single-turn conversation input for the Converse API
    messages = [
        {"role": "user", "content": [{"text": prompt}]}
    ]
    # Invoke the Bedrock model using the unified Converse API
    response = bedrock.converse(
        modelId=model_id,
        messages=messages,
        inferenceConfig={"maxTokens": 512, "temperature": 0.5, "topP": 0.9}
    )
    # Extract the model's reply text from the response
    output_text = response["output"]["message"]["content"][0]["text"]
    return output_text

# Call the traced function with a prompt
result = generate_text("How can I trace AWS Bedrock model outputs to LangSmith for debugging?")
print(result)
  • boto3.client("bedrock-runtime") creates a Bedrock Runtime client.
  • The converse method sends a chat prompt (as a list of messages) to the specified model and returns a structured response.
  • The generate_text function is decorated with @traceable, logging each call to LangSmith as a trace (using the function name as the default trace name).
  • Custom tags (aws-bedrock, langsmith, integration-test) and metadata (environment, model info) are passed into the decorator and attached to the trace record for filtering in the LangSmith UI.
  • When you run this code (with LANGSMITH_TRACING=true and your API key set), LangSmith automatically captures the input prompt, model output, token usage, and latency.

View traces in LangSmith

After running your code, navigate to your LangSmith project at smith.langchain.com to view the traces. Each trace includes:
  • Request details: Input messages, model parameters, and configuration
  • Response details: Model output, token usage, and response metadata
  • Performance metrics: Latency, tokens per second, and cost estimates
  • Custom metadata: Tags and metadata you provided to the @traceable decorator
You can filter traces by tags (e.g., aws-bedrock or integration-test), search by metadata fields, or drill into specific traces to debug issues.

Next steps


Connect these docs to Claude, VSCode, and more via MCP for real-time answers.