- Request and response payloads
- Token usage and costs
- Latency and performance metrics
- Custom tags and metadata for filtering and analysis
- Multi-step chains and agent workflows
Installation
boto3 (the AWS SDK for Python) along with langsmith to capture traces. For JavaScript/TypeScript, you’ll use @aws-sdk/client-bedrock-runtime with the langsmith package. Both implementations use the Bedrock Converse API, which provides a unified interface for interacting with foundation models.
Setup
To enable LangSmith tracing, configure your LangSmith API key and project settings. You’ll also need to set up your AWS credentials to authenticate with Bedrock.LangSmith configuration
LANGSMITH_PROJECT variable allows you to organize traces into different projects.
AWS credentials
Configure your AWS credentials to authenticate with Bedrock. You’ll need an AWS account with Bedrock access enabled. Follow the AWS setup instructions to create your credentials and enable model access:Configure tracing
Once your environment variables are set, you can trace Bedrock model calls by wrapping your invocation functions with LangSmith’s@traceable decorator (Python) or traceable function (TypeScript).
The following example demonstrates how to use the Bedrock Converse API with LangSmith tracing. The Converse API is AWS’s recommended unified interface for foundation models, providing consistent request and response handling across different model providers. You can enhance traces with custom tags and metadata—tags help you categorize traces (e.g., by environment, feature, or test type), while metadata allows you to attach arbitrary key-value pairs for detailed context:
- Python
- TypeScript
boto3.client("bedrock-runtime")creates a Bedrock Runtime client.- The
conversemethod sends a chat prompt (as a list of messages) to the specified model and returns a structured response. - The
generate_textfunction is decorated with@traceable, logging each call to LangSmith as a trace (using the function name as the default trace name). - Custom tags (
aws-bedrock,langsmith,integration-test) and metadata (environment, model info) are passed into the decorator and attached to the trace record for filtering in the LangSmith UI. - When you run this code (with
LANGSMITH_TRACING=trueand your API key set), LangSmith automatically captures the input prompt, model output, token usage, and latency.
View traces in LangSmith
After running your code, navigate to your LangSmith project at smith.langchain.com to view the traces. Each trace includes:- Request details: Input messages, model parameters, and configuration
- Response details: Model output, token usage, and response metadata
- Performance metrics: Latency, tokens per second, and cost estimates
- Custom metadata: Tags and metadata you provided to the
@traceabledecorator
aws-bedrock or integration-test), search by metadata fields, or drill into specific traces to debug issues.
Next steps
- Learn more about LangSmith features including evaluation, datasets, and feedback
- Explore Bedrock model capabilities like tool calling, streaming, and prompt caching
- Review LangChain Bedrock integration documentation for advanced features like extended thinking and citations