BedrockChat
features and configurations head to the API reference.
Overview
Integration details
Class | Package | Local | Serializable | PY support | Downloads | Version |
---|---|---|---|---|---|---|
BedrockChat | @langchain/community | ❌ | ✅ | ✅ |
Model features
See the links in the table headers below for guides on how to use specific features.Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|
✅ | ✅ | ❌ | ✅ | ❌ | ❌ | ✅ | ✅ | ❌ |
Setup
To access Bedrock models you’ll need to create an AWS account, set up the Bedrock API service, get an access key ID and secret key, and install the@langchain/community
integration package.
Credentials
Head to the AWS docs to sign up for AWS and setup your credentials. You’ll also need to turn on model access for your account, which you can do by following these instructions. If you want to get automated tracing of your model calls you can also set your LangSmith API key by uncommenting below:Installation
The LangChainBedrockChat
integration lives in the @langchain/community
package. You’ll also need to install several official AWS packages as peer dependencies:
Instantiation
Currently, only Anthropic, Cohere, and Mistral models are supported with the chat model integration. For foundation models from AI21 or Amazon, see the text generation Bedrock variant. There are a few different ways to authenticate with AWS - the below examples rely on an access key, secret access key and region set in your environment variables:Invocation
Chaining
We can chain our model with a prompt template like so:Tool calling
Tool calling with Bedrock models works in a similar way to other models, but note that not all Bedrock models support tool calling. Please refer to the AWS model documentation for more information.API reference
For detailed documentation of allBedrockChat
features and configurations head to the API reference.