Overview
Integration details
Class | Package | Local | Serializable | PY support | Downloads | Version |
---|---|---|---|---|---|---|
ChatMistralAI | @langchain/mistralai | ❌ | ❌ | ✅ |
Model features
See the links in the table headers below for guides on how to use specific features.Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|
✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ | ❌ |
Setup
To access Mistral AI models you’ll need to create a Mistral AI account, get an API key, and install the@langchain/mistralai
integration package.
Credentials
Head here to sign up to Mistral AI and generate an API key. Once you’ve done this set theMISTRAL_API_KEY
environment variable:
Installation
The LangChain ChatMistralAI integration lives in the@langchain/mistralai
package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
When sending chat messages to mistral, there are a few requirements to follow:- The first message can not be an assistant (ai) message.
- Messages must alternate between user and assistant (ai) messages.
- Messages can not end with an assistant (ai) or system message.