Chat models are language models that use a sequence of messages as inputs and return messages as outputs (as opposed to using plain text). These are generally newer models.
If you’d like to write your own chat model, see this how-to. If you’d like to contribute an integration, see Contributing integrations.

Install dependencies

npm i @langchain/groq

Add environment variables

GROQ_API_KEY=your-api-key

Instantiate the model

import { ChatGroq } from "@langchain/groq";

const model = new ChatGroq({
  model: "llama-3.3-70b-versatile",
  temperature: 0
});
await model.invoke("Hello, world!")
ModelStreamJSON modeTool CallingwithStructuredOutput()Multimodal
BedrockChat🟡 (Bedrock Anthropic only)🟡 (Bedrock Anthropic only)🟡 (Bedrock Anthropic only)
ChatBedrockConverse
ChatAnthropic
ChatCloudflareWorkersAI
ChatCohere
ChatFireworks
ChatGoogleGenerativeAI
ChatVertexAI
ChatGroq
ChatMistralAI
ChatOllama
ChatOpenAI
ChatTogetherAI
ChatXAI

All chat models