Skip to main content
Chat models are language models that use a sequence of messages as inputs and return messages as outputs .
While these LangChain classes support the indicated advanced feature, you may need to refer to provider-specific documentation to learn which hosted models or backends support the feature.
ModelTool callingStructured outputMultimodal
ChatOpenAI
ChatAnthropic
ChatVertexAI (deprecated)
ChatGoogleGenerativeAI
AzureChatOpenAI
ChatGroq
ChatBedrock
ChatAmazonNova
ChatHuggingFace
ChatOllama
ChatWatsonx
ChatXAI
ChatNVIDIA
ChatCohere
ChatMistralAI
ChatTogether
ChatFireworks
ChatLlamaCpp
ChatDeepSeek
ChatDatabricks
ChatPerplexity
ChatOpenRouter

Routers & proxies

Routers and proxies give you access to models from multiple providers through a single API and credential. They can simplify billing, let you switch between models without changing integrations, and offer features like automatic fallbacks.
ProviderIntegrationDescription
OpenRouterChatOpenRouterUnified access to models from OpenAI, Anthropic, Google, Meta, and more

Chat Completions API

Certain model providers offer endpoints that are compatible with OpenAI’s Chat Completions API. In such cases, you can use ChatOpenAI with a custom base_url to connect to these endpoints for basic chat functionality.
ChatOpenAI targets official OpenAI API specifications only. Non-standard response fields from third-party providers (e.g., reasoning_content, reasoning, reasoning_details) are not extracted or preserved. Use a provider-specific package when you need access to non-standard features.For instance, OpenRouter has a dedicated LangChain integration. See the ChatOpenRouter guide for setup and usage.

All chat models

Abso

AI21 Labs

AI/ML API

Alibaba Cloud PAI EAS

Amazon Nova

Anthropic

AzureAIChatCompletionsModel

Azure OpenAI

Azure ML Endpoint

Baichuan Chat

Baidu Qianfan

Baseten

AWS Bedrock

Cerebras

CloudflareWorkersAI

Cohere

ContextualAI

Coze Chat

Dappier AI

Databricks

DeepInfra

DeepSeek

Eden AI

EverlyAI

Featherless AI

Fireworks

ChatFriendli

Google Gemini

Google Cloud Vertex AI

GPTRouter

DigitalOcean Gradient

GreenNode

Groq

ChatHuggingFace

IBM watsonx.ai

JinaChat

Kinetica

Konko

LiteLLM

Llama 2 Chat

Llama API

LlamaEdge

Llama.cpp

maritalk

MiniMax

MistralAI

MLX

ModelScope

Moonshot

Naver

Nebius

Netmind

NVIDIA AI Endpoints

ChatOCIModelDeployment

OCIGenAI

ChatOctoAI

Ollama

OpenAI

OpenRouter

Outlines

Perplexity

Pipeshift

ChatPredictionGuard

PremAI

PromptLayer ChatOpenAI

Qwen QwQ

Qwen

Reka

RunPod Chat Model

SambaNova

ChatSeekrFlow

Snowflake Cortex

SparkLLM Chat

Nebula (Symbl.ai)

Tencent Hunyuan

Together

Tongyi Qwen

Upstage

vLLM Chat

Volc Engine Maas

ChatWriter

xAI

Xinference

YandexGPT

ChatYI

Yuan2.0

ZHIPU AI

If you’d like to contribute an integration, see Contributing integrations.

Connect these docs to Claude, VSCode, and more via MCP for real-time answers.