Skip to main content
Deep agents work with any LangChain chat model that supports tool calling.

Pass a model string

The simplest way to specify a model is to pass a string to create_deep_agent. Use the provider:model format to select a specific provider:
agent = create_deep_agent(model="openai:gpt-5.3-codex")
Under the hood, this calls init_chat_model with default parameters.

Configure model parameters

To configure model-specific parameters, use init_chat_model or instantiate a provider model class directly:
from langchain.chat_models import init_chat_model
from deepagents import create_deep_agent

model = init_chat_model(
    model="anthropic:claude-sonnet-4-6",
    thinking={"type": "enabled", "budget_tokens": 10000},
)
agent = create_deep_agent(model=model)
Available parameters vary by provider. See the chat model integrations page for provider-specific configuration options.

Supported models

Deep agents work with any chat model that supports tool calling. See chat model integrations for the full list of supported providers.

Learn more

  • Models in LangChain: chat model features including tool calling, structured output, and multimodality