Pass a model string
The simplest way to specify a model is to pass a string tocreateDeepAgent. Use the provider:model format to select a specific provider:
init_chat_model with default parameters.
Configure model parameters
To configure model-specific parameters, useinit_chat_model or instantiate a provider model class directly:
Available parameters vary by provider. See the chat model integrations page for provider-specific configuration options.
Select a model at runtime
If your application lets users choose a model (for example using a dropdown in the UI), use middleware to swap the model at runtime without rebuilding the agent. Pass the user’s model selection through runtime context, then use awrap_model_call middleware to override the model on each invocation using the @[@wrap_model_call] decorator:
Supported models
Deep Agents work with any chat model that supports tool calling. See chat model integrations for the full list of supported providers.Suggested models
These models perform well on the Deep Agents eval suite, which tests basic agent operations. Passing these evals is necessary but not sufficient for strong performance on longer, more complex tasks.| Provider | Models |
|---|---|
| Anthropic | claude-opus-4-6, claude-opus-4-5, claude-sonnet-4-6, claude-sonnet-4, claude-sonnet-4-5, claude-haiku-4-5, claude-opus-4-1 |
| OpenAI | gpt-5.4, gpt-4o, gpt-4.1, o4-mini, gpt-5.2-codex, gpt-4o-mini, o3 |
gemini-3-flash-preview, gemini-3.1-pro-preview | |
| Open-weight | GLM-5, Kimi-K2.5, MiniMax-M2.5, qwen3.5-397B-A17B, devstral-2-123B |
Learn more
- Models in LangChain: chat model features including tool calling, structured output, and multimodality
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

