Skip to main content
Deep Agents work with any LangChain chat model that supports tool calling.

Pass a model string

The simplest way to specify a model is to pass a string to createDeepAgent. Use the provider:model format to select a specific provider:
const agent = createDeepAgent({ model: "openai:gpt-5.3-codex" });
Under the hood, this calls init_chat_model with default parameters.

Configure model parameters

To configure model-specific parameters, use init_chat_model or instantiate a provider model class directly:
import { initChatModel } from "langchain/chat_models/universal";
import { createDeepAgent } from "deepagents";

const model = await initChatModel("anthropic:claude-sonnet-4-6", {
    maxTokens: 16000,
    thinking: { type: "enabled", budgetTokens: 10000 },
});
const agent = createDeepAgent({ model });
Available parameters vary by provider. See the chat model integrations page for provider-specific configuration options.

Select a model at runtime

If your application lets users choose a model (for example using a dropdown in the UI), use middleware to swap the model at runtime without rebuilding the agent. Pass the user’s model selection through runtime context, then use a wrap_model_call middleware to override the model on each invocation using the @[@wrap_model_call] decorator:
import { initChatModel, createMiddleware } from "langchain";
import { createDeepAgent } from "deepagents";
import * as z from "zod";

const contextSchema = z.object({
  model: z.string(),
});

const configurableModel = createMiddleware({
  name: "ConfigurableModel",
  wrapModelCall: async (request, handler) => {
    const modelName = request.runtime.context.model;
    const model = await initChatModel(modelName);
    return handler({ ...request, model });
  },
});

const agent = await createDeepAgent({
  model: "anthropic:claude-sonnet-4-6",
  middleware: [configurableModel],
  contextSchema,
});

// Invoke with the user's model selection
const result = await agent.invoke(
  { messages: [{ role: "user", content: "Hello!" }] },
  { context: { model: "openai:gpt-4.1" } },
);
For more dynamic model patterns (foe example routing based on conversation complexity or cost optimization), see Dynamic model in the LangChain agents guide.

Supported models

Deep Agents work with any chat model that supports tool calling. See chat model integrations for the full list of supported providers.

Suggested models

These models perform well on the Deep Agents eval suite, which tests basic agent operations. Passing these evals is necessary but not sufficient for strong performance on longer, more complex tasks.
ProviderModels
Anthropicclaude-opus-4-6, claude-opus-4-5, claude-sonnet-4-6, claude-sonnet-4, claude-sonnet-4-5, claude-haiku-4-5, claude-opus-4-1
OpenAIgpt-5.4, gpt-4o, gpt-4.1, o4-mini, gpt-5.2-codex, gpt-4o-mini, o3
Googlegemini-3-flash-preview, gemini-3.1-pro-preview
Open-weightGLM-5, Kimi-K2.5, MiniMax-M2.5, qwen3.5-397B-A17B, devstral-2-123B
Open-weight models are available through providers like OpenRouter, Fireworks or Ollama.

Learn more

  • Models in LangChain: chat model features including tool calling, structured output, and multimodality