Tools extend what agents can do—letting them fetch real-time data, execute code, query external databases, and take actions in the world.Under the hood, tools are callable functions with well-defined inputs and outputs that get passed to a chat model. The model decides when to invoke a tool based on the conversation context, and what input arguments to provide.
For details on how models handle tool calls, see Tool calling.
The simplest way to create a tool is by importing the tool function from the langchain package. You can use zod to define the tool’s input schema:
Copy
import * as z from "zod"import { tool } from "langchain"const searchDatabase = tool( ({ query, limit }) => `Found ${limit} results for '${query}'`, { name: "search_database", description: "Search the customer database for records matching the query.", schema: z.object({ query: z.string().describe("Search terms to look for"), limit: z.number().describe("Maximum number of results to return"), }), });
Server-side tool use: Some chat models feature built-in tools (web search, code interpreters) that are executed server-side. See Server-side tool use for details.
Prefer snake_case for tool names (e.g., web_search instead of Web Search). Some model providers have issues with or reject names containing spaces or special characters with errors. Sticking to alphanumeric characters, underscores, and hyphens helps to improve compatibility across providers.
Tools are most powerful when they can access runtime information like conversation history, user data, and persistent memory. This section covers how to access and update this information from within your tools.
Context provides immutable configuration data that is passed at invocation time. Use it for user IDs, session details, or application-specific settings that shouldn’t change during a conversation.Tools can access an agent’s runtime context through the config parameter:
Copy
import * as z from "zod"import { ChatOpenAI } from "@langchain/openai"import { createAgent } from "langchain"const getUserName = tool( (_, config) => { return config.context.user_name }, { name: "get_user_name", description: "Get the user's name.", schema: z.object({}), });const contextSchema = z.object({ user_name: z.string(),});const agent = createAgent({ model: new ChatOpenAI({ model: "gpt-4.1" }), tools: [getUserName], contextSchema,});const result = await agent.invoke( { messages: [{ role: "user", content: "What is my name?" }] }, { context: { user_name: "John Smith" } });
The BaseStore provides persistent storage that survives across conversations. Unlike state (short-term memory), data saved to the store remains available in future sessions.Access the store through config.store. The store uses a namespace/key pattern to organize data:
Copy
import * as z from "zod";import { createAgent, tool } from "langchain";import { InMemoryStore } from "@langchain/langgraph";import { ChatOpenAI } from "@langchain/openai";const store = new InMemoryStore();// Access memoryconst getUserInfo = tool( async ({ user_id }) => { const value = await store.get(["users"], user_id); console.log("get_user_info", user_id, value); return value; }, { name: "get_user_info", description: "Look up user info.", schema: z.object({ user_id: z.string(), }), });// Update memoryconst saveUserInfo = tool( async ({ user_id, name, age, email }) => { console.log("save_user_info", user_id, name, age, email); await store.put(["users"], user_id, { name, age, email }); return "Successfully saved user info."; }, { name: "save_user_info", description: "Save user info.", schema: z.object({ user_id: z.string(), name: z.string(), age: z.number(), email: z.string(), }), });const agent = createAgent({ model: new ChatOpenAI({ model: "gpt-4.1" }), tools: [getUserInfo, saveUserInfo], store,});// First session: save user infoawait agent.invoke({ messages: [ { role: "user", content: "Save the following user: userid: abc123, name: Foo, age: 25, email: foo@langchain.dev", }, ],});// Second session: get user infoconst result = await agent.invoke({ messages: [ { role: "user", content: "Get user info for user with id 'abc123'" }, ],});console.log(result);// Here is the user info for user with ID "abc123":// - Name: Foo// - Age: 25// - Email: foo@langchain.dev
Stream real-time updates from tools during execution. This is useful for providing progress feedback to users during long-running operations.Use config.writer to emit custom updates:
Copy
import * as z from "zod";import { tool, ToolRuntime } from "langchain";const getWeather = tool( ({ city }, config: ToolRuntime) => { const writer = config.writer; // Stream custom updates as the tool executes if (writer) { writer(`Looking up data for city: ${city}`); writer(`Acquired data for city: ${city}`); } return `It's always sunny in ${city}!`; }, { name: "get_weather", description: "Get weather for a given city.", schema: z.object({ city: z.string(), }), });
ToolNode is a prebuilt node that executes tools in LangGraph workflows. It handles parallel tool execution, error handling, and state injection automatically.
For custom workflows where you need fine-grained control over tool execution patterns, use ToolNode instead of @[create_agent]. It’s the building block that powers agent tool execution.
Tools can access the current graph state through @[ToolRuntime]:For more details on accessing state, context, and long-term memory from tools, see Access context.
LangChain provides a large collection of prebuilt tools and toolkits for common tasks like web search, code interpretation, database access, and more. These ready-to-use tools can be directly integrated into your agents without writing custom code.See the tools and toolkits integration page for a complete list of available tools organized by category.
Some chat models feature built-in tools that are executed server-side by the model provider. These include capabilities like web search and code interpreters that don’t require you to define or host the tool logic.Refer to the individual chat model integration pages and the tool calling documentation for details on enabling and using these built-in tools.