StateGraph
you created in the first tutorial, add bindTools
on the LLM. This lets the LLM know the correct JSON format to use if it wants to use the search engine.
Let’s first select our LLM:
StateGraph
:
"tools"
that checks the most recent message in the state and calls tools if the message contains tool_calls
. It relies on the LLM’s tool calling support, which is available in Anthropic, OpenAI, Google Gemini, and a number of other LLM providers.
conditional_edges
conditional_edges
.
Edges route the control flow from one node to the next. Conditional edges start from a single node and usually contain “if” statements to route to different nodes depending on the current graph state. These functions receive the current graph state
and return a string or list of strings indicating which node(s) to call next.
Next, define a router function called routeTools
that checks for tool_calls
in the chatbot’s output. Provide this function to the graph by calling addConditionalEdges
, which tells the graph that whenever the chatbot
node completes to check this function to see where to go next.
The condition will route to tools
if tool calls are present and END
if not. Because the condition can return END
, you do not need to explicitly set a finish_point
this time.
getGraph
method and render the graph with the drawMermaidPng
method.
createToolNode
is replaced with the prebuilt ToolNoderouteTools
is replaced with the prebuilt toolsCondition