StateGraph
you created in the first tutorial, add bind_tools
on the LLM. This lets the LLM know the correct JSON format to use if it wants to use the search engine.
Let’s first select our LLM:
StateGraph
:
BasicToolNode
that checks the most recent message in the state and calls tools if the message contains tool_calls
. It relies on the LLM’s tool_calling
support, which is available in Anthropic, OpenAI, Google Gemini, and a number of other LLM providers.
conditional_edges
conditional_edges
.
Edges route the control flow from one node to the next. Conditional edges start from a single node and usually contain “if” statements to route to different nodes depending on the current graph state. These functions receive the current graph state
and return a string or list of strings indicating which node(s) to call next.
Next, define a router function called route_tools
that checks for tool_calls
in the chatbot’s output. Provide this function to the graph by calling add_conditional_edges
, which tells the graph that whenever the chatbot
node completes to check this function to see where to go next.
The condition will route to tools
if tool calls are present and END
if not. Because the condition can return END
, you do not need to explicitly set a finish_point
this time.
get_graph
method and one of the “draw” methods, like draw_ascii
or draw_png
. The draw
methods each require additional dependencies.
BasicToolNode
is replaced with the prebuilt ToolNoderoute_tools
is replaced with the prebuilt tools_condition