entrypoint
, input is restricted to the first argument of the function. To pass multiple inputs, you can use a dictionary.
Extended example: simple workflow
Extended example: Compose an essay with an LLM
@task
and @entrypoint
decorators
syntactically. Given that a checkpointer is provided, the workflow results will
be persisted in the checkpointer.Extended example: parallel LLM calls
@task
. Each call generates a paragraph on a different topic, and results are joined into a single text output.Extended example: calling a simple graph from the functional API
Extended example: calling another entrypoint
get_stream_writer
from langgraph.config
..stream()
to process streamed output.get_stream_writer()
will not work. Instead please
use the StreamWriter
class directly. See Async with Python < 3.11 for more details.ttl
is specified in seconds. The cache will be invalidated after this time.slow_task
as its result is already saved in the checkpoint.
interrupt
function and the Command
primitive.
"bar"
."qux"
.step_1
— are persisted, so that they are not run again following the interrupt
.
Let’s send in a query string:
interrupt
after step_1
. The interrupt provides instructions to resume the run. To resume, we issue a Command containing the data expected by the human_feedback
task.
review_tool_call
function that calls interrupt
. When this function is called, execution will be paused until we issue a command to resume it.
Given a tool call, our function will interrupt
for human review. At that point we can either:
ToolMessage
supplied by the human. The results of prior tasks — in this case the initial model call — are persisted, so that they are not run again following the interrupt
.
entrypoint.final
to decouple what is returned to the caller from what is persisted in the checkpoint. This is useful when:
InMemorySaver
checkpointer.
The bot is able to remember the previous conversation and continue from where it left off.