- Stream subagent progress — track each subagent’s execution as it runs in parallel.
- Stream LLM tokens — stream tokens from the main agent and each subagent.
- Stream tool calls — see tool calls and results from within subagent execution.
- Stream custom updates — emit user-defined signals from inside subagent nodes.
Enable subgraph streaming
Deep agents use LangGraph’s subgraph streaming to surface events from subagent execution. To receive subagent events, enablestream_subgraphs when streaming.
Namespaces
Whensubgraphs is enabled, each streaming event includes a namespace that identifies which agent produced it. The namespace is a path of node names and task IDs that represents the agent hierarchy.
| Namespace | Source |
|---|---|
() (empty) | Main agent |
("tools:abc123",) | A subagent spawned by the main agent’s task tool call abc123 |
("tools:abc123", "model_request:def456") | The model request node inside a subagent |
Subagent progress
Usestream_mode="updates" to track subagent progress as each step completes. This is useful for showing which subagents are active and what work they’ve completed.
Output
LLM tokens
Usestream_mode="messages" to stream individual tokens from both the main agent and subagents. Each message event includes metadata that identifies the source agent.
Tool calls
When subagents use tools, you can stream tool call events to display what each subagent is doing. Tool call chunks appear in themessages stream mode.
Custom updates
Useconfig.writer inside your subagent tools to emit custom progress events:
Output
Stream multiple modes
Combine multiple stream modes to get a complete picture of agent execution:Common patterns
Track subagent lifecycle
Monitor when subagents start, run, and complete:Related
- Subagents — Configure and use subagents with deep agents
- Frontend streaming — Build React UIs with
useStreamfor deep agents - LangChain streaming overview — General streaming concepts with LangChain agents