LangChain agents are built on LangGraph, so they support the same Event Streaming model with agent-focused projections for messages, tool calls, state, and custom updates. For most application and frontend use cases, use Event Streaming throughDocumentation Index
Fetch the complete documentation index at: https://docs.langchain.com/llms.txt
Use this file to discover all available pages before exploring further.
stream_events(..., version="v3"). Event Streaming returns a run object with typed projections, so you can choose the view you need instead of parsing stream-mode tuples.
Interested in streaming Pregel modes such as
updates, messages, or custom directly? See the Streaming page.What you can stream
| Projection | Use |
|---|---|
for event in run | Raw protocol events when you need exact arrival order. |
run.messages | Model message streams, one per LLM call. |
message.text | Text deltas and final text for a message. |
message.reasoning | Reasoning deltas for models that expose reasoning content. |
message.tool_calls | Tool-call argument chunks and finalized tool calls. |
message.output | Final message object after the model call completes. |
message.usage | Token usage metadata when the provider returns it. |
run.values | Agent state snapshots. |
run.output | Final agent state. |
run.extensions | Custom transformer projections. |
run.toolCalls | Tool execution lifecycle, inputs, output deltas, final output, and errors. |
run.messages yields message streams. Each message stream exposes .text, .reasoning, .toolCalls, .output, and .usage. Async projections can be iterated for live deltas or awaited for final values.
Stream agent messages
Userun.messages when you want model output from each LLM call.
Stream tool calls
There are two useful tool-call projections:message.tool_callsstreams tool-call argument chunks while the model is producing the tool call.run.tool_callsstreams the lifecycle of tool execution after the tool call starts.
Stream state and final output
Userun.values for state snapshots and run.output for the final agent state.
Related
- Streaming cookbook shows runnable Event Streaming examples.
- LangChain Streaming covers lower-level Pregel stream modes.
- LangGraph Event Streaming explains the underlying graph streaming model.
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

