Use any React UI component library with @langchain/react
The useStream hook from @langchain/react is UI-agnostic. It returns plain reactive state that you wire up to any component library you already use. This page shows how two popular libraries, AI Elements and assistant-ui, integrate with useStream to give you a fully-featured chat UI with minimal custom code.
useStream handles everything related to the LangGraph connection: streaming messages, thread management, tool call state, and reconnection. The library of your choice handles the visual layer. The integration is always the same shape:
AI Elements is a composable, shadcn/ui-based component library purpose-built for AI chat interfaces. Components like Conversation, Message, Tool, Reasoning, and PromptInput are designed to be dropped directly into any React project.
Clone and run the full AI Elements example to see tool call rendering, reasoning display, streaming messages, and more in a working project.
AI Elements works with any React setup — Vite, Next.js, or Remix. Components ship as source files in your project, so you can customise them freely without forking a library.
assistant-ui is a headless React UI framework for AI chat. It provides a full runtime layer — thread management, message branching, attachment handling — that connects to useStream via the useExternalStoreRuntime adapter.
Clone and run the full assistant-ui example to see a Claude-style chat interface wired to a LangChain agent with useExternalStoreRuntime.
The useExternalStoreRuntime adapter bridges stream.messages into the assistant-ui runtime. Pass it to AssistantRuntimeProvider and then use any assistant-ui thread component:
Copy
import { useCallback, useMemo } from "react";import { AssistantRuntimeProvider, useExternalStoreRuntime, type AppendMessage, type ThreadMessageLike,} from "@assistant-ui/react";import { useStream } from "@langchain/react";import { Thread } from "@assistant-ui/react";export function Chat() { const stream = useStream({ apiUrl: "http://localhost:2024", assistantId: "agent", }); const onNew = useCallback( async (message: AppendMessage) => { const text = message.content .filter((c) => c.type === "text") .map((c) => c.text) .join(""); await stream.submit({ messages: [{ type: "human", content: text }] }); }, [stream], ); // Convert LangChain messages to assistant-ui's ThreadMessageLike format const messages = useMemo(() => toThreadMessages(stream.messages), [stream.messages]); const runtime = useExternalStoreRuntime<ThreadMessageLike>({ messages, onNew, onCancel: () => stream.stop(), convertMessage: (m) => m, }); return ( <AssistantRuntimeProvider runtime={runtime}> <Thread /> </AssistantRuntimeProvider> );}
assistant-ui ships a full thread UI out of the box via <Thread />, including a message list, composer, and scroll management. Customise individual parts — messages, tool UIs, attachments — by overriding the component slots.