Skip to main content
This page makes reference to Grok models provided by xAI - not to be confused with Groq, a separate AI hardware and software company. See the Groq provider page.
xAI offers an API to interact with Grok models. This example goes over how to use LangChain to interact with xAI models.

Installation

pip install -U langchain-xai

Environment

To use xAI, you’ll need to create an API key. The API key can be passed in as an init param xai_api_key or set as environment variable XAI_API_KEY.

Example

See ChatXAI docs for detail and supported features.
# Querying chat models with xAI

from langchain_xai import ChatXAI

chat = ChatXAI(
    # xai_api_key="YOUR_API_KEY",
    model="grok-4",
)

# stream the response back from the model
for m in chat.stream("Tell me fun things to do in NYC"):
    print(m.content, end="", flush=True)

# if you don't want to do streaming, you can use the invoke method
# chat.invoke("Tell me fun things to do in NYC")

Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.