Deploy your first application to LangSmith Cloud using the LangGraph CLI.
This quickstart shows you how to deploy an application to LangSmith Cloud using the langgraph deploy command.
For a comprehensive Cloud deployment guide including GitHub-based deployments and all configuration options, refer to the Cloud deployment setup guide.
Run the deploy command from your project directory:
langgraph deploy
This creates a dev deployment named after your project directory by default. Use --name or --deployment-type prod to override.
To update an existing deployment after making code changes, re-run langgraph deploy. It finds the existing deployment by name and updates it in place.
You can also use langgraph deploy list to see all deployments, langgraph deploy logs to tail runtime logs, and langgraph deploy delete <ID> to remove a deployment. For details, refer to the CLI reference.
Studio is an interactive agent IDE connected directly to your deployment. Use it to send messages, inspect intermediate state at each node, edit state mid-run, and replay from any prior checkpoint without writing code.Once the deployment is ready:
Go to LangSmith and select Deployments in the left sidebar.
Select your deployment to view its details.
Click Studio in the top right corner to open Studio.
Copy the API URL from the deployment details view, then use it to call your application:
Python SDK (Async)
Python SDK (Sync)
JavaScript SDK
Rest API
Install the LangGraph Python SDK:
pip install langgraph-sdk
Send a message to the assistant (stateless run):
from langgraph_sdk import get_clientclient = get_client(url="your-deployment-url", api_key="your-langsmith-api-key")async for chunk in client.runs.stream( None, # Threadless run "agent", # Name of assistant. Defined in langgraph.json. input={ "messages": [{ "role": "human", "content": "What is LangGraph?", }], }, stream_mode="updates",): print(f"Receiving new event of type: {chunk.event}...") print(chunk.data) print("\n\n")
Install the LangGraph Python SDK:
pip install langgraph-sdk
Send a message to the assistant (threadless run):
from langgraph_sdk import get_sync_clientclient = get_sync_client(url="your-deployment-url", api_key="your-langsmith-api-key")for chunk in client.runs.stream( None, # Threadless run "agent", # Name of assistant. Defined in langgraph.json. input={ "messages": [{ "role": "human", "content": "What is LangGraph?", }], }, stream_mode="updates",): print(f"Receiving new event of type: {chunk.event}...") print(chunk.data) print("\n\n")
Install the LangGraph JS SDK:
npm install @langchain/langgraph-sdk
Send a message to the assistant (threadless run):
const { Client } = await import("@langchain/langgraph-sdk");const client = new Client({ apiUrl: "your-deployment-url", apiKey: "your-langsmith-api-key" });const streamResponse = client.runs.stream( null, // Threadless run "agent", // Assistant ID { input: { "messages": [ { "role": "user", "content": "What is LangGraph?"} ] }, streamMode: "messages", });for await (const chunk of streamResponse) { console.log(`Receiving new event of type: ${chunk.event}...`); console.log(JSON.stringify(chunk.data)); console.log("\n\n");}