You are viewing the v1 docs for LangChain, which is currently under active development. Learn more.
Welcome to LangChain! This quickstart will take you from zero to a fully functional AI agent in just a few minutes. We’ll start simple and gradually build up to something more sophisticated.
Let’s begin with the absolute basics - creating a simple agent that can answer questions and use tools:
Copy
Ask AI
from langchain.agents import create_agentdef get_weather(city: str) -> str: """Get weather for a given city.""" return f"It's always sunny in {city}!"agent = create_agent( model="anthropic:claude-3-7-sonnet-latest", tools=[get_weather], prompt="You are a helpful assistant",)# Run the agentagent.invoke( {"messages": [{"role": "user", "content": "what is the weather in sf"}]})
Now let’s create something more practical. We’ll build a weather forecasting agent that demonstrates the key concepts you’ll use in production:
Detailed system prompts for better agent behavior
Real-world tools that integrate with external data
Model configuration for consistent responses
Structured output for predictable results
Conversational memory for chat-like interactions
Let’s walk through each step:
1
Define the system prompt
The system prompt is your agent’s personality and instructions. Make it
specific and actionable:
Copy
Ask AI
system_prompt = """You are an expert weather forecaster, who speaks in puns.You have access to two tools:- get_weather_for_location: use this to get the weather for a specific location- get_user_location: use this to get the user's locationIf a user asks you for the weather, make sure you know the location. If you can tell from the question that they mean whereever they are, use the get_user_location tool to find their location."""
2
Create tools
Tools are functions your agent can call. They should be well-documented. Oftentimes, tools will want to connect to external systems, and will rely on runtime configuration to do so. Notice here how the get_user_location tool does exactly that:
Copy
Ask AI
from langchain_core.tools import tooldef get_weather_for_location(city: str) -> str: # (1)! """Get weather for a given city.""" return f"It's always sunny in {city}!"from langchain_core.runnables import RunnableConfigUSER_LOCATION = { "1": "Florida", "2": "SF"}@tooldef get_user_location(config: RunnableConfig) -> str: """Retrieve user information based on user ID.""" user_id = config["context"].get("user_id") return USER_LOCATION[user_id]
3
Configure your model
Set up your language model with the right parameters for your use case:
Copy
Ask AI
from langchain.chat_models import init_chat_modelmodel = init_chat_model( "anthropic:claude-3-7-sonnet-latest", temperature=0)
4
Define response format
Structured outputs ensure your agent returns data in a predictable
format. Here, we use Python’s DataClass
dictionary.
Copy
Ask AI
from dataclasses import dataclass@dataclassclass WeatherResponse: conditions: str punny_response: str
5
Add memory
Enable your agent to remember conversation history:
Copy
Ask AI
from langgraph.checkpoint.memory import InMemorySavercheckpointer = InMemorySaver()