langchain
is driven by a few core beliefs:
langchain
exists to be the easiest place to get started building with LLMs, while also being flexible and production ready enough to bring to production.
There are two core focuses.
We want to make it possible for developers to build with whatever the best model is at the time.
Different providers expose different APIs, with different model parameters and different message formats.
Standardizing these model inputs and outputs has always been a core focus, making it easy for developer to easily change to the most recent state-of-the-art model, avoiding any lock in.
We’ve also consistently focused on more than just calling the models - we want to make it easy to use these models to orchestrate more complex flows that interact with other data and computation, and have tried to make that as simple as possible.
This includes other components you may want to integrate with LLMs - we strive to make it easy to define tools that LLMs can use dynamically, as well help with parsing of and access to unstructured data.
This also include the common orchestration patterns for using LLMs.
Yet, given the constant rate of change in the field, LangChain has also evolved over time.
Below is a brief timeline of how LangChain has changed over the years, evolving along with what it meant to build with LLMs.
October 24th, 2022: A month before ChatGPT, LangChain was launched as a Python package.
It consisted of two main things: LLM abstractions, and high level “chains” for common use cases.
“Chains” were predetermined steps of computation to run. For example - RAG: run a retrieval step, then run a generation step.
The name LangChain comes from “Language” (like Language models) and “Chains”.
December 2022: The first general purpose agents were added to LangChain.
They were based on the ReAct paper (ReAct standing for Reasoning and Acting).
They used LLMs to generate JSON that represented tool calls, and then parsed that JSON to determine what tools to call.
January 2023: OpenAI releases a “Chat Completion” API.
Previously, models took in strings and returned a string.
In the ChatCompletions API they evolved to take in a list of messages, and return a message.
Other model providers followed suit, and LangChain updated to work with lists of messages.
January 2023: LangChain releases a JavaScript version.
LLMs and agents will change how applications are built, and JavaScript is the language of application developers.
February 2023: LangChain Inc was formed as a commercial company around the open source LangChain project.
The main goal was to “make intelligent agents ubiquitous”.
The team recognized that while LangChain was a key part (made it dead simple to get started) there would be other components that would be needed.
Spring 2023: OpenAI releases “function calling” in their API.
This allowed the API to explicitly generate payloads that represented tool calls.
Other model providers followed suit, and LangChain updated to use this as the preferred method for doing tool calling (rather than parsing JSON).
Summer 2023: LangSmith is released as closed source platform by LangChain Inc, providing observability and evals.
The main issue with building agents is getting them to be reliable, and LangSmith was built to solve that need.
LangChain updated to integrate seamlessly with LangSmith (you only need to set one environment variable).
January 2024: LangChain releases a 0.1
version, it’s first non-0.0.x
.
The industry is maturing from prototypes to production, and as such LangChain increases it focus on stability.
February 2024: LangGraph is released as an open source library.
If you remember, what was in the original LangChain was two things: LLM abstractions, and high level interfaces for get started with common applications.
What was missing was a low level orchestration layer allowing developers to control the exact flow of their agent.
LangGraph was exactly that.
LangGraph also learned from lessons in LangChain and built in functionality we discovered was needed in LangChain: streaming, durable execution, short term memory, human-in-the-loop, and more.
Summer 2024: LangChain has over 700 different integrations in it.
These are all in the main LangChain package, which has several downsides (package size, bad maintainer experience, dependency conflicts).
LangChain splits all integrations out of the core package, and either into their own standalone packages (for the core integrations) or langchain-community
(for the long tail).
Fall 2024: LangGraph becomes the preferred way to build any applications (chains or agents) more than just a single LLM call (rather than the LangChain chains/agents). The LangChain chains/agents were more high level, and we found that as developers tried to improve the reliability of their applications, they needed more control than the high level interfaces provided. LangGraph provided that low level flexibility. Most chains and agents were marked as deprecated in LangChain with guides on how to migrate them to LangGraph. There is one high level abstraction created in LangGraph - an agent abstraction. It is built on top of low level LangGraph and has the same interface as the ReAct agents from LangChain.
Spring 2025: Model APIs become more multi-modal.
They start to accept files, images, videos, and more.
LangChain updated their message format accordingly to allow developers to specify these multi-modal inputs in a standard way.
Fall 2025: LangChain releases 1.0
with two major changes:
1.0
.
For users still using old LangChain chains/agents who do NOT want to upgrade (note: we recommend you do), you can continue using old LangChain by installing the langchain-legacy
package.