Contextual AI’s Instruction-Following Reranker is the world’s first reranker designed to follow custom instructions about how to prioritize documents based on specific criteria like recency, source, and metadata. With superior performance on the BEIR benchmark (scoring 61.2 and outperforming competitors by significant margins), it delivers unprecedented control and accuracy for enterprise RAG applications.Documentation Index
Fetch the complete documentation index at: https://docs.langchain.com/llms.txt
Use this file to discover all available pages before exploring further.
Key capabilities
- Instruction Following: Dynamically control document ranking through natural language commands
- Conflict Resolution: Intelligently handle contradictory information from multiple knowledge sources
- Superior Accuracy: Achieve state-of-the-art performance on industry benchmarks
- Seamless Integration: Drop-in replacement for existing rerankers in your RAG pipeline
contextual-client Python SDK. Learn more in the contextual-client-python repository.
Overview
This integration invokes Contextual AI’s Grounded Language Model.Integration details
| Class | Package | Local | Serializable | JS support | Downloads | Version |
|---|---|---|---|---|---|---|
ContextualRerank | langchain-contextual | ❌ | beta | ❌ |
Setup
To access Contextual’s reranker models you’ll need to create a/an Contextual AI account, get an API key, and install thelangchain-contextual integration package.
Credentials
Head to app.contextual.ai to sign up to Contextual and generate an API key. Once you’ve done this set the CONTEXTUAL_AI_API_KEY environment variable:Installation
The LangChain Contextual integration lives in thelangchain-contextual package:
Instantiation
The Contextual Reranker arguments are:| Parameter | Type | Description |
|---|---|---|
| documents | list[Document] | A sequence of documents to rerank. Any metadata contained in the documents will also be used for reranking. |
| query | str | The query to use for reranking. |
| model | str | The version of the reranker to use. Currently, we just have “ctxl-rerank-en-v1-instruct”. |
| top_n | Optional[int] | The number of results to return. If None returns all results. Defaults to self.top_n. |
| instruction | Optional[str] | The instruction to be used for the reranker. |
| callbacks | Optional[Callbacks] | Callbacks to run during the compression process. |
Usage
First, we will set up the global variables and examples we’ll use, and instantiate our reranker client.Use within a chain
Examples coming soon.API reference
For detailed documentation of allChatContextual features and configurations head to the GitHub page: github.com/ContextualAI//langchain-contextual
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

