Overview

This will help you getting started with the AzionRetriever. For detailed documentation of all AzionRetriever features and configurations head to the API reference.

Integration details

RetrieverSelf-hostCloud offeringPackage[Py support]
AzionRetriever@langchain/community

Setup

To use the AzionRetriever, you need to set the AZION_TOKEN environment variable.
process.env.AZION_TOKEN = "your-api-key"
If you are using OpenAI embeddings for this guide, you’ll need to set your OpenAI key as well:
process.env.OPENAI_API_KEY = "YOUR_API_KEY";
If you want to get automated tracing from individual queries, you can also set your LangSmith API key by uncommenting below:
// process.env.LANGSMITH_API_KEY = "<YOUR API KEY HERE>";
// process.env.LANGSMITH_TRACING = "true";

Installation

This retriever lives in the @langchain/community/retrievers/azion_edgesql package:
import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>

<Npm2Yarn>
  azion @langchain/openai @langchain/community
</Npm2Yarn>

Instantiation

Now we can instantiate our retriever:
import { AzionRetriever } from "@langchain/community/retrievers/azion_edgesql";
import { OpenAIEmbeddings } from "@langchain/openai";
import { ChatOpenAI } from "@langchain/openai";

const embeddingModel = new OpenAIEmbeddings({
  model: "text-embedding-3-small"
})

const chatModel = new ChatOpenAI({
  model: "gpt-4o-mini",
  apiKey: process.env.OPENAI_API_KEY
})

const retriever = new AzionRetriever(embeddingModel, 
  {dbName:"langchain",
   vectorTable:"documents", // table where the vector embeddings are stored
   ftsTable:"documents_fts", // table where the fts index is stored
   searchType:"hybrid", // search type to use for the retriever
   ftsK:2, // number of results to return from the fts index
   similarityK:2, // number of results to return from the vector index
   metadataItems:["language","topic"],
   filters: [{ operator: "=", column: "language", value: "en" }],
   entityExtractor:chatModel

}) // number of results to return from the vector index

Usage

const query = "Australia"

await retriever.invoke(query);
[
  Document {
    pageContent: 'Australia s indigenous people have inhabited the continent for over 65,000 years',
    metadata: { language: 'en', topic: 'history', searchtype: 'similarity' },
    id: '3'
  },
  Document {
    pageContent: 'Australia is a leader in solar energy adoption and renewable technology',
    metadata: { language: 'en', topic: 'technology', searchtype: 'similarity' },
    id: '5'
  },
  Document {
    pageContent: 'Australia s tech sector is rapidly growing with innovation hubs in major cities',
    metadata: { language: 'en', topic: 'technology', searchtype: 'fts' },
    id: '7'
  }
]

Use within a chain

Like other retrievers, AzionRetriever can be incorporated into LLM applications via chains. We will need a LLM or chat model:
<ChatModelTabs customVarName="llm" />
// @lc-docs-hide-cell

import { ChatOpenAI } from "@langchain/openai";

const llm = new ChatOpenAI({
  model: "gpt-4o-mini",
  temperature: 0,
});
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { RunnablePassthrough, RunnableSequence } from "@langchain/core/runnables";
import { StringOutputParser } from "@langchain/core/output_parsers";

import type { Document } from "@langchain/core/documents";

const prompt = ChatPromptTemplate.fromTemplate(`
Answer the question based only on the context provided.

Context: {context}

Question: {question}`);

const formatDocs = (docs: Document[]) => {
  return docs.map((doc) => doc.pageContent).join("\n\n");
}

// See https://js.langchain.com/docs/tutorials/rag
const ragChain = RunnableSequence.from([
  {
    context: retriever.pipe(formatDocs),
    question: new RunnablePassthrough(),
  },
  prompt,
  llm,
  new StringOutputParser(),
]);
await ragChain.invoke("Paris")
The context mentions that the 2024 Olympics are in Paris.

API reference

For detailed documentation of all AzionRetriever features and configurations head to the API reference.