This will help you getting started with the AzionRetriever. For detailed documentation of all AzionRetriever features and configurations head to the API reference.
import { AzionRetriever } from "@langchain/community/retrievers/azion_edgesql";import { OpenAIEmbeddings } from "@langchain/openai";import { ChatOpenAI } from "@langchain/openai";const embeddingModel = new OpenAIEmbeddings({ model: "text-embedding-3-small"})const chatModel = new ChatOpenAI({ model: "gpt-4o-mini", apiKey: process.env.OPENAI_API_KEY})const retriever = new AzionRetriever(embeddingModel, {dbName:"langchain", vectorTable:"documents", // table where the vector embeddings are stored ftsTable:"documents_fts", // table where the fts index is stored searchType:"hybrid", // search type to use for the retriever ftsK:2, // number of results to return from the fts index similarityK:2, // number of results to return from the vector index metadataItems:["language","topic"], filters: [{ operator: "=", column: "language", value: "en" }], entityExtractor:chatModel}) // number of results to return from the vector index
[ Document { pageContent: 'Australia s indigenous people have inhabited the continent for over 65,000 years', metadata: { language: 'en', topic: 'history', searchtype: 'similarity' }, id: '3' }, Document { pageContent: 'Australia is a leader in solar energy adoption and renewable technology', metadata: { language: 'en', topic: 'technology', searchtype: 'similarity' }, id: '5' }, Document { pageContent: 'Australia s tech sector is rapidly growing with innovation hubs in major cities', metadata: { language: 'en', topic: 'technology', searchtype: 'fts' }, id: '7' }]
Like other retrievers, AzionRetriever can be incorporated into LLM applications via chains.We will need a LLM or chat model:
Copy
Ask AI
<ChatModelTabs customVarName="llm" />
Copy
Ask AI
// @lc-docs-hide-cellimport { ChatOpenAI } from "@langchain/openai";const llm = new ChatOpenAI({ model: "gpt-4o-mini", temperature: 0,});
Copy
Ask AI
import { ChatPromptTemplate } from "@langchain/core/prompts";import { RunnablePassthrough, RunnableSequence } from "@langchain/core/runnables";import { StringOutputParser } from "@langchain/core/output_parsers";import type { Document } from "@langchain/core/documents";const prompt = ChatPromptTemplate.fromTemplate(`Answer the question based only on the context provided.Context: {context}Question: {question}`);const formatDocs = (docs: Document[]) => { return docs.map((doc) => doc.pageContent).join("\n\n");}// See https://js.langchain.com/docs/tutorials/ragconst ragChain = RunnableSequence.from([ { context: retriever.pipe(formatDocs), question: new RunnablePassthrough(), }, prompt, llm, new StringOutputParser(),]);
Copy
Ask AI
await ragChain.invoke("Paris")
Copy
Ask AI
The context mentions that the 2024 Olympics are in Paris.