👉 Embeddings Included
Vectara uses its own embeddings under the hood, so you don’t have to provide any yourself or call another service to obtain embeddings. This also means that if you provide your own embeddings, they’ll be a no-op.Setup
You’ll need to:- Create a free Vectara account.
- Create a corpus to store your data
- Create an API key with QueryService and IndexService access so you can access this corpus
.env
file or provide args to connect LangChain to your Vectara corpus:
VECTARA_CORPUS_ID=3,8,9,43
.
For indexing multiple corpora, you’ll need to create a separate VectaraStore instance for each corpus.
Usage
lambda
is a parameter related to Vectara’s hybrid search capbility, providing a tradeoff between neural search and boolean/exact match as described here. We recommend the value of 0.025 as a default, while providing a way for advanced users to customize this value if needed.
APIs
Vectara’s LangChain vector store consumes Vectara’s core APIs:- Indexing API for storing documents in a Vectara corpus.
- Search API for querying this data. This API supports hybrid search.
Related
- Vector store conceptual guide
- Vector store how-to guides