langchain-localai is a 3rd party integration package for LocalAI. It provides a simple way to use LocalAI services in Langchain.The source code is available on Github
Let’s load the LocalAI Embedding class with first generation models (e.g. text-search-ada-doc-001/text-search-ada-query-001). Note: These are not recommended models - see here
Copy
Ask AI
from langchain_community.embeddings import LocalAIEmbeddings
import os# if you are behind an explicit proxy, you can use the OPENAI_PROXY environment variable to pass throughos.environ["OPENAI_PROXY"] = "http://proxy.yourcompany.com:8080"
Assistant
Responses are generated using AI and may contain mistakes.