This page covers how to use the Hazy Research ecosystem within LangChain.
It is broken into two parts: installation and setup, and then references to specific Hazy Research wrappers.
There exists an LLM wrapper around Hazy Research’s manifest library.
manifest is a python library which is itself a wrapper around many model providers, and adds in caching, history, and more.To use this wrapper:
Copy
Ask AI
from langchain_community.llms.manifest import ManifestWrapper
Assistant
Responses are generated using AI and may contain mistakes.