ChatLiteLLM
: The main Langchain wrapper for basic usage of LiteLLM (docs).ChatLiteLLMRouter
: A ChatLiteLLM
wrapper that leverages LiteLLM’s Router (docs).Class | Package | Local | Serializable | JS support | Package downloads | Package latest |
---|---|---|---|---|---|---|
ChatLiteLLM | langchain-litellm | ❌ | ❌ | ❌ | ||
ChatLiteLLMRouter | langchain-litellm | ❌ | ❌ | ❌ |
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ |
ChatLiteLLM
and ChatLiteLLMRouter
models, you’ll need to install the langchain-litellm
package and create an OpenAI, Anthropic, Azure, Replicate, OpenRouter, Hugging Face, Together AI, or Cohere account. Then, you have to get an API key and export it as an environment variable.
langchain-litellm
package:
ChatLiteLLM
model by providing a model
name supported by LiteLLM.
ChatLiteLLM
or a ChatLiteLLMRouter
, you can now use the ChatModel through Langchain’s API.
ChatLiteLLM
and ChatLiteLLMRouter
also support async and streaming functionality:
ChatLiteLLM
and ChatLiteLLMRouter
features and configurations, head to the API reference: https://github.com/Akshay-Dongare/langchain-litellm