Friendli enhances AI application performance and optimizes cost savings with scalable, efficient deployment options, tailored for high-demand AI workloads.This tutorial guides you through integrating
ChatFriendli for chat applications using LangChain. ChatFriendli offers a flexible approach to generating conversational AI responses, supporting both synchronous and asynchronous calls.
Setup
Ensure thelangchain_community and friendli-client are installed.
FRIENDLI_TOKEN environment.
mixtral-8x7b-instruct-v0-1. You can check the available models at docs.friendli.ai.
Usage
FrienliChat supports all methods of ChatModel including async APIs.
You can also use functionality of  invoke, batch, generate, and stream.
ainvoke, abatch, agenerate, and astream.
Connect these docs programmatically to Claude, VSCode, and more via MCP for    real-time answers.