Friendli enhances AI application performance and optimizes cost savings with scalable, efficient deployment options, tailored for high-demand AI workloads.This tutorial guides you through integrating
ChatFriendli
for chat applications using LangChain. ChatFriendli
offers a flexible approach to generating conversational AI responses, supporting both synchronous and asynchronous calls.
@langchain/community
is installed.
FRIENDLI_TOKEN
environment.
You can set team id as FRIENDLI_TEAM
environment.
You can initialize a Friendli chat model with selecting the model you want to use. The default model is meta-llama-3-8b-instruct
. You can check the available models at docs.friendli.ai.