Together AI offers an API to query 50+ leading open-source models in a couple lines of code.This example goes over how to use LangChain to interact with Together AI models.
To use Together AI, you’ll need an API key which you can find here:
https://api.together.ai/settings/api-keys. This can be passed in as an init param
together_api_key or set as environment variable TOGETHER_API_KEY.
# Querying chat models with Together AIfrom langchain_together import ChatTogether# choose from our 50+ models here: https://docs.together.ai/docs/inference-modelschat = ChatTogether( # together_api_key="YOUR_API_KEY", model="meta-llama/Llama-3-70b-chat-hf",)# stream the response back from the modelfor m in chat.stream("Tell me fun things to do in NYC"): print(m.content, end="", flush=True)# if you don't want to do streaming, you can use the invoke method# chat.invoke("Tell me fun things to do in NYC")
Copy
Ask AI
# Querying code and language models with Together AIfrom langchain_together import Togetherllm = Together( model="codellama/CodeLlama-70b-Python-hf", # together_api_key="...")print(llm.invoke("def bubble_sort(): "))
Assistant
Responses are generated using AI and may contain mistakes.