MLX
LLM’s as chat models.
In particular, we will:
ChatMLX
class to enable any of these LLMs to interface with LangChain’s Chat Messages abstraction.ChatAgent
pipelineChatMLX
to apply chat templatesgemma-2b-it
as a zero-shot ReAct
Agent. The example below is taken from here.
Note: To run this section, you’ll need to have a SerpAPI Token saved as an environment variable: SERPAPI_API_KEY
react-json
style prompt and access to a search engine and calculator.