The LangChain Ollama integration package has official support for tool calling. Click here to view the documentation.
This is an experimental wrapper that attempts to bolt-on tool calling support to models that do not natively support it. Use with caution.
Setup
Follow these instructions to set up and run a local Ollama instance.Initialize model
You can initialize this wrapper the same way you’d initialize a standardChatOllama
instance:
Passing in functions
You can now pass in functions the same way as OpenAI:Using for extraction
You can see a simple LangSmith trace of this here
Customization
Behind the scenes, this uses Ollama’s JSON mode to constrain output to JSON, then passes tools schemas as JSON schema into the prompt. Because different models have different strengths, it may be helpful to pass in your own system prompt. Here’s an example:Related
- Chat model conceptual guide
- Chat model how-to guides