Workers AI allows you to run machine learning models, on the Cloudflare network, from your own code. This will help you getting started withDocumentation Index
Fetch the complete documentation index at: https://docs.langchain.com/llms.txt
Use this file to discover all available pages before exploring further.
ChatCloudflareWorkersAI chat models. For detailed documentation of all ChatCloudflareWorkersAI features and configurations head to the API reference.
Overview
Integration details
| Class | Package | Serializable | PY support | Downloads | Version |
|---|---|---|---|---|---|
ChatCloudflareWorkersAI | @langchain/cloudflare | ✅ | ❌ |
Model features
See the links in the table headers below for guides on how to use specific features.| Tool calling | Structured output | Image input | Audio input | Video input | Token-level streaming | Token usage | Logprobs |
|---|---|---|---|---|---|---|---|
| ❌ | ❌ | ✅ | ❌ | ❌ | ✅ | ❌ | ❌ |
Setup
To access Cloudflare Workers AI models you’ll need to create a Cloudflare account, get an API key, and install the@langchain/cloudflare integration package.
Credentials
Head to this page to sign up to Cloudflare and generate an API key. Once you’ve done this, note yourCLOUDFLARE_ACCOUNT_ID and CLOUDFLARE_API_TOKEN.
Passing a binding within a Cloudflare Worker is not yet supported.
Installation
The LangChain ChatCloudflareWorkersAI integration lives in the@langchain/cloudflare package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
API reference
For detailed documentation of allChatCloudflareWorkersAI features and configurations head to the API reference.
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

