Documentation Index
Fetch the complete documentation index at: https://docs.langchain.com/llms.txt
Use this file to discover all available pages before exploring further.
Private beta: The LLM Gateway is in private beta. Sign up for the waitlist to get access.
Prerequisites
- Your Organization admin has completed Admin setup.
- You have a workspace-scoped LangSmith API key with
gateway:invokeandworkspaces:readpermissions. - You have set the gateway environment variables in your terminal.
Supported clients
- Claude Code CLI
- Codex CLI
- Gemini CLI
- Deep Agents
Claude Code CLI
No extra configuration beyond the environment variables. Run:ANTHROPIC_BASE_URL and ANTHROPIC_API_KEY from your environment automatically.
Codex CLI
Codex requires a TOML configuration file in addition to environment variables. Add this to~/.codex/config.toml:
LANGSMITH_API_KEY environment variable is set, then run:
Gemini CLI
No extra configuration beyond the environment variables. Run:GOOGLE_GEMINI_BASE_URL and GEMINI_API_KEY from your environment automatically.
Deep Agents
No extra configuration beyond the environment variables. For details, refer to the provider selection docs. Run:Python SDK usage
If you call LLM providers from Python scripts rather than through a coding agent, swap thebase_url:
- OpenAI SDK
- Anthropic SDK
Company-wide deployment
For organizations rolling the gateway out to all developers, distribute the configuration through MDM (mobile device management) or a shared shell profile. The key pieces to distribute are:- The gateway base URL (
https://gateway.smith.langchain.com). - A workspace-scoped LangSmith API key per user (or per team, depending on your policy granularity).
- The Codex
config.tomlif your organization uses Codex.
Verify the setup
After configuring a coding agent, make a test call and confirm that:- The call succeeds (the agent gets a response).
- A trace appears in the
gatewayandgateway-<short_api_key>-<api_key_id>tracing projects in your LangSmith workspace.
403, check that your API key’s role includes gateway:invoke and workspaces:read. If it fails with a 400 mentioning a missing provider key, ask your org admin to add the provider’s key to workspace secrets.
Next steps
- Spend policies: set cost limits on developer LLM usage.
- Traces, Engine, and access control: understand where gateway traces appear.
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

