Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.langchain.com/llms.txt

Use this file to discover all available pages before exploring further.

Private beta: The LLM Gateway is in private beta. Sign up for the waitlist to get access.

Prerequisites

Before you start, confirm that:
  • Your Organization admin has enabled the LLM Gateway and added provider API keys to workspace secrets. To set this up, refer to Admin setup.
  • You have a workspace-scoped LangSmith API key attached to a role with gateway:invoke and workspaces:read permissions. Ask your org admin if you’re unsure.

1. Set environment variables

Set the following in your terminal (or add them to your ~/.zshrc to persist across sessions):
export BASE_URL="https://gateway.smith.langchain.com"

export ANTHROPIC_BASE_URL="$BASE_URL/anthropic"
export OPENAI_BASE_URL="$BASE_URL/openai"
export GOOGLE_GEMINI_BASE_URL="$BASE_URL/gemini"

export LANGSMITH_API_KEY="lsv2_..._....cbed3e"

export ANTHROPIC_API_KEY="$LANGSMITH_API_KEY"
export OPENAI_API_KEY="$LANGSMITH_API_KEY"
export GEMINI_API_KEY="$LANGSMITH_API_KEY"
export GOOGLE_API_KEY="$LANGSMITH_API_KEY"
This points all provider SDKs at the LangSmith Gateway and uses your LangSmith API key for authentication. The gateway resolves actual provider keys from your workspace’s Provider Secrets—you never need local copies of provider API keys.

2. Make a call

curl https://gateway.smith.langchain.com/openai/v1/chat/completions \
    -H "Authorization: Bearer $LANGSMITH_API_KEY" \
    -H "Content-Type: application/json" \
    -d '{"model":"gpt-4o-mini","messages":[{"role":"user","content":"ping"}]}'
A 200 response with a chat completion confirms the gateway, your API key, and your role permissions are all working.

3. View your trace

Open the LangSmith UI and navigate to the tracing project named gateway or gateway-<short_api_key>-<api_key_id> in the workspace associated with your API key. You should see a new trace for the call you just made.

4. Set a spend policy (optional)

Go to Settings → Gateway → LLM Gateway in LangSmith to create a spend policy. For example, you can set a daily $10 cap on your API key. When the cap is reached, the gateway returns a 402 response with the message: "Request blocked by gateway policies: R&D Spend Cap". See Spend policies for the full guide on policy dimensions, time windows, and conflict resolution.

How the gateway handles requests

Here’s what the gateway did when you made a call:
  1. Authenticated your request using the LangSmith API key.
  2. Resolved the actual provider API key (for example, OPENAI_API_KEY) from your workspace’s Provider Secrets.
  3. Evaluated any active policies (spend limits, PII redaction, secrets redaction).
  4. Proxied the request to the upstream provider (OpenAI, Anthropic, or Gemini).
  5. Traced the call to LangSmith, including token counts, cost, and any policy events.
Routing through the gateway requires no application code changes.

Next steps