Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.langchain.com/llms.txt

Use this file to discover all available pages before exploring further.

Private beta: The LLM Gateway is in private beta. APIs and features may change as we iterate. Sign up for the waitlist to get access.
The LLM Gateway is a proxy that sits between your agents (or any LLM client) and the LLM providers they call. Instead of each client storing provider API keys locally, keys are stored once in LangSmith as Provider Secrets. Clients authenticate with a LangSmith API key. When a request passes through the gateway, it:
  1. Authenticates the caller using their LangSmith API key.
  2. Resolves the actual provider API key from your workspace’s Provider Secrets.
  3. Evaluates governance policies (spend limits, PII redaction, secrets redaction).
  4. Proxies the request to the upstream provider.
  5. Traces the call to LangSmith.
Every gateway-proxied call appears as a trace in LangSmith. When a policy fires, the event flows into LangSmith Engine for triage—you can go from a blocked request to the trace that triggered it to the fix, all in one product.

Feature availability

CapabilityDescription
Spend limitsHard-block enforcement at the organization, workspace, API key, or user level. When a cap is hit, the caller receives a 402 response with an actionable error message.
Spend visibilityReal-time cost rollups by workspace, user, API key. Per-model visibility can be had via Custom Charts.
PII redactionDetects and redacts names, places, nationality, religion, political affiliation, and ages from requests before they reach the model.
Secrets redactionDetects and redacts US phone numbers, US SSNs, API keys, tokens, and credentials from requests. Covers AWS, GitHub, GitLab, OpenAI, Anthropic, GCP, Azure, Slack, Datadog, PyPI, npm, private keys, and LangSmith tokens.
LangSmith Engine integrationPolicy violations surface as issues in LangSmith Engine. Click through from a violation to the trace that produced it.
Audit loggingAdministrative changes to gateway configuration and gateway invocations are logged.
TracingEvery gateway-proxied call appears in the same LangSmith workspace as the rest of your agent’s traces. Routing through the gateway does not fragment your observability.

Supported integrations

  • Anthropic
  • OpenAI
  • Google Gemini

Known limitations

  • Claude Desktop: no marketplace plugins supported. The Chat tab not visible, but Cowork is functional.
  • Codex Desktop: no marketplace plugins supported.
  • Cursor: IDE integration is not yet available.
  • ChatGPT Desktop / Gemini Desktop: not configurable.

Resources

Spend policies

Set and manage cost limits across your organization.

PII and secrets redaction

Prevent sensitive data from reaching LLM providers or trace storage.
For further questions on LLM Gateway, contact support.langchain.com.