Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.langchain.com/llms.txt

Use this file to discover all available pages before exploring further.

Sandboxes are in private preview. APIs and features may change as we iterate. Sign up for the waitlist to get access.
The auth proxy lets sandbox code call external APIs (OpenAI, Anthropic, GitHub, etc.) without hardcoding credentials. When configured on a sandbox, a proxy sidecar automatically injects authentication headers into matching outbound requests using your workspace secrets.
You must configure your secrets (e.g., OPENAI_API_KEY) in your LangSmith workspace settings before creating a sandbox that references them.

Configure auth proxy rules

Add a proxy_config when creating a sandbox. Each rule specifies:
FieldDescription
match_hostsHosts to intercept (supports globs like *.github.com)
match_pathsPaths to match (empty = all paths)
headersHeaders to inject, each with a name, type, and value
no_proxyHosts to bypass the proxy entirely (e.g. localhost)

Header types

Each header has a type that controls how its value is stored and displayed:
TypeDescription
workspace_secretReferences a workspace secret using {KEY} syntax. Resolved at push time.
plaintextValue is stored and returned as-is. Use for non-sensitive headers.
opaqueWrite-only. Value is encrypted at rest and never returned via the API.

Single API example

Create a sandbox that automatically injects an OpenAI API key into outbound requests:
curl -X POST "$LANGSMITH_ENDPOINT/api/v2/sandboxes/boxes" \
  -H "x-api-key: $LANGSMITH_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "snapshot_id": "<snapshot-uuid>",
    "name": "openai-sandbox",
    "wait_for_ready": true,
    "proxy_config": {
      "rules": [
        {
          "name": "openai-api",
          "match_hosts": ["api.openai.com"],
          "headers": [
            {
              "name": "Authorization",
              "type": "workspace_secret",
              "value": "Bearer {OPENAI_API_KEY}"
            }
          ]
        }
      ]
    }
  }'
The sandbox can now call OpenAI with no API key setup—the proxy injects it automatically.

Multiple API example

Add multiple rules to authenticate with several services at once:
curl -X POST "$LANGSMITH_ENDPOINT/api/v2/sandboxes/boxes" \
  -H "x-api-key: $LANGSMITH_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "snapshot_id": "<snapshot-uuid>",
    "name": "multi-api-sandbox",
    "wait_for_ready": true,
    "proxy_config": {
      "rules": [
        {
          "name": "openai-api",
          "match_hosts": ["api.openai.com"],
          "headers": [
            {
              "name": "Authorization",
              "type": "workspace_secret",
              "value": "Bearer {OPENAI_API_KEY}"
            }
          ]
        },
        {
          "name": "anthropic-api",
          "match_hosts": ["api.anthropic.com"],
          "headers": [
            {
              "name": "x-api-key",
              "type": "workspace_secret",
              "value": "{ANTHROPIC_API_KEY}"
            },
            {
              "name": "anthropic-version",
              "type": "plaintext",
              "value": "2023-06-01"
            }
          ]
        },
        {
          "name": "github-api",
          "match_hosts": ["api.github.com"],
          "match_paths": ["/repos/*", "/user"],
          "headers": [
            {
              "name": "Authorization",
              "type": "workspace_secret",
              "value": "Bearer {GITHUB_TOKEN}"
            }
          ]
        }
      ],
      "no_proxy": ["localhost", "127.0.0.1"]
    }
  }'

Configure via SDK

from langsmith.sandbox import SandboxClient

client = SandboxClient()

client.create_sandbox(
    snapshot_id=SNAPSHOT_ID,
    name="openai-sandbox",
    proxy_config={
        "rules": [
            {
                "name": "openai-api",
                "match_hosts": ["api.openai.com"],
                "headers": [
                    {
                        "name": "Authorization",
                        "type": "workspace_secret",
                        "value": "Bearer {OPENAI_API_KEY}",
                    }
                ],
            }
        ]
    },
)

Dynamic credentials with callbacks

Static rules pull credentials from your workspace secrets at sandbox creation time. For credentials that need to be resolved per-request—short-lived OAuth tokens, per-user-scoped tokens, tokens minted by your own auth service—use a callback instead. The proxy POSTs to a URL you provide, your endpoint returns the headers to inject, and the proxy caches the result. Callbacks are configured alongside rules under proxy_config:
FieldDescription
match_hostsHosts to intercept (same syntax as rules; supports globs like *.github.com).
urlYour callback endpoint. Must be an http:// or https:// URL reachable from the proxy.
request_headersHeaders attached to the proxy → callback request, e.g., an HMAC or shared secret your endpoint uses to verify the request. Only plaintext and opaque types are permitted (no workspace_secret).
ttl_secondsHow long resolved headers are cached before re-invoking the callback. Must be between 60 and 3600.
Static rules win. If any rule in rules matches the host, the callback is skipped for that host. Within rules, first-match-wins; the same applies between callbacks if multiple match.

Callback contract

The proxy makes the following request whenever it needs to resolve credentials for a matched host on a cache miss:
POST <callback.url>
Content-Type: application/json
<request_headers from your config, attached verbatim>

{"host": "api.example.com", "port": 443}
Your endpoint must respond 2xx with a JSON body:
{
  "headers": {
    "Authorization": "Bearer <token>",
    "X-Org-Id": "..."
  }
}
The proxy injects every header in the response into the sandbox’s outbound request and caches the response for ttl_seconds. Any non-2xx response, transport error, or malformed JSON fails closed: the sandbox’s request is rejected with 502 callback resolution failed (no headers injected, response not cached).

Example

Use a callback when your OAuth tokens are minted on demand by your own service:
curl -X POST "$LANGSMITH_ENDPOINT/api/v2/sandboxes/boxes" \
  -H "x-api-key: $LANGSMITH_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "snapshot_id": "<snapshot-uuid>",
    "name": "callback-sandbox",
    "wait_for_ready": true,
    "proxy_config": {
      "callbacks": [
        {
          "match_hosts": ["api.github.com", "*.githubusercontent.com"],
          "url": "https://auth.your-app.example.com/sandbox-credentials",
          "request_headers": [
            {
              "name": "X-Integrator-Secret",
              "type": "opaque",
              "value": "<shared-secret-your-endpoint-verifies>"
            }
          ],
          "ttl_seconds": 300
        }
      ]
    }
  }'

Configure via SDK

from langsmith.sandbox import SandboxClient

client = SandboxClient()

client.create_sandbox(
    snapshot_id=SNAPSHOT_ID,
    name="callback-sandbox",
    proxy_config={
        "callbacks": [
            {
                "match_hosts": ["api.github.com", "*.githubusercontent.com"],
                "url": "https://auth.your-app.example.com/sandbox-credentials",
                "request_headers": [
                    {
                        "name": "X-Integrator-Secret",
                        "type": "opaque",
                        "value": "<shared-secret-your-endpoint-verifies>",
                    }
                ],
                "ttl_seconds": 300,
            }
        ]
    },
)