Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.langchain.com/llms.txt

Use this file to discover all available pages before exploring further.

The langsmith-codex-plugins marketplace ships a tracing plugin that sends OpenAI Codex session data to LangSmith. Use it to inspect agent turns, model metadata, token usage, tool calls, and subagent threads from your Codex workflows.

Prerequisites

Before setting up tracing, ensure you have:

Install and enable the plugin

Add the marketplace using the Codex CLI:
codex plugin marketplace add langchain-ai/langsmith-codex-plugins
Enable plugin hooks and the tracing plugin globally in ~/.codex/config.toml, or only for a specific project in .codex/config.toml:
[features]
plugin_hooks = true

[plugins."tracing@langsmith-codex-plugins"]
enabled = true

Configure tracing

Tracing is disabled until either TRACE_TO_LANGSMITH is "true" or enabled is true in a config file. Configure credentials with environment variables, a JSON config file, or both.

Environment variables

The plugin reads Codex-specific variables first, then falls back to the generic LangSmith SDK variables.
VariableRequiredDefaultDescription
TRACE_TO_LANGSMITHYes-Set to "true" to enable tracing.
LANGSMITH_CODEX_API_KEYConditional-LangSmith API key. Falls back to LANGSMITH_API_KEY. Required unless every replica provides its own API key.
LANGSMITH_CODEX_ENDPOINTNohttps://api.smith.langchain.comLangSmith API URL. Falls back to LANGSMITH_ENDPOINT.
LANGSMITH_CODEX_PROJECTNocodexLangSmith project name. Falls back to LANGSMITH_PROJECT.
LANGSMITH_CODEX_METADATANo-JSON object merged into root trace metadata. Falls back to LANGSMITH_METADATA.
LANGSMITH_CODEX_RUNS_ENDPOINTSNo-JSON array of replica destinations. Falls back to LANGSMITH_RUNS_ENDPOINTS.
Add the variables to your shell configuration file (~/.zshrc, ~/.bashrc, or ~/.bash_profile):
export TRACE_TO_LANGSMITH="true"
export LANGSMITH_CODEX_API_KEY="<your-langsmith-api-key>"
export LANGSMITH_CODEX_PROJECT="codex"

Config file

Use <project>/.codex/langsmith.json for project-level settings or ~/.codex/langsmith.json for global defaults. The global file loads first, the project file overrides it, and matching environment variables take precedence over both.
{
  "enabled": true,
  "api_key": "<your-langsmith-api-key>",
  "api_url": "https://api.smith.langchain.com",
  "project": "codex",
  "metadata": {
    "team": "agents",
    "environment": "dev"
  }
}
FieldEnvironment variableDefaultDescription
enabledTRACE_TO_LANGSMITHfalseSet to true to enable tracing.
api_keyLANGSMITH_CODEX_API_KEY, LANGSMITH_API_KEY-LangSmith API key.
api_urlLANGSMITH_CODEX_ENDPOINT, LANGSMITH_ENDPOINTLangSmith defaultLangSmith API URL.
projectLANGSMITH_CODEX_PROJECT, LANGSMITH_PROJECTcodexLangSmith project name.
metadataLANGSMITH_CODEX_METADATA, LANGSMITH_METADATA-Object merged into root trace metadata.
replicasLANGSMITH_CODEX_RUNS_ENDPOINTS, LANGSMITH_RUNS_ENDPOINTS-Additional LangSmith destinations to replicate traces to.
Keep config files that include API keys out of version control.

Trace to multiple destinations

Set replicas in langsmith.json or LANGSMITH_CODEX_RUNS_ENDPOINTS to send the same trace data to additional LangSmith workspaces or projects. When set, the replica list overrides the other client settings. Tracing to multiple replicas is useful for:
  • Sending traces to both a production and staging project.
  • Tracing to multiple workspaces with different API keys.
  • Adding extra metadata to specific replica destinations.
Each replica object supports the following fields:
FieldRequiredDescription
apiUrlYesLangSmith API URL (typically https://api.smith.langchain.com).
apiKeyYesAPI key for the destination workspace.
projectNameYesProject name in the destination workspace.
updatesNoOptional run fields to override on replicated runs, such as extra metadata.

What gets traced

Each LLM run includes:
  • Inputs: accumulated conversation messages.
  • Outputs: assistant response content.
  • Metadata: model provider, model name, stop reason, and token usage.
Tool calls (function calls, shell calls, computer calls, file reads, web searches) are included with their inputs and outputs. Subagent threads are resolved and uploaded as nested child runs under the parent turn. Interrupted turns where the user cancels mid-response are still uploaded once the session completes.

View traces in LangSmith

Open the configured LangSmith project and complete a Codex turn. By default traces appear in the codex project. The plugin uploads completed Codex transcript data, including messages, tool call inputs and outputs, model metadata, token usage, and subagent thread structure.
The plugin uploads full Codex transcript data to LangSmith. Do not enable tracing for sessions that contain data you do not want stored in LangSmith.

Troubleshooting

If traces do not appear in LangSmith:
  • Confirm plugin_hooks = true and the tracing plugin is enabled in config.toml.
  • Confirm TRACE_TO_LANGSMITH=true is visible to the Codex process.
  • Confirm LANGSMITH_CODEX_API_KEY or LANGSMITH_API_KEY is set and valid.
  • If runs land in the wrong project, set LANGSMITH_CODEX_PROJECT or the project config key.
  • If a custom endpoint is not used, set LANGSMITH_CODEX_ENDPOINT or the api_url config key.