Mastra is a TypeScript framework for building AI-powered applications and agents. Using Mastra’s LangSmith exporter, you can send traces from your Mastra agents and workflows to LangSmith for debugging, evaluation, and observability. This guide shows you how to integrate Mastra with LangSmith using Mastra’s AI tracing system.Documentation Index
Fetch the complete documentation index at: https://docs.langchain.com/llms.txt
Use this file to discover all available pages before exploring further.
Installation
Install Mastra and the LangSmith exporter:Setup
-
Set your LangSmith API key and (optionally) a LangSmith project name:
-
If you plan to use OpenAI models, also ensure you have an OpenAI API key available at runtime:
-
In your project directory, create the following project structure and files:
Configure Mastra with the LangSmith exporter
Mastra tracing is configured directly on theMastra constructor. Add the following to a mastra.ts file:
- Storage is required for tracing (even when exporting traces externally).
- The LangSmith exporter reads credentials from environment variables.
- The deprecated telemetry system is disabled to avoid warnings.
- No separate instrumentation file is required when running Mastra outside of the Mastra server. For more details, refer to the Mastra docs.
Define an agent
For compatibility, use string-based model identifiers. Add the following code to anagent.ts file:
Run the agent
-
Add the following to an
index.tsfile: -
Run your application:
View traces in LangSmith
After running the agent:- Open the LangSmith UI.
- Select your project. For example, the value of
LANGCHAIN_PROJECT. - Locate the trace corresponding to
echoAgent.generate.
- Model inputs and outputs
- Agent execution steps
- Timing and error information
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

