LangSmith can capture traces generated by LiveKit Agents using OpenTelemetry instrumentation. This guide shows you how to automatically capture traces from your LiveKit voice AI agents and send them to LangSmith for monitoring and analysis. For a complete implementation, see the demo repository.Documentation Index
Fetch the complete documentation index at: https://docs.langchain.com/llms.txt
Use this file to discover all available pages before exploring further.
Installation
Install the required packages:Quickstart tutorial
Follow this step-by-step tutorial to create a voice AI agent with LiveKit and LangSmith tracing. You’ll build a complete working example by copying and pasting code snippets.Step 1: Set up your environment
Create a.env file in your project directory:
.env
Step 2: Download the span processor
Add the custom span processor file that enables LangSmith tracing. Save it aslangsmith_processor.py in your project directory.
What does the span processor do?
What does the span processor do?
The span processor enriches LiveKit Agents’ OpenTelemetry spans with LangSmith-compatible attributes so your traces display properly in LangSmith.Key functions:
- Converts LiveKit span types (stt, llm, tts, agent, session, job) to LangSmith format.
- Adds
gen_ai.prompt.*andgen_ai.completion.*attributes for message visualization. - Tracks and aggregates conversation messages across turns
- Uses multiple extraction strategies to handle various LiveKit attribute formats.
Step 3: Create your voice agent file
Create a new file calledagent.py and add the following code. We’ll build it section by section so you can copy and paste each part.
Part 1: Import dependencies and set up tracing
Part 2: Define your agent
Part 3: Set up the agent server
Step 4: Run your agent
Run your voice agent in console mode for local testing:Advanced usage
Custom metadata and tags
You can add custom metadata to your traces using span attributes:Troubleshooting
Spans not appearing in LangSmith
If traces aren’t showing up in LangSmith:- Verify environment variables: Ensure
OTEL_EXPORTER_OTLP_ENDPOINTandOTEL_EXPORTER_OTLP_HEADERSare set correctly in your.envfile. - Check setup order: Make sure
setup_langsmith()is called before creatingAgentServer. - Check API key: Confirm your LangSmith API key has write permissions.
- Look for confirmation: You should see ”✅ LangSmith tracing enabled” in the console when starting.
Messages not showing correctly
If conversation messages aren’t displaying properly:- Check span processor: Verify
langsmith_processor.pyis in your project directory and imported correctly. - Verify imports: Ensure
LangSmithSpanProcessoris imported in your agent.py. - Enable debug logging: Set
LANGSMITH_PROCESSOR_DEBUG=truein your environment to see detailed logs.
Connection issues
If your agent can’t connect to LiveKit:- Verify LiveKit URL: Check
LIVEKIT_URLis set correctly in your.envfile. - Check credentials: Ensure
LIVEKIT_API_KEYandLIVEKIT_API_SECRETare correct. - Test connection: Try connecting to your LiveKit server with the LiveKit CLI first.
- Console mode: For local testing, always use:
python agent.py console.
Import errors
If you’re getting import errors:- Install dependencies: Run the complete pip install command from Step 1.
- Check Python version: Ensure you’re using Python 3.9 or higher.
- Verify langsmith_processor: Make sure
langsmith_processor.pyis downloaded and in the same directory asagent.py. - Check LiveKit plugins: Ensure you have the correct LiveKit plugins installed for your STT/LLM/TTS providers.
Agent not responding
If your agent connects but doesn’t respond:- Check API keys: Verify your OpenAI API key (or other provider keys) are correct.
- Test services: Ensure your STT, LLM, and TTS services are accessible.
- Check instructions: Make sure your Agent has proper instructions.
- Review logs: Look for errors in the console output.
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

