Observability is important for any software application, but especially so for LLM applications. LLMs are non-deterministic by nature, meaning they can produce unexpected results. This makes them trickier than normal to debug. This is where LangSmith can help! LangSmith gives you visibility into each step your application takes when handling a request — helping you debug faster and gain confidence in your app. From prototyping to production, LangSmith has you covered with tracing, filtering, charting, and alerting to keep your application reliable at scale.

Get started

This tutorial will show you how to instrument a simple RAG application that consists of a retrieval step to fetch data and an LLM call to OpenAI to answer the user question based on the data.
If you’re building an application with LangChain or LangGraph, get started by reading the guides for tracing with LangChain or tracing with LangGraph.

1. Install Dependencies

pip install -U langsmith openai

2. Create an API key

To create an API key head to the LangSmith settings page. Then click + API Key.

3. Set up environment variables

This example uses OpenAI, but you can adapt it to use any LLM provider. If you’re using Anthropic, use the Anthropic wrapper to trace your calls. For other providers, use the traceable wrapper.
export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY="<your-langsmith-api-key>"
export OPENAI_API_KEY="<your-openai-api-key>"

4. Define your application

We will instrument a simple RAG application for this tutorial, but feel free to use your own code if you’d like - just make sure it has an LLM call!

5. Trace OpenAI calls

The first thing you might want to trace is all your OpenAI calls. LangSmith makes this easy with the wrap_openai (Python) or wrapOpenAI (TypeScript) wrappers. All you have to do is modify your code to use the wrapped client instead of using the OpenAI client directly.
from openai import OpenAI
from langsmith.wrappers import wrap_openai

openai_client = wrap_openai(OpenAI())

# This is the retriever we will use in RAG
# This is mocked out, but it could be anything we want
def retriever(query: str):
    results = ["Harrison worked at Kensho"]
    return results

# This is the end-to-end RAG chain.
# It does a retrieval step then calls OpenAI
def rag(question):
    docs = retriever(question)
    system_message = """Answer the users question using only the provided information below:
        {docs}""".format(docs="\n".join(docs))
        
    return openai_client.chat.completions.create(
        messages=[
            {"role": "system", "content": system_message},
            {"role": "user", "content": question},
        ],
        model="gpt-4o-mini",
    )
Now when you call your application as follows:
rag("where did harrison work")
This will produce a trace of just the OpenAI call in LangSmith’s default tracing project. It should look something like this.

6. Trace entire application

You can also use the traceable decorator (Python or TypeScript) to trace your entire application instead of just the LLM calls.
from openai import OpenAI
from langsmith import traceable
from langsmith.wrappers import wrap_openai

openai_client = wrap_openai(OpenAI())

def retriever(query: str):
    results = ["Harrison worked at Kensho"]
    return results

@traceable
def rag(question):
    docs = retriever(question)
    system_message = """Answer the users question using only the provided information below:
        {docs}""".format(docs="\n".join(docs))
        
    return openai_client.chat.completions.create(
        messages=[
            {"role": "system", "content": system_message},
            {"role": "user", "content": question},
        ],
        model="gpt-4o-mini",
    )
Now if you call your application as follows:
rag("where did harrison work")
This will produce a trace of just the entire pipeline (with the OpenAI call as a child run) - it should look something like this

Next steps

Congratulations! If you’ve made it this far, you’re well on your way to being an expert in observability with LangSmith. Here are some topics you might want to explore next: If you prefer a video tutorial, check out the Tracing Basics video from the Introduction to LangSmith Course.