Skip to main content
This guide shows you how to trace and log Google’s Gemini models in LangSmith. You’ll instrument Gemini calls using the latest google-genai SDK (Python) or @google/genai SDK (JavaScript), wrap the Gemini client for tracing, and try examples including basic prompts, metadata tagging, and multi-turn conversations.
The LangSmith Gemini wrappers are in beta. The API may change in future releases.

Installation

Install the required packages using your preferred package manager:
pip install langsmith google-genai

Setup

Set your API keys and project name:
export LANGSMITH_API_KEY=<your_langsmith_api_key>
export LANGSMITH_PROJECT=<your_project_name>
export LANGSMITH_TRACING=true
export GOOGLE_API_KEY=<your_google_api_key>
To create a Google API key, refer to Google AI Studio.

Configure tracing

To trace Gemini API calls, use LangSmith’s wrap_gemini (Python) or wrapGemini (JavaScript) wrapper function. This wrapper intercepts calls to the Gemini client and automatically logs them as traces in LangSmith. The wrapper preserves all of the original client’s functionality while adding observability:
from google import genai
from langsmith import wrappers

def main():
    # genai.Client() reads GOOGLE_API_KEY / GEMINI_API_KEY from the environment
    gemini_client = genai.Client()

    # Wrap the Gemini client to enable LangSmith tracing
    client = wrappers.wrap_gemini(
        gemini_client,
        tracing_extra={
            "tags": ["gemini", "python"],
            "metadata": {
                "integration": "google-genai",
            },
        },
    )

    # Make a traced Gemini call
    response = client.models.generate_content(
        model="gemini-2.5-flash",
        contents="Explain quantum computing in simple terms.",
    )

    print(response.text)


if __name__ == "__main__":
    main()
You can customize tracing by passing tracing_extra when calling wrap_gemini(). This parameter applies to all subsequent requests you make with that wrapped client, which allows you to attach tags and metadata for filtering and organizing traces in the LangSmith UI. The tracing_extra parameter accepts:
  • tags: A list of strings to categorize traces (for example, ["production", "gemini"]).
  • metadata: A dictionary of key-value pairs for additional context (for example, {"team": "ml-research", "integration": "google-genai"}).
  • client: An optional custom LangSmith client instance.
These settings apply consistently across all traces from the wrapped client, so that you can include environment-level tags or team metadata that should remain constant throughout your application.

View traces in LangSmith

After running your application, you can view traces in the LangSmith UI that include:
  • Model requests: Complete prompts sent to Gemini models
  • Model responses: Generated text and structured outputs
  • Function calls: Tool invocations and results when using function calling
  • Chat sessions: Multi-turn conversation context
  • Performance metrics: Latency and token usage information

Connect these docs to Claude, VSCode, and more via MCP for real-time answers.