google-genai SDK (Python) or @google/genai SDK (JavaScript), wrap the Gemini client for tracing, and try examples including basic prompts, metadata tagging, and multi-turn conversations.
The LangSmith Gemini wrappers are in beta. The API may change in future releases.
Installation
Install the required packages using your preferred package manager:Setup
Set your API keys and project name:Configure tracing
To trace Gemini API calls, use LangSmith’swrap_gemini (Python) or wrapGemini (JavaScript) wrapper function. This wrapper intercepts calls to the Gemini client and automatically logs them as traces in LangSmith. The wrapper preserves all of the original client’s functionality while adding observability:
- Python
- JavaScript
You can customize tracing by passing
tracing_extra when calling wrap_gemini(). This parameter applies to all subsequent requests you make with that wrapped client, which allows you to attach tags and metadata for filtering and organizing traces in the LangSmith UI. The tracing_extra parameter accepts:tags: A list of strings to categorize traces (for example,["production", "gemini"]).metadata: A dictionary of key-value pairs for additional context (for example,{"team": "ml-research", "integration": "google-genai"}).client: An optional custom LangSmith client instance.
View traces in LangSmith
After running your application, you can view traces in the LangSmith UI that include:- Model requests: Complete prompts sent to Gemini models
- Model responses: Generated text and structured outputs
- Function calls: Tool invocations and results when using function calling
- Chat sessions: Multi-turn conversation context
- Performance metrics: Latency and token usage information