gemini-2.5-pro
, gemini-2.5-flash
, etc. For a full and updated list of available models visit VertexAI documentation.
Google Cloud VertexAI vs Google PaLMThe Google Cloud VertexAI integration is separate from the Google PaLM integration. Google has chosen to offer an enterprise version of PaLM through GCP, and this supports the models made available through there.
Overview
Integration details
Class | Package | Local | Serializable | JS support | Downloads | Version |
---|---|---|---|---|---|---|
ChatVertexAI | langchain-google-vertexai | ❌ | beta | ✅ |
Model features
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ✅ | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |
Setup
To access VertexAI models you’ll need to create a Google Cloud Platform account, set up credentials, and install thelangchain-google-vertexai
integration package.
Credentials
To use the integration you must either:- Have credentials configured for your environment (gcloud, workload identity, etc…)
- Store the path to a service account JSON file as the GOOGLE_APPLICATION_CREDENTIALS environment variable
google.auth
library which first looks for the application credentials variable mentioned above, and then looks for system-level auth.
For more information, see:
- cloud.google.com/docs/authentication/application-default-credentials#GAC
- googleapis.dev/python/google-auth/latest/reference/google.auth.html#module-google.auth
Installation
The LangChain VertexAI integration lives in thelangchain-google-vertexai
package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
Built-in tools
Gemini supports a range of tools that are executed server-side.Google search
Requires
langchain-google-vertexai>=2.0.11
Code execution
Requires
langchain-google-vertexai>=2.0.25