ChatGoogleGenerativeAI
chat models. For detailed documentation of all ChatGoogleGenerativeAI
features and configurations head to the API reference.
Class | Package | Local | Serializable | PY support | Package downloads | Package latest |
---|---|---|---|---|---|---|
ChatGoogleGenerativeAI | @langchain/google-genai | ❌ | ✅ | ✅ |
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|
✅ | ✅ | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |
gemini
and gemini-vision
models, as well as other
generative models in LangChain through ChatGoogleGenerativeAI
class in the
@langchain/google-genai
integration package.
GOOGLE_API_KEY
environment variable:
ChatGoogleGenerativeAI
integration lives in the @langchain/google-genai
package:
@google/generative-ai
package, then construct your LLM as follows:
const invalidSchema = z.object({ properties: z.record(z.unknown()) });
and
const invalidSchema2 = z.record(z.unknown());
Instead, you should explicitly define the properties of the object field. Here’s an example:
CodeExecutionTool
, you can make the model generate code, execute it, and use the results in a final completion:
CachedContent
object using GoogleAICacheManager
class and then pass the CachedContent
object to your ChatGoogleGenerativeAIModel
with enableCachedContent()
method.