gemini-2.5-pro, gemini-2.5-flash, etc. For a full and updated list of available models visit VertexAI documentation.
Google Cloud VertexAI vs Gemini APIThe Google Cloud VertexAI integration is separate from the Google Gemini API. This page showcases an enterprise version of Gemini through Google Clou Platform (GCP).
Overview
Integration details
| Class | Package | Local | Serializable | JS support | Downloads | Version |
|---|---|---|---|---|---|---|
ChatVertexAI | langchain-google-vertexai | ❌ | beta | ✅ |
Model features
| Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
|---|---|---|---|---|---|---|---|---|---|
| ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
Setup
To access VertexAI models you’ll need to create a Google Cloud Platform account, set up credentials, and install thelangchain-google-vertexai integration package.
Credentials
To use the integration you must either:- Have credentials configured for your environment (gcloud, workload identity, etc…)
- Store the path to a service account JSON file as the
GOOGLE_APPLICATION_CREDENTIALSenvironment variable
google.auth library which first looks for the application credentials variable mentioned above, and then looks for system-level auth.
For more information, see the google.auth API reference.
To enable automated tracing of your model calls, set your LangSmith API key:
Installation
The LangChain VertexAI integration lives in thelangchain-google-vertexai package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
Built-in tools
Gemini supports a range of tools that are executed server-side.Google search
Requires
langchain-google-vertexai>=2.0.11Code execution
Requires
langchain-google-vertexai>=2.0.25API reference
For detailed documentation of all features and configuration options, head to theChatVertexAI API reference.