Dynamic GCP credentials are not just about rotating secrets; they’re about ephemeral identity for your applications.
Let’s see Vault’s GCP secrets engine in action. Imagine you have an application running on GKE that needs to access a Cloud Storage bucket. Instead of baking a long-lived service account key into your Kubernetes deployment, you can have Vault mint short-lived credentials on demand.
First, you configure the GCP secrets engine in Vault:
{
"type": "gcp",
"config": {
"project_id": "your-gcp-project-id",
"ops_users_gcp_key": "LS0tLS1CRUdJTiBQU0sgRVJTSU9OIFZFUlNJT04gMS4wLS0tLS0K... your GCP service account key here ...LS0tLS1FTkQgUFNLIEVSU0lPTiAxLjAtLS0tLQo=",
"token_reviewer_jwt": "eyJhbGciOiJSUzI1NiIsImtpZCI6IjE2YjEyM2... your GCP token reviewer JWT here ...M2RlZjhhMTAyNGM1ODhiYjhhY2I3YmI0YjQxZTZhM2I5Yw"
}
}
This ops_users_gcp_key is a service account key that Vault uses to act as an authorized entity within your GCP project. The token_reviewer_jwt is used to verify tokens issued by GCP itself.
Then, you define a role within Vault that maps to a GCP service account:
{
"type": "gcp",
"config": {
"role_name": "gke-storage-reader",
"service_account_name": "vault-gke-reader-{{.ProjectID}}",
"scopes": [
"https://www.googleapis.com/auth/devstorage.read_only"
],
"project_id": "your-gcp-project-id",
"ttl": "15m"
}
}
Here, vault-gke-reader-{{.ProjectID}} is the name of a GCP service account that Vault will create or manage. Vault will ensure this service account exists, grant it the specified scopes (read-only access to Cloud Storage), and then mint short-lived OAuth2 tokens for it. The ttl of 15m means these tokens will be valid for only 15 minutes.
Now, your GKE application, running with a Kubernetes Service Account that has been granted permission to authenticate to Vault, can request credentials:
vault write gcp/creds/gke-storage-reader
Vault will respond with a JSON payload containing client_email, private_key, and private_key_id – effectively a temporary GCP service account key. Your application can then use these credentials to authenticate with GCP services, like Cloud Storage.
The core problem this solves is the management of long-lived service account keys, which are a major security risk. When a key is compromised, it provides broad access for its entire lifetime. Dynamic credentials, with their short TTLs, drastically reduce the window of opportunity for an attacker. Vault acts as the trusted intermediary, abstracting away the complexity of GCP IAM for your applications. It creates and manages the necessary GCP service accounts and mints short-lived tokens, ensuring that even if a credential is leaked, it’s only useful for a very brief period. The scopes on the Vault role directly translate to IAM roles or permissions granted to the dynamically created GCP service account, enforcing the principle of least privilege.
What most people miss is how Vault actually provisions the GCP service accounts. It doesn’t just mint tokens; it uses the ops_users_gcp_key to interact with GCP’s IAM API. This means the service account associated with ops_users_gcp_key needs sufficient permissions in GCP to create, update, and delete service accounts, grant them roles, and manage their keys. If Vault can’t create the target service account because the ops_users_gcp_key lacks the iam.serviceAccountAdmin role, for example, the vault write gcp/creds/... command will fail with a permission denied error, even if Vault itself is healthy.
Once you’ve mastered dynamic credentials, the next logical step is integrating Vault’s certificate management for TLS termination.