Connecting Azure DevOps Pipelines with HashiCorp Vault

NewsConnecting Azure DevOps Pipelines with HashiCorp Vault

Integrating Microsoft Azure DevOps pipelines with HashiCorp Vault has historically been a complex task. Unlike GitHub, Azure DevOps lacks an implicit platform-level identity for its pipelines, which complicates the process of integrating with external services like Vault. Previously, the best practice involved leveraging Azure platform features such as service connections and service principal names (SPNs) to create a platform identity that allowed secure interaction with Vault. However, this approach had its limitations and challenges.

In February 2024, Microsoft announced the general availability of workload identity federation (WIF) for Azure DevOps. This feature significantly simplifies the integration process by providing a passwordless, secure method of authentication using the widely adopted OpenID Connect (OIDC) standard. WIF resolves the “secret zero problem” that plagued previous integrations, eliminating the need to manage and secure credentials manually. This enhancement allows Azure DevOps pipelines to access secret data managed centrally by HashiCorp Vault, including static and dynamic secrets and other credentials.

This post demonstrates the integration using HCP Vault Dedicated, HashiCorp’s cloud-hosted, single-tenant Vault offering. The principles discussed are also applicable to Vault Community and Vault Enterprise, provided they are hosted in a location accessible from within Azure.

Configuring Azure DevOps for Workload Identity Federation

There are two methods to configure WIF within an Azure DevOps service connection:

The first and recommended method is to let Azure DevOps automatically create all the necessary objects within Azure DevOps and Entra ID for you.

The second method involves integrating the Azure DevOps service connection with an existing workload identity and managing the integration touchpoints yourself. This method offers more control over the configuration, especially if existing service principals or managed service identities (and their associated SPNs) already scoped to the roles required by your pipelines are in use.

Neither method is particularly difficult, and the end-to-end configuration can be accomplished using a straightforward Terraform configuration, which is especially beneficial for large-scale implementations.

This post focuses on the first method of integration and the Terraform configurations required to set it up.

Creating the Service Connection with WIF

To configure the service connection using Terraform, you need four pieces of information:

  1. A project_id, representing the Azure DevOps project
  2. An azurerm_spn_tenantid, representing the Azure tenant
  3. An azurerm_subscription_id, representing the Azure subscription
  4. An azurerm_subscription_name

Using Microsoft’s Terraform provider for Azure DevOps, you can create the service connection. Here is an example configuration:

“`hcl
resource “azuredevops_serviceendpoint_azurerm” “automatic” {
project_id = azuredevops_project.this.id
service_endpoint_name = “AzureRM Service Connection for Vault with Automatic WIF”
service_endpoint_authentication_scheme = “WorkloadIdentityFederation”
azurerm_spn_tenantid = “00000000-0000-0000-0000-000000000000”
azurerm_subscription_id = “00000000-0000-0000-0000-000000000000”
azurerm_subscription_name = “Subscription Name”
}
“`

Next, extract the service principal object ID associated with the service principal created by this configuration using the Terraform provider for Entra ID:

“`hcl
data “azuread_service_principal” “auto_wif” {
client_id = azuredevops_serviceendpoint_azurerm.automatic.service_principal_id
}

output “auto_wif_object_id” {
value = data.azuread_service_principal.auto_wif.object_id
description = “The service principal object ID for the automatically generated service principal”
}
“`

This service principal object ID can then be used in your HashiCorp Vault configuration to bind a role on the Azure auth method to specific service principals.

Configuring HashiCorp Vault

Enable and configure the Azure auth method using standard Vault CLI commands, adapting the pattern from the Azure auth method documentation:

“`bash
vault auth enable -path=”ado” azure
vault write auth/ado/config \
tenant_id=00000000-0000-0000-0000-000000000000 \
resource=https://management.core.windows.net/ \
client_id=00000000-0000-0000-0000-000000000000 \
client_secret=sUp3r~S3Cr3t~cl1enT_S3Cr3t
“`

Configure a role on the Azure auth method for the Azure DevOps pipelines to use during login:

“`bash
vault write auth/ado/role/pipeline-role \
token_policies=”default,azure-policy” \
token_ttl=600 \
token_type=batch \
bound_service_principal_ids=”00000000-0000-0000-0000-000000000000″
“`

In large-scale environments, using lightweight batch tokens instead of service tokens can reduce the load on the Vault cluster. Ensure the token time-to-live (TTL) is sufficient for the pipeline to complete its tasks.

Automating Configuration with Terraform

To automate the configuration, use the Terraform provider for Vault:

“`hcl
resource “vault_auth_backend” “ado” {
type = “azure”
path = “ado”

tune {
listing_visibility = “unauth”
}
}

resource “vault_azure_auth_backend_config” “ado” {
backend = vault_auth_backend.ado.path
tenant_id = “00000000-0000-0000-0000-000000000000”
client_id = “00000000-0000-0000-0000-000000000000”
client_secret = “sUp3r~S3Cr3t~cl1enT_S3Cr3t”
resource = “https://management.core.windows.net/”

lifecycle {
ignore_changes = [client_secret]
}
}

resource “vault_azure_auth_backend_role” “ado” {
backend = vault_auth_backend.ado.path
role = “pipeline-role”
token_ttl = 600
token_type = “batch”
token_policies = [“default”, “pipeline-policy”]
bound_service_principal_ids = [“00000000-0000-0000-0000-000000000000”]
}
“`

Configure Azure DevOps Pipeline Tasks

To authenticate with Vault and retrieve secret data, follow these four steps in the Azure DevOps pipeline:

  1. Establish your platform identity
  2. Retrieve a JWT from the platform representing that identity
  3. Present that JWT to Vault in an authentication request to retrieve a Vault token
  4. Use the Vault token from the authentication response to request secret data

Steps 1-3 can be completed within a single pipeline task:

“`yaml
– task: AzureCLI@2
displayName: “Establish Identity and Authenticate with Vault”
env:
VAULT_ADDR: https://my-hcp-vault-cluster-00000000.00000000.z1.hashicorp.cloud:8200
VAULT_NAMESPACE: admin
inputs:
azureSubscription: ‘AzureRM Service Connection for Vault with Automatic WIF’
scriptType: ‘bash’
scriptLocation: ‘inlineScript’
inlineScript: |
JWT=$(az account get-access-token –query accessToken –output tsv)
VAULT_TOKEN=$(vault write -format=json auth/ado/login role=pipeline-role jwt=”$JWT” | jq -r .auth.client_token)
echo “##vso[task.setvariable variable=VAULT_TOKEN]$VAULT_TOKEN”
“`

Once the Vault token is acquired, it can be reused within the scope of the pipeline:

“`yaml
– task: Bash@3
displayName: Retrieve a Pipeline Secret
env:
VAULT_ADDR: https://my-hcp-vault-cluster-00000000.00000000.z1.hashicorp.cloud:8200
VAULT_NAMESPACE: admin
inputs:
targetType: ‘inline’
script: |
vault kv get -mount=kvv2 -format=json config/pipeline
“`

The Vault token is used to retrieve KV data or any other secrets engine the token’s policy permits access to.

Considerations

The configuration does not resolve all identity challenges associated with Azure DevOps and Vault. The service connection is a project-level construct within Azure DevOps, which does not allow fine-grained scoping of identities. Enforcing the principle of least privilege relies on creating multiple service connections in the Azure DevOps project for each desired scope and configuring Vault appropriately.

Service connections and their identities should be bound to their respective roles in Vault using the bound_service_principal_ids parameter. This ensures that a pipeline can authenticate to Vault only against explicitly permitted Vault roles. Automation with Terraform can ease the burden of managing service connections and Vault roles at scale.

Summary

This tutorial demonstrates how to enable Azure DevOps pipelines to authenticate securely with HashiCorp Vault in a passwordless manner using workload identity federation. This integration allows consuming secrets from any of Vault’s secrets engines, including static secrets, dynamic database credentials, cloud provider credentials, and PKI certificates.

If you are an Azure DevOps user looking to centralize secrets management for your pipelines, consider registering on the HashiCorp Cloud Platform to try out this integration with HCP Vault Dedicated.

For more Information, Refer to this article.

Neil S
Neil S
Neil is a highly qualified Technical Writer with an M.Sc(IT) degree and an impressive range of IT and Support certifications including MCSE, CCNA, ACA(Adobe Certified Associates), and PG Dip (IT). With over 10 years of hands-on experience as an IT support engineer across Windows, Mac, iOS, and Linux Server platforms, Neil possesses the expertise to create comprehensive and user-friendly documentation that simplifies complex technical concepts for a wide audience.
Watch & Subscribe Our YouTube Channel
YouTube Subscribe Button

Latest From Hawkdive

You May like these Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.