Preface

Azure DevOps allows us to define our CI/CD workflows as YAML pipelines, which we can store in Git repositories. This approach enables better version control and the ability to create reusable and modular pipelines.

One powerful feature in YAML pipelines is parameters, which allow you to pass data into templates or pipeline steps. Among these, the **object** parameter type is particularly useful for handling complex or nested data structures.

In this article, we’ll explore how to use the object parameter type to create cleaner, more flexible pipelines, and avoid repeating the same logic for different environments — following the DRY (Don’t Repeat Yourself) principle.

Parameter type: object

The object type allows us to group multiple values — including strings, arrays, and other objects — into a single parameter. This is especially helpful when dealing with multiple environments like dev, test, or prod, where each environment might require specific values.

You can use objects to:

  • Loop through multiple values
  • Dynamically create stages and jobs
  • Pass structured data to templates

Example 1: Simple Object

This is a basic use case where we pass an array of environments and echo their names in a script step.

parameters:  
- name: environment  
  type: object  
  default:   
  - dev  
  - test  
stages:  
- ${{ each value in parameters.environment }}: # Run the step for each environment  
  - stage: Echo_${{ value }}   
    - script: echo ${{ value }} # Output the current value in the iteration

Output: This will create two stages: Echo_dev and Echo_test.

Example 2: Nested Object

We can define a more structured object where each environment has its own properties — like the name of the agent to use.

azure-pipelines.yaml

parameters:  
- name: environments  
  type: object  
  default:   
    dev: # passing Environment as nested object  
      agentName: devagent01    
      name: dev  
    test:  
      agentName: testagent01  
      name: test  
  
  
stages:  
- ${{ each pair in parameters.environments }}:  
  - ${{ each env in pair.value }}:  
    - stage: Initiate_${{ env.name }}  
      displayName: Initiate ${{ env.name }}  
      jobs:  
      - job: Initiate_Job_${{ env.name }}  
        steps:  
        - script: |  
            echo "AgentName: ${{ env.agentName}}  
        displayName: 'Display Parameters'

Benefit: We can easily add new environments without repeating the stage and job structure.

Example 3: Nested Object with template

To further modularize our pipeline, we can pass the environment-specific values to a template file. This keeps the main pipeline clean and reusable.

azure-pipelines.yaml (Main pipeline)

azure-pipelines.yaml

parameters:  
- name: environments  
  type: object  
  default:   
    dev: # passing Environment as nested object  
      agentName: devagent01    
      name: dev  
    test:  
      agentName: testagent01  
      name: test  
  
stages:  
- ${{ each env in parameters.environments }}: # Run the template for each environment  
  - template: templates/stage-template.yml  
    parameters:  
      env: ${{ env }}

templates/stage-template.yml (Template File)

Template File: templates/stage-template.yml

parameters:  
- name: env         # Values passed from azure-pipelines.yaml to this template  
  type: object  
  
stages:  
  - stage: STAGE1_${{ parameters.env.name }}  
    jobs:  
      - job: JOB1  
        pool:  
          vmImage: ${{ parameters.env.agentName }} #Environment specific Values  
        steps:  
          - script: echo ${{ parameters.env.name }

Benefit: You can now maintain one generic template and use it for multiple environments by simply passing different values.

Summary

Using object parameters in Azure DevOps pipelines allows you to:

  • Avoid repetition by looping over structured data
  • Dynamically create stages or jobs
  • Encapsulate environment-specific values
  • Use templates for cleaner, maintainable pipelines

This approach is ideal for teams managing multiple deployment environments, helping them stay organized and consistent across the pipeline configuration.