© Chaminda Chandrasekara and Pushpa Herath 2020
C. Chandrasekara, P. HerathHands-on Azure Pipelineshttps://doi.org/10.1007/978-1-4842-5902-3_6

6. Creating Build Pipelines –Classic-Queuing, Debugging, Task Groups, Artifacts, and Import/Export Options

Chaminda Chandrasekara1  and Pushpa Herath2
(1)
Dedigamuwa, Sri Lanka
(2)
Hanguranketha, Sri Lanka
 

We have discussed many useful features of Azure build pipelines in the previous two chapters. In them we talked about how we can set up a build pipeline using build tasks, variables, build job options, usage of different source control systems with builds, using builds to protect branches, and several other options and features.

In this chapter, we explore a few more features such as queueing builds and enabling diagnostic info with the debug build mode using variables in PowerShell scripts, usage of OAuth tokens, grouping tasks for reusability, usage of an agentless phase, importing and exporting builds, and organizing the builds into folder structures for maintainability.

Lesson 6.01: Queuing Builds and Enabling Debugging Mode for More Diagnostic Information

While working with build pipelines, we need to learn how to fix build failures quickly. Consider a situation where you are in the middle of a critical client release and it needs to be pushed quickly to production, but what if your build gets failed. You all may have experienced the pressure you get from the team when these types of failures happen. To solve the build failure quickly, it is necessary to identify the issue quickly. After the build fails, we need to read the build logs to understand the reason for failure. But sometimes log data provided are not enough to identify the real reason for the failure. Let’s see the same build task logs with the debug state as false and the debug state as true to see the benefit we get when we diagnose issues with the debug mode on.

The following NuGet restore task has executed with the debug mode set to false. It has 178 log lines available. See Figure 6-1.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig1_HTML.jpg
Figure 6-1

Build log with debug false state

If we run the same build with the debug set as true in the variables (system.debug variable), you are able to see the difference between the two logs. It provides more details than the previous one. As shown in the following image, the same NuGet Restore step has 785 log lines, which means it provides more information than the debug false. See Figure 6-2.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig2_HTML.jpg
Figure 6-2

Build log with debug true

So, after the build failure, execute the build with the debug value set as true, which provides more details that you can use to identify the build failure reasons easily.

This lesson discussed that we can set the debug variable value to true and get more information on the failures with which we can easily identify the issues with the build.

Lesson 6.02: Setting Variable Values in PowerShell Scripts

While configuring build pipelines, pipeline tasks need various input values. Sometimes it can be a project name or a folder path, or a different set of values based on the type of project you are building. We all know it is good practice to have parameterized values rather than using hard-coded values in build steps. Hence, in pipelines we declare variables under the variable section. All the variables defined in the variable section of the pipeline can be used in any agent phase in the pipeline. Without declaring variables in the pipeline variable section, you can define dynamic variables for an agent phase using PowerShell scripts as well. These dynamic variables only belong to the agent phase to where the PowerShell script belongs.

Dynamic variables are very useful when you work with external tools like Octopus, which has greater variable management capability with multidimensional and scoped variables. Assume you have defined an Octopus project and it has a variable set with some values. You need to read the values from the Octopus variable set and use those values with the tasks of the Azure build pipeline. You can write a PowerShell script to dynamically create the agent phase variables with the same variable names used in Octopus and apply the value obtained from Octopus. See Figure 6-3.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig3_HTML.jpg
Figure 6-3

Using Octopus variables in Azure DevOps pipelines

All magic is done using this code line.
"##vso[task.setvariable variable=" + $octopusVariable.Name + "]" + $variableValue

It creates a variable with the given variable name and assigns the given variable value. For example, a variable could be specified in Octopus as environment and it could have value develop or prod, etc., which is getting applied as a new variable in the Azure Pipeline. These variable values can be used by any task inside the agent phase.

This lesson discussed the very useful feature that allows you to create the build pipeline variables by dynamically using a PowerShell script.

Lesson 6.03: Accessing Secret Variable Values in PowerShell

As discussed in previous lessons, there are various types of build tasks that can be used to configure build pipelines for various requirements. In most situations, we need to use PowerShell scripts to automate some pipeline tasks. So, it is good to have an idea about how PowerShell scripts can use variables in a pipeline. See Figure 6-4.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig4_HTML.jpg
Figure 6-4

PowerShell task

While working with projects, we need to work with different types of values. Some can be shared publicly and some need to be secret. So, these secret values need to be treated differently due to the protection level required by them.

The variable values defined in the build pipelines are used by the agent by creating environment variables inside the agent. But for the secret values, it doesn’t add any values in the agent environment variables. So, in the PowerShell scripts, we can only access the variables by using a $env:variablename or $(variablename) for the non-secret variables. But for secret variables, as the agent does not create environment variables in it, we cannot access the secret variables with the $env:variable format. The only possible way to access secret variables in PowerShell would be with a $(variablename) format.

We were able to learn how the secret variables get handled in the PowerShell scripts and how those secret variables behave in the pipeline. Also, we discussed the reasons for behavior of the secret variables in build pipelines in this lesson.

Lesson 6.04: Using Auth Tokens in the Builds

As we already discussed in previous lessons of this book, there are so many configuration options available in the Azure DevOps build pipeline agent phase. Those configurations values can be used to make the build process efficient and effective. In this lesson, we will talk about the OAuth configuration in the agent phase.

While working with Azure DevOps, sometimes it is required to use the Azure REST API endpoint to create, delete, update, and retrieve the Azure DevOps service resources. So, mostly PowerShell tasks are used to execute REST API calls in the build pipelines. As we are already aware, before executing any REST API call, it is necessary to use authentication mechanisms to allow the API to perform authorized operations. In Azure DevOps, the Personal Access Token (PAT) is the most common way of providing authentication. But a OAuth configuration in the agent phase allows us to execute API calls without using a PAT as a parameter for authentication.

The enable OAuth token configuration in the Azure DevOps build pipeline enables the scripts and other process launched by tasks to access the OAuth token through the SYSTEM.ACCESS.TOKEN variable. When access to the system access token is enabled, it is possible to use a $env:SYSTEM_ACCESSTOKEN environment variable in the task scripts that you are executing in a build pipeline job. See Figure 6-5.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig5_HTML.jpg
Figure 6-5

Enable OAuth token

The following code sample shows how to list the builds using an API script. It uses a $env:SYSTEM_ACCESSTOKEN variable in a task script for authentication. If the Build pipeline OAuth configuration is not enabled, the script will not work because the $env:SYSTEM_ACCESSTOKEN cannot get a value as it is only allowed when a build pipeline OAuth configuration is enabled.
$url = $env:SYSTEM_TEAMFOUNDATIONCOLLECTIONURI + $env:SYSTEM_TEAMPROJECTID +  "/_apis/build/builds?api-version=5.1"
Write-Host "URL: $url"
$pipeline = Invoke-RestMethod -Uri $url -Headers @{
    Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
}
Write-Host "Pipeline = $($pipeline | ConvertTo-Json -Depth 100)"

In this lesson, we have discussed how to use the OAuth configuration in build pipelines and the importance of this configuration.

Lesson 6.05: Creating and Using Task Groups

One of the important features of the pipeline is a pipeline task. While we configure build pipelines, sometimes we need to use the same set of tasks in multiple build pipelines. Assume a project developed with the microservices architecture. Let’s say that Azure functions have been used to develop the microservices architecture and we need to configure a separate build pipeline for each function. In this type of situation, we use the same set of steps in each function build pipeline. It has a build task to build the code, a NuGet pack task to package the built output, and a NuGet push task to push the packed content to the Azure DevOps artifacts feed. If we have 100 functions in the project, we need to create 100 pipelines to build those. But instead of repeating the same work 100 times, we can create task groups to reduce the effort we put in to configure the pipelines.

A task group is grouping a set of repetitive tasks and maintaining it as the shared component for multiple pipelines. If we consider the situation where the project has more build pipelines that use the same set of tasks, we can create a task group using the repetitive tasks and pass parameters to it, using each build pipeline so that it builds and packages different projects.

Let’s see how we can create a task group easily with an existing set of build steps. Create one complete pipeline with the all necessary tasks included in it. After that, if there are any input values to each task, parameterize those values. Now select the tasks that you want to add to the task group and create a task group. See Figure 6-6.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig6_HTML.jpg
Figure 6-6

Creating task group

After adding the task group, it will be available under the Azure DevOps task group section. Now we can use the task group to create the build pipelines. See Figure 6-7.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig7_HTML.jpg
Figure 6-7

Adding a task group to build

If we create a task group, we can share this with other projects too. You can export the task group from one project to another project either in the same organization or an external one. Task groups are very usable and productive components that we can configure in Azure DevOps. For example, if you want to configure the same build tasks in the build pipeline in another project or a project in another organization, you can export the task group. When you click on the task group export button, it will download the json file. You can import this file in another team project, and it will automatically create a task group for you.

This lesson explained the use and importance of the task group. We were able to get an idea how to create a task group and the purpose of it. Further, we discussed its capabilities and reusability by allowing us to export and import to projects in the same organization and to projects in external organizations.

Lesson 6.06: Use Agentless Phases

Automated build uses a machine or more machines to do some work for us without any human interaction. We call the machines Agents, and they play a very important role of a task executor when it comes to automated pipelines. However, there are situations where you need to do some activities that do not require a machine to perform tasks such as waiting for an approval. For these waiting type of purposes, you can use agentless phases in build pipelines. Let’s discuss agentless phase capabilities in this lesson.

An agentless phase has the tasks that can perform without help from the agent machine. Most of the tasks are manual intervention tasks or actions that depend on the data retrieved from the external query or API. As you already know, one build pipeline can have more than one agent phase. Similarly, you can add more than one agentless phase to the build pipeline if required. See Figure 6-7. You can order the agent phase or agentless jobs, with dependencies to make a sequence of execution or enable parallel execution according to the requirements. See Figure 6-8.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig8_HTML.jpg
Figure 6-8

Add agentless phase

When it comes to an agentless phase, it has tasks available with it that do not require the agent machine involvement to perform the task. As an example, in some situations we might need to make a time gap between two tasks available in the build pipeline. Assume we execute a script to apply changes to an existing resource in Azure or want to provision a new resource. It takes some time to apply those changes to the resources in Azure. It might be required to wait until the change is fully applied before continuing the build pipeline execution in the next task as it depends on the availability of the change you made to the Azure resource. Hence, we can use agentless tasks to wait for the required time period to get the changes to be applied to the Azure resource and continue with another agent phase for the next task, which depends on the Azure resource change. See Figure 6-9.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig9_HTML.jpg
Figure 6-9

Delay task of agentless phase

Another useful task available in the agentless phase is the Query work items task. Consider a situation where you need to package the artifacts if and only if all the work items are marked as completed in the sprint. Hence, what you need to do is write a query to get the count of the work items in the to-do or in-progress status in the specific sprint. If there are any incomplete work items in the sprint, you should stop the build pipeline without creating the artifacts. See Figure 6-10.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig10_HTML.jpg
Figure 6-10

Query work items task of agentless phase

Another useful task is a manual intervention task that can wait for the user to approve or reject the further execution of the pipeline. This type of manual intervention helps to do any required manual verification of the executed steps before further executing the pipeline.

Other than these tasks, there are a few other agentless tasks available in the marketplace.

After going through this lesson, you were able to learn about the agentless phase available in the Azure DevOps build pipeline. Further, we have discussed a few tasks specific to the agentless phase and the usage of those tasks with some practical scenarios.

Lesson 6.07: Publishing Artifacts

Azure DevOps build pipelines are used to get the source from the repo, build the code, test the code, publish built binaries, and package it as deployable artifacts. Published artifacts are the outcome of most of the build pipelines. Azure DevOps uses different ways to save the build artifacts. One method is keeping the published artifacts in the same build pipeline. Further, you can publish the artifacts as NuGet packages to the Azure DevOps NuGet feed or an external NuGet feed. Their way of saving artifacts is keeping those in a shared file location.

The most well-known, simple way of keeping artifacts is to save the published artifacts to the pipeline itself using the publish artifacts task. After completing the build, you will be able to find the artifacts attached to the build pipeline if you utilize the artifacts drop as the same build. See Figure 6-11.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig11_HTML.jpg
Figure 6-11

Published artifacts attached to pipeline

Artifacts attached to the build pipelines have a shorter lifetime as they will be dependent on the build retention time.

Using the same publish build output task, it is possible to publish the artifacts into a given file’s share path. It is worth it to keep a file share when your concern is security and you want to use it as an on-prem agent to publish your build artifacts to a shared folder with your network, which is not accessible by those outside of your corporate network.

In this lesson we briefly discussed build artifacts. In Chapter 7, we discuss these different artifact publishing options, discussed in more detail, and with usage scenarios.

Lesson 6.08: Exporting and Importing Build Definition

As we know, sometimes we get requirements to set up build pipelines for multiple projects, mostly for a similar type of build needs. In that type of situation, it would not be worth it to spend more time to set up each build pipeline manually from scratch. Azure DevOps has the capability to export and import the build pipelines that allow us to set up build pipelines easily when we need similar builds in multiple projects.

In a situation where you have to set up multiple, similar build pipelines in a single project, you can easily clone the build pipeline and update according to the requirements. Or you can export the existing build and import to create a new build pipeline in the same team project. Go to the build pipeline, which you need, to export and click on export options. See Figure 6-12.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig12_HTML.jpg
Figure 6-12

Export build pipeline

It will download the json file. For importing the pipeline, you can import the json file to the Azure DevOps project and it will create a build pipeline. See Figure 6-13.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig13_HTML.jpg
Figure 6-13

Import json of build pipeline

But if you want to export a pipeline and import it to a different team project in the same or in a different Azure DevOps organization, it is not as straightforward as explained above for importing to the same team project.

Before we import the build pipeline json to another project, it is required to make a small change to the exported json file. Azure DevOps projects have unique ids for each team project. When you export the build pipeline, it contains the project id of the source team project in the json file. This project id is required to be replaced with the project id of the destination team project id. Otherwise, it gives an error when trying to import the build for a target team project and saving it, due to the differences of the project id in the json. So, first you must find the destination and source project ids using the following REST API call. You can run the following REST API call by changing the relevant organization name and project name to find the project id. See Figure 6-14.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig14_HTML.jpg
Figure 6-14

Project id

https://dev.azure.com/yourorgname/_apis/projects/teamprojectname?api-version=5.0

After finding the destination project id and source project id, replace the exported json file and source project id with the destination project id. Then you can import it to the destination team project and save it to create a build pipeline.

This lesson discussed the build pipeline export and import features available in Azure DevOps and their uses. Further, we were able to learn a technique to export and import the build pipelines between team projects and Azure DevOps organizations.

Lesson 6.09: Organizing Build into Folder

Depending on the project architecture, there can be multiple build pipelines in a single team project. As an example, if the team is developing the system using microservices architecture, it is required to set up separate build pipelines for each microservices component. It would be good to organize the build pipelines in a more manageable way to increase its maintainability.

This lesson discusses how to create a folder structure and organize the build pipelines in a more manageable way within the team project. See Figure 6-15.
../images/493403_1_En_6_Chapter/493403_1_En_6_Fig15_HTML.jpg
Figure 6-15

Build folder structure

Let’s consider a mobile development project that uses Azure functions as back-end microservices components. The function builds can be put under a folder named Functions. The mobile build can be organized under a folder named Mobile. Also, if the Infrastructure provisioned using the scripts, those Infra builds can be put to an Infra folder. Likewise, all the build pipelines should be categorized using meaningful folder structure. It will help users to easily access the relevant build pipelines without scrolling through all of the build definitions.

This lesson discussed the importance of having a good, organized folder structure to keep build definitions, which helps users to easily identify and maintain build pipelines.

Summary

In this chapter, we discussed more useful configurations and features available with Azure DevOps builds. As we explained, the debug mode of the build is very important to go through the build failure logs and identify the failure reasons. Also, we were able to discuss some useful features available in the Azure DevOps Pipelines while working with PowerShell scripts. Further, we talked about the use of the task groups and build artifacts, which are a very important part of Azure DevOps build pipelines. Additionally, we were able to get an idea of how to import and export build pipelines between team projects in the same organization or external organization, which is very useful when there is a requirement to copy similar build pipelines between projects. Finally, we discussed the importance of having a well-organized folder structure to keep build pipelines.

In the next chapter, we discuss build artifacts in detail to identify different options we can use to publish artifacts with usage scenarios.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.216.239.46