<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: James Cook</title>
    <description>The latest articles on Forem by James Cook (@officialcookj).</description>
    <link>https://forem.com/officialcookj</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/officialcookj"/>
    <language>en</language>
    <item>
      <title>Azure Functions Deployment using GitHub Actions</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Wed, 09 Mar 2022 13:13:53 +0000</pubDate>
      <link>https://forem.com/officialcookj/azure-functions-deployment-using-github-actions-2fa2</link>
      <guid>https://forem.com/officialcookj/azure-functions-deployment-using-github-actions-2fa2</guid>
      <description>&lt;p&gt;Have you ever been in a situation where you have been asked to set up CI/CD for your new Azure Function App? Have the requirements been to use a single repository and workflow in GitHub to distribute the application to deployment slots?&lt;/p&gt;

&lt;p&gt;In this post, you will follow step by step instructions to set up deployment slots within your Azure Function resource, implement a branching strategy in your repository, identify and create environments in GitHub, and assemble a single workflow to run in GitHub Actions as your CI/CD.&lt;/p&gt;

&lt;h2&gt;
  
  
  What you need
&lt;/h2&gt;

&lt;p&gt;You would have already created the Azure Function resource in advance of following the below. You will also need access to this resource to configure it and download publication profiles.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deployment Slots
&lt;/h2&gt;

&lt;p&gt;My workflow for this project will be targeting three deployment slots:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Development&lt;/li&gt;
&lt;li&gt;Test&lt;/li&gt;
&lt;li&gt;Staging&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Deployment Slots allow me to run multiple environments within a single Azure Function App. This will provide a unique public endpoint for each of them, allowing me to develop and test without impacting the staging and live slots. I haven't listed a production slot, and this is because a default slot is available after creating an Azure Function for production use.&lt;/p&gt;

&lt;p&gt;If you want to create a Deployment Slot:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Open Azure Function App&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Deployment Slots&lt;/strong&gt; from the side menu&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Add Slot&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Provide a name for your slot. The Function App name will automatically appear at the start, so it's unique across the &lt;code&gt;.azurewebsites.net&lt;/code&gt; domain. You can then click the &lt;strong&gt;Add&lt;/strong&gt; button when you are ready.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Branch Strategy
&lt;/h2&gt;

&lt;p&gt;I will be using three git branches to represent the slots I've created above. They are:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;main&lt;/strong&gt; (will deploy to Staging slot)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;test&lt;/strong&gt; (will deploy to Test slot)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;develop&lt;/strong&gt; (will deploy to the Development slot)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tvb7Vymr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1645342952895/DLSY--gfv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tvb7Vymr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1645342952895/DLSY--gfv.png" alt="image.png" width="291" height="143"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I will not require a branch for production as I will either run a manual Swap of the Staging slot with Production or incorporate a release approval gate that will trigger the Swap with Staging and Production in the workflow.&lt;/p&gt;

&lt;h2&gt;
  
  
  Publish Profile
&lt;/h2&gt;

&lt;p&gt;To publish our code to a deployment slot, we require the &lt;strong&gt;Publish Profile&lt;/strong&gt; for each slot (excluding production). To do this, follow each step below and repeat for each slot you have:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Open Azure Function App&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Deployment Slots&lt;/strong&gt; from the side menu&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select a Slot (one with the blue hyperlink)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;At the top of the window, select &lt;strong&gt;Get Publish Profile&lt;/strong&gt;. This will download a copy of the publish profile as a file.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Repeat these steps per deployment slot.&lt;/p&gt;

&lt;h2&gt;
  
  
  Environments
&lt;/h2&gt;

&lt;p&gt;I will create an environment in GitHub for every deployment slot I created, allowing me to associate each publish profile to its environment using secrets. To create an environment:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Open your GitHub repository in the Web UI&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Settings&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From the side menu, select &lt;strong&gt;Environments&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click the &lt;strong&gt;New environment&lt;/strong&gt; button&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Give your environment a name and click &lt;strong&gt;Configure environment&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Repeating the above steps, I have the following environments:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Development&lt;/li&gt;
&lt;li&gt;Test&lt;/li&gt;
&lt;li&gt;Production (where we will publish to the Staging slot)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Nwwto6Vv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1645343539268/kfWPEr6AK.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Nwwto6Vv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1645343539268/kfWPEr6AK.png" alt="image.png" width="617" height="293"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I have enabled &lt;strong&gt;Required reviewers&lt;/strong&gt; in each environment and added myself. When we run a job within a workflow against any of these environments, I do not want it to proceed until my approval. If you don't have this selected, our workflow will automatically deploy to the environment once the Pull Request merges.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TjqRSRGj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1645343525105/VZOvcM8O-.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TjqRSRGj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1645343525105/VZOvcM8O-.png" alt="image.png" width="542" height="184"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Secrets
&lt;/h2&gt;

&lt;p&gt;Within each environment, we will be creating a new secret that will contain the contents of the publish profile. First, select &lt;strong&gt;Add Secret&lt;/strong&gt; under the heading &lt;strong&gt;Environments secrets&lt;/strong&gt; in the environment you are configuring.&lt;/p&gt;

&lt;p&gt;In the &lt;strong&gt;Name&lt;/strong&gt; field, give your secret an appropriate name. The name you provide will be used when referring to the secret in the workflow, so make it identifiable. In the &lt;strong&gt;Value&lt;/strong&gt; field, copy and paste the content from the &lt;strong&gt;Publish Profile&lt;/strong&gt; file. This includes all the information required for authenticating and deploying your Azure Function App to the correct slot. Select &lt;strong&gt;Save&lt;/strong&gt; when you are ready.&lt;/p&gt;

&lt;p&gt;For this post, my secret will be named &lt;strong&gt;AZURE_FUNCTIONAPP_PUBLISH_PROFILE&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5ND6q0Gq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1645344731232/7R8I21YD0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5ND6q0Gq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1645344731232/7R8I21YD0.png" alt="image.png" width="332" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Workflow
&lt;/h2&gt;

&lt;p&gt;I will break down each part of the workflow we will use to deploy our Azure Function.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Function App Deployment&lt;/span&gt;
&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; 
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;main&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;test&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;develop&lt;/span&gt;
  &lt;span class="na"&gt;workflow_dispatch&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Above is the first part of our workflow. I have defined the workflow's name as &lt;strong&gt;Function App Deployment&lt;/strong&gt; and specified this workflow to run when &lt;strong&gt;push&lt;/strong&gt; (updates) happen on one of the three branches. I have also included &lt;strong&gt;workflow_dispatch&lt;/strong&gt;, so I can run the workflow manually.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;

  &lt;span class="na"&gt;cd-build-deploy-dev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Continuous Deployment - Development&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Development&lt;/span&gt;
    &lt;span class="na"&gt;if&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;github.ref == 'refs/heads/develop'&lt;/span&gt;
    &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_NAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;MyHTTPTrigger&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;.'&lt;/span&gt;
      &lt;span class="na"&gt;DOTNET_VERSION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.0.13'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, we define the jobs we are going to run. We will have in total three jobs, one for each environment. First, we will define the development environment job. We will set the job id to &lt;strong&gt;cd-build-deploy-dev&lt;/strong&gt; and the name to &lt;strong&gt;Continuous Deployment - Development&lt;/strong&gt;. We will be using &lt;strong&gt;ubuntu-latest&lt;/strong&gt;for all the runs, but we will be setting &lt;strong&gt;environments&lt;/strong&gt; to each environment we create in GitHub.&lt;/p&gt;

&lt;p&gt;We use an &lt;strong&gt;if&lt;/strong&gt; statement here to say, "if push happens on the develop branch, run the job". We do this for all three of our jobs to run based on the branch push. If the if statement is not met, the job will skip.&lt;/p&gt;

&lt;p&gt;And we define environment variables for the job. These will be:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AZURE_FUNCTIONAPP_NAME&lt;/strong&gt; - the name of the function app&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/strong&gt; - the path to the code (mine is at the root of the repo so I specified ".")&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;DOTNET_VERSION&lt;/strong&gt; - the version of .NET I am using.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Checkout&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@main&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Setup DotNet 3.1.x Environment&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-dotnet@v1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;dotnet-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;3.1.x&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Resolve Project Dependencies Using Dotnet&lt;/span&gt;
        &lt;span class="na"&gt;shell&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;bash&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;pushd './${{ env.AZURE_FUNCTIONAPP_PACKAGE_PATH }}'&lt;/span&gt;
          &lt;span class="s"&gt;dotnet build --configuration Release --output ./output&lt;/span&gt;
          &lt;span class="s"&gt;popd&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run Azure Functions Action&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Azure/functions-action@v1&lt;/span&gt;
        &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;fa&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;app-name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ env.AZURE_FUNCTIONAPP_NAME }}&lt;/span&gt;
          &lt;span class="na"&gt;package&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env.AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}/output'&lt;/span&gt;
          &lt;span class="na"&gt;publish-profile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AZURE_FUNCTIONAPP_PUBLISH_PROFILE }}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the job is configured, we want to configure each step we require to get the deployment going.&lt;/p&gt;

&lt;p&gt;The first step is &lt;strong&gt;Checkout&lt;/strong&gt;; this will clone the branch to the local runner.&lt;/p&gt;

&lt;p&gt;The second step will set up .NET on the runner. Here I have specified to install the latest version of 3.1. We are using the GitHub action project &lt;a href="https://github.com/actions/setup-dotnet"&gt;setup-dotnet&lt;/a&gt; to complete the installation on the runner.&lt;/p&gt;

&lt;p&gt;The third step will trigger a build ready for the function app to deploy.&lt;/p&gt;

&lt;p&gt;The final step is for deployment to take place to our Function App. We specify environment variables and GitHub secrets to complete the app name, package and publish profile parameters. This uses a supplied GitHub action from Microsoft called &lt;a href="https://github.com/Azure/functions-action"&gt;functions-action&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Now you have to complete this for the other jobs (listed below) and copy the steps below each job (the steps should be identical).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;  &lt;span class="na"&gt;cd-build-deploy-test&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Continuous Deployment - Test&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Test&lt;/span&gt;
    &lt;span class="na"&gt;if&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;github.ref == 'refs/heads/test'&lt;/span&gt;
    &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_NAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;MyHTTPTrigger&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;.'&lt;/span&gt;
      &lt;span class="na"&gt;DOTNET_VERSION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.0.13'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;  &lt;span class="na"&gt;cd-build-deploy-prod&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Continuous Deployment - Production&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Production&lt;/span&gt;
    &lt;span class="na"&gt;if&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;github.ref == 'refs/heads/main'&lt;/span&gt;
    &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_NAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;MyHTTPTrigger&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;.'&lt;/span&gt;
      &lt;span class="na"&gt;DOTNET_VERSION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.0.13'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The final workflow should look similar to this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Function App Deployment&lt;/span&gt;
&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; 
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;main&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;test&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;develop&lt;/span&gt;
  &lt;span class="na"&gt;workflow_dispatch&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;cd-build-deploy-dev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Continuous Deployment - Development&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Development&lt;/span&gt;
    &lt;span class="na"&gt;if&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;github.ref == 'refs/heads/develop'&lt;/span&gt;
    &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_NAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;MyHTTPTrigger&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;.'&lt;/span&gt;
      &lt;span class="na"&gt;DOTNET_VERSION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.0.13'&lt;/span&gt;

    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Checkout&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@main&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Setup DotNet 3.1.x Environment&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-dotnet@v1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;dotnet-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;3.1.x&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Resolve Project Dependencies Using Dotnet&lt;/span&gt;
        &lt;span class="na"&gt;shell&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;bash&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;pushd './${{ env.AZURE_FUNCTIONAPP_PACKAGE_PATH }}'&lt;/span&gt;
          &lt;span class="s"&gt;dotnet build --configuration Release --output ./output&lt;/span&gt;
          &lt;span class="s"&gt;popd&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run Azure Functions Action&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Azure/functions-action@v1&lt;/span&gt;
        &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;fa&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;app-name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ env.AZURE_FUNCTIONAPP_NAME }}&lt;/span&gt;
          &lt;span class="na"&gt;package&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env.AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}/output'&lt;/span&gt;
          &lt;span class="na"&gt;publish-profile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AZURE_FUNCTIONAPP_PUBLISH_PROFILE }}&lt;/span&gt;

  &lt;span class="na"&gt;cd-build-deploy-test&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Continuous Deployment - Test&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Test&lt;/span&gt;
    &lt;span class="na"&gt;if&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;github.ref == 'refs/heads/test'&lt;/span&gt;
    &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_NAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;MyHTTPTrigger&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;.'&lt;/span&gt;
      &lt;span class="na"&gt;DOTNET_VERSION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.0.13'&lt;/span&gt;

    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Checkout&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@main&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Setup DotNet 3.1.x Environment&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-dotnet@v1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;dotnet-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;3.1.x&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Resolve Project Dependencies Using Dotnet&lt;/span&gt;
        &lt;span class="na"&gt;shell&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;bash&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;pushd './${{ env.AZURE_FUNCTIONAPP_PACKAGE_PATH }}'&lt;/span&gt;
          &lt;span class="s"&gt;dotnet build --configuration Release --output ./output&lt;/span&gt;
          &lt;span class="s"&gt;popd&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run Azure Functions Action&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Azure/functions-action@v1&lt;/span&gt;
        &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;fa&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;app-name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ env.AZURE_FUNCTIONAPP_NAME }}&lt;/span&gt;
          &lt;span class="na"&gt;package&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env.AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}/output'&lt;/span&gt;
          &lt;span class="na"&gt;publish-profile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AZURE_FUNCTIONAPP_PUBLISH_PROFILE }}&lt;/span&gt;

  &lt;span class="na"&gt;cd-build-deploy-prod&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Continuous Deployment - Production&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Production&lt;/span&gt;
    &lt;span class="na"&gt;if&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;github.ref == 'refs/heads/main'&lt;/span&gt;
    &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_NAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;MyHTTPTrigger&lt;/span&gt;
      &lt;span class="na"&gt;AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;.'&lt;/span&gt;
      &lt;span class="na"&gt;DOTNET_VERSION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.0.13'&lt;/span&gt;

    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Checkout&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@main&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Setup DotNet 3.1.x Environment&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-dotnet@v1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;dotnet-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;3.1.x&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Resolve Project Dependencies Using Dotnet&lt;/span&gt;
        &lt;span class="na"&gt;shell&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;bash&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;pushd './${{ env.AZURE_FUNCTIONAPP_PACKAGE_PATH }}'&lt;/span&gt;
          &lt;span class="s"&gt;dotnet build --configuration Release --output ./output&lt;/span&gt;
          &lt;span class="s"&gt;popd&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run Azure Functions Action&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Azure/functions-action@v1&lt;/span&gt;
        &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;fa&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;app-name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ env.AZURE_FUNCTIONAPP_NAME }}&lt;/span&gt;
          &lt;span class="na"&gt;package&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env.AZURE_FUNCTIONAPP_PACKAGE_PATH&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}/output'&lt;/span&gt;
          &lt;span class="na"&gt;publish-profile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AZURE_FUNCTIONAPP_PUBLISH_PROFILE }}&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Commit the workflow once completed and update your branches, so they are up-to-date.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deployment
&lt;/h2&gt;

&lt;p&gt;Now we have our workflow in place, we can trigger the workflow either by pushing to a particular branch or running manually.&lt;/p&gt;

&lt;p&gt;For this post, I am going to trigger the deployment manually. To do this, I select the &lt;strong&gt;Actions&lt;/strong&gt; tab and select my workflow from the left side menu. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Bvtf-Ra8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1646828894058/V-c67tp5E.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Bvtf-Ra8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1646828894058/V-c67tp5E.png" alt="image.png" width="114" height="58"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From there, I click on the &lt;strong&gt;Run workflow&lt;/strong&gt; button and select the &lt;strong&gt;develop&lt;/strong&gt; branch to deploy the function app to the development slot. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vh-yzJKD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1646828877210/hqAGUIB6O.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vh-yzJKD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1646828877210/hqAGUIB6O.png" alt="image.png" width="316" height="166"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you've set up reviewers on your environments, the deployment will not start until a reviewer approves the job.&lt;/p&gt;

&lt;p&gt;Once the deployment completes and I can confirm it's working as intended, I can repeat the above to deploy to the testing slot and then the stagging slot.&lt;/p&gt;

&lt;p&gt;Once I am happy for the app to go into the production slot, I will use the &lt;strong&gt;Swap&lt;/strong&gt; button in the &lt;strong&gt;Deployment&lt;/strong&gt; blade to swap Stagging with Production (within the Azure Function App resource).&lt;/p&gt;

</description>
      <category>azure</category>
      <category>github</category>
      <category>serverless</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Azure Key Vault Secrets in GitHub Actions</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 15 Nov 2021 17:02:39 +0000</pubDate>
      <link>https://forem.com/officialcookj/azure-key-vault-secrets-in-github-actions-1naf</link>
      <guid>https://forem.com/officialcookj/azure-key-vault-secrets-in-github-actions-1naf</guid>
      <description>&lt;p&gt;The fundamental rule to a secret is to not share a secret. Once shared it's more likely going to be shared again and in an unsecure format, but how do we keep a secret a secret?&lt;/p&gt;

&lt;p&gt;When it comes to Cloud technology we can use resources that store our sensitive information in a secure environment. For example, Azure Key Vault allows us to store secrets, certificates and keys where we can set access control using authentication methods like Azure AD.&lt;/p&gt;

&lt;p&gt;But when we add secrets into a secure resource like Key Vault, how do we access them when running deployments?&lt;/p&gt;

&lt;p&gt;In this blog post I will be covering how we get the secrets from an Azure Key Vault for a deployment in GitHub Actions.&lt;/p&gt;

&lt;h2&gt;
  
  
  GitHub Workflow
&lt;/h2&gt;

&lt;p&gt;We will need login to Azure using the Azure CLI. The first workflow step will be the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Azure CLI Login&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Azure/login@v1.1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;creds&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;{"clientId":"${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;secrets.AZ_CLIENT_ID&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}","clientSecret":"${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;secrets.AZ_CLIENT_SECRET&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}","subscriptionId":"${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;secrets.AZ_SUBID&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}","tenantId":"${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;secrets.AZ_TENANT_ID&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"}'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following are GitHub Secret values that need to exists before running the workflow:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AZ_CLIENT_ID&lt;/strong&gt; - Service Principal Client ID&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AZ_CLIENT_SECRET&lt;/strong&gt; - Service Principal Client Secret&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AZ_SUBID&lt;/strong&gt; - The Subscription ID you are connecting to as part of this workflow&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AZ_TENANT_ID&lt;/strong&gt; - The Tenant ID where the Service Principal exists&lt;/p&gt;

&lt;p&gt;Once logged via the Azure CLI, we will utilise the Get Key Vault Secrets GitHub Action where we will specify the Key Vault name and the Secrets we want:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Azure Key Vault Secrets&lt;/span&gt;
        &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;azurekeyvault&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Azure/get-keyvault-secrets@v1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;keyvault&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;MyVaultName"&lt;/span&gt;
          &lt;span class="na"&gt;secrets&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;MyFirstSecret,&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;MySecondSecret,&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;MyThirdSecret'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You would replace the following values with your own:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;MyVaultName&lt;/strong&gt; - You would replace this with the name of your Key Vault&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;MyFirstSecret, MySecondSecret, My ThirdSecret&lt;/strong&gt; - Replace these with the name of the secrets in your Key Vault (not the values).&lt;/p&gt;

&lt;p&gt;Now when you want to use these secrets in the workflow, you just need to use the following format:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;steps.azurekeyvault.outputs.MyFirstSecret&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Replace the following for your configuration:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;azurekeyvault&lt;/strong&gt; - This would be the id of the Key Vault action&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;MyFirstSecret&lt;/strong&gt; - Replace this with one of the secret names you listed to get&lt;/p&gt;

&lt;h2&gt;
  
  
  Service Principal Access
&lt;/h2&gt;

&lt;p&gt;The above workflow uses a Service Principal to connect to Azure. It would be used to access the Azure Key Vault and will require access permissions to access the secrets. You can do this within the Key Vault itself, either by using RBAC or Access Control (depending on what authentication method you set for the Key Vault).&lt;/p&gt;

&lt;p&gt;The GitHub Action only gets the secret from Azure Key Vault, meaning you only need to set permissions with the minimum to be able to get the specified secret you want.&lt;/p&gt;

&lt;h2&gt;
  
  
  Example Usage
&lt;/h2&gt;

&lt;p&gt;Below are some examples of using the above Azure Key Vault action to use secrets within other actions.&lt;/p&gt;

&lt;h3&gt;
  
  
  Terraform
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Install Terraform&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;hashicorp/setup-terraform@main&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;terraform_version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;latest&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Terraform Init&lt;/span&gt;
        &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;init&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;terraform init&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Terraform Plan&lt;/span&gt;
        &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;plan&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;terraform plan&lt;/span&gt;
        &lt;span class="na"&gt;continue-on-error&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="no"&gt;true&lt;/span&gt;
        &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;TF_VAR_az_tenant_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AZ_TENANT_ID }}&lt;/span&gt;
          &lt;span class="na"&gt;TF_VAR_MyFirstSecret&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ steps.azurekeyvault.outputs.MyFirstSecret }}&lt;/span&gt;
          &lt;span class="na"&gt;TF_VAR_MySecondSecret&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ steps.azurekeyvault.outputs.MySecondSecret }}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Docker
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Docker Login&lt;/span&gt;
      &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;azure/docker-login@v1&lt;/span&gt;
      &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;login-server&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;myregistry.azurecr.io&lt;/span&gt;
        &lt;span class="na"&gt;username&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ steps.azurekeyvault.outputs.MySecondSecret }}&lt;/span&gt;
        &lt;span class="na"&gt;password&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ steps.azurekeyvault.outputs.MyThirdSecret }}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>azure</category>
      <category>security</category>
      <category>github</category>
      <category>devops</category>
    </item>
    <item>
      <title>Terraform: Remove Resource from a Remote State in Azure Storage Account</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 30 Aug 2021 06:19:50 +0000</pubDate>
      <link>https://forem.com/officialcookj/terraform-remove-resource-from-a-remote-state-in-azure-storage-account-1k3c</link>
      <guid>https://forem.com/officialcookj/terraform-remove-resource-from-a-remote-state-in-azure-storage-account-1k3c</guid>
      <description>&lt;p&gt;Have you been in the situation where cleaning up your Infrastructure as Code (powered by HashiCorp Terraform) to delete deprecated resources resulted in the Terraform apply taking longer than expected? Maybe this is what you are seeing:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;azurerm_backup_protected_vm.rs_name: Still destroying... [id=/subscriptions/***/***], 1h19m50s elapsed]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the deployment timed out I found that the resource was already deleted via the Azure portal. The Terraform state file still believes it exists and it will continue to fail the deployment, how do I resolve the issue?&lt;/p&gt;

&lt;h2&gt;
  
  
  What you need
&lt;/h2&gt;

&lt;p&gt;Based on a Windows client, you will need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.terraform.io/downloads.html"&gt;Terraform&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/cli/azure/install-azure-cli"&gt;Azure CLI&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You will also need a Azure account that has permissions to access the Azure Storage Container which stores the Terraform state file.&lt;/p&gt;

&lt;h2&gt;
  
  
  Steps to resolve the problem
&lt;/h2&gt;

&lt;p&gt;First you should clone your repository so you can locally validate the actions you take have worked (you can complete these steps without cloning but you won't be able to follow steps to validate if the actions worked without running the pipeline again).&lt;/p&gt;

&lt;p&gt;Create a &lt;strong&gt;override&lt;span&gt;&lt;/span&gt;.tf&lt;/strong&gt; in the location where you stored your Terraform configuration files. Within the file set the resource group name, the storage account and container name and key where the remote state file is stored.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="err"&gt;terraform&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;backend&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"azurerm"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;resource_group_name&lt;/span&gt;&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="err"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"resource_group_name"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;storage_account_name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"storage_account_name"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;container_name&lt;/span&gt;&lt;span class="w"&gt;       &lt;/span&gt;&lt;span class="err"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"container_name"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;key&lt;/span&gt;&lt;span class="w"&gt;                  &lt;/span&gt;&lt;span class="err"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"stafe_file_location/terraform.tfstate"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once you've done this and saved the file, run &lt;strong&gt;az login&lt;/strong&gt; (in a terminal of your choice) to authenticate with an account that has access to the Storage Account Container you specified above.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;🚀❯ az login
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now you need to set the subscription you are working with. This should be the subscription that your state file manages.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;🚀❯ az account set --subscription "SUBSCRIPTION NAME"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Azure CLI has now been utilised to complete authentication. You will now need to change the local directory your terminal is using to the location where you have cloned your respoistory. On Windows, changing a directory usually is:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;🚀❯ cd "C:\Users\CloudJames\***\***"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once you are in the correct directory, run the &lt;strong&gt;terraform init&lt;/strong&gt; to initialise the configuration so it downloads providers, modules, etc...&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;🚀❯ terraform init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once completed, you can run &lt;strong&gt;terraform state list&lt;/strong&gt; to list the resources that are in your remote state file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;🚀❯ terraform state list
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The results should appear like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;🦄❯ terraform state list
***
***
***
***
azurerm_backup_protected_vm.rs_name
***
***
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Find the resource that no longer exists in the Azure environment and take note of the name in full (format is &lt;strong&gt;resourcetype.resourcename&lt;/strong&gt;).&lt;/p&gt;

&lt;p&gt;We are now ready to remove the resource from the state file. We will use &lt;strong&gt;terraform state rm&lt;/strong&gt; to achieve this. Here is an example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;🚀❯ terraform state rm azurerm_backup_protected_vm.rs_name
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When ran, you should get an output like the below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;🦄❯ terraform state rm azurerm_backup_protected_vm.rs_name
Removed azurerm_backup_protected_vm.rs_name
Successfully removed 1 resource instance(s).
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To validate this has worked (if you cloned the repo as described at the beginning), you just need to run a &lt;strong&gt;terraform plan&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;🚀❯ terraform plan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should not see the resource listed at all for destruction. This will mean you can run your pipeline again for it to continue as normal.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>terraform</category>
      <category>devops</category>
      <category>operations</category>
    </item>
    <item>
      <title>Azure DevOps: Terraform variables with Azure Key Vault</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 09 Aug 2021 07:23:36 +0000</pubDate>
      <link>https://forem.com/officialcookj/azure-devops-terraform-variables-with-azure-key-vault-1m1</link>
      <guid>https://forem.com/officialcookj/azure-devops-terraform-variables-with-azure-key-vault-1m1</guid>
      <description>&lt;p&gt;We use variables when creating Terraform configuration files to be able to change and adapt our code to be reusable. When configuring variables, the best method for me was selecting a single location for both sensitive (secrets) and non-sensitive (resource names, etc..) information, allowing me to manage variables in one place. In this post I will cover how to use variables in Terraform, how to store variables in Azure Key Vault and how to use these variables in Azure DevOps as part of a deployment.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to configure variables using Terraform
&lt;/h2&gt;

&lt;p&gt;First you need to declare a variable in the Terraform code you are writing. For example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="err"&gt;variable&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"VMPASS"&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;type&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;string&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I would usually put all my variables in a separate file but in the same directory to make it easier for myself to locate and manage them.&lt;/p&gt;

&lt;p&gt;Once configured, you can use the variable in the code by replacing the string you would enter with the variable, for example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="err"&gt;password&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;var.VMPASS&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Make sure you commit your code to a repository in Azure DevOps.&lt;/p&gt;

&lt;h2&gt;
  
  
  Storing variables in Azure Key Vault
&lt;/h2&gt;

&lt;p&gt;Very simple, create your Azure Key Vault if you haven't done so already. From within the Key Vault resource you will need to create a secret by selecting &lt;strong&gt;Secrets&lt;/strong&gt; from the side menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628261195095%2FG4mmUp9uG.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628261195095%2FG4mmUp9uG.png" alt="secrets.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now select &lt;strong&gt;Generate/Import&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628261376220%2F_dp-mvhc0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628261376220%2F_dp-mvhc0.png" alt="generate and import.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And here create the name of the secret for the variable and the value you require for your code. Please note Azure Key Vault does not support curtain characters, for example underscore which is something we require when using external sources for variables. Click &lt;a href="https://docs.microsoft.com/en-us/azure/key-vault/general/about-keys-secrets-certificates#vault-name-and-object-name" rel="noopener noreferrer"&gt;here&lt;/a&gt; for more information but we will cover in the next section how we remap these.&lt;/p&gt;

&lt;h2&gt;
  
  
  Link Azure DevOps to Key Vault
&lt;/h2&gt;

&lt;p&gt;We now need to link our Azure DevOps to Azure Key Vault. Open your project within Azure DevOps and from the side menu select &lt;strong&gt;Pipelines&lt;/strong&gt; then &lt;strong&gt;Library&lt;/strong&gt;. Here select &lt;strong&gt;Variable group&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628262311809%2FzwJRMrREz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628262311809%2FzwJRMrREz.png" alt="variable group.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Give your Variable Group a name and enable the &lt;strong&gt;Link secrets from an Azure key vault as variables&lt;/strong&gt; toggle. From here you want to select the Azure Subscription and Key Vault you created your Terraform variables in (if you haven't linked your Azure subscription to Azure DevOps, use the &lt;strong&gt;Manage&lt;/strong&gt; link to create a Service Principal).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628262521525%2FLPiU_HRXw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628262521525%2FLPiU_HRXw.png" alt="link key vault.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You might be asked to &lt;strong&gt;Authorize&lt;/strong&gt; the access to the Key Vault but once this is done, you can select the &lt;strong&gt;Add&lt;/strong&gt; option to add secrets to your variable library.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628262888632%2Fd6o-DbmVU.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628262888632%2Fd6o-DbmVU.png" alt="add.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now click &lt;strong&gt;Save&lt;/strong&gt; to complete the library creation.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to use variables in your pipeline
&lt;/h2&gt;

&lt;p&gt;Finally we need to either edit an existing pipeline or create a new one. We need to include the variable library we created that connects to Key Vault, to do this you need to select the &lt;strong&gt;Variables&lt;/strong&gt; tab and then select &lt;strong&gt;Variable groups&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628428342719%2FEQyK6Rwku.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1628428342719%2FEQyK6Rwku.png" alt="variable-variablegroups.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here select the &lt;strong&gt;Link variable group&lt;/strong&gt; and select the newly created variable group.&lt;/p&gt;

&lt;p&gt;Now we need to add a &lt;strong&gt;Bash Script&lt;/strong&gt; task to run remap commands so the Azure Key Vault variables are in the supported format. Terraform expects from an external source the format to be &lt;strong&gt;TF_VAR_NAME&lt;/strong&gt; (underscores not supported in Azure Key Vault and why we have to remap), where name is the variable name. Below is an example of the command we need to add to the bash script task, repeated for each variable that needs to be remapped for Terraform.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"##vso[task.setvariable variable=TF_VAR_VMPASS;]&lt;/span&gt;&lt;span class="nv"&gt;$tf&lt;/span&gt;&lt;span class="s2"&gt;-var-vmpass"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once this is done, you can add the Terraform tasks to install, initialize and plan/deploy. Once the pipeline runs, the script will map the Azure Key Vault variables to new names that can be identified by Terraform.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>terraform</category>
      <category>security</category>
    </item>
    <item>
      <title>Azure Update Management</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 07 Jun 2021 06:40:59 +0000</pubDate>
      <link>https://forem.com/officialcookj/azure-update-management-1p89</link>
      <guid>https://forem.com/officialcookj/azure-update-management-1p89</guid>
      <description>&lt;p&gt;An Azure Automation Account has a feature called Update Management that can manage your Windows and Linux Operating System (OS) updates for Azure, On-Premise and other third party Cloud environments. In this post I will explain what Update Management is, how to switch it on and how to add your servers to it.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Update Management
&lt;/h2&gt;

&lt;p&gt;Update Management is a toggle on feature of Azure Automation Account. Within Update Management, you can add servers from multiple environments and manage both Windows and Linux updates. Microsoft does have a supported Operating Systems list &lt;a href="https://docs.microsoft.com/en-us/azure/automation/update-management/overview#supported-operating-systems" rel="noopener noreferrer"&gt;here&lt;/a&gt; but I will outline in brief what is supported:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Windows 2012, 2012 R2, 2016 (excl. core), 2019&lt;/li&gt;
&lt;li&gt;Centos 6, 7, 8 (excl. 7.5)&lt;/li&gt;
&lt;li&gt;Red Hat Enterprise 6, 7, 8&lt;/li&gt;
&lt;li&gt;SUSE Enterprise 12, 15, 15,1&lt;/li&gt;
&lt;li&gt;Ubuntu 14.04, 16.04, 18.04&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From within Update Management you can assess servers for missing updates, configure a schedule for updates to run within a maintenance window and configure what to include/exclude from the update cycle.&lt;/p&gt;

&lt;h2&gt;
  
  
  How do I enable Update Management
&lt;/h2&gt;

&lt;p&gt;To enable Update Management you need to create a Automation Account and Log Analytics Workspace. Once you have both of these created:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open the &lt;strong&gt;Automation Account&lt;/strong&gt; you created&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Update Management&lt;/strong&gt; from the left side menu&lt;/li&gt;
&lt;li&gt;Here you are presented with a configure window. Select the &lt;strong&gt;Log Analytics Workspace&lt;/strong&gt; you created and click &lt;strong&gt;Enable&lt;/strong&gt; to complete.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Update Management is now enabled and will monitor any servers that have been configured to use the Log Analytics Workspace.&lt;/p&gt;

&lt;h2&gt;
  
  
  How do I add Azure VMs
&lt;/h2&gt;

&lt;p&gt;To add an Azure VM, all you need to do is either:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;From within the &lt;strong&gt;Update Management&lt;/strong&gt; blade within the &lt;strong&gt;Automation Account&lt;/strong&gt; resource, select &lt;strong&gt;Add Azure VM&lt;/strong&gt; from the top menu bar
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1623049395294%2F01xALwn5R.png" alt="addazvm.png"&gt;
&lt;/li&gt;
&lt;li&gt;Select a VM from the list provided (note that you may have to move Log Analytics Workspaces if it reports the VM using an alternative).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;em&gt;or&lt;/em&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;From within the &lt;strong&gt;Virtual Machine* select the **Guest + host updates&lt;/strong&gt; blade from the left side menu
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1623049511645%2FzFfkkg_vB.png" alt="guesthostupdates.png"&gt;
&lt;/li&gt;
&lt;li&gt;Now select the &lt;strong&gt;Go to Update management&lt;/strong&gt; option and then select &lt;strong&gt;Log Analytics Workspace&lt;/strong&gt; and the &lt;strong&gt;Automation Account&lt;/strong&gt; to be used for updates.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1623049589800%2Fjy7gs3zR0.png" alt="guesthostupdates2.png"&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  How do I add On-Premise Servers
&lt;/h2&gt;

&lt;p&gt;First you need to install the Log Analytics agent onto either &lt;a href="https://docs.microsoft.com/en-us/azure/azure-monitor/agents/agent-windows" rel="noopener noreferrer"&gt;Windows&lt;/a&gt; or &lt;a href="https://docs.microsoft.com/en-us/azure/azure-monitor/agents/agent-linux" rel="noopener noreferrer"&gt;Linux&lt;/a&gt;. When running the installer you will be asked to select a Log Analytics Workspace, select the one you created earlier.&lt;/p&gt;

&lt;p&gt;Once the installer completes, allow 30 minutes for it to appear in Azure. To add the server to Update Management:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;From within the &lt;strong&gt;Update Management&lt;/strong&gt; blade within the &lt;strong&gt;Automation Account&lt;/strong&gt; resource, select &lt;strong&gt;Add non-Azure VM&lt;/strong&gt; from the top menu bar
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1623049449128%2FRjGmVHeNv.png" alt="addnonazvm.png"&gt;
&lt;/li&gt;
&lt;li&gt;Select a server from the list provided.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  What type of update can I manage
&lt;/h2&gt;

&lt;p&gt;On &lt;strong&gt;Windows&lt;/strong&gt; you can manage:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Critical Updates&lt;/li&gt;
&lt;li&gt;Security Updates&lt;/li&gt;
&lt;li&gt;Update Rollups&lt;/li&gt;
&lt;li&gt;Feature Packs&lt;/li&gt;
&lt;li&gt;Service Packs&lt;/li&gt;
&lt;li&gt;Definition Updates&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On &lt;strong&gt;Linux&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Critical and Security Updates&lt;/li&gt;
&lt;li&gt;Other Updates (those not specified as Critical or Security)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How do you schedule updates
&lt;/h2&gt;

&lt;p&gt;From within the &lt;strong&gt;Update Management&lt;/strong&gt; blade within the &lt;strong&gt;Automation Account&lt;/strong&gt;, select &lt;strong&gt;Schedule update deployment&lt;/strong&gt; from the top menu. Here you can select:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The type of OS you want this deployment to apply to (cannot be both Windows and Linux)&lt;/li&gt;
&lt;li&gt;What Virtual Machines or groups containing VMs you want to apply this to&lt;/li&gt;
&lt;li&gt;What update categories you want to apply as part of this schedule&lt;/li&gt;
&lt;li&gt;Any updates to include (leaving this blank means it will do them all) or exclude&lt;/li&gt;
&lt;li&gt;Maintenance Window&lt;/li&gt;
&lt;li&gt;Date and Time to start&lt;/li&gt;
&lt;li&gt;Recurrence (if you want it to repeat)&lt;/li&gt;
&lt;li&gt;Reboot options&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Select &lt;strong&gt;Create&lt;/strong&gt; once configured. Use the &lt;strong&gt;Deployment&lt;/strong&gt; tab to monitor the progress or history of a scheduled deployment.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>security</category>
      <category>linux</category>
      <category>windows</category>
    </item>
    <item>
      <title>Static Code Analyses - Terrascan, Terraform and Azure DevOps</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 10 May 2021 06:27:17 +0000</pubDate>
      <link>https://forem.com/officialcookj/static-code-analyses-terrascan-terraform-and-azure-devops-3690</link>
      <guid>https://forem.com/officialcookj/static-code-analyses-terrascan-terraform-and-azure-devops-3690</guid>
      <description>&lt;p&gt;In my &lt;a href="https://jamescook.dev/codeanalyses-checkov-terraform-azuredevops" rel="noopener noreferrer"&gt;previous post&lt;/a&gt; I looked at Static Code Analyses with two of the three tools I am going to use in this post. We are now going to look at &lt;a href="https://github.com/accurics/terrascan" rel="noopener noreferrer"&gt;Terrascan&lt;/a&gt; as our analyses tool and have it running from CI/CD platform Azure DevOps which will also host the Terraform code we want to review.&lt;/p&gt;

&lt;p&gt;To follow with the post in configuring this setup, you will need the above mentioned tools with permissions on Azure DevOps to be able to create a Pipeline, add extensions from a marketplace and commit to a repository.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Code
&lt;/h2&gt;

&lt;p&gt;As per my previous post, I will be reusing the example Terraform configuration file which contains bad practices like password in plain text. This will allow me to test the tool as it should flag some of these bad practices.&lt;/p&gt;

&lt;p&gt;This is the code example:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619359048086%2F_nMzNoobR.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619359048086%2F_nMzNoobR.png" alt="code.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In my example, this is stored in a Azure DevOps repository but you can use a third party repository like GitHub as an alternative.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pipeline Configuration
&lt;/h2&gt;

&lt;p&gt;We will create a separate pipeline within Azure DevOps rather than use the same one we used in the previous post (you can combine them but I will cover this later). This will be used to run Terrascan to analyse the code. You will want to open your project within Azure DevOps and go into &lt;strong&gt;Pipelines&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619363601470%2F3IVJn3d6C.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619363601470%2F3IVJn3d6C.png" alt="1-pipelines.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now you want to create a pipeline for this. Select the new pipeline option and within the new window select &lt;strong&gt;Use the classic editor&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619363733896%2F_vfeOLXBz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619363733896%2F_vfeOLXBz.png" alt="2-classiceditor.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here you want to select the repository where the configuration file is stored. I have stored it in an Azure DevOps repository so will select this as my location.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619363876562%2FLKhvYUEUo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619363876562%2FLKhvYUEUo.png" alt="3-demorepo.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once selected, you will then need to select &lt;strong&gt;Empty job&lt;/strong&gt; as the template option for this pipeline.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619363999506%2F38kKmnKic.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619363999506%2F38kKmnKic.png" alt="4-emptyjob.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The first fields will appear asking you to give the pipeline a name and select the agent pools you want to use. For this demo, I have selected to use Hosted Agents where I will run Terrascan on an Ubuntu OS. Below are the configurations I set.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620457346436%2F3glSyAWc1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620457346436%2F3glSyAWc1.png" alt="devops_name.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After all fields are filled, you want to select the &lt;strong&gt;Run on agent&lt;/strong&gt; option and configure the agent job name. I opted to calling the agent &lt;strong&gt;Terrascan Analyses&lt;/strong&gt; as it seemed appropriate for what it is doing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620457500596%2FKIm_E0CA0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620457500596%2FKIm_E0CA0.png" alt="agent_name.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we are going to select the &lt;strong&gt;plus icon&lt;/strong&gt; on the run on agent field to add a job. You will be asked to select something from your currently installed extensions or from the marketplace. We will initially need to install Terraform as this is a prerequisite of Terrascan, so we will need to use the &lt;strong&gt;Terraform extension&lt;/strong&gt; from the marketplace (you may have this already so skip this step).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619364620421%2F3xN2AXqZ9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619364620421%2F3xN2AXqZ9.png" alt="7-terraformgetmarket.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once acquired from the marketplace you can then select to &lt;strong&gt;install Terraform&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619364882704%2F3_XNaLmon.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619364882704%2F3_XNaLmon.png" alt="8-installterraform.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Within the Terraform configuration window of the extension, select the version of Terraform you want to run on the Hosted Agent (as of writing this, v0.15.0 of Terraform has a bug that stops the initialisation, this may cause Terrascan not to function so use an earlier version).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619364981261%2FatDE04-A2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619364981261%2FatDE04-A2.png" alt="terraformconfigversion.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the &lt;strong&gt;plus icon&lt;/strong&gt; on the &lt;strong&gt;Run on agent&lt;/strong&gt; and select the &lt;strong&gt;Bash&lt;/strong&gt; extension.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619365206606%2FHb3hZ4zKm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619365206606%2FHb3hZ4zKm.png" alt="9-bash.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here you want to install Terrascan using the inline function. Here is what I used to install the software.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459155102%2FH4WpSzuXf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459155102%2FH4WpSzuXf.png" alt="terrascan install.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Repeat the process of adding another &lt;strong&gt;Bash&lt;/strong&gt; extension to the pipeline and this time we are configuring the inline so Terrascan can run the analyses and output the results into an xml file. Make sure to also tick under &lt;strong&gt;Control Options&lt;/strong&gt; heading the &lt;strong&gt;Continue on error&lt;/strong&gt; option or it will fail the pipeline run.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459328127%2FoScV4vWyI.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459328127%2FoScV4vWyI.png" alt="terrascan.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Again, select the &lt;strong&gt;plus icon&lt;/strong&gt; on &lt;strong&gt;Run on agent&lt;/strong&gt; and select the &lt;strong&gt;Publish Test Results&lt;/strong&gt; extension.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619365705659%2FEsZfM8LO7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619365705659%2FEsZfM8LO7.png" alt="11-publishtestresult.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we are importing the xml output from Terrascan into the test results feature in Azure DevOps. Here is the configurations I used to import.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459414733%2FNEBbmSqNs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459414733%2FNEBbmSqNs.png" alt="results config.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once all configured, select &lt;strong&gt;Save&lt;/strong&gt; on the Pipeline.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pipeline Run
&lt;/h2&gt;

&lt;p&gt;You are now ready to run the pipeline. All you need to do is select the &lt;strong&gt;Run&lt;/strong&gt; option under the three dotted icon next to the pipeline name. The pipeline will report a failure if Terrascan flags something in its analyses, if nothing is flagged the pipeline will succeed.&lt;/p&gt;

&lt;p&gt;In my code, I have been flagged by Terrascan which has set the status of the pipeline build as failed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459759523%2FmwnD7Beq6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459759523%2FmwnD7Beq6.png" alt="status.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Code Analyses Report
&lt;/h2&gt;

&lt;p&gt;Now we have the pipeline running and the report being published into the Azure DevOps test reports, we can review these reports in two location. The first is within the pipeline build, select the pipeline job and open the tab &lt;strong&gt;Tests&lt;/strong&gt;. Here you will see the tests than was ran by Terrascan, what passed and failed and reasons for this.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459874390%2FYYoRycEzH.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459874390%2FYYoRycEzH.png" alt="pipeline results.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Clicking on the flagged test failure, you will see more details as to why it failed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459955423%2FPDNw1-No7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620459955423%2FPDNw1-No7.png" alt="results details.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Alternatively, you can view the test reports via the side menu under &lt;strong&gt;Test Plans&lt;/strong&gt; and &lt;strong&gt;Runs&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620460084653%2Fn_vVCe5xV.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1620460084653%2Fn_vVCe5xV.png" alt="runs results.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can do more with Terrascan but this will not be covered in this post but future posts on the topic. In the meantime, checkout the &lt;a href="https://github.com/accurics/terrascan" rel="noopener noreferrer"&gt;Terrascan GitHub&lt;/a&gt; page for more information.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>security</category>
    </item>
    <item>
      <title>Diving into Azure Management Groups</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 03 May 2021 05:49:54 +0000</pubDate>
      <link>https://forem.com/officialcookj/diving-into-azure-management-groups-i4i</link>
      <guid>https://forem.com/officialcookj/diving-into-azure-management-groups-i4i</guid>
      <description>&lt;p&gt;When I first heard of Management Groups I thought it was just a way to group subscriptions in Azure. After in depth research on the feature, I found there was more you can do with them so in this post I will cover what are Management Groups and what can you do with them.&lt;/p&gt;

&lt;h2&gt;
  
  
  What are Management Groups
&lt;/h2&gt;

&lt;p&gt;Management Groups is a feature of Azure used to control RBAC (Role Based Access Control), apply governance via policies and implement cost management to subscriptions that are organised within these groups. You might be familiar with these features already within subscriptions but being able to duplicate configurations from one subscription to another can be a headache to manage. What Management Groups allows us to do is add these subscriptions to one group and then apply these configurations to the group which then populates to the subscriptions and its resources. You can also add a Management Group within a Management Group which will also inherit the configurations set.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where and how to create
&lt;/h2&gt;

&lt;p&gt;To create a Management Group is straight forward, first we need to locate where to find this feature. Within the Azure Portal search for &lt;strong&gt;Management Groups&lt;/strong&gt; and select the result as per the below image.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619879894378%2FAqosq9Q-m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619879894378%2FAqosq9Q-m.png" alt="search-managementgroups.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the page you will notice there is already a Management Group called Tenant Root Group. This will contain all your subscriptions. When you create a group it will appear under the Tenant Root. To create a Management Group, select &lt;strong&gt;Add&lt;/strong&gt; from the top menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619880215372%2FfBYtrFRKL.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619880215372%2FfBYtrFRKL.png" alt="add.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here you enter the ID and display name for the group you want to create.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619880356388%2F1OyzoSEWc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619880356388%2F1OyzoSEWc.png" alt="add-management-group.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once fields are completed, select &lt;strong&gt;Submit&lt;/strong&gt;. Once created, you can select it to start configuring.&lt;/p&gt;

&lt;p&gt;If you create multiple Management Groups and want to move them inside of each other, select the &lt;strong&gt;Move&lt;/strong&gt; option while in one of these groups and select the location.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619883145502%2FcnXQLCiKU.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619883145502%2FcnXQLCiKU.png" alt="move.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Subscriptions
&lt;/h2&gt;

&lt;p&gt;Within the group you created, select &lt;strong&gt;Subscriptions&lt;/strong&gt; from the side menu. Here you can select the &lt;strong&gt;Add&lt;/strong&gt; option from the top menu to add the subscriptions you want within this group.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619880215372%2FfBYtrFRKL.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1619880215372%2FfBYtrFRKL.png" alt="add.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A subscription can only be in one management group at one time.&lt;/p&gt;

&lt;h2&gt;
  
  
  IAM
&lt;/h2&gt;

&lt;p&gt;Select &lt;strong&gt;IAM&lt;/strong&gt; from the side menu within the Management Group. Here you can configure RBAC in the same way as you would do within a subscription. This will populate down to other Management Groups under this one, subscriptions and their resource groups.&lt;/p&gt;

&lt;p&gt;Click &lt;a href="https://docs.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal" rel="noopener noreferrer"&gt;here&lt;/a&gt; for more information assigning roles in IAM.&lt;/p&gt;

&lt;h2&gt;
  
  
  Security
&lt;/h2&gt;

&lt;p&gt;Within the &lt;strong&gt;Security&lt;/strong&gt; option, you can review all subscriptions and resource groups security recommendations. You will also see an overall security score rating for the Management Group with a summary of the lowest rated subscriptions.&lt;/p&gt;

&lt;p&gt;Click &lt;a href="https://docs.microsoft.com/en-us/azure/security-center/security-center-introduction" rel="noopener noreferrer"&gt;here&lt;/a&gt; for more information on using the Security blade to enable Security Center.&lt;/p&gt;

&lt;h2&gt;
  
  
  Policy
&lt;/h2&gt;

&lt;p&gt;The &lt;strong&gt;Policy&lt;/strong&gt; side menu option allows you to apply governance policies, either pre-built or custom. This will populate down to your subscriptions.&lt;/p&gt;

&lt;p&gt;Click &lt;a href="https://docs.microsoft.com/en-us/azure/governance/policy/assign-policy-portal" rel="noopener noreferrer"&gt;here&lt;/a&gt; for more information on implementing policies.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost Analysis
&lt;/h2&gt;

&lt;p&gt;To analyse what the costs of resources within a Management Group, select &lt;strong&gt;Cost Analysis&lt;/strong&gt; within the side menu. Here you will get an overview of resource costs and cost breakdown based on each subscription within the group.&lt;/p&gt;

&lt;p&gt;Click &lt;a href="https://docs.microsoft.com/en-us/azure/cost-management-billing/costs/quick-acm-cost-analysis?tabs=azure-portal" rel="noopener noreferrer"&gt;here&lt;/a&gt; for more information on using Cost Analysis.&lt;/p&gt;

&lt;h2&gt;
  
  
  Budgets
&lt;/h2&gt;

&lt;p&gt;You can set budgets to the top level of a Management Group to monitor and control costs. Select &lt;strong&gt;Budgets&lt;/strong&gt; from the side menu, here you can create a budget for the group.&lt;/p&gt;

&lt;p&gt;Click &lt;a href="https://docs.microsoft.com/en-us/azure/cost-management-billing/costs/tutorial-acm-create-budgets" rel="noopener noreferrer"&gt;here&lt;/a&gt; for more information on how to create budgets.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>management</category>
      <category>security</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Static Code Analyses - Checkov, Terraform and Azure DevOps</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 26 Apr 2021 06:32:02 +0000</pubDate>
      <link>https://forem.com/officialcookj/static-code-analyses-checkov-terraform-and-azure-devops-1kf4</link>
      <guid>https://forem.com/officialcookj/static-code-analyses-checkov-terraform-and-azure-devops-1kf4</guid>
      <description>&lt;p&gt;Static Code Analyses is a method of reviewing code against policies before deploying it, identifying weaknesses before they are live vulnerabilities in your environment. This is not new, tools for this purpose have been around for a while for development teams but we are talking about Infrastructure as Code (IaC), a practice that hasn't been around as long so the tools available are very few in comparison.&lt;/p&gt;

&lt;p&gt;In this post I am covering Static Code Analyses using three tools, these are:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Azure DevOps&lt;/strong&gt; - a CI/CD platform provided by Microsoft for developers. We will use this to store the IaC and run the code analyses.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Terraform&lt;/strong&gt; - this is the IaC tool we will use to write our code for Azure infrastructure.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Checkov&lt;/strong&gt; - we will use this tool by &lt;a href="https://bridgecrew.io/"&gt;bridgecrew&lt;/a&gt; (their open source version) to analyse the code we write in Terraform.&lt;/p&gt;

&lt;p&gt;In future posts I will look at other Static Code Analyses tools and will draw comparisons between them.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Code
&lt;/h2&gt;

&lt;p&gt;For this, I have created an example Terraform configuration file which contains bad practices like password in plain text. This will allow me to test the tool as it should flag some of these bad practices.&lt;/p&gt;

&lt;p&gt;This is the code example:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5OJOo-MJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619359048086/_nMzNoobR.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5OJOo-MJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619359048086/_nMzNoobR.png" alt="code.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This will be stored in an Azure DevOps repository but you can also use a third party repository like GitHub.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pipeline Configuration
&lt;/h2&gt;

&lt;p&gt;First we need to create the pipeline within Azure DevOps. This will be used to run Checkov to analyse the code. You will want to open your project within Azure DevOps and go into &lt;strong&gt;Pipelines&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--V2qLk6yO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619363601470/3IVJn3d6C.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--V2qLk6yO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619363601470/3IVJn3d6C.png" alt="1-pipelines.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now you want to create a pipeline for this. Select the new pipeline option and within the new window select &lt;strong&gt;Use the classic editor&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--H5Rx9LwE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619363733896/_vfeOLXBz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--H5Rx9LwE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619363733896/_vfeOLXBz.png" alt="2-classiceditor.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here you want to select the repository where the configuration file is stored. I have stored it in an Azure DevOps repository so will select this as my location.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fmqCKOeR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619363876562/LKhvYUEUo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fmqCKOeR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619363876562/LKhvYUEUo.png" alt="3-demorepo.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once selected, you will then need to select &lt;strong&gt;Empty job&lt;/strong&gt; as the template option for this pipeline.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rBE_mOlo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619363999506/38kKmnKic.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rBE_mOlo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619363999506/38kKmnKic.png" alt="4-emptyjob.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The first fields will appear asking you to give the pipeline a name and select the agent pools you want to use. For this demo, I have selected to use Hosted Agents where I will run Checkov on an Ubuntu OS. Below are the configurations I set.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0MefLgIS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619364237723/0Muc8HBQJ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0MefLgIS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619364237723/0Muc8HBQJ.png" alt="5-pipelineconfig.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After all fields are filled, you want to select the &lt;strong&gt;Run on agent&lt;/strong&gt; option and configure the agent job name. I opted to calling the agent &lt;strong&gt;Checkov Analyses&lt;/strong&gt; as it seemed appropriate for what it is doing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zltAN1dM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619364220175/YQi8YSgyT.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zltAN1dM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619364220175/YQi8YSgyT.png" alt="6-agentjobname.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we are going to select the &lt;strong&gt;plus icon&lt;/strong&gt; on the run on agent field to add a job. You will be asked to select something from your currently installed extensions or from the marketplace. We will initially need to install Terraform as this is a prerequisite of Checkov, so we will need to use the &lt;strong&gt;Terraform extension&lt;/strong&gt; from the marketplace (you may have this already so skip this step).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--k1HzvMme--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619364620421/3xN2AXqZ9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--k1HzvMme--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619364620421/3xN2AXqZ9.png" alt="7-terraformgetmarket.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once acquired from the marketplace you can then select to &lt;strong&gt;install Terraform&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pYNW-hI4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619364882704/3_XNaLmon.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pYNW-hI4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619364882704/3_XNaLmon.png" alt="8-installterraform.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Within the Terraform configuration window of the extension, select the version of Terraform you want to run on the Hosted Agent (as of writing this, v0.15.0 of Terraform has a bug that stops the initialisation, this may cause Checkov not to function so use an earlier version).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0-041XxM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619364981261/atDE04-A2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0-041XxM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619364981261/atDE04-A2.png" alt="terraformconfigversion.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the &lt;strong&gt;plus icon&lt;/strong&gt; on the &lt;strong&gt;Run on agent&lt;/strong&gt; and select the &lt;strong&gt;Bash&lt;/strong&gt; extension.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_trCmfqA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619365206606/Hb3hZ4zKm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_trCmfqA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619365206606/Hb3hZ4zKm.png" alt="9-bash.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here you want to install Checkov with further prerequisites using the inline function. Here is what I used to install the prerequisites and Checkov. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Lf7F0UUD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619365780842/Qxc0xoT2b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Lf7F0UUD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619365780842/Qxc0xoT2b.png" alt="checkovinstall.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Repeat the process of adding another &lt;strong&gt;Bash&lt;/strong&gt; extension to the pipeline and this time we are configuring the inline so Checkov can run the analyses and output the results into an xml file. Make sure to also tick under &lt;strong&gt;Control Options&lt;/strong&gt; heading the &lt;strong&gt;Continue on error&lt;/strong&gt; option or it will fail the pipeline run.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cugVQEcJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619365327432/MKbnwoC_L.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cugVQEcJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619365327432/MKbnwoC_L.png" alt="10-checkovconfig.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Again, select the &lt;strong&gt;plus icon&lt;/strong&gt; on &lt;strong&gt;Run on agent&lt;/strong&gt; and select the &lt;strong&gt;Publish Test Results&lt;/strong&gt; extension.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ekSeNcyi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619365705659/EsZfM8LO7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ekSeNcyi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619365705659/EsZfM8LO7.png" alt="11-publishtestresult.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we are importing the xml output from Checkov into the test results feature in Azure DevOps. Here is the configurations I used to import.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4iEM-hQ_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619366265612/9tek5YdPD.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4iEM-hQ_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619366265612/9tek5YdPD.png" alt="12-testresultconfig.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once all configured, select &lt;strong&gt;Save&lt;/strong&gt; on the Pipeline.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pipeline Run
&lt;/h2&gt;

&lt;p&gt;You are now ready to run the pipeline. All you need to do is select the &lt;strong&gt;Run&lt;/strong&gt; option under the three dotted icon next to the pipeline name. The pipeline will report a failure if Checkov flags something in its analyses, if nothing is flagged the pipeline will succeed.&lt;/p&gt;

&lt;p&gt;In my code, I have been flagged by Checkov which has set the status of the pipeline build as failed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0MPowHOl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619366825334/SysBZvdUJ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0MPowHOl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619366825334/SysBZvdUJ.png" alt="pipelinefail.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Code Analyses Report
&lt;/h2&gt;

&lt;p&gt;Now we have the pipeline running and the report being published into the Azure DevOps test reports, we can review these reports in two location. The first is within the pipeline build, select the pipeline job and open the tab &lt;strong&gt;Tests&lt;/strong&gt;. Here you will see the tests than was ran by Checkov, what passed and failed and reasons for this.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PCp1_D7M--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619375901765/z4b0t53yq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PCp1_D7M--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619375901765/z4b0t53yq.png" alt="testresults.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Clicking on the flagged test failure, you will see more details as to why it failed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BUmWDUdd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619375977814/0jQ5gCJe0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BUmWDUdd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619375977814/0jQ5gCJe0.png" alt="testresultdetails.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Alternatively, you can view the test reports via the side menu under &lt;strong&gt;Test Plans&lt;/strong&gt; and &lt;strong&gt;Runs&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--74t_XaRK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619376070361/AuZGuLgW9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--74t_XaRK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1619376070361/AuZGuLgW9.png" alt="testplanruns.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can do more with Checkov but this will not be covered in this post but further posts on the topic. In the meantime, checkout the &lt;a href="https://github.com/bridgecrewio/checkov"&gt;Checkov GitHub&lt;/a&gt; page for more information.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>terraform</category>
      <category>devops</category>
      <category>security</category>
    </item>
    <item>
      <title>Azure Disk Encryption for Data Disk on Linux</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 19 Apr 2021 06:46:00 +0000</pubDate>
      <link>https://forem.com/officialcookj/azure-disk-encryption-for-data-disk-on-linux-303c</link>
      <guid>https://forem.com/officialcookj/azure-disk-encryption-for-data-disk-on-linux-303c</guid>
      <description>&lt;p&gt;When configuring a new Linux Virtual Machine (VM) you may think your data is stored on Azure hardware which means is encrypted so no further encryption method is needed. It's true Microsoft encrypts its data but this is at &lt;a href="https://docs.microsoft.com/en-us/azure/security/fundamentals/encryption-atrest#the-purpose-of-encryption-at-rest" rel="noopener noreferrer"&gt;rest&lt;/a&gt;, meaning it is protected at the hardware level when there is a physical attack. Other than this, there is no logical encryption of the data disk, leaving you vulnerable if someone was able to download the disk. What I will cover in this post is what type of encryption does Azure uses for Linux Data Disks and how to enable this and attach to a VM.&lt;/p&gt;

&lt;h2&gt;
  
  
  What encryption method is used
&lt;/h2&gt;

&lt;p&gt;The Azure platform uses &lt;a href="https://en.wikipedia.org/wiki/Dm-crypt" rel="noopener noreferrer"&gt;DM-Crypt&lt;/a&gt; to encrypt Linux VMs data. This is the only method available from Azure when encrypting Linux data.&lt;/p&gt;

&lt;h2&gt;
  
  
  How does the encryption key generate and where is it stored
&lt;/h2&gt;

&lt;p&gt;As encryption is a supported method offered by Microsoft, the Azure platform integrated data disk encryption with &lt;a href="https://docs.microsoft.com/en-us/azure/key-vault/general/overview" rel="noopener noreferrer"&gt;Azure Key Vault&lt;/a&gt;. As part of the encryption process, you will be asked to select a Key Vault (or create a new one) and select or create the key that will be used for the encryption.&lt;/p&gt;

&lt;h2&gt;
  
  
  Are there prerequisites
&lt;/h2&gt;

&lt;p&gt;There are a couple requirements you must meet to be able to configure encryption on a data disk:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;The Linux virtual machine must have at least 2GB of RAM (8GB if you are doing both OS and Data Disk)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The OS must be one of the supported operating systems Microsoft have outlined in their &lt;a href="https://docs.microsoft.com/en-us/azure/virtual-machines/linux/disk-encryption-overview#supported-operating-systems" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;You will need to mount the data disk in advance of encryption so the virtual machine can mount after encryption is enabled. You can follow Microsoft documented approach to this &lt;a href="https://docs.microsoft.com/en-us/azure/virtual-machines/linux/disk-encryption-overview#additional-vm-requirements" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Azure Key Vault you use must have Enable Access to Azure Disk Encryption for volume encryption policy enabled. If not, the Key Vault used for the procedure below will not work.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  How to enable encryption using the Azure Portal
&lt;/h2&gt;

&lt;p&gt;Make sure you first mount the data disk to the virtual machine and turn it off ready for encryption to start. When ready, follow these steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open the &lt;strong&gt;Virtual Machine&lt;/strong&gt; resource&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Disks&lt;/strong&gt; from the left side menu
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618773544001%2FpXZae5xRw.png" alt="disks.png"&gt;
&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Additional Settings&lt;/strong&gt; from the top of the window
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618773628182%2FyjbO76zXv.png" alt="additional-settings.png"&gt;
&lt;/li&gt;
&lt;li&gt;From the &lt;strong&gt;Disk to encrypt&lt;/strong&gt; drop down, select &lt;strong&gt;Data disks&lt;/strong&gt;
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618773726184%2FEDqwetdzA.png" alt="encrypt-disk-list.png"&gt;
&lt;/li&gt;
&lt;li&gt;Select the &lt;strong&gt;Click to select a key&lt;/strong&gt; option that appeared after the above step completed
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618773992232%2FSXpGqOuMf.png" alt="select-keyvault.png"&gt;
&lt;/li&gt;
&lt;li&gt;Here complete the fields, either by select an existing Key Vault or creating a new. Once done, click the &lt;em&gt;Select&lt;/em&gt;* button
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618774112979%2Fh460mPY7l.png" alt="keyvault-config.png"&gt;
&lt;/li&gt;
&lt;li&gt;You will return to the configuration window to finish by selecting &lt;strong&gt;Save&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  How to enable encryption using Azure CLI
&lt;/h2&gt;

&lt;p&gt;Run the following command&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az vm encryption enable -g "ResourceGroupName" --name "LinuxVMName" --disk-encryption-keyvault "NameOfKeyVault" --volume-type DATA
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace the following values with your own:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;ResourceGroupName&lt;/strong&gt; - The name of the resource group that the Linux VM is.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;LinuxVMName&lt;/strong&gt; - Name of the Linux VM where the data disk will be encrypted&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;NameOfKeyVault&lt;/strong&gt; - Key Vault name you are using for storing the encryption key&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>azure</category>
      <category>cloud</category>
      <category>linux</category>
      <category>security</category>
    </item>
    <item>
      <title>Best Practice: Terraform State in Azure Blob Container</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 12 Apr 2021 06:46:08 +0000</pubDate>
      <link>https://forem.com/officialcookj/best-practice-terraform-state-in-azure-blob-container-4ol3</link>
      <guid>https://forem.com/officialcookj/best-practice-terraform-state-in-azure-blob-container-4ol3</guid>
      <description>&lt;p&gt;Locating your Terraform state file remotely in an Azure Blob Storage shouldn't be as easy as creating a container and configuring the backend, you should consider some best practices. In this post I will outline practices I've used when securing and implementing redundancy to a Storage Account containing Terraform state files.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: Some settings may increase the cost of your Storage Account so please refer to the &lt;a href="https://azure.microsoft.com/en-gb/pricing/details/storage/" rel="noopener noreferrer"&gt;Microsoft's pricing page&lt;/a&gt;. You should also consider each practice as a recommendation and evaluate based on your setup/needs.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Subscription and Resource Permissions
&lt;/h2&gt;

&lt;p&gt;When storing Terraform state files in a Storage Account, you need to review the permissions of the Subscription, Resource Group and Resource. Terraform state files contain sensitive information and should itself be considered sensitive. Check IAM (Identity and Access Management) to make sure those who need access to the resource have permissions and those who shouldn't are removed. Consider SAS (Shared Access Signature) as another means of authentication (mentioned further down in this post).&lt;/p&gt;

&lt;p&gt;You can locate IAM in the left sidebar with the name &lt;strong&gt;Access Control (IAM)&lt;/strong&gt; from any subscription, resource group or resource.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069738611%2FPMidqLCeP.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069738611%2FPMidqLCeP.png" alt="1-iam.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Selected Networks
&lt;/h2&gt;

&lt;p&gt;You can secure access to your Blob container by allowing access through selected networks. Specifying networks reduce outside threats to the data because only users who are on the specified networks are granted access after authentication.&lt;/p&gt;

&lt;p&gt;To configure this, go into your &lt;strong&gt;Storage Account&lt;/strong&gt; and select from the left side menu &lt;strong&gt;Networking&lt;/strong&gt;. From here you can then change the radio button from All Networks to Selected Networks where you can then configure your network settings.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069750167%2Fy3AvR5UH3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069750167%2Fy3AvR5UH3.png" alt="2-networking.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Azure Defender
&lt;/h2&gt;

&lt;p&gt;Azure Defender by default is disabled but can be enabled on the entire Storage Account. Azure Defender detects access attempts on containers that are deemed to be harmful. If you are only using the Storage Account to contain state files in a Blob container, then the cost based on per 10,000 transactions is very low and should be considered as a setting to be enabled.&lt;/p&gt;

&lt;p&gt;To enable, select &lt;strong&gt;Security&lt;/strong&gt; from the left side menu within the Storage Account.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069763973%2FTUoRG_1mg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069763973%2FTUoRG_1mg.png" alt="3-security.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From here you can then select the &lt;strong&gt;Enable Azure Defender for Storage&lt;/strong&gt; (you will also be given a current transaction amount this storage account has processed to help with estimating costs).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069775717%2FKpRYCp6HN.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069775717%2FKpRYCp6HN.png" alt="4-security.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Geo Replication
&lt;/h2&gt;

&lt;p&gt;There are three forms of replications you can configure, LRS (Locally Redundant Storage), GRS (Geo Redundant Storage) and RA-GRS (Read Access Geo Redundant Storage). Depending on the availability of the data, I would consider either GRS or RA-GRS replication on an environment where Terraform is used to manage many resources. GRS will perform the same synchronization of data that of LRS (three copies to the local region) but will then copy the data to a secondary region. RA-GRS will do the same as GRS but the data will only be read only. Choosing one of these replication types depends on if you want to have a read-only copy of the data in another region or if you want to actively use that copy of data when the source region is down.&lt;/p&gt;

&lt;p&gt;Geo Replication can be configured during setup of the Storage Account and reconfigured after its created. To reconfigure an existing Storage Account's replication, select &lt;strong&gt;Configuration&lt;/strong&gt; from the left side menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069786918%2F4w8dvq0ci.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069786918%2F4w8dvq0ci.png" alt="5-configuration.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Within the window you can select the from &lt;strong&gt;Replication&lt;/strong&gt; dropdown the type of replication you want for the Storage Account.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069795563%2F5XOzRr20T.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069795563%2F5XOzRr20T.png" alt="6-geodropdown.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Soft Delete
&lt;/h2&gt;

&lt;p&gt;As a precaution, make sure soft delete is enabled. The more days you add onto the soft delete, the more you pay as the data is stored somewhere so set something sensible like 30 days. After the specified soft delete time has been reached, the data is permanently deleted.&lt;/p&gt;

&lt;p&gt;To check soft delete is configured, select &lt;strong&gt;Data Protection&lt;/strong&gt; from the left side menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069807002%2FWHm-vI_cs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069807002%2FWHm-vI_cs.png" alt="7-dataprotection.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here you can enable and configure &lt;strong&gt;Turn on soft delete for blobs&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069817278%2FgoKSkWp-D2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069817278%2FgoKSkWp-D2.png" alt="8-softdelete.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Versioning
&lt;/h2&gt;

&lt;p&gt;Versioning is an important setting when it comes to file recovery. You can select the Terraform state file and look back at every change that's been uploaded for that file, allowing you to recover from a previous version by making it the current version. Turn this on if not already enabled.&lt;/p&gt;

&lt;p&gt;To enable versioning, select &lt;strong&gt;Data Protection&lt;/strong&gt; from the left side menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069807002%2FWHm-vI_cs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069807002%2FWHm-vI_cs.png" alt="7-dataprotection.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then check the box next to &lt;strong&gt;Turn on versioning for blobs&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069836144%2FvUffcCuPP.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069836144%2FvUffcCuPP.png" alt="9-versioning.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Change Feed
&lt;/h2&gt;

&lt;p&gt;For observability I recommend enabling Blob Change Feed so changes made on the Blob container are audited for security and investigation purposes. You can either read the log files within the container or use alternative methods to ingest this data to monitor/read.&lt;/p&gt;

&lt;p&gt;Enabling change feed, select &lt;strong&gt;Data Protection&lt;/strong&gt; from the left side menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069807002%2FWHm-vI_cs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069807002%2FWHm-vI_cs.png" alt="7-dataprotection.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From here you can then check the box next to &lt;strong&gt;Turn on blob change feed&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069853167%2FO-Jqn-9rT.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069853167%2FO-Jqn-9rT.png" alt="10-changefeed.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Public Access
&lt;/h2&gt;

&lt;p&gt;By default, Blob public access is enabled. To prevent any misconfiguration, disabling public access is the sensible option so only authenticated methods can access resources. Refer to the &lt;a href="https://docs.microsoft.com/en-us/azure/storage/blobs/anonymous-read-access-prevent" rel="noopener noreferrer"&gt;Microsoft documentation&lt;/a&gt; on preventing anonymous read access to Blob storage.&lt;/p&gt;

&lt;p&gt;To disable public access, select &lt;strong&gt;Configuration&lt;/strong&gt; from the left side menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069786918%2F4w8dvq0ci.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069786918%2F4w8dvq0ci.png" alt="5-configuration.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select &lt;strong&gt;Disabled&lt;/strong&gt; under the heading &lt;strong&gt;Allow Blob public access&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069875250%2FZsd55Oo1I.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069875250%2FZsd55Oo1I.png" alt="11-publicaccess.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Access Level
&lt;/h2&gt;

&lt;p&gt;There are three access level tiers to a Blob Container, Private, Blob and Container. Private is the only container that prevents anonymous access to the data inside and should be the one to choose. If you've already disabled Public Access, then Private would be applied to all containers and the other two options would not be available.&lt;/p&gt;

&lt;p&gt;Select &lt;strong&gt;Container&lt;/strong&gt; from the side menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069887820%2FvV0wqn7nt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069887820%2FvV0wqn7nt.png" alt="13-containers.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select a Blob container and select &lt;strong&gt;Change access level&lt;/strong&gt; from the top of the window to then be able to change access level tier.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069898354%2FN0nIWnBuG.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069898354%2FN0nIWnBuG.png" alt="12-acl.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Delete Lock
&lt;/h2&gt;

&lt;p&gt;Prevent your Storage Account from being deleted by configuring a Lock so deletion cannot happen without the lock being removed.&lt;/p&gt;

&lt;p&gt;To set a delete lock, select &lt;strong&gt;Locks&lt;/strong&gt; from the side menu within &lt;strong&gt;Storage Account&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069916427%2Fm1LUkYhnA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1618069916427%2Fm1LUkYhnA.png" alt="14-locks.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From here you can then create a new lock and set the type as &lt;strong&gt;Delete&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  SAS Tokens
&lt;/h2&gt;

&lt;p&gt;Consider SAS tokens if you are granting access to individuals/applications for a specific period. SAS tokens can then be used to authenticate to the backend using the specified configuration &lt;a href="https://www.terraform.io/docs/language/settings/backends/azurerm.html" rel="noopener noreferrer"&gt;HashiCorp lists on their site&lt;/a&gt;. This reduces the need of creating Service Principals or Managed Service Identities (MSI).&lt;/p&gt;

&lt;h2&gt;
  
  
  Snapshots
&lt;/h2&gt;

&lt;p&gt;I would only recommend taking a snapshot if you are going to manipulate the state file. This would give you an additional recovery point if the state file breaks in some way (I have failed manipulating a recovery file and no recovery point to revert back to). Versioning is an option to recover from but as a precautionary measure, I feel a snapshot is good to have and then delete once you confirm the file is working as intended.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>terraform</category>
      <category>devops</category>
    </item>
    <item>
      <title>Ctrl+Shift+A in Windows Terminal to launch Azure CLI authentication</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 29 Mar 2021 07:56:11 +0000</pubDate>
      <link>https://forem.com/officialcookj/ctrl-shift-a-in-windows-terminal-to-launch-azure-cli-authentication-3fk6</link>
      <guid>https://forem.com/officialcookj/ctrl-shift-a-in-windows-terminal-to-launch-azure-cli-authentication-3fk6</guid>
      <description>&lt;p&gt;I found logging into Azure via CLI a repetitive task but not a long one. I did however utilised Windows Terminal Actions to create key bindings to type in the login command.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configuring settings.json
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Launch &lt;strong&gt;Windows Terminal&lt;/strong&gt;;&lt;/li&gt;
&lt;li&gt;Select the &lt;strong&gt;down arrow&lt;/strong&gt; next to the add tab option and select &lt;strong&gt;Settings&lt;/strong&gt;;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--K7OkBPiF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1616960718731/nDXdmM75W.png" alt="azloginterm3.png"&gt;
&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;settings.json&lt;/strong&gt; file will open up with your default code editor. Within the file you need to add the below line of code:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nl"&gt;"actions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"command"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"sendInput"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"input"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"az login&lt;/span&gt;&lt;span class="se"&gt;\r&lt;/span&gt;&lt;span class="s2"&gt;eturn"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"keys"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ctrl+shift+a"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Add this code in a place appropriate, it should look something like this:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2FMe7Yn5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1616955273483/dJtr-AEQe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2FMe7Yn5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1616955273483/dJtr-AEQe.png" alt="azloginterm1.png"&gt;&lt;/a&gt;&lt;br&gt;
Once the code has been included in the file, save it and relaunch Windows Terminal (any errors, review and correct).&lt;/p&gt;

&lt;h2&gt;
  
  
  How to launch
&lt;/h2&gt;

&lt;p&gt;Now your ready to sign in to Azure CLI using the key bindings, launch Windows Terminal and hold down the following keys:&lt;br&gt;
&lt;strong&gt;Ctrl + Shift + A&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once you do this, the below will display on your terminal:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CBIDbAru--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1616955429980/afjTdh-x0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CBIDbAru--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.hashnode.com/res/hashnode/image/upload/v1616955429980/afjTdh-x0.png" alt="azloginterm2.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You will also have a web browser window appear asking you to login to Azure to authenticate.&lt;/p&gt;

&lt;h2&gt;
  
  
  What else can you do
&lt;/h2&gt;

&lt;p&gt;You can amend the command line field so you can include some more information to include as part of the authentication. Find all sign in methods from  &lt;a href="https://docs.microsoft.com/en-us/cli/azure/authenticate-azure-cli"&gt;Microsoft Azure CLI documentation&lt;/a&gt;. There is other ways to improve on this or to do something different but remember to consider security before implementing these, you do not want to reduce security to improve convenience. &lt;/p&gt;

</description>
      <category>azure</category>
      <category>cloud</category>
      <category>terminal</category>
      <category>windows</category>
    </item>
    <item>
      <title>My Journey to Microsoft Certified Azure DevOps Engineer Expert</title>
      <dc:creator>James Cook</dc:creator>
      <pubDate>Mon, 22 Mar 2021 07:27:23 +0000</pubDate>
      <link>https://forem.com/officialcookj/my-journey-to-microsoft-certified-azure-devops-engineer-expert-7e9</link>
      <guid>https://forem.com/officialcookj/my-journey-to-microsoft-certified-azure-devops-engineer-expert-7e9</guid>
      <description>&lt;p&gt;Before I tell you my journey, I want to let you know this is not my first Microsoft certification. The last Microsoft certification I took was in 2014 and since then changed technology specialism between Apple and Google where I became certified. My experience with exams isn't new to me, I’ve done many, so my journey may come off differently to others who explain their experience.&lt;/p&gt;

&lt;p&gt;It started in December 2019 where the interest to skill up took me to the  &lt;a href="https://query.prod.cms.rt.microsoft.com/cms/api/am/binary/RE2PjDI"&gt;Microsoft Certification Poster&lt;/a&gt;. I’ve been using Azure in drips and drabs over the year and wanted to learn more about the service range and felt it was best to start with Azure Fundamentals.&lt;/p&gt;

&lt;h2&gt;
  
  
  Microsoft Certified Azure Fundamentals (AZ 900) 🏁
&lt;/h2&gt;

&lt;p&gt;I first went to the &lt;a href="https://docs.microsoft.com/en-us/learn/certifications/exams/az-900"&gt;certification page&lt;/a&gt; to look at the learning content Microsoft recommends. The page listed all the learning modules from the Microsoft Learn site that is related to the exam topics. To make it convenient for myself, I created a Collection (select the plus symbol next to the module) within my MS Learn profile so I can collate them in one place to access later. Once I done this, I started working through the material which includes scenario driven examples, video tutorials and hands on labs. I allowed 2 hours a day to work through the material (with the casual day off), using the Collection progress bar to monitor how far I was from completing all the modules.&lt;/p&gt;

&lt;p&gt;Once I completed all the modules, I went back to the certification page to review the exam skills outline document. I used this to cross off anything I felt confident I understood and covered as part of the MS Learn material. By doing this, I was happy that the content I covered from Microsoft Learn prepared me for the exam which I then booked and passed. This preparation took a month but felt both my existing knowledge and the MS Learn content helped with the fast turnaround.&lt;/p&gt;

&lt;h2&gt;
  
  
  Microsoft Certified Azure Administrator Associate (AZ 103 / 104) 📚
&lt;/h2&gt;

&lt;p&gt;After passing the fundamentals exam (in January), I set a target for myself to be prepared for the Azure Administrator exam in five months (June). This target was based on a slow paced learning, dedicated to reading material, hands on labs and deploying my own resources.  This target was derailed after I found a new interest in learning how to manage Azure resources using HashiCorp Terraform. This put an extra two months (August) onto my planned target, but I did come out learning an IaC (Infrastructure as Code) product and certified in it.&lt;/p&gt;

&lt;p&gt;Over the course of seven months I referred to the Microsoft Learn content listed on the &lt;a href="https://docs.microsoft.com/en-us/learn/certifications/exams/az-104"&gt;Certification page&lt;/a&gt;. While working through this content I was also deploying resources in my own subscription by using the &lt;a href="https://azure.microsoft.com/en-gb/free/"&gt;trial service Microsoft offers&lt;/a&gt;. Even though Microsoft provides hands on labs in their learning material, using Azure outside of the learning content was necessary in my opinion to familiarise with the service settings and to get a feel of the services working with each other. When reviewing the exam skills outline document, my confidence on some subject areas was not great as I felt I was missing some topics that need clarification. After some thought I decided to go ahead and search for an instructor led video course that included video tutorials and the information I was missing. There is many courses out there, and there is a lot of decent instructors who create them so it was hard to choose. I came across Scott Duffy’s course and felt his way of teaching suited me, and yes everything I watched helped with explaining what I felt was missing from my learning and made other knowledge I had stronger. Once completing the course, I went ahead and scheduled the exam and passed in August.&lt;/p&gt;

&lt;h2&gt;
  
  
  Designing and Implementing Microsoft DevOps Solutions (AZ 400) 🏆
&lt;/h2&gt;

&lt;p&gt;I now have Fundamentals and Administrator Associate exams passed on first attempts within eight months. I wasn’t planning on taking any other exams in 2020, I was going to focus more on the Azure Architect content but from attending a Microsoft Ignite event in January I had a free exam voucher that was expiring at the end of the year. As stated in the last section I started learning IaC where I was using services like GitHub and Azure DevOps. I felt taking the DevOps exam might be beneficial both to my current understanding of DevOps practices and to the work I was doing with IaC. I did some research on the exam and found even passing the &lt;a href="https://docs.microsoft.com/en-us/learn/certifications/exams/az-400"&gt;Designing and Implementing Microsoft DevOps Solutions&lt;/a&gt; exam, there was a prerequisite of passing one of two exam before earning &lt;a href="https://docs.microsoft.com/en-us/learn/certifications/devops-engineer/"&gt;Microsoft Certified Azure DevOps Engineer Expert&lt;/a&gt; (AZ-400). The two exams are Azure Developer Associate (AZ-204) and Azure Administrator Associate (AZ-103/104), one of which I've passed as part of my journey, making me eligible to earn DevOps Engineer Expert status if I pass the AZ-400 exam.&lt;/p&gt;

&lt;p&gt;Again, I started with the Microsoft Learn content but felt from reading the exam skills outline document that there was huge gaps in my knowledge around development tools (possibly AZ-204 could of helped me). At the time Pluralsight was partnered with Microsoft to provide free content on certifications (this ended in January 2021), the material included information around development skills needed for the exam. I used this to fill in the missing knowledge I had, implementing what I learnt in some of my own DevOps practices, leaving me feeling prepared for the exam. In December I took the exam and passed, completing my twelve month journey to skill up.&lt;/p&gt;

&lt;h2&gt;
  
  
  Takeaways and What’s Next ⏭️
&lt;/h2&gt;

&lt;p&gt;I learnt during those twelve months that even though the Microsoft Learn material is great in detail and content, there is additional work on your behalf to build on the information they provide. Don’t just read, actually complete the hands on labs and tutorials that’s provided. I also learnt that time commitment is necessary and must be regular so you retain the information. &lt;/p&gt;

&lt;p&gt;Now does this mean I can relax with three Microsoft certifications under my belt? Am I skilled up to where I want to be? The answer to both is no, Cloud and DevOps are both constantly changing. I must stay in the loop with advancements in technology and practices as well as renewing my certification yearly so they stay valid. I have started writing blog posts to share my experience and knowledge as well as trying to contribute to both Cloud and DevOps communities, in return I am receiving advice, support and information that is helping me to grow.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>cloud</category>
      <category>cloudskills</category>
    </item>
  </channel>
</rss>
