<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Simon Waight</title>
    <description>The latest articles on Forem by Simon Waight (@simonwaight).</description>
    <link>https://forem.com/simonwaight</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/simonwaight"/>
    <language>en</language>
    <item>
      <title>Practical adoption of generative AI as part of your software development lifecycle</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Tue, 04 Apr 2023 00:00:00 +0000</pubDate>
      <link>https://forem.com/simonwaight/practical-adoption-of-generative-ai-as-part-of-your-software-development-lifecycle-mkp</link>
      <guid>https://forem.com/simonwaight/practical-adoption-of-generative-ai-as-part-of-your-software-development-lifecycle-mkp</guid>
      <description>&lt;p&gt;As the pace at which new technology ships increases, so do the chances that businesses will opt out or disable new capabilities because they don't feel they have time to adequately assess the impact before people have access. A downside to this approach is that those businesses who have slow or no adoption of new technology will quickly be outpaced by their competitors (emerging and exitsing) that are better able to assess and adopt.&lt;/p&gt;

&lt;p&gt;Over the past decade I have experienced this first hand with cloud adoption, where many large organisations now restrict access to new services and features as standard operating procedure. More mature businesses put a structured assessment process in place, but it still feels like a barrier to innovation from potentially beneficial new technology.&lt;/p&gt;

&lt;p&gt;More recently I am seeing the same behaviour with the introduction of &lt;a href="https://en.wikipedia.org/wiki/Generative_artificial_intelligence"&gt;generative AI&lt;/a&gt;, so in this post I am going to concentrate on adoption of this emerging technology as part of the softare delivery process. Examples of generative AI services you have access to today include &lt;a href="https://github.com/features/copilot/"&gt;GitHub Copilot&lt;/a&gt;, &lt;a href="https://aws.amazon.com/codewhisperer/"&gt;Amazon CodeWhisperer&lt;/a&gt; and &lt;a href="https://www.tabnine.com/"&gt;Tabnine&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;TL;DR - here's a &lt;a href="https://github.com/users/sjwaight/projects/2/"&gt;basic scaffold GitHub Project&lt;/a&gt; to help with your adoption process.&lt;/p&gt;

&lt;h3&gt;
  
  
  The more things change, the more they stay the same
&lt;/h3&gt;

&lt;p&gt;This famous quote (&lt;em&gt;"Plus ça change, plus c'est la même chose"&lt;/em&gt; ) from 19 Century French journalist &lt;a href="https://en.wikipedia.org/wiki/Jean-Baptiste_Alphonse_Karr"&gt;Jean-Bapiste Alphonse Karr&lt;/a&gt; is quite apt for the situation in which we find ourselves.&lt;/p&gt;

&lt;p&gt;Developers have been using assistive technology for many years to help reduce time to ship software or to remove the tedium of repetitive work. Code generation has been around since at least the late 1970s with the likes of &lt;a href="https://en.wikipedia.org/wiki/MATLAB"&gt;MATLAB&lt;/a&gt; plotting the course we find ourselves on today.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://en.wikipedia.org/wiki/Intelligent_code_completion"&gt;Intelligent code completion&lt;/a&gt; has also played a big part in modern developer experience with most Integrated Developer Environments (IDEs) supporting some form of code completion. Code completion is also not a new idea and traces its history all the way back to the 1950s.&lt;/p&gt;

&lt;p&gt;Clearly the idea that every single line of code in any application codebase is 100% produced only through direct human edeavour has been outdated for a while!&lt;/p&gt;

&lt;h3&gt;
  
  
  Protecting your IP from generative AI
&lt;/h3&gt;

&lt;p&gt;Many businesses have invested substantial amounts in building software that solve their business problems or those of their customers. It's understandable busineses want to protect their Intellectual Property (IP).&lt;/p&gt;

&lt;p&gt;However, the reality is that the IP is not just the sourcecode, it's in the business logic and the data, and the unique aspects of the user experience the software delivers. The code is just a means to an end. While it's true some companies such as Microsoft (amongst others) have built their business on IP that is in code, the majority of businesses are not in this position.&lt;/p&gt;

&lt;p&gt;I know this is perhaps a controversial opinion (and likely your lawyers will disagree), but if your only reason for not adopting new technology is the perception you are protecting IP in your source code, you might be focusing on the wrong things. You might also want to stop reading here! 🙂&lt;/p&gt;

&lt;p&gt;Still with me? Good, let's look at how you can manage your adoption of generative AI services as part of your software delivery lifecycle.&lt;/p&gt;

&lt;h3&gt;
  
  
  Legal review of service prior to roll-out
&lt;/h3&gt;

&lt;p&gt;You should sit down with the appropriate members of your legal department or advsior and formally review the terms of service and privacy policy of the generative AI service(s) you are considering. This way you have a clear understanding of what data is being collected and how it is being used by the service provider.&lt;/p&gt;

&lt;p&gt;If you don't have a legal capability then you need to either engage one, or make best efforts to decide on your intrepertation of the terms of service and privacy policy. Make sure to note down the process used to decide and justifications for future reference!&lt;/p&gt;

&lt;p&gt;For example, GitHub Copilot for Business &lt;a href="https://docs.github.com/en/site-policy/privacy-policies/github-copilot-for-business-privacy-statement#code-snippets-data"&gt;clearly states&lt;/a&gt; it doesn't retain any code snippets sent to the service as a prompt. If the services you are considering have no clear declaration then you should ask the service provider for one. Depending on their answer you might not want to adopt their service!&lt;/p&gt;

&lt;p&gt;This is also not a one-time process. It should be reguarly reviewed to ensure the service provider has not changed their policy in a way that is not acceptable to your business. Watch out for "we've changed our terms of service" or "we've updated our Privacy Policy" notifications!&lt;/p&gt;

&lt;h3&gt;
  
  
  Limit use of generative AI to specific codebases, tasks or teams
&lt;/h3&gt;

&lt;p&gt;If you have a large codebase, you may want to limit the use of the generative AI service to a specific part of a codebase or a specific team. This way you can control the amount of data being sent to the generative AI service and also the amount of generated code being included in your solution. You can also have confidence that certain parts of your codebase, potentially those with what you consider your actual IP, are not exposed to the generative AI service or will contain generated code.&lt;/p&gt;

&lt;p&gt;If you are unsure where in your codebase this thinking applies, I think we can agree that the Data Access Layer (DAL) of your solution is likely not to be where your IP lives and it's also not where you want your teams to be spending their time writing code. 😉&lt;/p&gt;

&lt;p&gt;There is a &lt;a href="https://research.aimultiple.com/generative-ai-copyright/"&gt;swirling debate&lt;/a&gt; around whether or not the outputs of generative AI models or services can be copyrighted, and if they can, who holds that copyright? If you want to ensure you hold copyright, limiting the places where this technology is used can be a good start.&lt;/p&gt;

&lt;h3&gt;
  
  
  Require human explainability and review
&lt;/h3&gt;

&lt;p&gt;This might be a tough ask depending on how much code generation you are prepared to allow, but ultimatley your business will be responsible for any experiences resulting from this code. If your code generation service produces code nobody can explain then you bear the risk of something unexpected happening.&lt;/p&gt;

&lt;p&gt;If you consider the previous point (limited use) then you can ring fence certain key parts of your business logic and require human explainability and review of generated code in these areas if that helps limit the scope.&lt;/p&gt;

&lt;p&gt;It's important to be aware of what is happening in this space of human explainability from a regulatory standpoint too. The Europe Union's General Data Protection Regulation (GDPR) includes mentions of automated decision making and the need for human explainability. The GDPR covers this most clearly in &lt;a href="https://gdpr.eu/article-22-automated-individual-decision-making/"&gt;Article 22&lt;/a&gt; and &lt;a href="https://gdpr.eu/recital-71-profiling/"&gt;Recital 71&lt;/a&gt;. It's safe to say other duristictions will likely follow suit in due course and you need to be ready!&lt;/p&gt;

&lt;h3&gt;
  
  
  Leaning on your DevOps practices
&lt;/h3&gt;

&lt;p&gt;If you are still not using DevOps practices widely, then now is the time to start! If you have quality gates in place for human-created code and you run your machine-generated code through the same process, you can have a degree of confidence around the quality and functionality of the code. There are a few aspects to this suggestion:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Small commits containing only generated code:&lt;/strong&gt; avoid mixing human and machine-generated code in commits. Ideally these commits are associated with a specific task or user story so you already have that context. Any Pull Request resulting from these commits must be human reviewed.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Only generate what you need:&lt;/strong&gt; I can foresee a situation where a developer is using a code generation service to generate a complete API or library for, say, database access. They will generate all the normal Create, Read, Update and Delete (CRUD) methods and add the code to a commit. Perhaps they just needed a Read operation? This is all the code that should have been kept and committed to the repository. If we require humans to review code changes, then we want to limit the volume of code to review and reduce the chances that problems will arise later when unreviewed code is used in the application.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Static code analysis or &lt;a href="https://en.wikipedia.org/wiki/Lint_(software)"&gt;linting&lt;/a&gt;:&lt;/strong&gt; just because code compiles doesn't mean it is correct! You should be running a static code analysis tool as part of your CI/CD pipeline. This will help identify any issues with the generated code. If you are using a generative AI service that is integrated with your IDE then you should be running the same analysis tool or linter locally. This will help you identify issues before you commit the code.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Clearly identify code as generated:&lt;/strong&gt; today there is no automated way to provide a clear indication that code is machine-generated. Depending how widely you decide to use code generation,s you can add a developer standard that requires developers to clearly mark code as machine-generated. This could be as simple as a comment at the top of the file or a section of a file. This will help you identify generated code and avoid the risk of human-generated code being overwritten or vice versa. If the code is manually modified it should be noted in the comment. I asked the team at GitHub if Copilot could add comments around generated code like the below sample (today it's not possible was the answer).&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;//GHCP: Start
/*
Model release: x.xxx.xx
Model date: 2023-03-28
Prompt: a function to calculate the sum of two numbers
*/
public int Sum(int a, int b)
{
    return a + b;
}
//GHCP: End
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Test coverage:&lt;/strong&gt; ensuring all key parts of your codebase have code coverage will provide confidence that changes that are machine-generated are not introducing any regressions. This is also a good practice to have in place for any code changes regardless! I know you are probably thinking about machine-generated unit tests at this point... I'll refer you to the human explainability and review point above. If you are using code generation to create unit tests simply to hit code coverage targets you are looking at this the wrong way!&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Revisit generated code as part of BAU or updates:&lt;/strong&gt; don't assume that today's best suggestion will be tomorrow's - just because it's generated doesn't mean it should be frozen in time. It is worth revisiting generated code over time to see if they are better ways to implement the logic. This is no different to using new language features or libraries as they become availbale and can replace bespoke code you have developed.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The bottom line is, if you are already allowing changes into production without the necessary quality gates, then all adding generative AI to the mix is going to do is increase the volume of change you are now submitting to your CI/CD pipeline or requiring human review of!&lt;/p&gt;

&lt;h3&gt;
  
  
  Generative AI doesn't mean less people
&lt;/h3&gt;

&lt;p&gt;Let's not overlook the obvious concern many people have around the introduction of generative AI technology into the workplace - job security. I think it's important to understand that generative AI is not a replacement for humans. It's a tool that can help developers be more productive and deliver more value to their business while at the same time helping them to learn and grow their skills.&lt;/p&gt;

&lt;p&gt;If you are a manager or business owner and you are thinking you can replace your developers with generative AI sevices like Copilot you are going to suffer in the long term. You will notice that a lot of what I've written about so far requires human oversight or intervention. Smart people want interesting and challenging work, and if you can use generative AI to remove the busy work from their lives they will be more productive and happy. This also provides them with space to help grow new team members who you should absolutely be hiring!&lt;/p&gt;

&lt;h3&gt;
  
  
  It's worth the time and the (perceived) risk
&lt;/h3&gt;

&lt;p&gt;I know there is a lot of hype (and a fair amount of concern) floating about on the impacts of AI at the moment, but I honestly believe the benefits of generative AI will far outweigh the risks.&lt;/p&gt;

&lt;p&gt;Like any new technology it's worth taking a considered approach to adoption. To this end I created a &lt;a href="https://github.com/users/sjwaight/projects/2/"&gt;basic scaffold GitHub Project&lt;/a&gt; which you can copy to your organisation and use to get started with adoption of generative AI in your enviornment. Make sure you select &lt;em&gt;"Draft issues will be copied if selected"&lt;/em&gt; so you get the baseline tasks I created for you to use. You can read about copying projects on &lt;a href="https://docs.github.com/en/issues/planning-and-tracking-with-projects/creating-projects/copying-an-existing-project"&gt;GitHub's documentation&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I recently recorded a session with Damian Brady where we discussed risks associated with pushing Copilot code to production in more detail. You can watch the discussion below.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/ZGJQ8V8nefM?start=2135"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;I will leave you with this thought...&lt;/p&gt;

&lt;p&gt;Let's be honest, the only time you review generated SQL from an Object Relational Mapper (ORM) is when there are performance issues, right? And that code is typically generated by a machine at runtime! 😜&lt;/p&gt;

&lt;p&gt;😎&lt;/p&gt;

</description>
      <category>ai</category>
      <category>devops</category>
      <category>alm</category>
      <category>github</category>
    </item>
    <item>
      <title>Moving from Azure Logic Apps to Power Automate Flow</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Wed, 07 Sep 2022 00:00:29 +0000</pubDate>
      <link>https://forem.com/simonwaight/moving-from-azure-logic-apps-to-power-automate-flow-460n</link>
      <guid>https://forem.com/simonwaight/moving-from-azure-logic-apps-to-power-automate-flow-460n</guid>
      <description>&lt;p&gt;You may be sitting there are wondering why are we doing an Azure Logic App to Power Automate Flow migration? In most cases you’d likely be heading the other direction, but I can give you a few pointers why this move makes sense:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Bring a low-code solution and data into an environment that more people have access to (Office 365)&lt;/li&gt;
&lt;li&gt;Allow policies around data management to easily applied using existing platform capabilities.&lt;/li&gt;
&lt;li&gt;Easier to share the implementation with others by exporting the Power Automate Flow or granting them ownership rights.&lt;/li&gt;
&lt;li&gt;Make use of features like &lt;a href="https://docs.microsoft.com/power-automate/process-mining-cloud-flow-process-insights"&gt;Process Insights&lt;/a&gt; to help improve my low-code solution.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;For this post I will be moving my &lt;a href="https://blog.siliconvalve.com/2019/05/24/integrate-meetup-announcements-with-microsoft-teams-using-azure-logic-apps-and-adaptive-cards/"&gt;meetups alerts Logic App&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Move data from Azure Table Storage to SharePoint
&lt;/h3&gt;

&lt;p&gt;While we are moving our business logic to an easier to manage location, we might as well move out data as well. It is possible with Power Automate Flow to connect to &lt;a href="https://docs.microsoft.com/connectors/azuretables/"&gt;Azure Table Storage&lt;/a&gt;, but it may be that the people managing your Flow may not have access to the Azure environment which would be an issue. To avoid issues, let’s move the data into a platform that Flow editors are likely to also have access to – SharePoint.&lt;/p&gt;

&lt;p&gt;The process you use to migrate your data will depend on the volume of data you are moving. For my scenario the list of items is fairly small, so I will take a simple approach.&lt;/p&gt;

&lt;p&gt;First up, let’s export the Azure Storage Table Storage data to CSV using the free &lt;a href="https://azure.microsoft.com/products/storage/storage-explorer/"&gt;Azure Storage Explorer&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Iy4PbNjr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/08/2022-08-29_14-14-03.png%3Fw%3D1024%26h%3D612" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Iy4PbNjr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/08/2022-08-29_14-14-03.png%3Fw%3D1024%26h%3D612" alt="Azure Storage Explorer - Export Table CSV" title="Azure Storage Explorer - Export Table CSV" width="880" height="526"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once we have the CSV downloaded open it up in Excel and delete the columns that denote data type. Once done, save the file as a Excel Workbook (xlsx).&lt;/p&gt;

&lt;p&gt;Next, we are going to go ahead and create a new SharePoint List for our data by using the &lt;a href="https://www.microsoft.com/microsoft-365/microsoft-lists"&gt;Microsoft Lists&lt;/a&gt; service.&lt;/p&gt;

&lt;p&gt;The easiest way to navigate to Lists is to expand the “waffle menu” in the top left in any online Office 365 application (Outlook, OneDrive, etc) and then select Lists.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Y9jRRbnE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-09-04_16-06-50.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Y9jRRbnE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-09-04_16-06-50.png%3Fw%3D1100" alt="Selecting Lists" title="Selecting Lists" width="477" height="989"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the Lists main page has loaded go ahead and click &lt;code&gt;+ New list&lt;/code&gt; at the top. On the next screen go ahead and choose &lt;code&gt;From Excel&lt;/code&gt; as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JhzUHknI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-08-29_14-47-10.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JhzUHknI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-08-29_14-47-10.png%3Fw%3D1100" alt="Create Microsoft List from Excel" title="Create Microsoft List from Excel" width="743" height="371"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the Excel file is uploaded you can select the columns you want to import (it should be all of them as we deleted the data type columns earlier).&lt;/p&gt;

&lt;p&gt;Finally, you can give your List a name, select a colour and icon, and most importantly, the location you want to store it. I selected a SharePoint Teams site which means the list is actually created as a standard SharePoint List.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jU2eQGBO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-08-29_14-49-40.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jU2eQGBO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-08-29_14-49-40.png%3Fw%3D1100" alt="Finish creating new list" title="Finish creating new list" width="860" height="846"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we have our data in SharePoint, let’s go ahead and rebuild our Logic App.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Rebuild the Logic App as a Power Automate Flow
&lt;/h3&gt;

&lt;p&gt;The big tip I will give you here is to name your Power Automate Flow Actions (Steps) identically to those in your Logic App. The reason? You can copy complex dynamic expressions from your Logic App and paste them right into your Flow.&lt;/p&gt;

&lt;p&gt;The big change between our original Logic App and Flow implementation is that our data is now stored in a SharePoint List, so instead of an Azure Table Storage connector (which is actually available to Flow) you use a SharePoint connector. Every other element remains the same.&lt;/p&gt;

&lt;p&gt;The other item to be aware of is that the &lt;a href="https://docs.microsoft.com/azure/connectors/connectors-native-http"&gt;HTTP Connector&lt;/a&gt; (built-in for both Logic Apps and Flows) is considered a Premium connector in Power Automate. This means you must determine if you are licensed for Premium connectors or not. The easiest way to view if you’re covered or not is to review the documentation on &lt;a href="https://docs.microsoft.com/power-platform/admin/power-automate-licensing/types#compare-power-automate-plans"&gt;Power Automate Plans&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The main body of my new Power Automate Flow is shown below. You can see I am now using a SharePoint connector in place of the original Azure Table Storage one in the Logic App.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OuLEbR0D--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-08-31_14-53-46.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OuLEbR0D--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-08-31_14-53-46.png%3Fw%3D1100" alt="Power Automate Flow Main Body" title="Power Automate Flow Main Body" width="659" height="775"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If I expand the meetup loop you can also see that the steps match the Logic App except I now use SharePoint as my target to track status.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WMTJs_qq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-08-31_15-01-49.png%3Fw%3D1024%26h%3D639" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WMTJs_qq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-08-31_15-01-49.png%3Fw%3D1024%26h%3D639" alt="Power Automate Flow Loop Body" title="Power Automate Flow Loop Body" width="880" height="550"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that the Flow is created, I can use the dashboard to quickly view when the Flow ran, add other users as Owners of the Flow, export the Flow to share, and also determine any changes I can make to improve the Flow performance by using Process Insights!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BwAchkDO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-09-01_16-09-39.png%3Fw%3D1024%26h%3D554" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BwAchkDO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/09/2022-09-01_16-09-39.png%3Fw%3D1024%26h%3D554" alt="Power Automate Flow Overview" title="Power Automate Flow Overview" width="880" height="476"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Unsurprisingly this migration wasn’t a tough one – Logic Apps and Power Automate Flow share a lot in common, and my data source was not a complex one. As I said at the start of this post, it’s an unusual move to make – most migrations would go from Power Automate Flow to Azure Logic Apps, but it is a real positive to see it is possible to go back the other way!&lt;/p&gt;

&lt;p&gt;Happy Days! 😎&lt;/p&gt;

</description>
      <category>azure</category>
      <category>lowcode</category>
      <category>powerautomate</category>
      <category>cloud</category>
    </item>
    <item>
      <title>How to consume GraphQL APIs from Azure Logic Apps</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Tue, 05 Apr 2022 00:00:47 +0000</pubDate>
      <link>https://forem.com/simonwaight/how-to-consume-graphql-apis-from-azure-logic-apps-57e8</link>
      <guid>https://forem.com/simonwaight/how-to-consume-graphql-apis-from-azure-logic-apps-57e8</guid>
      <description>&lt;p&gt;If you’ve been reading my blog for a while then you will see that I am a heavy user of RESTful APIs in my solutions, going so far as to build a custom &lt;a href="https://github.com/sjwaight/LogicAppsMeetupReadOnlyPublicConnector"&gt;Azure Logic Apps Connector for Meetup’s REST API&lt;/a&gt;. I’ve used this Connector in a few places and blog about one example previously in &lt;a href="https://blog.siliconvalve.com/2019/05/24/integrate-meetup-announcements-with-microsoft-teams-using-azure-logic-apps-and-adaptive-cards/"&gt;Integrate Meetup announcements with Microsoft Teams using Azure Logic Apps and Adaptive Cards&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Recently my solution for Meetup integration ceased working as Meetup decided to shift their API from REST to GraphQL without providing any advance notice. This change has quite a big impact on API consumers because the client interaction model and API endpoints are completely different between the REST and GraphQL. Thankfully GraphQL still utilises HTTP primitives and returns JSON responses so we at least have a starting point for our new client code.&lt;/p&gt;

&lt;p&gt;In this post I am going to look at how you can call a GraphQL API from within an Azure Logic App by using the &lt;a href="https://docs.microsoft.com/azure/connectors/connectors-native-http"&gt;in-built HTTP Action&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Our sample query
&lt;/h3&gt;

&lt;p&gt;For the purpose of this post I am going to use a GraphQL query against the Meetup GraphQL API which uses the predefined &lt;a href="https://www.meetup.com/api/schema/#groupByUrlname"&gt;groupByUrlname query&lt;/a&gt; and use it to return the next scheduled event for the specified group. The sample is shown below.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;This query shows one of the benefits of GraphQL espoused by its proponents – the ability to send what is effectively two queries (get group AND event details) to the API at once and receive a single response. I must admit this is really handy and helped remove a step in my Logic App.&lt;/p&gt;

&lt;h3&gt;
  
  
  Using in a Logic App
&lt;/h3&gt;

&lt;p&gt;As I hinted at above, GraphQL still uses all the usual HTTP primitives, so we can use the &lt;a href="https://docs.microsoft.com/azure/connectors/connectors-native-http#add-an-http-action"&gt;in-built HTTP Action in Logic Apps&lt;/a&gt; to call the Meetup GraphQL API.&lt;/p&gt;

&lt;p&gt;The Meetup API documentation includes a few samples that show how to call the GraphQL API using &lt;a href="https://curl.se/"&gt;curl&lt;/a&gt; which is a really handy way to expose exactly what is required. I’ve included a sample below.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The key items of note are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Wrap the GraphQL query and any variables in the body of your request as a JSON payload. Specify ‘query’ and ‘variables’ with their own JSON property.&lt;/li&gt;
&lt;li&gt;Set Content-Type to application/json.&lt;/li&gt;
&lt;li&gt;Use the POST HTTP verb.&lt;/li&gt;
&lt;li&gt;The URI to hit is &lt;a href="https://api.meetup.com/gql"&gt;https://api.meetup.com/gql&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Given I’m retrieving information on public Meetup groups I don’t need to provide an Authorization token, but apart from that I can use the rest of the curl sample to help me setup my Logic App HTTP Action as shown below.&lt;/p&gt;

&lt;p&gt;The response that comes back is in a JSON format which I will parse into an object to make it easier to use elsewhere in my Logic App. The easiest way to build the schema is to run the HTTP Action once, look at the ‘output’ of the Action in the Run, copy the response and then use the “Use sample payload to generate schema” option in the Parse JSON Action.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PlWgHvXJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/03/2022-03-30_10-33-40.png%3Fw%3D706%26h%3D1024" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PlWgHvXJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/03/2022-03-30_10-33-40.png%3Fw%3D706%26h%3D1024" alt="Logic App steps for reach API and parse response" width="705" height="1023"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the HTTP Action I am dynamically replacing the URL of the Meetup group with the RowKey from an Azure Storage table. I am doing this because my data design means the RowKey contains the URL component of the meetup group – i.e. “Azure-Sydney-User-Group”.&lt;/p&gt;

&lt;p&gt;The net result of this implementation is that my &lt;a href="https://blog.siliconvalve.com/2019/05/24/integrate-meetup-announcements-with-microsoft-teams-using-azure-logic-apps-and-adaptive-cards/"&gt;Meetup-to-Teams bridge&lt;/a&gt; is functional again, and in fact is now simplified with only a single Meetup API call. Unfortunately, I can’t use my Custom Connector for the Logic App anymore, but I at least my integration works and I can keep across new meetups as they are announced!&lt;/p&gt;

&lt;p&gt;Happy Days! 😎&lt;/p&gt;

</description>
      <category>azure</category>
      <category>logicapps</category>
      <category>graphql</category>
      <category>lowcode</category>
    </item>
    <item>
      <title>Real-time air quality monitoring and alerting with Azure and PurpleAir – Part 1</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Mon, 10 Jan 2022 22:45:44 +0000</pubDate>
      <link>https://forem.com/simonwaight/real-time-air-quality-monitoring-and-alerting-with-azure-and-purpleair-part-1-52aj</link>
      <guid>https://forem.com/simonwaight/real-time-air-quality-monitoring-and-alerting-with-azure-and-purpleair-part-1-52aj</guid>
      <description>&lt;p&gt;Anyone who was living in Australia during the 2019/2020 summer can’t help but remember the massive bushfires we had, and the impact they had on air quality.&lt;/p&gt;

&lt;p&gt;Probably the starkest way to illustrate how bad it was is this post from December 10, 2019. I also added a recent follow-up post to show what it normally looks like here.&lt;/p&gt;


&lt;blockquote class="ltag__twitter-tweet"&gt;
      &lt;div class="ltag__twitter-tweet__media"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U0lxXCK1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://pbs.twimg.com/media/ELYPsvzUcAAzaqE.jpg" alt="unknown tweet media content"&gt;
      &lt;/div&gt;

  &lt;div class="ltag__twitter-tweet__main"&gt;
    &lt;div class="ltag__twitter-tweet__header"&gt;
      &lt;img class="ltag__twitter-tweet__profile-image" src="https://res.cloudinary.com/practicaldev/image/fetch/s--HsIDR3-3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://pbs.twimg.com/profile_images/1392320582545416192/KY33Oa5y_normal.jpg" alt="Simon Waight profile image"&gt;
      &lt;div class="ltag__twitter-tweet__full-name"&gt;
        Simon Waight
      &lt;/div&gt;
      &lt;div class="ltag__twitter-tweet__username"&gt;
        &lt;a class="mentioned-user" href="https://dev.to/simonwaight"&gt;@simonwaight&lt;/a&gt;
      &lt;/div&gt;
      &lt;div class="ltag__twitter-tweet__twitter-logo"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ir1kO05j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-f95605061196010f91e64806688390eb1a4dbc9e913682e043eb8b1e06ca484f.svg" alt="twitter logo"&gt;
      &lt;/div&gt;
    &lt;/div&gt;
    &lt;div class="ltag__twitter-tweet__body"&gt;
      Visibility of maybe 100m. Can usually see kilometres from here... 🙁 
    &lt;/div&gt;
    &lt;div class="ltag__twitter-tweet__date"&gt;
      22:33 PM - 09 Dec 2019
    &lt;/div&gt;


    &lt;div class="ltag__twitter-tweet__actions"&gt;
      &lt;a href="https://twitter.com/intent/tweet?in_reply_to=1204167218063278081" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fFnoeFxk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-reply-action-238fe0a37991706a6880ed13941c3efd6b371e4aefe288fe8e0db85250708bc4.svg" alt="Twitter reply action"&gt;
      &lt;/a&gt;
      &lt;a href="https://twitter.com/intent/retweet?tweet_id=1204167218063278081" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--k6dcrOn8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-retweet-action-632c83532a4e7de573c5c08dbb090ee18b348b13e2793175fea914827bc42046.svg" alt="Twitter retweet action"&gt;
      &lt;/a&gt;
      &lt;a href="https://twitter.com/intent/like?tweet_id=1204167218063278081" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SRQc9lOp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-like-action-1ea89f4b87c7d37465b0eb78d51fcb7fe6c03a089805d7ea014ba71365be5171.svg" alt="Twitter like action"&gt;
      &lt;/a&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/blockquote&gt;



&lt;blockquote class="ltag__twitter-tweet"&gt;
      &lt;div class="ltag__twitter-tweet__media"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7kgM-fJS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://pbs.twimg.com/media/FHV__JYVEAQR_PY.jpg" alt="unknown tweet media content"&gt;
      &lt;/div&gt;

  &lt;div class="ltag__twitter-tweet__main"&gt;
    &lt;div class="ltag__twitter-tweet__header"&gt;
      &lt;img class="ltag__twitter-tweet__profile-image" src="https://res.cloudinary.com/practicaldev/image/fetch/s--HsIDR3-3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://pbs.twimg.com/profile_images/1392320582545416192/KY33Oa5y_normal.jpg" alt="Simon Waight profile image"&gt;
      &lt;div class="ltag__twitter-tweet__full-name"&gt;
        Simon Waight
      &lt;/div&gt;
      &lt;div class="ltag__twitter-tweet__username"&gt;
        &lt;a class="mentioned-user" href="https://dev.to/simonwaight"&gt;@simonwaight&lt;/a&gt;
      &lt;/div&gt;
      &lt;div class="ltag__twitter-tweet__twitter-logo"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ir1kO05j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-f95605061196010f91e64806688390eb1a4dbc9e913682e043eb8b1e06ca484f.svg" alt="twitter logo"&gt;
      &lt;/div&gt;
    &lt;/div&gt;
    &lt;div class="ltag__twitter-tweet__body"&gt;
      Not quite 2 year later, but here’s a reference picture for what this usually looks like… 
    &lt;/div&gt;
    &lt;div class="ltag__twitter-tweet__date"&gt;
      03:47 AM - 24 Dec 2021
    &lt;/div&gt;


    &lt;div class="ltag__twitter-tweet__actions"&gt;
      &lt;a href="https://twitter.com/intent/tweet?in_reply_to=1474225181551828993" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fFnoeFxk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-reply-action-238fe0a37991706a6880ed13941c3efd6b371e4aefe288fe8e0db85250708bc4.svg" alt="Twitter reply action"&gt;
      &lt;/a&gt;
      &lt;a href="https://twitter.com/intent/retweet?tweet_id=1474225181551828993" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--k6dcrOn8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-retweet-action-632c83532a4e7de573c5c08dbb090ee18b348b13e2793175fea914827bc42046.svg" alt="Twitter retweet action"&gt;
      &lt;/a&gt;
      &lt;a href="https://twitter.com/intent/like?tweet_id=1474225181551828993" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SRQc9lOp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-like-action-1ea89f4b87c7d37465b0eb78d51fcb7fe6c03a089805d7ea014ba71365be5171.svg" alt="Twitter like action"&gt;
      &lt;/a&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/blockquote&gt;


&lt;p&gt;I’ve had a weather station for the house for probably 10 years or so, and around the time of these fires I started looking into air quality sensor as a way to identify when we shouldn’t be spending time outside.&lt;/p&gt;

&lt;p&gt;If I’m quite honest, needing to check outside air quality safety feels a little crazy, but I think it’s a wake-up call that many of us might have needed to realise that this issue affects everyone, not just certain heavily industrialised areas of the world.&lt;/p&gt;

&lt;p&gt;On top of this, while the first photograph above shows it’s clearly unhealthy to head outside, it doesn’t need to get to that level of visibility before air quality is at dangerous enough levels for some people.&lt;/p&gt;

&lt;p&gt;After a number of months looking I was unable to find a good quality consumer air sensor in stock. The increased demand due to the following year’s North American and European fires, along with pandemic-driven shortages meant I put this on the back-burner and only periodically checked for stock. One of my colleagues, Dave Glover, &lt;a href="https://github.com/gloveboxes/Raspberry-Pi-Python-Environment-Monitor-with-the-Pimoroni-Enviro-Air-Quality-PMS5003-Sensor"&gt;built his own hardware solution&lt;/a&gt;, but I wasn’t ready to go down that rabbit hole!!&lt;/p&gt;

&lt;p&gt;During a recent browsing session I happened across &lt;a href="https://www2.purpleair.com/"&gt;PurpleAir&lt;/a&gt;. Their devices are less consumer-orientated than I would have liked, and I was ideally wanting something battery-operated, but I also liked that they listed their device details, including that it has two laser-based particle sensors (&lt;a href="https://core-electronics.com.au/pm2-5-air-quality-sensor-and-breadboard-adapter-kit-pms5003.html"&gt;PMS5003&lt;/a&gt;) (coincidentally, the same Dave used in his DIY project).&lt;/p&gt;

&lt;p&gt;Having a bit of a dig I could also see that they make data publicly available and that you can access it yourself as well. This is a win-win for me. Everyone gets the benefit of more readings and I get something I can build on top of.&lt;/p&gt;

&lt;p&gt;Despite the pricing (and the shipping 😪) I decided to take the plunge…&lt;/p&gt;


&lt;blockquote class="ltag__twitter-tweet"&gt;
      &lt;div class="ltag__twitter-tweet__media"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vp1brTQB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://pbs.twimg.com/media/FDZJVonVQA4V9EE.jpg" alt="unknown tweet media content"&gt;
      &lt;/div&gt;

  &lt;div class="ltag__twitter-tweet__main"&gt;
    &lt;div class="ltag__twitter-tweet__header"&gt;
      &lt;img class="ltag__twitter-tweet__profile-image" src="https://res.cloudinary.com/practicaldev/image/fetch/s--HsIDR3-3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://pbs.twimg.com/profile_images/1392320582545416192/KY33Oa5y_normal.jpg" alt="Simon Waight profile image"&gt;
      &lt;div class="ltag__twitter-tweet__full-name"&gt;
        Simon Waight
      &lt;/div&gt;
      &lt;div class="ltag__twitter-tweet__username"&gt;
        &lt;a class="mentioned-user" href="https://dev.to/simonwaight"&gt;@simonwaight&lt;/a&gt;
      &lt;/div&gt;
      &lt;div class="ltag__twitter-tweet__twitter-logo"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ir1kO05j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-f95605061196010f91e64806688390eb1a4dbc9e913682e043eb8b1e06ca484f.svg" alt="twitter logo"&gt;
      &lt;/div&gt;
    &lt;/div&gt;
    &lt;div class="ltag__twitter-tweet__body"&gt;
      Less a week to arrive! Now to get this mounted and sending AQI data to ⁦&lt;a href="https://twitter.com/ThePurpleAir"&gt;@ThePurpleAir&lt;/a&gt;⁩. This sensor has some heft to it! 
    &lt;/div&gt;
    &lt;div class="ltag__twitter-tweet__date"&gt;
      01:24 AM - 05 Nov 2021
    &lt;/div&gt;


    &lt;div class="ltag__twitter-tweet__actions"&gt;
      &lt;a href="https://twitter.com/intent/tweet?in_reply_to=1456432171988652032" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fFnoeFxk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-reply-action-238fe0a37991706a6880ed13941c3efd6b371e4aefe288fe8e0db85250708bc4.svg" alt="Twitter reply action"&gt;
      &lt;/a&gt;
      &lt;a href="https://twitter.com/intent/retweet?tweet_id=1456432171988652032" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--k6dcrOn8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-retweet-action-632c83532a4e7de573c5c08dbb090ee18b348b13e2793175fea914827bc42046.svg" alt="Twitter retweet action"&gt;
      &lt;/a&gt;
      &lt;a href="https://twitter.com/intent/like?tweet_id=1456432171988652032" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SRQc9lOp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-like-action-1ea89f4b87c7d37465b0eb78d51fcb7fe6c03a089805d7ea014ba71365be5171.svg" alt="Twitter like action"&gt;
      &lt;/a&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/blockquote&gt;


&lt;p&gt;After installation I went to check the &lt;a href="https://map.purpleair.com/"&gt;PurpleAir map&lt;/a&gt; for my device and was happily surprised to find that the New South Wales Government’s Department of Planning, Industry and Environment (DPIE) is using these sensors – &lt;a href="https://map.purpleair.com/1/mAUAQI/a10/p604800/cC6?select=98657#13.68/-33.79135/151.27582"&gt;this one in Manly&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vHrIK5f5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/12/2021-12-24_15-59-55.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vHrIK5f5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/12/2021-12-24_15-59-55.png%3Fw%3D1100" alt="Manly Sensor Reading" title="Manly Sensor Reading" width="371" height="620"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Understanding Air Quality Index (AQI) and measurements
&lt;/h3&gt;

&lt;blockquote&gt;
&lt;p&gt;👨‍🔬 Non-scientist alert 👨‍🔬 I’m going to put this disclaimer here. I’m not an air quality expert. Please take what follows with a grain of salt and do your own reading!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The aforementioned DPIE has a really good summary of Air Quality.. if you want to &lt;a href="https://www.dpie.nsw.gov.au/air-quality/understanding-air-quality-data"&gt;go and have a read&lt;/a&gt;, I’d highly recommend it.&lt;/p&gt;

&lt;p&gt;The PMS5003 sensors are particle sensors, so for us, the two air quality properties we can measure will be:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Measure&lt;/th&gt;
&lt;th&gt;Period&lt;/th&gt;
&lt;th&gt;Units&lt;/th&gt;
&lt;th&gt;GOOD&lt;/th&gt;
&lt;th&gt;FAIR&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Particulate matter &amp;lt; 10 µm (PM10)&lt;/td&gt;
&lt;td&gt;1 hr&lt;/td&gt;
&lt;td&gt;µg/m&lt;sup&gt;3&lt;/sup&gt;
&lt;/td&gt;
&lt;td&gt;&amp;lt; 50&lt;/td&gt;
&lt;td&gt;50-100&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Particulate matter &amp;lt; 2.5 µm (PM2.5)&lt;/td&gt;
&lt;td&gt;1 hr&lt;/td&gt;
&lt;td&gt;µg/m&lt;sup&gt;3&lt;/sup&gt;
&lt;/td&gt;
&lt;td&gt;&amp;lt; 25&lt;/td&gt;
&lt;td&gt;25-50&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The PMS sensors send through data for much finer particles than PM2.5, but the two measures above will be key for our final solution. We'll use DPIE's Air Quality Categories (AQC) as well for our measure which means anything other than "GOOD" will impact sensitive people. I haven't put all categories here, but we'll alert on state change for each category (up and down).&lt;/p&gt;

&lt;h3&gt;
  
  
  Oh, look the PA-II support Azure IoT!
&lt;/h3&gt;

&lt;p&gt;Well, yes, the registration screen includes Azure as a potential Data Processor, except.. good luck if you can get it to work… and it’s currently totally undocumented.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--C-CwQgpb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2021-12-16_22-10-04.png%3Fw%3D1024%26h%3D563" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--C-CwQgpb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2021-12-16_22-10-04.png%3Fw%3D1024%26h%3D563" alt="Configuring data processors for PA-II air sensor" title="Configuring data processors for PA-II air sensor" width="880" height="484"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Sadly I was unable to get it to work and the PurpleAir team is currently shifting support platforms, so I wasn’t able to find out how to get it working.&lt;/p&gt;

&lt;p&gt;Which means…&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fRgrPSmQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/6023hu.jpg%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fRgrPSmQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/6023hu.jpg%3Fw%3D1100" alt="CODE ALL THE THINGS!" title="CODE ALL THE THINGS!" width="593" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  The proposed solution
&lt;/h3&gt;

&lt;p&gt;As with any self-respecting nerd I have various bits of tech to hand, with my Network Attached Storage (NAS) device from Synology being a really useful swiss army knife. The Synology NAS supports Docker containers and has been handy for hosting a bunch of things for me, so I thought it would make sense to re-use for this solution.&lt;/p&gt;

&lt;p&gt;.NET 6 also recently shipped, and I wanted to have a play with the new &lt;a href="https://docs.microsoft.com/aspnet/core/fundamentals/minimal-apis?view=aspnetcore-6.0"&gt;Minimal API&lt;/a&gt; model that removes a lot of boilerplate code from your solutions. What a great scenario to use it for!&lt;/p&gt;

&lt;p&gt;The diagram below shows the high level flow of the overall solution that will allow us to get our sensor data into Azure IoT Hub.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KEaZ1AxT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_12-17-16.png%3Fw%3D1024%26h%3D452" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KEaZ1AxT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_12-17-16.png%3Fw%3D1024%26h%3D452" alt="Proposed Solution Architecture" title="Proposed Solution Architecture" width="880" height="388"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Time to code!
&lt;/h4&gt;

&lt;p&gt;I chose to use Visual Studio Code to build my Gateway solution which is a standard ASP.NET Web API. This Gateway accepts HTTP POST requests from the PA-II sensor and parses the JSON payload which is then sent to an Azure IoT Hub using the &lt;a href="https://docs.microsoft.com/dotnet/api/microsoft.azure.devices.client?view=azure-dotnet"&gt;IoT Device C# SDK&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;You can find the final API solution &lt;a href="https://github.com/sjwaight/AirQualityAzureIoTGateway"&gt;on GitHub&lt;/a&gt;. You will see it includes a Dockerfile which is then used by the associated GitHub Action to build the solution and publish the resulting image to &lt;a href="https://hub.docker.com/r/sjwaight/airsensorgateway"&gt;Docker Hub&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;All the logic to handle the data from the sensor lives in the &lt;code&gt;SensorReadingController.cs&lt;/code&gt; file which contains a single method to hand the HTTP POST request (&lt;a href="https://github.com/sjwaight/AirQualityAzureIoTGateway/blob/main/Controllers/SensorReadingController.cs"&gt;view it on GitHub&lt;/a&gt;). I created a C# class to model the JSON payload from the sensor which also makes the code a bit easier to read!&lt;/p&gt;

&lt;p&gt;Given I have a closed network with a strong security setup I probably could have left this Web API open to any caller.. but we’ve all seen horror stories of security breaches of all types happening from insecure endpoints, so I thought it best to at least perform some basic client validation (&lt;a href="https://github.com/sjwaight/AirQualityAzureIoTGateway/blob/main/Controllers/SensorReadingController.cs#L46-L48"&gt;see the lines that perform the check&lt;/a&gt;) before allowing the request to be processed. I know this wouldn’t stop a motivated attacker, but hopefully it’d be enough in most cases!! Yes, I could go a lot further, probably right down in the guts of ASP.NET Authentication… but for my use case this will suffice.&lt;/p&gt;

&lt;h3&gt;
  
  
  Deploying the gateway
&lt;/h3&gt;

&lt;p&gt;I’m using Docker Hub to host my Container Image as it is the default Container Repository that Synology’s Docker setup uses which makes it easy for me to pull the resulting Container Image to my NAS (it’s also free which is handy!)&lt;/p&gt;

&lt;p&gt;On my NAS I open up the Docker application and search for my customer gateway Image on Docker Hub. Once found I can then select the Image and Download it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--498Cs_gb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_12-57-50.png%3Fw%3D1024%26h%3D402" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--498Cs_gb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_12-57-50.png%3Fw%3D1024%26h%3D402" alt="Docker image search on Synology NAS" title="Docker image search on Synology NAS" width="880" height="346"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the image is downloaded I then need to launch on instance of it, so I switch over Image, select my downloaded Image and click Launch.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--uDhzlYEy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_13-00-52.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uDhzlYEy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_13-00-52.png%3Fw%3D1100" alt="Launch Docker Image on Synology" title="Launch Docker Image on Synology" width="880" height="474"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In order for my API to run I need to configure some environment variables and to also define the TCP port I want to expose the Web API on. I do this by selecting Advanced Settings on the Launch dialog.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yVmGWN3L--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_13-01-50.png%3Fw%3D1024%26h%3D735" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yVmGWN3L--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_13-01-50.png%3Fw%3D1024%26h%3D735" alt="Advanced Settings on Launch dialog" title="Advanced Settings on Launch dialog" width="880" height="631"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Specify the port mapping..&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DzgXJ3tU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_13-06-27.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DzgXJ3tU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_13-06-27.png%3Fw%3D1100" alt="Container Port mapping" title="Container Port mapping" width="880" height="251"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;and then set the four required environment variables as &lt;a href="https://github.com/sjwaight/AirQualityAzureIoTGateway#runtime-configuration"&gt;detailed in the readme&lt;/a&gt; on the GitHub repository.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bpXdegXi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_13-05-40.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bpXdegXi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_13-05-40.png%3Fw%3D1100" alt="Set environment variables for the container" title="Set environment variables for the container" width="880" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once configured you can now start the gateway Container and should see confirmation in the logs that it has started.&lt;/p&gt;

&lt;p&gt;Next we need to update our PA-II device registration so that it calls our newly deployed API Gateway, so let’s go ahead and configure that on the PurpleAir registration site.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---3JotEMn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_13-09-59.png%3Fw%3D1024%26h%3D557" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---3JotEMn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_13-09-59.png%3Fw%3D1024%26h%3D557" alt="PurpleAir device registration with new gateway" title="PurpleAir device registration with new gateway" width="880" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the registration is saved we should see a call to the Gateway API every two minutes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3nBhibr9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_14-48-31.png%3Fw%3D1024%26h%3D199" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3nBhibr9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_14-48-31.png%3Fw%3D1024%26h%3D199" alt="Container Logs" title="Container Logs" width="880" height="171"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As a final check we can switch over to our Azure IoT Hub Overview tab and see that events are arriving as expected.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--K_5gLKIX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_14-24-35.png%3Fw%3D1024%26h%3D644" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--K_5gLKIX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2022/01/2022-01-05_14-24-35.png%3Fw%3D1024%26h%3D644" alt="Azure IoT Hub Overview screen" title="Azure IoT Hub Overview screen" width="880" height="553"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  So, what now?
&lt;/h3&gt;

&lt;p&gt;At this stage we are now delivering filtered events from our local sensor to an Azure IoT Hub instance, but this is only a part of our overall solution.&lt;/p&gt;

&lt;p&gt;The IoT Hub instance will hold our event data for up to 7 days before it expires. We could choose to route the events to an Azure Storage Account or other endpoint, but for the purpose of this blog series I am simply going to leave them sitting in IoT Hub until we are ready to build the next stage of our solution. This way I am avoiding incurring additional costs until I’m ready to develop and deploy the cloud processing and storage part of my solution.&lt;/p&gt;

&lt;p&gt;Hopefully this has been an insightful post which shows you how you can quickly take locally generated data and push it into Azure for additional processing.&lt;/p&gt;

&lt;p&gt;Until the next post! 😎&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>aspnet</category>
      <category>azure</category>
      <category>iot</category>
    </item>
    <item>
      <title>Setting Helm Chart version and appVersion properties during CI/CD with GitHub Actions</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Tue, 14 Dec 2021 01:29:08 +0000</pubDate>
      <link>https://forem.com/simonwaight/setting-helm-chart-version-and-appversion-properties-during-cicd-with-github-actions-588h</link>
      <guid>https://forem.com/simonwaight/setting-helm-chart-version-and-appversion-properties-during-cicd-with-github-actions-588h</guid>
      <description>&lt;p&gt;The release of Helm 3.7 sees some major changes to the way Helm behaves and the commands you work with. In addition to this, stricter adherence to &lt;a href="https://semver.org/"&gt;Semantic Versioning (semver)&lt;/a&gt; can be observed for both Chart and Application versioning.&lt;/p&gt;

&lt;p&gt;In this post I am going to look at one way you can simplify setting the &lt;code&gt;version&lt;/code&gt; and &lt;code&gt;appVersion&lt;/code&gt; values for your Helm Charts whilst ensuring you meet the semver 2 requirements.&lt;/p&gt;

&lt;h4&gt;
  
  
  TL;DR show me the code!
&lt;/h4&gt;

&lt;p&gt;If you’re here simply looking for the full solution then head on over to the &lt;a href="https://github.com/sjwaight/helm-version-demo/blob/main/.github/workflows/content-web.yml"&gt;sample GitHub Action workflow&lt;/a&gt; and check it out!&lt;/p&gt;

&lt;p&gt;Read on if you want to understand the approach.&lt;/p&gt;

&lt;h3&gt;
  
  
  Understanding version and appVersion
&lt;/h3&gt;

&lt;p&gt;The first thing we need to understand is why there appears to be two fields for a Helm Chart that look to do the same thing. The &lt;a href="https://helm.sh/docs/topics/charts/#the-chartyaml-file"&gt;documentation of the Chart.yaml&lt;/a&gt; file format is great, but I’ll summarise here as well and dig a bit more into each.&lt;/p&gt;

&lt;h4&gt;
  
  
  version
&lt;/h4&gt;

&lt;p&gt;Quite simply &lt;code&gt;version&lt;/code&gt; represents the version of the Helm Chart you are deploying. This is &lt;em&gt;not&lt;/em&gt; the same as the version of the application you are deploying with the Chart. Charts define application components &lt;em&gt;AND&lt;/em&gt; their configuration. If the configuration changes then the Chart &lt;code&gt;version&lt;/code&gt; property should be updated. Updating an application component such as a Container image doesn’t necessarily mean there is a configuration change.&lt;/p&gt;

&lt;p&gt;The only way to set this value is to update it in the Chart.yaml file. This represents some challenges when automating build and deployment – either you need to submit an updated Chart.yaml file &lt;em&gt;or&lt;/em&gt; write some scripting to update / create the Chart.yaml file.&lt;/p&gt;

&lt;h4&gt;
  
  
  appVersion
&lt;/h4&gt;

&lt;p&gt;In many cases &lt;code&gt;appVersion&lt;/code&gt; will be the property you want to update on every deployment as it is represents the version of the application component(s) you are deploying. This might simply be an updated container image (or set of images).&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;appVersion&lt;/code&gt; can be set either though manually updating the Chart.yaml file (similar to updating version), with the additional capability to override it at packaging time by supplying the &lt;code&gt;--app-version&lt;/code&gt; argument as shown below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
helm package . --app-version 1.0.0

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Great, and the problem is?
&lt;/h3&gt;

&lt;p&gt;I’m sure many would argue there is no problem here, but I do see a problem. You can set the version of your application regardless of the value contained in the Chart.yaml file, yet you are unable to do the same with the Chart version.&lt;/p&gt;

&lt;p&gt;Yes, I could manually update the Chart.yaml file to denote a version change (perhaps as part of a Pull Request), but why should I have to edit a file when all I need to do is change one value? It introduce the chance of human error and I also require people to have more knowledge of the Helm ecosystem than many may ever need. Alternatively, you could argue this approach ensures the value is only updated when really necessary… but it still strikes me as odd!&lt;/p&gt;

&lt;h3&gt;
  
  
  Here’s my approach
&lt;/h3&gt;

&lt;p&gt;For this demo I am going to use GitHub Actions as my CI/CD platform, but this approach will work on any platform. I’ll also use Azure Container Registry (ACR) as my &lt;a href="https://docs.microsoft.com/azure/container-registry/container-registry-helm-repos"&gt;Image and Chart repository&lt;/a&gt;, and deploy the resulting Container and Chart to Azure Kubernetes Service (AKS). While the ACR and AKS steps aren’t necessary, they do form part of an end-to-end process most people are likely to use.&lt;/p&gt;

&lt;p&gt;I’m using the &lt;a href="https://github.com/sjwaight/helm-version-demo/"&gt;sample repository&lt;/a&gt; for this demo, but you can use any Helm chart you have previously scaffolded with &lt;code&gt;helm create&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Start by editing the Chart.yaml file and setting the &lt;code&gt;version&lt;/code&gt; to a known value that you can parse is a script. You can also set a value for the &lt;code&gt;appVersion&lt;/code&gt; but this will be overridden at the command line as we will see.&lt;/p&gt;

&lt;p&gt;Next, in your GitHub Action workflow define two environment variables &lt;code&gt;chartVersion&lt;/code&gt; and &lt;code&gt;appMajorMinorVersion&lt;/code&gt; as shown below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;env:
  chartVersion: 0.1.0
  appMajorMinorVersion: 0.1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I have set a full semantic version for the Chart and only the major and minor versions for the application. I want full centralised control for the Chart version and the approach I have chosen means I do need to update my GitHub Action workflow if I want to change the Chart version, but I can live with that (at least it’s not hidden in a source code file somewhere in the repo!)&lt;/p&gt;

&lt;p&gt;Hopefully in future we’ll have a plain text variable store for GitHub Actions which means I can define the value elsewhere and just reference in the workflow file. I don’t want to use secrets because I want these values visible to anyone who looks at this workflow.&lt;/p&gt;

&lt;p&gt;When we want to package our Chart we need to do a few additional steps. As our workflow runs on a Linux host we can utilise Linux commands to help. In this case we’re going to use &lt;a href="https://www.redhat.com/sysadmin/command-basics-printf"&gt;printf&lt;/a&gt; and &lt;a href="https://www.howtogeek.com/666395/how-to-use-the-sed-command-on-linux/"&gt;sed&lt;/a&gt; to format and replace the &lt;code&gt;version&lt;/code&gt; placeholder.&lt;/p&gt;

&lt;p&gt;In the below workflow step we start by escaping the dots in the &lt;code&gt;chartVersion&lt;/code&gt; environment variable. We do this is so the &lt;code&gt;sed&lt;/code&gt; replacement works as expected on the next line.&lt;/p&gt;

&lt;p&gt;Using this escaped value, which is held in &lt;code&gt;$escaped_version&lt;/code&gt;, we then use the &lt;code&gt;sed&lt;/code&gt; command to replace the &lt;code&gt;version&lt;/code&gt; placeholder in the Chart.yaml file.&lt;/p&gt;

&lt;p&gt;The final two commands packages up the Helm Chart, specifying the &lt;code&gt;appVersion&lt;/code&gt; at the command-line, and then push the Chart to a remote repository (the sample uses Azure Container Registry).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
    - name: Helm Chart Update, Package and Push
      run: |
        cd ./content-web/charts/web
        escaped_version=$(printf '%s\n' "${{ env.chartVersion }}" | sed -e 's/[\/.]/\\./g')
        sed -i "s/version\: 0\.0\.0/version\: $escaped_version/" Chart.yaml

        helm package . --app-version ${{ env.appMajorMinorVersion }}.${{ env.tag }}
        helm push web-${{ env.chartVersion }}.tgz oci://${{ env.containerRegistry }}/helm

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As a result of this step we can now control the &lt;code&gt;version&lt;/code&gt; and &lt;code&gt;appVersion&lt;/code&gt; properties of our Chart.&lt;/p&gt;

&lt;h3&gt;
  
  
  More control
&lt;/h3&gt;

&lt;p&gt;The thing I do like about the approach I have, is that should anyone tamper with the Chart.yaml file and change the &lt;code&gt;version&lt;/code&gt; value the build will break and I avoid downstream issues as a result. As &lt;code&gt;appVersion&lt;/code&gt; is overridden I can also be confident that any change made to the file will be ignored.&lt;/p&gt;

&lt;p&gt;Having said this, the approach I am using is just one way to achieve this outcome, and it’s also important to note that it doesn’t suit every use case (and maybe not yours!)&lt;/p&gt;

&lt;p&gt;It might be that you would break the GitHub Action workflow into multiple separate workflows in order to give you better control over when the two Helm Chart properties are updated. Also, the sample workflow uses a fairly basic process – it assumes every invocation should build a new Container Image, a new Helm Chart and deploy to AKS. Clearly that may not be ideal, so at a minimum you’d probably want to add Approvals or other mechanism that ensures you only build a new Chart when necessary, and only deploy the Chart when you want.&lt;/p&gt;

&lt;p&gt;Hopefully you’ve picked up some useful information on working with Helm from this post, and it will help you be successful in packaging and releasing applications using Helm. If you come up with a different or better way feel free to leave a comment!&lt;/p&gt;

&lt;p&gt;Happy days! 😎&lt;/p&gt;

&lt;p&gt;P.S. You can check out the &lt;a href="https://github.com/sjwaight/helm-version-demo/"&gt;GitHub repository&lt;/a&gt; with the sample workflow.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>ci</category>
      <category>containers</category>
      <category>github</category>
    </item>
    <item>
      <title>How to port AWS Serverless solutions to Microsoft Azure</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Mon, 29 Nov 2021 22:30:38 +0000</pubDate>
      <link>https://forem.com/simonwaight/how-to-port-aws-serverless-solutions-to-microsoft-azure-56nb</link>
      <guid>https://forem.com/simonwaight/how-to-port-aws-serverless-solutions-to-microsoft-azure-56nb</guid>
      <description>&lt;p&gt;Avoid Vendor Lock-in.&lt;/p&gt;

&lt;p&gt;Three words. I wonder how many hours (days, weeks, months?) have been lost to designing and building software solutions and systems to the lowest common denominator simply to avoid the perceived risk of betting on the wrong platform?&lt;/p&gt;

&lt;p&gt;While risk mitigation and extensibility should form a part of any design, I’ve believed for a while now, especially since the rise of modern cloud platforms, that the need to build abstractions in solutions simply to avoid perceived future risk is a massive sinkhole and a large inhibitor to innovation, especially in the enterprise.&lt;/p&gt;

&lt;p&gt;Imagine, if you will, designing and building a solution that makes direct use of SDKs and platform features provided by a cloud vendor. Compare this to building the same solution, but instead of using the SDKs or features directly you first wrap them in abstractions. It’s easy to see the amount of additional effort involved in the second scenario.&lt;/p&gt;

&lt;p&gt;If you consider the amount of non-functional development work required in scenario two and it’s impact of this work on delivery time and cost along with potential reduction in overall functional capabilities of the resulting solution you can see it can be a fairly large impact. Also consider this in the context that at some future point (let’s say 3 years from now) you &lt;em&gt;might&lt;/em&gt; need to move to a different platform.&lt;/p&gt;

&lt;p&gt;There are an absolute minority of businesses that might require true multi or hybrid cloud solutions (👋 Kubernetes), but I can tell you nobody at Netflix has lost sleep over their bet on a single cloud vendor.&lt;/p&gt;

&lt;p&gt;I firmly believe that you can, in many cases, push re-platforming risk into the future. Any cloud migration will invariably be a multi-step process that involves infrastructure, data and application code. While you don’t want to rewrite your entire application codebase, if all you need to do is rewrite code that is tightly coupled to a cloud API then your overall effort is relatively small, and can likely be factored into the costs of migration.&lt;/p&gt;

&lt;p&gt;Anyhow, enough of my opinions, let’s look at a practical example.&lt;/p&gt;

&lt;h3&gt;
  
  
  List Manager Event-driven sample
&lt;/h3&gt;

&lt;p&gt;AWS has a few sample Lambda applications on their documentation site, and I thought I would find the most intricate one and use it as the basis for the demo code to go along with this blog. To that end I chose the &lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/samples-listmanager.html"&gt;List manager sample application for AWS Lambda&lt;/a&gt; as my preferred demo.&lt;/p&gt;

&lt;p&gt;I also wanted to select a Lambda solution because it’s the most tightly bound to the cloud platform it is running on. Almost every aspect of this sample application relies on a piece of AWS that is available on AWS only – the two items that are not are the Node.js code containing the business logic along with the RDS MySQL instance which is just a managed MySQL service.&lt;/p&gt;

&lt;h3&gt;
  
  
  Switching to Azure
&lt;/h3&gt;

&lt;p&gt;If we’re starting with serverless, we might as well target serverless! In most inter-cloud migrations I’d expect this type of mapping to occur given the level of equivalence in services between vendor lock-in is really not a factor any more.&lt;/p&gt;

&lt;p&gt;Yes, we’ll have to change some SDKs and libraries that determine how we interact with as-a-Service offerings, but at the core there is not much business logic that needs to change.&lt;/p&gt;

&lt;p&gt;If you want simply to see the source code you can &lt;a href="https://github.com/sjwaight/azure-list-manager-sample"&gt;check it out on GitHub&lt;/a&gt;, otherwise read on for how the migration was completed.&lt;/p&gt;

&lt;p&gt;Here’s a high level architecture diagram to give you a feel for what we’re aiming for.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--li9FFNSJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/11/2021-10-29_14-03-48.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--li9FFNSJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/11/2021-10-29_14-03-48.png%3Fw%3D1100" alt="Azure List Manager Sample" title="Azure List Manager Sample" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  The porting process
&lt;/h4&gt;

&lt;p&gt;The only bits we really need to keep from the original are the Node.js logic implementations and the format and data for the events driving the solution. Everything else is just supporting infrastructure which we can replace.&lt;/p&gt;

&lt;p&gt;The original sample uses private networking to connect everything except the event source, so we can easily rebuild the infrastructure from scratch in Azure. However, rather than starting with this, as I will assume it’s just a secure pipe between PaaS in Azure, I’m more interested in porting the Node.js and event source code so I can have an end-to-end test that allows me to validate my Azure implementation.&lt;/p&gt;

&lt;p&gt;At this point I could easily have run the &lt;a href="https://docs.microsoft.com/en-us/azure/cosmos-db/local-emulator"&gt;Cosmos DB emulator&lt;/a&gt; and a &lt;a href="https://dev.mysql.com/downloads/mysql/"&gt;MySQL Server&lt;/a&gt; on my developer box to act as surrogates for their Azure equivalents and developed the &lt;a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-develop-vs-code?tabs=nodejs"&gt;Azure Functions using Visual Studio Code&lt;/a&gt; and the Azure Core Functions tooling.&lt;/p&gt;

&lt;p&gt;However, I didn’t want to setup MySQL locally, and running it all in Azure is just handier 🙂 – especially as I’d still need to use an actual &lt;a href="https://docs.microsoft.com/azure/event-hubs/event-hubs-about"&gt;Azure Event Hub&lt;/a&gt; for my testing. I did develop the Functions locally and then connected them directly to Azure as required. If I’d spun up the development infrastructure to use private networking I’d need to do this development on a Virtual Machine running on the same network – something that just felt unnecessary at this point.&lt;/p&gt;

&lt;p&gt;So, with all these pieces in place I began working on the Node.js Functions and converted them to use Azure. The ‘dbadmin’ Function really needed a change as the source sample allows any arbitrary SQL query to be sent to this Function – something that strikes me as massively dangerous. I solved this by hardcoding the SQL that can be used and controlling any variables via only Azure Functions or Azure Key Vault configuration values that are only ever used on the server and never exposed to any client.&lt;/p&gt;

&lt;p&gt;The ‘processor’ Function was a little more work, not helped by me not being a massively experienced Node developer, and the need to update code to better handle Node Promises, as well as the differences between Cosmos DB and DynamoDB. From what I can see it looks like DynamoDB accepts and emits JSON as simple strings, whereas Cosmos DB (or its Node SDK) readily accepts and emits JSON objects ready for code to consume. Additionally, Cosmos DB has the idea of an ‘upsert’, which does seem to exist in DynamoDB, but is not used in the source application codebase at all.&lt;/p&gt;

&lt;p&gt;The final difference between Cosmos DB and DynamoDB is the concept of a Table. Cosmos DB doesn’t have a Table construct, and the anti-patten a lot of first time Cosmos DB users fall into is to map a Table (typically from a relational data model world) into a &lt;a href="https://docs.microsoft.com/azure/cosmos-db/account-databases-containers-items#azure-cosmos-containers"&gt;Cosmos DB Container&lt;/a&gt;. At first glance this looks like a good fit, but it’s actually likely to cause you pain (mostly around cost, but potentially also performance).&lt;/p&gt;

&lt;p&gt;Cosmos DB stores JSON documents, and each document can have a different set of properties. While you are required to meet a few basic rules including having an identity field with high cardinality (basically the field should have a lot of different possible values) you can otherwise store multiple different document types in one Container, which is what I’ve done with this sample.&lt;/p&gt;

&lt;p&gt;Once I had everything working end-to-end with this hybrid development setup it was then time to build out the final infrastructure deployment which requires private networking.&lt;/p&gt;

&lt;p&gt;The easiest way I find to achieve infrastructure modelling is to use the Azure Portal. This is because it hides a lot of complexity in the underlying REST APIs being used to provision the infrastructure and it also provides me with an easy way to discover required attributes to provision infrastructure I may not have worked with previously. There were a few items in this demo that I knew of, but hadn’t used, so this worked out perfectly. I’d also note I didn’t look at the exiting CloudFormation definitions and try and convert those. What I actually did was use the documentation and diagram on the source GitHub repository as my guide – proving to me that solid existing documentation is key to any migration!&lt;/p&gt;

&lt;p&gt;Now with the baseline environment in place I &lt;a href="https://docs.microsoft.com/azure/azure-resource-manager/templates/export-template-portal"&gt;exported the deployment to an ARM template&lt;/a&gt; so I could create a reusable template to go into my GitHub sample. I’ve been meaning to learn Bicep for a while now, so I took the opportunity to decompile the ARM template into Bicep which did a pretty good job. I suspect the &lt;a href="https://github.com/sjwaight/azure-list-manager-sample/blob/main/infra-deploy/deploy.bicep"&gt;resulting Bicep definition&lt;/a&gt; is overly verbose because it contains a lot of properties with default values that could be removed, but I’m pleasantly surprised at how I could do things like define Storage Account Containers, Cosmos DB Databases and Containers, Key Vault secrets, and a whole range of other things that used to be manual steps. It took me a little while to convert the Bicep into a re-usable template, but I’m pretty happy with the result which also now supports comments!&lt;/p&gt;

&lt;p&gt;Once I had the infrastructure deployed I used the &lt;a href="https://docs.microsoft.com/azure/azure-functions/functions-continuous-deployment"&gt;Azure Functions Deployment Center&lt;/a&gt; to setup a GitHub Action for my deployment. This simple process creates the necessary Secret in GitHub to enable the deployment and also results in a GitHub Action workflow embedded in my sample repository which can easily be forked and cloned as required by others.&lt;/p&gt;

&lt;p&gt;Finally, I could sit down and document the sample both on GitHub and via this blog.. so here we are! 😎&lt;/p&gt;

&lt;p&gt;Rather than wrap shell scripts around all the commands needed for the sample I’ve tried to expose the raw Azure CLI commands you’d need to use. They aren’t actually that complex and really show the level of consistency the Azure CLI has and how straightforward it is to work with!&lt;/p&gt;

&lt;h3&gt;
  
  
  Future bets
&lt;/h3&gt;

&lt;p&gt;So if you’ve stuck it out this far… thanks for that!&lt;/p&gt;

&lt;p&gt;The process for this blog has been a fun one, but I do think there will be a better way forward with at least some aspects of these types of solutions in future.&lt;/p&gt;

&lt;p&gt;Containerisation and Kubernetes offer great platform-agnostic ways for hosting solution components, even if not all solution components live within a portable environment. This also doesn’t mean you need start with this as an option either!&lt;/p&gt;

&lt;p&gt;The Azure Functions runtime has been portable since its v2 release and can be run on Azure, on-premises or anywhere you can run a Container. Additionally, I see the work Microsoft is doing around Kubernetes Extensions such as the &lt;a href="https://docs.microsoft.com/azure/app-service/overview-arc-integration"&gt;Azure App Services on Kubernetes&lt;/a&gt; and I start to see ways for people to have confidence that they can build solution components once and then have flexibility on where they host them, without needing to do a huge amount of porting work.&lt;/p&gt;

&lt;p&gt;Broadening cloud hosted services to provide access to popular technologies like MySQL, Postgres, MongoDB, Redis, Grafana, et al, also means over time you will have confidence that you can truly take your solutions anywhere!&lt;/p&gt;

&lt;p&gt;Happy Days! 😎&lt;/p&gt;

&lt;p&gt;P.S. You can &lt;a href="https://github.com/sjwaight/azure-list-manager-sample"&gt;check out the List Manager Azure sample on GitHub&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>azure</category>
      <category>cosmosdb</category>
      <category>nosql</category>
    </item>
    <item>
      <title>How to Build and Debug an Alexa Skill with Python and Azure Functions</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Mon, 30 Aug 2021 23:30:52 +0000</pubDate>
      <link>https://forem.com/simonwaight/how-to-build-and-debug-an-alexa-skill-with-python-and-azure-functions-ke0</link>
      <guid>https://forem.com/simonwaight/how-to-build-and-debug-an-alexa-skill-with-python-and-azure-functions-ke0</guid>
      <description>&lt;p&gt;Voice assistants have become all the rage, and they provide a great way to access and consume information. A fairly common scenario, and one that most assistants ship with, is reading the latest news headlines.&lt;/p&gt;

&lt;p&gt;This got me thinking about how I could take the foundation of &lt;a href="https://blog.siliconvalve.com/2021/07/27/generate-a-powerpoint-file-using-azure-functions-and-python/" rel="noopener noreferrer"&gt;my last post&lt;/a&gt; on generating a PowerPoint presentation from the latest Azure Updates RSS feed and use it to create a way for people to hear these updates via a voice assistant.&lt;/p&gt;

&lt;p&gt;The assistants most people would be familiar are Amazon Alexa, Google Assistant and Apple’s Siri. For this post I am going to pick Amazon’s Alexa assistant and build a Skill that allows Alexa users to hear the latest Azure news.&lt;/p&gt;

&lt;p&gt;Most code samples in the Alexa documentation simply use AWS Lambda serverless backends, but for my post I am going to replace Lambda with it’s equivalent in Azure – Azure Functions. We’ll be using Python to build our Skill.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; I opted to go with a bare-bones implementation on Azure, but if you’re looking to build a more complex user experience it’s worth looking at the &lt;a href="https://docs.microsoft.com/azure/bot-service/bot-service-quickstart-create-bot?view=azure-bot-service-4.0&amp;amp;tabs=python%2Cvs" rel="noopener noreferrer"&gt;Azure Bot Service and SDK&lt;/a&gt; and then &lt;a href="https://docs.microsoft.com/azure/bot-service/bot-service-channel-connect-alexa?view=azure-bot-service-4.0" rel="noopener noreferrer"&gt;connecting the resulting Bot to Alexa&lt;/a&gt;. The benefit of this route is you can build your Bot logic and flow once and then publish to multiple channels.&lt;/p&gt;

&lt;h3&gt;
  
  
  TL;DR… show me the code!
&lt;/h3&gt;

&lt;p&gt;Maybe you’re looking for a complete non-Lambda Alexa Skill sample in Python and struggling to find anything in the official docs or repositories. If all you want is the code from this blog, you can &lt;a href="https://github.com/sjwaight/AzureNewsOnAlexa" rel="noopener noreferrer"&gt;find it on GitHub&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Want to know how to get it all working as well? Read on!&lt;/p&gt;

&lt;h3&gt;
  
  
  Pre-requisites
&lt;/h3&gt;

&lt;p&gt;If you’d like to build your own Alexa Skill similar to the one in this post, this is what you’ll need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://www.python.org/downloads/" rel="noopener noreferrer"&gt;Python 3&lt;/a&gt; (I used 3.8) including pip&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://code.visualstudio.com/Download" rel="noopener noreferrer"&gt;Visual Studio Code&lt;/a&gt; with these extensions:

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions" rel="noopener noreferrer"&gt;Azure Functions&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=ms-python.python" rel="noopener noreferrer"&gt;Python from Microsoft&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=philnash.ngrok-for-vscode" rel="noopener noreferrer"&gt;ngrok for VSCode&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;An &lt;a href="https://developer.amazon.com/en-US/docs/alexa/ask-overviews/create-developer-account.html" rel="noopener noreferrer"&gt;Amazon Alexa Developer Account&lt;/a&gt;.&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Getting Started
&lt;/h3&gt;

&lt;p&gt;Once you have all the pre-requisites in place you can start by creating a new Python-based Azure Functions project in Visual Studio Code using the &lt;a href="https://docs.microsoft.com/azure/azure-functions/create-first-function-vs-code-python" rel="noopener noreferrer"&gt;official Microsoft Quickstart&lt;/a&gt; as a guide. You need a HTTP trigger (which the Quickstart uses) and should set the authentication for the method to “Anonymous”.&lt;/p&gt;

&lt;p&gt;If you run the Azure Function locally you will see the URL published in a format similar to:&lt;/p&gt;

&lt;p&gt;&lt;a href="http://localhost:7071/api/YourFunctionName" rel="noopener noreferrer"&gt;http://localhost:7071/api/YourFunctionName&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At this point we are ready to add the code to support our Alexa Skill.&lt;/p&gt;

&lt;h3&gt;
  
  
  Add the Alexa Python SDK to our Azure Function
&lt;/h3&gt;

&lt;p&gt;We need to first add the SDK that Amazon publishes for developers wishing to build Alexa backends using Python.&lt;/p&gt;

&lt;p&gt;If you look in your Azure Functions project you will see a ‘requirements.txt’ file. Open that file in Visual Studio Code and add the Alexa SDK items we need, along with a couple of other libraries we will need to complete our implementation. Once finished your file should look like the snippet below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;azure-functions
beautifulsoup4
requests
ask-sdk-core
ask-sdk-webservice-support
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Once you have made these change, and so you have all the libraries locally for development purposes, you can run this command in the terminal in Visual Studio Code:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install -f requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; I found that the Alexa webservices library has a reliance on a version of the cryptography library that appears to clash with versions required for some Azure SDK Python libraries such as Azure Storage. In this sample it’s not an issue, but one to keep an eye out for. Pip will complain when trying to install the package so you’ll know if you hit it!&lt;/p&gt;

&lt;p&gt;Now we have the necessary components locally for our development environment we need to configure our Alexa Skill via the Alexa Developer Console. I had a quick look at the &lt;a href="https://github.com/alexa/ask-cli" rel="noopener noreferrer"&gt;Alexa CLI&lt;/a&gt; (ask) but it once again appears only to tie into Lambda-based environments so it’s a no-go for us here and we’ll need to use the Console.&lt;/p&gt;
&lt;h3&gt;
  
  
  Configure our Alexa Skill
&lt;/h3&gt;

&lt;p&gt;When a user invokes your Skill, the request is directed to an Amazon-managed Alexa API which performs the voice-to-text action and then forwards the text to your Skill backend implementation.&lt;/p&gt;

&lt;p&gt;In order for Alexa to know about your Skill and its implementation, you have to configure it via the &lt;a href="https://www.developer.amazon.com/alexa/console/ask" rel="noopener noreferrer"&gt;Alexa Developer Console&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Let’s go ahead and do that now.&lt;/p&gt;

&lt;p&gt;First we need need to define the Skill, the interaction model to use, and how the Skill will be implemented. I called my Skill “Read Azure cloud news” (you’ll note the Console hint on brand name use – if I wanted to publish this Skill I would need to show I have the right to use the Azure brand, but for this demo it works fine). I will build a Custom interaction model and use my own hosting (the other two options will give you Lambda-hosted solutions).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_09-52-25-1.png%3Fw%3D716%26h%3D1024" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_09-52-25-1.png%3Fw%3D716%26h%3D1024" title="Define Alexa Skill" alt="Define Alexa Skill"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the next screen select “Start from scratch” for the template to use.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_09-55-57.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_09-55-57.png%3Fw%3D1100" title="Alexa Skill Template - Start from scratch" alt="Alexa Skill Template - Start from scratch"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next let’s define the Skill Invocation Name.&lt;/p&gt;

&lt;p&gt;This is what users will say after they wake up Alexa, so for my Skill I want a user interaction to be “Alexa, ask the Azure cloud news service for the latest news”.&lt;/p&gt;

&lt;p&gt;In the Developer Console expand the “Invocations” section and select “Skill Invocation Name” and enter your text. Click “Save Model” at the top.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_15-17-19.png%3Fw%3D1024%26h%3D715" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_15-17-19.png%3Fw%3D1024%26h%3D715" title="Invocation Name for Alexa Skill" alt="Invocation Name for Alexa Skill"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We have one more step to go before we can head back to Visual Studio Code and cut some code!&lt;/p&gt;

&lt;p&gt;We need to create an Intent which is what we use to drive actual user interaction.&lt;/p&gt;

&lt;p&gt;To help understand what an Intent is, consider the user interaction of “Alexa, ask the Azure cloud news service for the latest news”. In this interaction our Intent is derived from the “latest news” statement.&lt;/p&gt;

&lt;p&gt;In the Developer Console expand the “Intents” section and click the “+ Add Intent” button. We are going to create an Intent for reading the latest top five news items. Give it an Intent Name and click “Create custom intent”.&lt;/p&gt;

&lt;p&gt;The Intent Name is important as it is what we will use to match incoming requests in our code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_10-33-16.png%3Fw%3D1024%26h%3D551" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_10-33-16.png%3Fw%3D1024%26h%3D551" title="Defining an Intent for Alexa" alt="Defining an Intent for Alexa"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We then need to provide some sample Utterances (sentences or statements) that a user might use to signal that this is their action (or intent) when using the Skill.&lt;/p&gt;

&lt;p&gt;If the Alexa model matches a voice or text command to an Utterance then it will select the Intent and send it through to your backend implementation. Note these matches aren’t simply a 1:1 match – these Utterances help train a voice-to-text recognition model which matches more than just the samples you provide!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_10-46-25.png%3Fw%3D1024%26h%3D915" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_10-46-25.png%3Fw%3D1024%26h%3D915" title="Alexa Intent Sample Utterances" alt="Alexa Intent Sample Utterances"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Alright, now we’re ready to implement the first parts of our Alexa Skill backend in our Azure Function.&lt;/p&gt;
&lt;h3&gt;
  
  
  Handling Alexa Intents in your Azure Function
&lt;/h3&gt;

&lt;p&gt;There are a few ways you can implement an Alexa Skill in Python, but for this post I am using Python classes where each class will be used to handle a single Intent.&lt;/p&gt;

&lt;p&gt;The below code snippet shows how can handle the Intent we just defined. You can see that it simply does a match on the name of the Intent and then passes any match to the handle function where our actual logic is stored.&lt;/p&gt;

&lt;p&gt;For our Skill we download the Azure Updates RSS feed and return to the top five headlines to Alexa for it to read.&lt;/p&gt;

&lt;p&gt;You can embed &lt;a href="https://docs.microsoft.com/azure/cognitive-services/speech-service/speech-synthesis-markup?tabs=csharp" rel="noopener noreferrer"&gt;Speech Synthesis Markup Language&lt;/a&gt; (SSML) snippets in the text to control how Alexa reads the text.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;We also need a couple of other default handlers as well. One for the launch of the Skill which introduces the Skill and tells the user what they can do, and an exception handler which can be used to handle any unexpected errors in your Skill. The code snippets for these are shown below. The LaunchRequest intend doesn’t require you to configure anything in the Developer Console – it’s there by default.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Finally, we need to create a &lt;a href="https://alexa-skills-kit-python-sdk.readthedocs.io/en/latest/api/core.html#ask_sdk_core.skill_builder.SkillBuilder" rel="noopener noreferrer"&gt;SkillBuilder&lt;/a&gt; in our main Azure Function to ensure incoming requests are validated, passed to the right handler, and an appropriate response sent back to the caller.&lt;/p&gt;

&lt;p&gt;You can grab your Skill ID from the Alexa Developer Console. In the final implementation I am reading the Skill ID from the application settings so it’s easy for you to update to your own.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;You can grab the completed Python Function code &lt;a href="https://github.com/sjwaight/AzureNewsOnAlexa/blob/master/GetNews/%20__init__.py" rel="noopener noreferrer"&gt;from GitHub&lt;/a&gt; and add it to your project. This ensures you have the right imports and the RSS function in place to run the solution.&lt;/p&gt;

&lt;h3&gt;
  
  
  Testing your Skill
&lt;/h3&gt;

&lt;p&gt;Our code is now complete enough that it will manage simple interactions. Before we can test using the Alexa Developer Console we need to do a few things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Publish our local Azure Function via an &lt;a href="https://ngrok.com/" rel="noopener noreferrer"&gt;ngrok proxy&lt;/a&gt; (remember we installed the extension in VS Code). Use Port 7071 which is the default the Azure Functions Core Tools will use. The free tier for ngrok should suffice.&lt;/li&gt;
&lt;li&gt;Copy the ngrok proxy address to our clipboard.&lt;/li&gt;
&lt;li&gt;Go into the Alexa Developer Console for our Skill and select the Endpoint section.&lt;/li&gt;
&lt;li&gt;Switch the Endpoint from being a Lambda ARN to HTTPS and then put in the ngrok proxy address and our Azure Function relative path (/api/YourFunctionName).&lt;/li&gt;
&lt;li&gt;Rebuild the Alexa Model.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The video below shows you these steps. Note the video has no audio.&lt;/p&gt;

&lt;p&gt;Once we’ done the above steps we can now switch over to the Test tab in our Skill in the Alexa Developer Console and test out our Skill. Here’s a sample of how it works.&lt;/p&gt;

&lt;p&gt;Note that if you stay in a breakpoint in Visual Studio Code for too long the Alexa service will automatically timeout and give the user a default message around not being able to complete the request.&lt;/p&gt;

&lt;h3&gt;
  
  
  Moving to Production
&lt;/h3&gt;

&lt;p&gt;If you go and look at the GitHub repository you will see there is a &lt;a href="https://github.com/sjwaight/AzureNewsOnAlexa/blob/master/.github/workflows/master_aznewsalexa.yml" rel="noopener noreferrer"&gt;GitHub Actions workflow&lt;/a&gt; which was created by the Azure Functions Deployment Centre in Azure after this repository was selected as a source. Updates pushed to the repository are automatically built and deployed.&lt;/p&gt;

&lt;p&gt;You do need deploy the two Application Settings into Azure – one for the RSS endpoint and one for your Alexa Skill ID. For our demo you can manually insert them, but it is possible to automate their deployment as well in the Actions workflow if we really wished.&lt;/p&gt;

&lt;p&gt;At this point we can update our Skill Endpoint in the Alexa Developer Console to point at our deployed Azure Function URL, rebuild the model and we are then calling our Production Skill!&lt;/p&gt;

&lt;p&gt;If you wanted to launch the serivce you would still need to go through distribution checks with Amazon. In my example they’d be unlikely to approve my use of the Microsoft brand “Azure” so I’d have to change that, but apart from that our Skill is pretty much feature complete!&lt;/p&gt;

&lt;h3&gt;
  
  
  Bonus Points – Intent with a Slot
&lt;/h3&gt;

&lt;p&gt;Right now we have a fully functional, if basic, Alexa Skill powered by Azure Functions. If you want to see how we can extend the usefulness of the service, read on!&lt;/p&gt;

&lt;p&gt;Let’s add support for reading news from a particular date and see how we can build that out. For this we will need to define another Intent for our Skill and define a “Slot” within it for our date.&lt;/p&gt;

&lt;p&gt;In the Alexa Console add a new Intent called “ReadItemsFromDate”.&lt;/p&gt;

&lt;p&gt;This time when we provide sample Utterances we want to define a placeholder (known as a Slot in the Alexa world) that will be replaced by a user saying a date.&lt;/p&gt;

&lt;p&gt;You can add a Slot to an Intent by adding a placeholder in Utterances enclosed in {curly braces}, as shown in the screenshot below. Once you have done this you should then define the data type for the slot.&lt;/p&gt;

&lt;p&gt;In our case we will use the pre-defined “&lt;a href="https://developer.amazon.com/en-US/docs/alexa/custom-skills/slot-type-reference.html#date" rel="noopener noreferrer"&gt;AMAZON.DATE&lt;/a&gt;” type. As a tip, make sure to understand how these Slots can be interpreted. For example, any date that doesn’t specify a year is always assumed to be a future date… which isn’t really helpful if we are asking for past news items!!!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_15-26-15.png%3Fw%3D1024%26h%3D852" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_15-26-15.png%3Fw%3D1024%26h%3D852" title="Add new utterance with a slot" alt="Add new utterance with a slot"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can continue to use the Slot in other Utterances so that the model can start to figure out how users might ask for news from a particular date. You can see in the below screenshot where we have set the Slot data type as required.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_15-28-45.png%3Fw%3D1024%26h%3D945" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F08%2F2021-08-25_15-28-45.png%3Fw%3D1024%26h%3D945" title="Intent with Utterances using a Slot" alt="Intent with Utterances using a Slot"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we have this configured we need to write a handler for it in our Azure Function. The code for this is shown below.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The great thing with the Slot type we are using is a Skill user can say something like “top news items from last Tuesday” and Alexa will figure it out and simply send you a date in an ISO-8601 format (YYYY-MM-DD). You can even pass dates back to Alexa in ISO-8601 format and Alexa will speak them out completely to the user.&lt;/p&gt;

&lt;h3&gt;
  
  
  Wrapping Up
&lt;/h3&gt;

&lt;p&gt;Whew! There’s a lot in this post, but I do so hope you’ve found it useful and it saves you some time – even just having a full non-Lambda sample to help understand how webservice-hosted Alexa Skills hang together.&lt;/p&gt;

&lt;p&gt;As a final thought, even though I’ve used Azure Functions as my hosting option, I can easily Containerise this solution and then deploy it &lt;a href="https://docs.microsoft.com/azure/azure-functions/functions-kubernetes-keda" rel="noopener noreferrer"&gt;anywhere I can run a Container&lt;/a&gt;, which means I’m able to deliver a highly available Alexa service that is cloud agnostic without any code or runtime hosting changes!&lt;/p&gt;

&lt;p&gt;Until next time, happy days! 😎&lt;/p&gt;

</description>
      <category>azure</category>
      <category>serverless</category>
      <category>python</category>
      <category>bots</category>
    </item>
    <item>
      <title>Generate a PowerPoint file using Azure Functions and Python</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Mon, 26 Jul 2021 23:30:00 +0000</pubDate>
      <link>https://forem.com/simonwaight/generate-a-powerpoint-file-using-azure-functions-and-python-571</link>
      <guid>https://forem.com/simonwaight/generate-a-powerpoint-file-using-azure-functions-and-python-571</guid>
      <description>&lt;p&gt;For many years I have been involved with the &lt;a href="https://www.meetup.com/Azure-Sydney-User-Group/"&gt;Azure Sydney User Group&lt;/a&gt;, and even though I’m no longer the organiser I still gather updates for Azure in the prior month and prepare a PowerPoint presentation that contains them. My go-to place for the information is the &lt;a href="https://azure.microsoft.com/updates/"&gt;Azure Updates website&lt;/a&gt; and I’ve typically just been browsing the site and manually updating an existing PowerPoint template.&lt;/p&gt;

&lt;p&gt;Recently I reflected on the amount of time it takes for me to click through the pages and pull out the headlines, and decided this probably wasn’t a great use of my time. Even though it’s typically under 10 minutes to do it’s still repetitive which clearly means it’s the perfect candidate to automate!&lt;/p&gt;

&lt;p&gt;I’ve used it for other integration in the past, so I know the updates website has an RSS feed which is perfect for automation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Choosing an implementation approach
&lt;/h2&gt;

&lt;p&gt;There are a few ways I could have built an automation, and my original intention was to use either Azure Logic Apps or Power Automate Flow and integrate into PowerPoint online in Microsoft 365. Unfortunately, it turns out there is no native PowerPoint online connector which means this approach became a no-go!&lt;/p&gt;

&lt;p&gt;In the absence of this integration capability, I decided to turn to a code-based solution because I know there are many ways to generate Office documents through SDKs that implement the Office Open XML File Formats standard (&lt;a href="https://www.ecma-international.org/publications-and-standards/standards/ecma-376/"&gt;ECMA-376&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;One of the reasons I also looked at Logic Apps or Power Automate was their serverless pay-on-execution model. Keeping this in mind I turned to my trusty friend, Azure Functions. At this stage the no-brainer for me as a long-term C# developer would have been to implement a .NET-based Function, but as I’ve said a few times before, I’m wanting to push my skills to cover other languages, so I thought I’d have a go with Python.&lt;/p&gt;

&lt;h2&gt;
  
  
  Azure Functions + Python = ❤
&lt;/h2&gt;

&lt;p&gt;It turns out there is an excellent PowerPoint library called &lt;a href="https://python-pptx.readthedocs.io/en/latest/"&gt;python-pptx&lt;/a&gt; that has everything I needed, and I found an great &lt;a href="https://codeburst.io/building-an-rss-feed-scraper-with-python-73715ca06e1f"&gt;blog and sample from Matthew Wimberly&lt;/a&gt; that had what I needed to read and parse an RSS feed. Now I had these two elements I needed a little bit of Functions magic to tie it together and provide a simple HTTP API I could use to generate my presentation.&lt;/p&gt;

&lt;p&gt;The resulting Azure Function (shown below) does all I need in less than 200 lines of code.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;To run it, you invoke the Function via a web browser with a URL similar to:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://your-func-app.azurewebsites.net/api/GeneratePresentation?code=YOUR-FUNC-KEY&amp;amp;start=2021-06-20&amp;amp;end=2021-06-30
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If the supplied date range is valid and there are updates that fall within it, you receive a simple web page with a link to a downloadable PowerPoint file held in a private Azure Storage account which is served for a limited period using a SAS-protected URL. The full documentation around how to debug, deploy and execute the Azure Function can be be found on the &lt;a href="https://github.com/sjwaight/AzureUpdatesPresentationGen"&gt;GitHub repository for the solution&lt;/a&gt;. Also, here’s a &lt;a href="https://siliconvalve.files.wordpress.com/2021/07/azureupdate-2021-07-23-01-43-09.pptx"&gt;sample of what you can generate&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I have deployed the solution onto a Consumption plan in Azure which means I’m not paying for idle compute, and the PowerPoint takes up so little space that my Storage Account costs will be tiny, especially given this API endpoint can’t be invoked by just anyone. Finally, to save myself even more money, I have a Timer Function that once a week deletes any PowerPoint files sitting in the Storage Account, which won’t be many (if any) for most of the time.&lt;/p&gt;

&lt;p&gt;I’m pretty happy with the solution as it stands, but in future I might look to use my existing PowerPoint template as the base for the resulting presentation which means there would be even less manual work for me to do. Right now I still need to copy / paste from one PowerPoint to the other, but this is so trivial that I’m not bothered about automating it away … just yet 😉.&lt;/p&gt;

&lt;p&gt;Hopefully you find some inspiration in the solution here!&lt;/p&gt;

&lt;p&gt;Happy Days! 😎&lt;/p&gt;

&lt;p&gt;P.S. The GitHub repository with the solution is here: &lt;a href="https://github.com/sjwaight/AzureUpdatesPresentationGen"&gt;https://github.com/sjwaight/AzureUpdatesPresentationGen&lt;/a&gt;&lt;/p&gt;

</description>
      <category>python</category>
      <category>azure</category>
      <category>cloud</category>
      <category>serverless</category>
    </item>
    <item>
      <title>How to build and run a free link tracker using VS Code, Java, GitHub, MongoDB and Azure</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Wed, 14 Jul 2021 02:17:30 +0000</pubDate>
      <link>https://forem.com/simonwaight/how-to-build-and-run-a-free-link-tracker-using-vs-code-java-github-mongodb-and-azure-2ahg</link>
      <guid>https://forem.com/simonwaight/how-to-build-and-run-a-free-link-tracker-using-vs-code-java-github-mongodb-and-azure-2ahg</guid>
      <description>&lt;p&gt;Many developers are interested in learning how to build and run software on cloud platforms, but are wary of the potential for hidden costs and &lt;a href="https://cloudirregular.substack.com/p/please-fix-the-aws-free-tier-before"&gt;bill shock&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Often times this bill shock come from not understanding how consumption-based services work, and by being lulled into a false sense of security by the relatively tiny amount per-request / per-MB many services charge. This is all well-and-good until you build something unexpectedly popular that explodes over night and ramps up you bill.&lt;/p&gt;

&lt;p&gt;However, most of us will actually never face this dilema and I think it’s a shame that people are limiting themselves to learn something new based on what &lt;em&gt;might&lt;/em&gt; happen, not what &lt;em&gt;will&lt;/em&gt; happen. Having said this, let’s look at how we can remove some uncertainty when building with Azure.&lt;/p&gt;

&lt;h2&gt;
  
  
  Building cost-safe solutions
&lt;/h2&gt;

&lt;p&gt;Most of the solutions I’ve built cover the last 18 months easily fit into the “runs for the price of a coffee (at Sydney prices) or less” category, but I am always looking for ways to do better. To this end I am going to try and find ways for you to learn with Azure without the worry that you might need a second mortgage.&lt;/p&gt;

&lt;p&gt;The first thing I would recommend doing is looking at the &lt;a href="https://azure.com/free"&gt;Azure Free Tier&lt;/a&gt; and understanding what services it contains and what their limits are, with a particular focus on what happens when you exceed the Free Tier offer. You should revisit periodically as services are added or modified over time – as an example Cosmos DB recently increased the Request Units (RUs) and storage available under their free offer.&lt;/p&gt;

&lt;p&gt;Also, consider what you want to do or learn with the platform. If you are completely new to Azure and want to make use of the time-limited financial credits and 12 month increased free service tier, make sure you have goals in mind, otherwise you’ll waste your benefits. I &lt;a href="https://blog.siliconvalve.com/2018/05/30/are-free-tier-cloud-services-worth-the-cost/"&gt;wrote about this in 2018&lt;/a&gt;, so it’s worth a read before starting.&lt;/p&gt;

&lt;p&gt;Finally, always setup &lt;a href="https://docs.microsoft.com/en-us/azure/cost-management-billing/costs/cost-mgt-alerts-monitor-usage-spending#budget-alerts"&gt;Budget alerts&lt;/a&gt; for PAYG subscriptions you pay for. This won’t stop exponential growth happening, but at least you will know about it sooner than your next bill and can act accordingly to limit your expenses.&lt;/p&gt;

&lt;h2&gt;
  
  
  Our demo scenario
&lt;/h2&gt;

&lt;p&gt;For this post I will be building a simple REST API that can be used to create and track URL shortlinks (think bit.ly). These can be used on blogs like mine to track outbound link clicks.&lt;/p&gt;

&lt;p&gt;If I want track any outbound link (say when I reference a GitHub repository), I create a shortlink for the destination, use that shortlink in my blog in place of the actual destination and when a blog visitor clicks on the link their request first hits my shortlink system before being redirected to the actual destination. The shortlink code logs the request and I can report on it later.&lt;/p&gt;

&lt;p&gt;As I like to mix things up I’m going to build the solution using Java Spring Boot and MongoDB.&lt;/p&gt;

&lt;h2&gt;
  
  
  Free local development and deployment
&lt;/h2&gt;

&lt;p&gt;If I’m going to challenge myself to run services for free, I might as well extend that to the entire lifecycle of my application. To that end I’m going to be using the following to help me build and debug my solution locally.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Java JDK 11: you can use one of the free JDKs out there – &lt;a href="https://adoptopenjdk.net/"&gt;AdoptOpenJDK&lt;/a&gt; or the &lt;a href="https://www.microsoft.com/openjdk"&gt;Microsoft Build of OpenJDK&lt;/a&gt; will do.&lt;/li&gt;
&lt;li&gt;The &lt;a href="https://docs.microsoft.com/en-us/cli/azure/install-azure-cli"&gt;Azure Command Line Interface (CLI)&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.mongodb.com/try/download/community"&gt;MongoDB Community&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://code.visualstudio.com/"&gt;Visual Studio Code&lt;/a&gt;: the best free development environment out there. Add these great extensions

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=vscjava.vscode-java-pack"&gt;Java Extension Pack&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=vscjava.vscode-spring-initializr"&gt;Spring Intializr&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=mongodb.mongodb-vscode"&gt;MongoDB&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=rangav.vscode-thunder-client"&gt;Thunder Client&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once I’m done writing my solution, I want somewhere to publish my code and to then deploy it to Azure, so for that I’ll be using &lt;a href="https://github.com/"&gt;GitHub&lt;/a&gt; and &lt;a href="https://github.com/features/actions"&gt;GitHub Actions&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Free hosting in Azure
&lt;/h2&gt;

&lt;p&gt;As a developer I hate managing infrastructure. If it requires a Virtual Machine, you’re doing it wrong! So, for hosting our solution I am going to use Azure App Service and Azure Cosmos DB configured with MongoDB API support. Both these services offer a compelling free tier:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;App Service: &lt;a href="https://azure.microsoft.com/pricing/details/app-service/linux/"&gt;F1 Free Tier&lt;/a&gt; – not designed for prod workloads, but guarantees you stay free. When your app consumes more than 60 minutes of CPU time in a day the web app will be stopped and callers will receive a 503 (Unavailable) message.
&lt;/li&gt;
&lt;li&gt;Cosmos DB: &lt;a href="https://docs.microsoft.com/azure/cosmos-db/free-tier"&gt;Free Tier&lt;/a&gt; – you can run some good small solutions on this tier. The 1,000 Request Units (RUs) and 25 GB is generous and should suit most use cases. If you generate calls that exceed the RU limit you’ll receive a 429 (Too Many Requests) message and will need to handle in your calling code. You can manage the storage growth by using Time-To-Live (TTL) settings to ensure you don’t exceed the limits, though 25 GB is a lot of documents!&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Let’s build our app!
&lt;/h2&gt;

&lt;p&gt;As a starting point, connect to your MongoDB instance and make sure you have an empty database available. I called my database “bloglinks”, but you can use any name you want. Copy the connection URI and database name for use later on.&lt;/p&gt;

&lt;p&gt;You can also create an empty repository on GitHub where the application source code will be stored when you are finished and ready to deploy to Azure.&lt;/p&gt;

&lt;p&gt;Now, using Visual Studio Code, open an empty folder on your computer and then bring up the command palette (View &amp;gt; Command Palette) and select Spring Initialzr by starting to type ‘spring’. Select “Create a Maven Project…”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-49-53.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--M3-zoesd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-49-53.png%3Fw%3D918" alt="Spring Initialzr extension in VS Code." title="Spring Initialzr extension in VS Code."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next up let’s select the Spring Boot version. I tend to favour stable releases, but the choice is yours. Note your versions will likely look different to the below one depending on when you read this blog.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-50-31.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Zu8ahPAF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-50-31.png%3Fw%3D920" alt="Selecting Spring Boot version in VS Code." title="Selecting Spring Boot version in VS Code."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Make the project language Java.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-50-57.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xmLymXGu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-50-57.png%3Fw%3D937" alt="Setting project language to Java in VS Code." title="Setting project language to Java in VS Code."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now set up the necessary details to generate your package name.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-51-28.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9MV7GR----/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-51-28.png%3Fw%3D925" alt="Set Input Group ID in VS Code." title="Set Input Group ID in VS Code."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-51-48.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pPlbub3y--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-51-48.png%3Fw%3D934" alt="Set Input Artifact ID in VS Code." title="Set Input Artifact ID in VS Code."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Set Packaging Type to ‘Jar’ and Java version to ’11’.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-52-06.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XK1G60Jh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-52-06.png%3Fw%3D912" alt="Set packaging type to Jar in VS Code." title="Set packaging type to Jar in VS Code."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-52-29.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yDfQaIK7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-06-23_20-52-29.png%3Fw%3D931" alt="Set Java Version in VS Code." title="Set Java Version in VS Code."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, let’s pull in the Spring Dependencies that will help us get going quickly with our solution. For our application we’ll use Spring Web, Spring Data MongoDB and Spring Security.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-07-13_11-16-35.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NGFjeuBS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-07-13_11-16-35.png%3Fw%3D910" alt="Selecting dependencies in VS Code." title="Selecting dependencies in VS Code."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;OK, so now we have a basic Spring Boot web application scaffold which we can build on top of. Rather than spend more time on pulling this together I am going to direct you to the &lt;a href="https://github.com/sjwaight/JavaShortLinkTracker"&gt;source code for a pre-built version&lt;/a&gt; of the application which you can use as the basis of yours.&lt;/p&gt;

&lt;p&gt;Make sure to pay attention to the application.properties.prod file. You can use this locally on your machine (rename it application.properties) to manage configuration for basic authentication and connection to MongoDB. It’s important to ensure you add this file to your gitignore when committing to GitHub though, especially if you use a remote MongoDB instance.&lt;/p&gt;

&lt;p&gt;Once you’ve completed coding up the solution you will find that the Spring Data MongoDB library will automatically create the necessary Collections for you in your Database. We will use these two Collection names when we create placeholders in Azure in our next step.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up Azure
&lt;/h2&gt;

&lt;p&gt;We will use the Azure CLI to complete these steps, so make sure you have it installed and that you have logged into the subscription you want to deploy to.&lt;/p&gt;

&lt;p&gt;First we’ll create a Resource Group which can contain all our services. You can choose the location that suits you.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az group create --name MyResourceGroup --location westus2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a free Azure App Service Plan.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az appservice plan create \
    --resource-group MyResourceGroup \
    --name MyAppsPlan \
    --is-linux \
    --sku FREE
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create Web App using the Plan. We will pre-configure the Web App with the right Java environment (Java SE web server 8 with Java 11).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az webapp create \
    --name myuniquewebappid123 \ 
    --resource-group MyResourceGroup \
    --runtime "JAVA|8-java11" \
    --plan MyAppsPlan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create free tier Cosmos DB configurated with the MongoDB API v4.0.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az cosmosdb create \
    --name MyCosmosAccount \
    --resource-group MyResourcegroup \
    --enable-free-tier true \
    --kind MongoDB \
    --server-version 4.0 \
    --default-consistency-level "Session"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now we can create our Database and Collections in Cosmos DB.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Create 'bloglinks' Database
az cosmosdb mongodb database create \
     --account-name MyCosmosAccount \
     --resource-group MyResourcegroup \
     --name bloglinks 

# Create 'shortLink' Collection
az cosmosdb mongodb collection create \
     --account-name MyCosmosAccount \
     --resource-group MyResourcegroup \
     --database-name bloglinks \
     --name shortLink

# Create 'linkClick' Collection
az cosmosdb mongodb collection create \
     --account-name MyCosmosAccount \
     --resource-group MyResourcegroup \
     --database-name bloglinks \
     --name linkClick \
     --idx '[{"key":{"keys": ["_ts"]},"options":{"expireAfterSeconds": 5184000}}]'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, you might look at that last Collection create statement and wonder what I am doing here? Well, remember earlier I mentioned we wanted a way to ensure we don’t hit our storage limit (even though that’s unlikely to ever happen… but why find out?!), well this is how I will manage it this scenario.&lt;/p&gt;

&lt;p&gt;Cosmos DB has a great Time-To-Live (TTL) feature which can be used to expire documents, meaning they are removed from result sets and ultimately purged from storage. Best of all? No cost involved!!&lt;/p&gt;

&lt;p&gt;If you want to use this feature with the MongoDB API you need to use Mongo Indexes over the Cosmos-internal “_ts” document property to drive this feature. &lt;a href="https://docs.microsoft.com/azure/cosmos-db/mongodb-time-to-live"&gt;Based on the official Microsoft documentation&lt;/a&gt; you can define these indexes when creating a Collection (as above) or separately later using a MongoDB client to create the index. I’ve set the expiry at 60 days as I should ideally of processed the collection data by this point, and if not… that’s on me!!&lt;/p&gt;

&lt;p&gt;As a last step, let’s grab the MongoDB connection URI for our Cosmos DB we can use in our configuration below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az cosmosdb keys list \
     --name MyCosmosAccount \
     --resource-group MyResourcegroup \
     --type connection-strings
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;OK, so at this stage we have a web server and a MongoDB database ready for our application to be deployed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deploying and configuring our web application
&lt;/h2&gt;

&lt;p&gt;The final piece of the puzzle is to deploy and configure our web application.&lt;/p&gt;

&lt;p&gt;Let’s start with the configuration. We have four configuration elements we need in order for our web application to work:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;spring.security.user.name&lt;/li&gt;
&lt;li&gt;spring.security.user.password&lt;/li&gt;
&lt;li&gt;spring.data.mongodb.database&lt;/li&gt;
&lt;li&gt;spring.data.mongodb.uri&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Locally we have been holding the values in our application.properties file, but we don’t want to push that file to GitHub, so make sure when you publish the solution to GitHub that you exclude this file.&lt;/p&gt;

&lt;p&gt;Azure Application Service provides a nice neat way to provide these four values via App Settings which we can deploy as follows. Convert each application.properties placeholder into an equivalent with underscores instead of dots.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az webapp config appsettings set \
    --resource-group MyResourcegroup \
    --name myuniquewebappid123 \
    --settings SPRING_SECURITY_USER_NAME=youruser \
SPRING_SECURITY_USER_PASSWORD=5ecureP4ssword \
SPRING_DATA_MONGODB_DATABASE=bloglinks \
SPRING_DATA_MONGODB_URI=mongodb://your_cosmos_uri/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, let’s configure the GitHub Action to deploy the application for us.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az webapp deployment github-actions add \
    --resource-group MyResourcegroup \
    --name myuniquewebappid123 \
    --repo "youruser/your-repo" \
    --runtime "JAVA|8-java11" \
    --login-with-github
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You will be prompted to log into GitHub at &lt;a href="https://github.com/login/device"&gt;https://github.com/login/device&lt;/a&gt; using a specific ID, and after completed you should find that there is now a new folder in your repository which contains the workflow which will build and deploy your solution!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-07-13_17-10-56.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---56-3MIZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-07-13_17-10-56.png%3Fw%3D768" alt="GitHub Device Login Prompt" title="GitHub Device Login Prompt"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After a few minutes you find that a workflow kicks off and your application is now deployed and ready to serve requests!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-07-13_17-14-53.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PNGsafwK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-07-13_17-14-53.png%3Fw%3D1024" alt="GitHub Action completed run screenshot." title="GitHub Action completed run screenshot."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Testing it out
&lt;/h2&gt;

&lt;p&gt;Before we wrap up, let’s test out our deployed solution. The easiest way to do this is to install Thunderclient for VS Code and then open the collection I’ve &lt;a href="https://github.com/sjwaight/JavaShortLinkTracker/blob/main/thunder-collection_Shortlinks%20Demo.json"&gt;included in the sample app repository&lt;/a&gt;. Make sure to update the hostname to match the one you used in Azure, and that you set the basic authentication username and password to match yours too.&lt;/p&gt;

&lt;p&gt;When you use the POST call to create a new shortlink you will receive a “201 Created” HTTP response code and the details for the created entry are in HTTP response header as the ‘location’ field. This is shown below. If you try to create a link that already exists you will receive a “200 OK” HTTP response code and the shortlink data will be returned in the response body.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/07/2021-07-14_10-23-11.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KvcISpXJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/07/2021-07-14_10-23-11.png%3Fw%3D1010" alt="201 Created response sample that shows location header field." title="201 Created response sample that shows location header field."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;We’ve done a lot in this post, but I’m not 100% done yet. If you’ve run all the requests you will notice that there is a field called “visitors” that has a value of zero. We need to build a solution that will populate this field for use periodically, ideally in a zero-cost fashion as well. In a future post I’ll take a look at how we can do this.&lt;/p&gt;

&lt;p&gt;In the meantime… happy days! 😎&lt;/p&gt;

</description>
      <category>azure</category>
      <category>java</category>
      <category>mongodb</category>
      <category>github</category>
    </item>
    <item>
      <title>Windows Containers and .NET Framework applications: DevOps with Kubernetes</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Tue, 20 Apr 2021 23:11:40 +0000</pubDate>
      <link>https://forem.com/simonwaight/windows-containers-and-net-framework-applications-devops-with-kubernetes-198e</link>
      <guid>https://forem.com/simonwaight/windows-containers-and-net-framework-applications-devops-with-kubernetes-198e</guid>
      <description>&lt;p&gt;In my previous two posts on .NET Framework applications and Windows Containers I took a look at the &lt;a href="https://dev.to/simonwaight/windows-containers-and-net-framework-applications-the-basics-2pdi"&gt;rationale&lt;/a&gt; and &lt;a href="https://dev.to/simonwaight/windows-containers-and-net-framework-applications-migration-2pj"&gt;approach&lt;/a&gt; for bringing these applications to containers before using a sample application (MVC Music Store) to show what is involved with containerising an application.&lt;/p&gt;

&lt;p&gt;In this post I am going to take the next step – take our containerised ASP.NET web application and deploy it to Kubernetes whilst making sure the build and deployment process is centralised and repeatable.&lt;/p&gt;

&lt;h3&gt;
  
  
  Setting up Kubernetes
&lt;/h3&gt;

&lt;p&gt;I’m going to use Azure Kubernetes Service (AKS) for this post, so to start I am going to create a new AKS cluster with a Windows Node Pool. Even though I’ve selected an Azure managed service there is nothing stopping you from using a similar approach to deploying to Kubernetes either on your own infrastructure or in another cloud.&lt;/p&gt;

&lt;p&gt;Let’s start by creating a new AKS cluster using the following commands.&lt;/p&gt;

&lt;p&gt;You will be prompted for a strong Windows password for the specified admin user. If you receive an error with “Invalid adminPassword” then you haven’t met the security standards set via policy and will need to increase complexity or length of the Windows admin user’s password.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Only required if AKS creation fails due to Service Principal 
# not existing or unable to be created automatically.
az ad sp create-for-rbac \
    --skip-assignment \
    --name myAksServicePrincipal

# Create new cluster with 2 system (Linux)
# nodes (you will be prompted for a password)
az aks create \
    --resource-group myResourceGroup \
    --name myAKSCluster \
    --node-count 2 \
    --enable-addons monitoring \
    --generate-ssh-keys \
    --location myAzureRegion \
    --windows-admin-username windowsAdminUser \
    --vm-set-type VirtualMachineScaleSets \
    --network-plugin azure \
    --service-principal myAppIdFromAdCreate \
    --client-secret myAppPasswordFromAdCreate

# Add Windows Node Pool with 2 nodes
az aks nodepool add \
    --resource-group myResourceGroup \
    --cluster-name myAKSCluster \
    --os-type Windows \
    --name winnp \
    --node-count 2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;At this point we now have a 4 node Kubernetes cluster – two nodes running Linux and two nodes running Windows.&lt;/p&gt;

&lt;p&gt;As a final piece let’s export the kubeconfig that we will use later for depoying our solution’s Container to AKS from GitHub Actions.&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az aks get-credentials \
    --resource-group myResourceGroup \
    --name myAKSCluster \
    --admin \
    --file akscreds.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;The local file “akscreds.txt” now contains the data you’ll need to configure a deployment into AKS. Protect this data as it is the keys to your Kubernetes castle (hint: once you’ve used it later to configure the deployment you should delete the file).&lt;/p&gt;
&lt;h3&gt;
  
  
  Configuring our connection string
&lt;/h3&gt;

&lt;p&gt;In the previous post we used the new &lt;em&gt;Microsoft.Configuration.ConfigurationBuilders.Environment&lt;/em&gt; extensions available for .NET 4.7.1+ to allow us to supply the database connection for our sample ASP.NET web application via Windows environment variables.&lt;/p&gt;

&lt;p&gt;In Kubernetes we can supply these environment variables via a few methods to container instances. As we are dealing with sensitive information we’ll use Kubernetes &lt;a href="https://kubernetes.io/docs/concepts/configuration/secret/" rel="noopener noreferrer"&gt;Secrets&lt;/a&gt; which are managed centrally within any cluster. Let’s go ahead and add our MusicStoreEntities environment variable to our cluster.&lt;/p&gt;

&lt;p&gt;Secret values must be supplied as base64 encoded strings. This means we have to encode our connection string before we create a Secret with it. On Windows we can do this use the following PowerShell snippet.&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes("YOUR_CONNECTION_STRING"))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;This will output the connection string as a base64 encoded value. Copy the value as we will use it the next step.&lt;/p&gt;

&lt;p&gt;Next, use your favourite text editor (VS Code, right 😉) to create a new file called “musicstoreentities-secret.yml” that will look similar to the below. Replace the “connstring” entry with the base64 encoded connection string you just created, then save the file.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;Now we have this file we can go ahead and create the secret in our cluster. Exactly how you do this will depend on how (or where) your Kubernetes setup is hosted. You can use the Kubernetes command line tool (&lt;a href="https://kubernetes.io/docs/reference/kubectl/overview/" rel="noopener noreferrer"&gt;kubectl&lt;/a&gt;) to add the secret, but that requires setup first which I won’t cover here.&lt;/p&gt;

&lt;p&gt;In my instance I’m using features in AKS that allow me to add the secret via the Azure Portal so I don’t need to have kubectl installed or configured (I could also do it via Azure Cloud Shell *with* kubectl… but that’s another story 😉).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/04/2021-03-29_13-43-12.png" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F04%2F2021-03-29_13-43-12.png%3Fw%3D932" title="How to add a Secret to Kubernetes in Azure Kubernetes Service in the Azure Portal." alt="How to add a Secret to Kubernetes in Azure Kubernetes Service in the Azure Portal."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/04/2021-03-29_14-49-54.png" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F04%2F2021-03-29_14-49-54.png%3Fw%3D988" title="Creating a new Kubernetes Secret using the Azure Portal." alt="Creating a new Kubernetes Secret using the Azure Portal."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;OK, so we should now have everything we need in our Kubernetes setup so we can deploy and run our application!&lt;/p&gt;

&lt;h3&gt;
  
  
  Back to the Dock(er)yard
&lt;/h3&gt;

&lt;p&gt;Let’s return to our updated Visual Studio Solution and review &lt;a href="https://github.com/sjwaight/MVC-Music-Store/blob/post-containerisation/MvcMusicStore-Completed/MvcMusicStore/Dockerfile" rel="noopener noreferrer"&gt;the Dockerfile&lt;/a&gt; that was added by the Visual Studio Container tools.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;This is a very succinct Dockerfile 😎. There is no definition in this file that tells the executing builder how to actually perform steps like NuGet restore or execute MSBuild – these are all defined outside of this context because the assumption is that Visual Studio will run and that the resulting build output lives in “obj\Docker\publish”.&lt;/p&gt;

&lt;p&gt;The problem we have is that it works very well when you are building using Visual Studio, but not when you try to use this in a Continuous Integration scenario where Visual Studio may not be installed on the build host (this is a good thing BTW – not having VS on your build host).&lt;/p&gt;

&lt;p&gt;We can fix this, but it requires some manual work as we cannot get Visual Studio to generate a full file for us. The resulting Docker file is shown below and also sits in the &lt;a href="https://github.com/sjwaight/MVC-Music-Store/blob/post-containerisation/MvcMusicStore-Completed/CIDockerfile" rel="noopener noreferrer"&gt;sample GitHub repository&lt;/a&gt;.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The above Dockerfile is a &lt;a href="https://docs.docker.com/develop/develop-images/multistage-build/" rel="noopener noreferrer"&gt;multi-stage Docker file&lt;/a&gt; and it’s worth exploring it a bit. Our initial build and compilation takes place on the sdk:48 base image (tagged as “build”). This image contains the full .NET Framework SDK which includes tooling such as MS Build.&lt;/p&gt;

&lt;p&gt;Once the application has been succcessfully compiled we then use a new base image (aspnet:4.8 – tagged as “runtime”) to which our build output is copied. This new base image does not contain the full SDK. Both from an image size and security standpoint this is a win – we don’t have unnecessary files on our runtime host which is great.&lt;/p&gt;

&lt;h3&gt;
  
  
  Storing our Container Images
&lt;/h3&gt;

&lt;p&gt;Once built we need a location to store our Container Images. In most cases these Images are held in private Container Registries, and for this post I am going to use &lt;a href="https://docs.microsoft.com/azure/container-registry/container-registry-intro" rel="noopener noreferrer"&gt;Azure Container Registry&lt;/a&gt; (ACR) as my storage location. You can choose to use any Docker-compatible Registry you like though as long as GitHub Actions can publish to it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az acr create --name myContainerRegistry \
              --resource-group myResourceGroup \
              --sku Basic \
              --admin-enabled true \
              --location myAzureRegion
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Once the Container Registry is provisioned I then need to attach it to my AKS cluster. This will enable AKS to pull images from the ACR instance without the need for pull secrets.&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az aks update \
   --resource-group myResourceGroup \
   --name myAKSCluster \
   --attach-acr $(az acr show --name myContainerRegistry --resource-group myResourceGroup --query "id" -o tsv)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h3&gt;
  
  
  Building our Container Image with GitHub Actions
&lt;/h3&gt;

&lt;p&gt;Thankfully the task of building and pushing our Container is not that difficult, particularly if we select ‘windows-latest’ as our build host in GitHub Actions. Windows Server 2019 (‘windows-latest at time of writing) contains all the necessary Docker binaries we need to build a Windows Container, so we don’t need to spend time specialising the host which is great.&lt;/p&gt;

&lt;p&gt;The below Gist contains the contents of the GitHub Action that does exactly what we need. You can find the actual GitHub Action definition in the ‘deploy-to-k8s’ branch of the &lt;a href="https://github.com/sjwaight/MVC-Music-Store/blob/deploy-to-k8s/.github/workflows/main.yml" rel="noopener noreferrer"&gt;repository on GitHub&lt;/a&gt;.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;In order to get this Action functional we need to define a few GitHub Actions &lt;a href="https://docs.github.com/en/actions/reference/encrypted-secrets" rel="noopener noreferrer"&gt;Secrets&lt;/a&gt; which are only made available to the build agent when the Action executes. Secrets are a great way to hide information from others not authorised to access upstream services (such as our Container Registry) or from those who may troubleshoot failures looking at logs (Secrets are not captured in logs).&lt;/p&gt;

&lt;p&gt;Our Secrets are as follows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;ACR_USER:&lt;/strong&gt; Azure Container Registry user (Username on Access Keys blade in Azure Portal – typically same as Registry name).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;ACR_INSTANCE:&lt;/strong&gt; Azure Container Registry name.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;ACR_SECRET:&lt;/strong&gt; Azure Container Registry Password or Password2 value.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;IMAGE_NAME:&lt;/strong&gt; Used as the Docker image name. Doesn’t necessarily have to be a secret, but might be useful to obfuscate the image name. This also ends up as the ‘repository name’ in Azure Container Registry.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;KUBECONFIG:&lt;/strong&gt; grab the contents of the ‘akscreds.txt’ file and paste it into this. We’ll use this later to deploy to Kubernetes.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/04/2021-04-19_10-23-30.png" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F04%2F2021-04-19_10-23-30.png%3Fw%3D1024" title="Azure Container Registry Access Keys blade. This is where you copy most of the Secrets from." alt="Azure Container Registry Access Keys blade. This is where you copy most of the Secrets from."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/04/2021-04-20_21-06-45.png" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F04%2F2021-04-20_21-06-45.png%3Fw%3D1024" title="GitHub Repository Secrets" alt="GitHub Repository Secrets"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Deploying to Kubernetes
&lt;/h3&gt;

&lt;p&gt;The last piece of the puzzle is taking our freshly minted Container Image and deploying it to Kubernetes. We have a couple of ways to do this – either by defining some YAML and deploying the Image using &lt;a href="https://kubernetes.io/docs/reference/kubectl/overview/" rel="noopener noreferrer"&gt;kubectl&lt;/a&gt;, or we can look at using &lt;a href="https://helm.sh/" rel="noopener noreferrer"&gt;Helm&lt;/a&gt;. For this post I am going to use Helm. You’ll need to&lt;a href="https://github.com/helm/helm/releases" rel="noopener noreferrer"&gt;install the Windows release of Helm&lt;/a&gt; first before you can work with it on your local developer machine.&lt;/p&gt;

&lt;p&gt;Once Helm is installed, open the MVC Music Store project in Windows Explorer and in the root folder create a new sub-folder called ‘charts’. Open this folder at a command-line and issue the following command to create a Helm Chart scaffold.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;helm create mvcmusicstoreweb
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;A series of files and folders will be created and we only need to make a few minor changes to the scaffolded files to be able to use them.&lt;/p&gt;

&lt;p&gt;Edit the Chart.yaml file and update it as follows. We should auto-update some values in this file and the values.yaml file, but for the purpose of this blog post we’ll go with static values. Key items of note below are the &lt;em&gt;name&lt;/em&gt;, &lt;em&gt;description&lt;/em&gt; and &lt;em&gt;appVersion&lt;/em&gt; attributes.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;Then we need to make some modifications the the values.yaml file as well. Key entries to update or add here include the &lt;em&gt;nameOverride&lt;/em&gt;, &lt;em&gt;fullnameOverride&lt;/em&gt;, &lt;em&gt;requests&lt;/em&gt; (cpu and memory) and the &lt;em&gt;nodeSelector&lt;/em&gt; to ensure the workload is scheduled onto a Windows Container host.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The final piece of the puzzle is to make sure that our environment variable containing our database connection string (MusicStoreEntities) is populated from a Kubernetes Secret.&lt;/p&gt;

&lt;p&gt;In order to make this happen we need to edit the templates\deployment.yaml Helm file and add the env section (lines 36 - 41) that tells Kubernetes to create an environment variable and pull the value from a named Secret (dbconnection) which we created earlier in this post.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;We also need to build and deploy the Helm Chart somewhere. For my purposes I am going to use Azure Container Registry’s inbuilt Helm Chart support for storing built Charts and then add a stage to my GitHub Action that builds the Chart for me.&lt;/p&gt;

&lt;p&gt;You can find the ‘build-and-push-helm-chart step in the&lt;a href="https://github.com/sjwaight/MVC-Music-Store/blob/deploy-to-k8s/.github/workflows/main.yml" rel="noopener noreferrer"&gt;Action on GitHub&lt;/a&gt; and which is reproduced in full below.. Note that we don’t need to install Helm on the build host as the Windows 2019 hosts used by GitHub Actions already have it deployed.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Now if we submit this GitHub Action we should see a new Windows Container image built, published to Azure Container Registry, followed by a new Helm Chart being created and published, also to Azure Container Registry. Finally, the Helm Chart is then used to tell Kubernetes to either deploy or update the Image running on the cluster. You can &lt;a href="https://github.com/sjwaight/MVC-Music-Store/actions/runs/766197879" rel="noopener noreferrer"&gt;view this successful run on GitHub&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/04/2021-04-20_20-37-05.png" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F04%2F2021-04-20_20-37-05.png%3Fw%3D1024" title="Screenshot of GitHub Action showing successful stages." alt="Screenshot of GitHub Action showing successful stages."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can see the Image build takes 11 minutes. That’s not a trivial amount of time, but I am using the free running tier in GitHub Actions, so it’s likely you could speed this up. Having said this, you may not run this process on every check-in and may wish to use it only for PR-merges into a deployment branch (for example!)&lt;/p&gt;

&lt;p&gt;Once these steps are completed we should find that we have a new Service in AKS called ‘mvcmusicstore’ and that it has a &lt;em&gt;LoadBalancer&lt;/em&gt; type with an External IP address.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/04/2021-04-20_19-40-12.png" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F04%2F2021-04-20_19-40-12.png%3Fw%3D1024" title="Services and ingresses view in Azure Kubernetes Service blade in the Azure Portal. Our new Service is at the bottom." alt="Services and ingresses view in Azure Kubernetes Service blade in the Azure Portal. Our new Service is at the bottom."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If we click on that IP address we get…&lt;/p&gt;

&lt;p&gt;&lt;a href="https://siliconvalve.files.wordpress.com/2021/04/2021-04-20_19-37-32.png" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fsiliconvalve.files.wordpress.com%2F2021%2F04%2F2021-04-20_19-37-32.png%3Fw%3D1024" title="MVC Music Store sample application running on Kubernetes!" alt="MVC Music Store sample application running on Kubernetes!"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;… the sweet taste of success!!&lt;/p&gt;

&lt;p&gt;Whew! 🤓&lt;/p&gt;

&lt;p&gt;So we made it through – taking an existing ASP.NET Web Application and moving it into Windows Containers and then showing how we can deploy it to Kubernetes, while ensuring it can still connect to its database.&lt;/p&gt;

&lt;p&gt;While this has been a fairly simple application to move, a lot of concepts are the same, even for complex applications. Most work will likely go into bringing the .NET release up to a supported version (3.5+), along with making sure any third party libraries work as expected.&lt;/p&gt;

&lt;p&gt;I hope you’ve learned about modernisation of .NET applications using Windows Containers in this series of posts, and until next time…&lt;/p&gt;

&lt;p&gt;Happy Days! 😎&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>azure</category>
      <category>docker</category>
      <category>devops</category>
    </item>
    <item>
      <title>Windows Containers and .NET Framework applications: Migration</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Mon, 22 Mar 2021 22:30:00 +0000</pubDate>
      <link>https://forem.com/simonwaight/windows-containers-and-net-framework-applications-migration-2pj</link>
      <guid>https://forem.com/simonwaight/windows-containers-and-net-framework-applications-migration-2pj</guid>
      <description>&lt;p&gt;We previously looked at the &lt;a href="https://dev.to/simonwaight/windows-containers-and-net-framework-applications-the-basics-2pdi"&gt;basics of what is involved in bringing .NET Framework applications to Windows Containers&lt;/a&gt;. In this second post we are going to go a little deeper and look at migrating an application.&lt;/p&gt;

&lt;p&gt;We already know that we have some discreet requirements around the types of applications that can be migrated, so for this post I am going to use one of the original ASP.NET MVC sample applications – MVC Music Store – the source code for which we can &lt;a href="https://github.com/sjwaight/MVC-Music-Store"&gt;find on GitHub&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;As previously discussed, anything earlier that .NET Framework 3.5 will require an upgrade to that release. Depending on the complexity of your solution this may or may not be a trivial task. If you’ve come here looking for a magic bullet for that… unfortunately I don’t have one!&lt;/p&gt;

&lt;p&gt;While MVC Music Store is a good sample application for this post, it’s important to note that it is already running on .NET Framework 4.0 so there is not much work for us to do here, but I do want to bring it up to .NET Framework 4.7.2 – more on why later!&lt;/p&gt;

&lt;h3&gt;
  
  
  Your Toolkit
&lt;/h3&gt;

&lt;p&gt;In order to have a go at this yourself you will need the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A GitHub Account&lt;/li&gt;
&lt;li&gt;Hyper-V / virtualisation enabled in Windows 8 or 10 Pro&lt;/li&gt;
&lt;li&gt;Minimum of 10 GB free disk space for Windows Container / ASP.NET base images&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.docker.com/products/docker-desktop"&gt;Docker Desktop for Windows&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Visual Studio – ideally 2017 (v15.7+) or 2019 (16.x). You can use the &lt;a href="https://visualstudio.microsoft.com/vs/community/"&gt;Community Edition&lt;/a&gt; if this is for non-commercial or open source contribution purposes. We want these versions as they include Docker tooling that we can use to build and debug the application&lt;/li&gt;
&lt;li&gt;.NET Framework 3.5.x or 4.x installed. Your target SDK must be installed to upgrade&lt;/li&gt;
&lt;li&gt;[Optional] Azure Subscription. For my purpose I will use &lt;a href="https://docs.microsoft.com/azure/azure-sql/database/"&gt;Azure SQL Database&lt;/a&gt; as my database storage engine and in a later post I will use Azure Container Registry and Azure Kubernetes Service to host and run my containerised app.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Why not Visual Studio Code?
&lt;/h4&gt;

&lt;p&gt;We are using the full Visual Studio application as opposed to Visual Studio Code because we are porting a .NET Framework application that likely relies on the Visual Studio Solution format. This format holds the metadata about how the multiple projects in the Solution relate to one another, along with build configurations. Visual Studio Code doesn’t natively support Solution files, and while you can still work with the projects it would likely mean lots of lost time tying together the individual projects and building and releasing them.&lt;/p&gt;

&lt;h3&gt;
  
  
  Upgrade our application
&lt;/h3&gt;

&lt;p&gt;Start by forking the MVC Music Store repository you can &lt;a href="https://github.com/sjwaight/MVC-Music-Store"&gt;find on GitHub&lt;/a&gt;, followed by cloning it to a machine with your developer tools on. You can do this from the command line, or use the in-built Git support in Visual Studio.&lt;/p&gt;

&lt;p&gt;Once cloned, open the solution in Visual Studio (for my purposes I’m using VS 2019 Community Edition). You will be prompted to migrate the project if you are using a newer release of Visual Studio than the solution was created with.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--39FM6rFW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_14-58-46.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--39FM6rFW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_14-58-46.png%3Fw%3D1100" alt="MVC Music Store - needs migration"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can see the project isn’t loaded at this point, so right-click on it and select ‘Load Project’ from the context menu. When you do this you will be presented with one or more dialogs that detail what changes are necessary to migrate the project to the version of Visual Studio you are running.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CsO5S2-t--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_15-01-22.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CsO5S2-t--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_15-01-22.png%3Fw%3D1100" alt="Migration dialog"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Interestingly, despite our migration we can still open this project all the way back to Visual Studio 2010!&lt;/p&gt;

&lt;p&gt;Follow the prompts and migrate the projects using the in-built Visual Studio tooling. Depending on your solution this option may not be available. I realise the first sentence glosses over a lot of potential complexity, but in the case of our sample application, not much is required. It’s likely with older or more complex solutions that this is where you will start your journey of upgrading / replacing components of your solution so it runs on a more modern release of the .NET Framework.&lt;/p&gt;

&lt;p&gt;In the case of our sample MVC Music Store application the tooling advises that some features of ASP.NET MVC 3 may not work as expected in the IDE, and sure enough, if we open a Razor view we can see that there are lot of errors reported, though the solution builds and runs just fine.&lt;/p&gt;

&lt;p&gt;Now let’s switch the .NET Framework version we’re running on.&lt;/p&gt;

&lt;p&gt;Right-click the project you want to change and choose Properties at the very bottom of the context menu. On the project properties Application tab, update the project to the .NET Framework version you want (note: you must have it installed in order to be able to select it). Once again, if you are making a big step up in Framework versions you may either receive lots of warnings or errors at this point, or you may be unable to change. If you have multiple projects you will need to bring them all up to the same level as well.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--l3SA2jOR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-19_10-04-52.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--l3SA2jOR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-19_10-04-52.png%3Fw%3D1100" alt="Change .NET Framework version"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once this update has been done (and you’ve fixed any issues arising) then let’s restore the NuGet packages required to run the application. Open the Package Manager console by selecting the Tools menu &amp;gt; NuGet Package Manager &amp;gt; Package Manager Console and running the following three commands.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;At this point we should have a functional codebase that compiles and that can run. The final two pieces we need before we containerise the application are updating the app so it doesn’t use SQL CE and so that session state is not “InProc”.&lt;/p&gt;

&lt;p&gt;As I mentioned earlier in the post, I have decided to use Azure SQL Database for my scenario, but you can use any SQL Server you like that doesn’t get bundled with the app like SQL CE. The below script gives the Azure CLI commands you can use to create a new &lt;a href="https://docs.microsoft.com/azure/azure-sql/database/serverless-tier-overview"&gt;Serverless Azure SQL Database&lt;/a&gt; instance and obtain the connection string. Replace the placeholders with appropriate arguments.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The MVC Music Store sample will automatically populate the database on first run, so we don’t need to populate any data in the database, but we do need to modify the web.config file so that it points at the right Azure SQL Database for &lt;em&gt;MusicStoreEntities&lt;/em&gt; connection string.&lt;/p&gt;

&lt;p&gt;While you are editing the web.config, also find all instances of ‘&lt;em&gt;DefaultConnection&lt;/em&gt;‘ that were inserted when you installed the Microsoft.AspNet.Providers and set them to &lt;em&gt;MusicStoreEntities&lt;/em&gt; so that all the providers use the same database. In a production environment we’d likely split all our provider datastores out from our application data, but for our sample application this will suffice.&lt;/p&gt;

&lt;p&gt;For the session state we should have all the necessary bits present in the web.config after installing the Providers NuGet package. All you need to do to use SQL Server for session state is update the sessionState section of the web.config and point it at our SQL Database by using our Azure SQL DB connection string name for the DefaultSessionProvider.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Now you should be able to run the application in Visual Studio and after a few moments the website will load and the database will be populated. Once you’ve had a click around you can stop debugging.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tZRKq_X5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-19_11-27-57.png%3Fw%3D1100" alt="MVC Music Store Sample loaded"&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; if you want to use the ‘Admin’ section of the site you will need to manually create a new ‘Administrator’ role and then map it to a user you create via the website registration link. You can perform all these steps using the open source &lt;a href="https://docs.microsoft.com/sql/azure-data-studio/download-azure-data-studio"&gt;Azure Data Studio&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Containerising our application
&lt;/h3&gt;

&lt;p&gt;Now the fun begins! 😁&lt;/p&gt;

&lt;p&gt;In Visual Studio right-click on the project file and select Add &amp;gt; Docker Support…&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rfNrU55H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_15-24-53.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rfNrU55H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_15-24-53.png%3Fw%3D1100" alt="Add Docker support in Visual Studio"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After a few minutes processing you will find that a &lt;a href="https://docs.docker.com/engine/reference/builder/"&gt;Dockerfile&lt;/a&gt; has been added to the project along with a .dockerignore file which controls what content is copied during a container build. You will also find that a new debug option appears in Visual Studio – you can debug with “Docker” which will build the application as a container image and deploy the image to Docker Desktop and then attach the debugger.&lt;/p&gt;

&lt;p&gt;If you have been previously running Linux containers using Windows Subsystem for Linux (WSL) you will find when you first run your application in a container that the Visual Studio container tools will switch Docker to use Windows Containers instead.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--y2fZi7Ch--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_15-25-31.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--y2fZi7Ch--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_15-25-31.png%3Fw%3D1100" alt="Switching Docker to Windows Containers"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As you are also unlikely to have the base container image (the default currently for ASP.NET apps is ‘4.8-windowsservercore-ltsc2019) this image will also be downloaded and unpacked, which can take a while (and which is the reason you need free disk space!)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--thH2oU1S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_15-25-58.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--thH2oU1S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_15-25-58.png%3Fw%3D1100" alt="Download ASP.NET base container image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At this point, once the base image has downloaded everything should fire up just like it did previously. I think we can call that “step 1 complete” 😁&lt;/p&gt;

&lt;h3&gt;
  
  
  Making an idempotent container image
&lt;/h3&gt;

&lt;p&gt;Up until this point we’ve been focused on updating our application codebase to work with a more recent release of the .NET Framework, along with getting all the tooling and process in place so that inner loop development is rapid and not that different to a non-containerised app.&lt;/p&gt;

&lt;p&gt;There is one item we still need to deal with though. This might be something you don’t care about, but I suspect you will.&lt;/p&gt;

&lt;p&gt;Here’s how we currently manage how we connect to our database.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;In a traditional ASP.NET build and deployment environment we would likely use &lt;a href="https://docs.microsoft.com/aspnet/core/host-and-deploy/iis/transform-webconfig"&gt;web.config transforms&lt;/a&gt; for each environment we want to deploy to and allow tooling like &lt;a href="https://docs.microsoft.com/iis/install/installing-publishing-technologies/installing-and-configuring-web-deploy-on-iis-80-or-later"&gt;Web Deploy&lt;/a&gt; to manage this setting for us at deployment time. We could continue to use this approach, but it would mean a different container image per environment with the only difference being the configuration file. Not ideal with such large base image sizes and potentially a big overhead for large environments. This also isn’t a common way of managing containers.&lt;/p&gt;

&lt;p&gt;One reason you want to aim for .NET Framework 4.7.1+ is that Microsoft introduced some new capabilities in this release that provide a way for us to easily override certain web.config sections and use a more container-centric way of managing per-environment configuration. If you are unable to move up to this Framework release then you have some options. You could develop your own solution if you had enough applications warranting it, but I don’t see why that yak shaving should be necessary.&lt;/p&gt;

&lt;p&gt;For our scenario we are going to use environment variables to hold per-environment configuration and use the library Microsoft has provided. If this doesn’t work for you, Microsoft PM, Anthony Chu has a good approach &lt;a href="https://anthonychu.ca/post/aspnet-web-config-transforms-windows-containers-revisited/"&gt;on his blog&lt;/a&gt; to solving this in other ways. If you are doing a lot of container or non-.NET development work then using environment variables for runtime configuration will be very familiar to you.&lt;/p&gt;

&lt;p&gt;In order to support this capability in my MVC sample you need to install the NuGet package that contains the logic for this new configuration source.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Install-Package Microsoft.Configuration.ConfigurationBuilders.Environment -Version 2.0.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;You can read about the this and other configuration sources &lt;a href="https://github.com/aspnet/MicrosoftConfigurationBuilders"&gt;on the GitHub repository&lt;/a&gt; hosting the code.&lt;/p&gt;

&lt;p&gt;Once installed you then need to do a few things on your development machine:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create an Windows environment variable called “MusicStoreEntities” (this matches the web.config key) and set its value to the connection string.&lt;/li&gt;
&lt;li&gt;Add an attribute to the connectionStrings section that sets it to be read from environment variables.&lt;/li&gt;
&lt;li&gt;Remove the connection string value from the config file so it just reads “ignored” (this avoids confusion later)&lt;/li&gt;
&lt;li&gt;Restart Visual Studio to force it to re-read environment variables.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The result is our configuration section looks like the below.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;At this stage we can run the application in Docker again and get….&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xzkUn7IE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_17-40-43.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xzkUn7IE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-08_17-40-43.png%3Fw%3D1100" alt="ASP.NET Application Error"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Oh.&lt;/p&gt;

&lt;p&gt;Actually, we should expect this. We set the environment variable in Windows (the container host) but the value is not cascaded to the running container, so our running container has no idea how to connect to the database!&lt;/p&gt;

&lt;p&gt;How to fix? The standard containerised app way is to pass the environment variable to the docker command when running the container, but this gets a tricky if we want to debug from within Visual Studio using the standard “hit F5” experience.&lt;/p&gt;

&lt;p&gt;The solution we’ll use &lt;a href="https://stackoverflow.com/questions/52370812/passing-environment-variables-to-docker-container-when-running-in-visual-studio"&gt;comes from Stackoverflow&lt;/a&gt; and requires you to modify the csproj file manually and to add a pre-build event.&lt;/p&gt;

&lt;p&gt;In the project file (after unloading it) find the Debug configuration (we don’t want to use this for Release builds… we’ll deal with those other ways) and add the following XML snippet.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;DockerfileRunEnvironmentFiles&amp;gt;debug_config.env&amp;lt;/DockerfileRunEnvironmentFiles&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the XML snippet is added close the project file and reload the project and edit the pre-build event and populate the contents of this file by using the standard Windows echo command to write the environment variable to the file we referenced above (it’s OK if the file doesn’t exist – it will be created).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--L53BFy4A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-19_15-52-22.png%3Fw%3D1100" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--L53BFy4A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://siliconvalve.files.wordpress.com/2021/03/2021-03-19_15-52-22.png%3Fw%3D1100" alt="Pre-build settings Visual Studio"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now when you go to debug the solution the pre-build step will create the file that contains the environment variable which will be passed to the container at start and your database will now be accessible to the application that runs there.&lt;/p&gt;

&lt;h3&gt;
  
  
  Next steps
&lt;/h3&gt;

&lt;p&gt;I could keep on going with this post, but I think this makes a logical place to pause.&lt;/p&gt;

&lt;p&gt;For my next blog I will cover how we can centralise our container image build and how we can define a deployment pipeline so the container image can be deployed to Azure Kubernetes Service (AKS).&lt;/p&gt;

&lt;p&gt;In the meantime, if you’d like to see how the MVC Music Store solution is after the work we’ve done in this post, then you can check out that version of it on the &lt;a href="https://github.com/sjwaight/MVC-Music-Store/blob/post-containerisation/"&gt;‘post-containerisation’ branch on GitHub&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Until next time! 😎&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>azure</category>
      <category>docker</category>
      <category>architecture</category>
    </item>
    <item>
      <title>Windows Containers and .NET Framework applications: The Basics</title>
      <dc:creator>Simon Waight</dc:creator>
      <pubDate>Mon, 04 Jan 2021 22:21:15 +0000</pubDate>
      <link>https://forem.com/simonwaight/windows-containers-and-net-framework-applications-the-basics-2pdi</link>
      <guid>https://forem.com/simonwaight/windows-containers-and-net-framework-applications-the-basics-2pdi</guid>
      <description>&lt;p&gt;In this multi-post series I am going to look at what is required to take existing .NET Framework applications and bring them to Windows Containers.&lt;/p&gt;

&lt;p&gt;Rather than just dive into the mechanics of the process, first I’d like to take a look at why you might want to move to Windows Containers and what you should be aware of before you start this journey.&lt;/p&gt;

&lt;p&gt;Let’s dig in!&lt;/p&gt;

&lt;h3&gt;
  
  
  Why should I containerise my .NET Framework app?
&lt;/h3&gt;

&lt;p&gt;This is honestly the place you need to start any conversation. A lot of our industry is driven by trends, and the current one is to Containerise everything and then run it on Kubernetes!&lt;/p&gt;

&lt;p&gt;My advice is to not blindly decide this is the path for your current applications simply because you read a lot about it and see it at all the conferences you attend!&lt;/p&gt;

&lt;p&gt;From my point of view I think the following reasons are why you might consider containerisation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Clear fit for purpose. If you have modern .NET Framework 4 applications that fit nicely into the microservices paradigm then it probably makes sense to containerise them and run them on an orchestrator like Kubernetes.&lt;/li&gt;
&lt;li&gt;Consistent deployment and management methods for all platforms. If you are already heavily invested in containerisation for other stacks in your business or team, then why not for your Windows and .NET Framework apps too? Now you have one way to package, deploy, rollback and manage your apps.&lt;/li&gt;
&lt;li&gt;Pushing ops left in your SDLC. I’m amazed that people still remote into production servers and change stuff on the fly. Containers make this a more difficult proposition and force developers to build solutions with immutability in production in mind. This is probably the biggest hidden benefit in adopting Windows Containers though many developers may not agree! 😉 &lt;/li&gt;
&lt;li&gt;Better resource utilisation and separation. Containers have less overhead than a full VM and represent much better &lt;a href="https://docs.microsoft.com/dotnet/framework/app-domains/application-domains"&gt;App Domain&lt;/a&gt; separation than running multiple .NET applications on the same Windows VM. As Windows Containers only contain your application and &lt;a href="https://docs.microsoft.com/windows-hardware/drivers/gettingstarted/user-mode-and-kernel-mode"&gt;user mode&lt;/a&gt; portions of Windows you have a smaller overall footprint versus a traditional VM but benefit from a clearer separation between applications that may run on the same Container host.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Can I bring any .NET Framework app to Windows Containers?
&lt;/h3&gt;

&lt;p&gt;TL;DR – the answer is no. Read on for more detail…&lt;/p&gt;

&lt;h4&gt;
  
  
  Windows Desktop applications
&lt;/h4&gt;

&lt;p&gt;The first use case that isn’t supported is Windows desktop applications. Some people might look at a Container as a way to provide remote desktop services to end users, but this scenario isn’t supported and from what I’ve read is unlikely to be supported.&lt;/p&gt;

&lt;p&gt;If this is a requirement in your environment then you should look at Microsoft’s &lt;a href="https://docs.microsoft.com/microsoft-desktop-optimization-pack/appv-v5/getting-started-with-app-v-51"&gt;App-V&lt;/a&gt; (Application Virtualisation) that can be used to serve desktop applications to users on demand such that the applications aren’t installed on the users desktop. Alternatively a hosted remote desktop service like Azure’s &lt;a href="https://azure.microsoft.com/services/virtual-desktop/"&gt;Windows Virtual Desktop&lt;/a&gt; might also suffice.&lt;/p&gt;

&lt;h4&gt;
  
  
  .NET 1.x and 2.x applications
&lt;/h4&gt;

&lt;p&gt;The minimum requirement you have to meet for .NET on Windows Containers is .NET Framework 3.5. This means if you have codebases (or pre-compiled apps) that rely on either the 1.x or 2.x .NET Framework then you’re out of luck.&lt;/p&gt;

&lt;p&gt;Given .NET Framework 3.5 first &lt;a href="https://docs.microsoft.com/dotnet/framework/migration-guide/versions-and-dependencies#net-framework-35"&gt;shipped around 2008&lt;/a&gt; that means you’ve had at least a decade to update your application to it. If in that decade you haven’t, then unfortunately Windows Containers are not a silver bullet for you!&lt;/p&gt;

&lt;p&gt;If you’ve come here to figure out how to move forward from .NET 1.1, then I’d recommend reading the official Microsoft documentation on &lt;a href="https://docs.microsoft.com/dotnet/framework/migration-guide/migrating-from-the-net-framework-1-1"&gt;migrating from .NET 1.1 to 4.x&lt;/a&gt;.&lt;/p&gt;

&lt;h4&gt;
  
  
  Dependencies requiring manual installation
&lt;/h4&gt;

&lt;p&gt;If you have dependencies such as third party libraries or software that require you to manually intervene in their installation then you’re going to need to find alternatives.&lt;/p&gt;

&lt;p&gt;While it is possible to install software in Windows Containers as part of your Dockerfile, the installation must be totally command-line driven. If you’re unable to execute the installer and provide it with configuration via the command line then it’s a ‘no go’.&lt;/p&gt;

&lt;p&gt;Here’s a sample showing how to install an MSI using PowerShell. This is taken from a &lt;a href="https://github.com/StefanScherer/dockerfiles-windows/blob/bf0eb95daebd4b4e0c12d98b633e7b6789c299ec/iisnode/Dockerfile"&gt;Dockerfile to install iisnode&lt;/a&gt; which enables integrated Node.js support on Windows’ in-built web server IIS.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="n"&gt;RUN&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;Write-Host&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'Downloading iisnode'&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nv"&gt;$MsiFile&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$&lt;/span&gt;&lt;span class="nn"&gt;env&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="nv"&gt;Temp&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;+&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'\iisnode.msi'&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;New-Object&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;Net.WebClient&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;DownloadFile&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'https://github.com/tjanczuk/iisnode/releases/download/v0.2.21/iisnode-full-v0.2.21-x64.msi'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$MsiFile&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;Write-Host&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'Installing iisnode'&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;Start-Process&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;msiexec.exe&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-ArgumentList&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'/i'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$MsiFile&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'/quiet'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'/norestart'&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-NoNewWindow&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Wait&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Picking your base image
&lt;/h3&gt;

&lt;p&gt;Now that you’ve finally decided that you want to move to Windows Containers you need to figure out which base image you’ll use.&lt;/p&gt;

&lt;p&gt;Microsoft has introduced a new way in which it offers updated Windows Server releases so it’s important to understand this first.&lt;/p&gt;

&lt;p&gt;Most of us would be familiar with Windows Server 2016 and Windows Server 2019. These two releases are part of the &lt;em&gt;Long-Term Servicing Channel (LTS)&lt;/em&gt; for Windows which follows the fairly traditional ~ 3 year cycle of major releases with regular updates and fixes.&lt;/p&gt;

&lt;p&gt;The newer concept is the &lt;em&gt;Semi-Annual Channel (SAC)&lt;/em&gt; which is updated with new capabilities much more regularly than the LTS releases. These releases are referred to by their release number rather than a year. So, for example, “Windows Server 20H2” or “Windows Server 1903”.&lt;/p&gt;

&lt;p&gt;You can read more about the differences between LTS and SAC on &lt;a href="https://docs.microsoft.com/windows-server/get-started-19/servicing-channels-19"&gt;the official Microsoft Docs site&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Which foundation image you choose is up to you, but for my money I’d be more inclined to switch to the Semi-Annual Channel releases because your Container Images probably shouldn’t be long-lived any way, and as you need to rebuild from a base Image on a regular basis it’s best if you’ve got the latest bits, especially if they include Windows Container updates!&lt;/p&gt;

&lt;p&gt;But wait, there’s more!&lt;/p&gt;

&lt;p&gt;In addition to choosing the Windows Server release to use you will also need to determine which Windows Server image type you want: Windows Sever Core, Windows Nano Server, Windows or Windows IoT Core.&lt;/p&gt;

&lt;p&gt;I know this seems like a lot to grok, but the reason you have these choices is really about reducing the size of your base Image.&lt;/p&gt;

&lt;p&gt;Nano Server offers the smallest size, but really only supports .NET Core applications. Most .NET Framework applications can run on Windows Server Core, and you only need to consider the full Windows Server Image if you have dependencies on Windows capabilities like GDI+.&lt;/p&gt;

&lt;p&gt;There’s also some additional options for base Images that come with either the .NET Framework or ASP.NET pre-loaded.&lt;/p&gt;

&lt;p&gt;You can dive into the detail, and how to select the right Image &lt;a href="https://docs.microsoft.com/virtualization/windowscontainers/manage-containers/container-base-images#choosing-a-base-image"&gt;on Microsoft Docs&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  What’s next?
&lt;/h3&gt;

&lt;p&gt;In my next post I am going to take an existing .NET Framework application and work through how we get it up and running on Windows Containers. Exciting!!&lt;/p&gt;

&lt;p&gt;Until then… 😎&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>azure</category>
      <category>docker</category>
      <category>architecture</category>
    </item>
  </channel>
</rss>
