<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Goodness Ojonuba</title>
    <description>The latest articles on Forem by Goodness Ojonuba (@goodnessoj).</description>
    <link>https://forem.com/goodnessoj</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/goodnessoj"/>
    <language>en</language>
    <item>
      <title>Agentic DevOps: Letting AI Subagents Audit Terraform Infrastructure</title>
      <dc:creator>Goodness Ojonuba</dc:creator>
      <pubDate>Fri, 13 Mar 2026 23:58:46 +0000</pubDate>
      <link>https://forem.com/goodnessoj/agentic-devops-letting-ai-subagents-audit-terraform-infrastructure-1iin</link>
      <guid>https://forem.com/goodnessoj/agentic-devops-letting-ai-subagents-audit-terraform-infrastructure-1iin</guid>
      <description>&lt;h2&gt;
  
  
  Agentic DevOps: Letting AI Subagents Audit Terraform Infrastructure
&lt;/h2&gt;

&lt;p&gt;What if your DevOps workflow included AI workers that could review your infrastructure the same way a teammate would?&lt;/p&gt;

&lt;p&gt;As part of my learning journey with &lt;strong&gt;Agentic AI&lt;/strong&gt;, I’ve been exploring how modern AI systems can move beyond simple prompt-response interactions and begin operating more like structured engineering workflows.&lt;/p&gt;

&lt;p&gt;Most agent systems operate using a simple loop:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Gather → Act → Verify&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;But there is another important idea that makes these systems far more powerful:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Delegation.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Instead of one AI trying to do everything, work can be delegated to specialized AI workers that focus on one responsibility.&lt;/p&gt;

&lt;p&gt;In &lt;strong&gt;Claude Code&lt;/strong&gt;, these workers are called &lt;strong&gt;subagents&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Skills vs Subagents
&lt;/h2&gt;

&lt;p&gt;Earlier in my project I worked with &lt;strong&gt;Skills&lt;/strong&gt; — reusable slash commands such as:&lt;/p&gt;

&lt;p&gt;/tf-plan&lt;br&gt;
/tf-apply&lt;br&gt;
/deploy&lt;/p&gt;

&lt;p&gt;Skills help standardize repeatable workflows and run &lt;strong&gt;inside the same conversation context&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;But &lt;strong&gt;subagents work differently&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;A subagent operates in its &lt;strong&gt;own isolated environment&lt;/strong&gt;, with its own tools and sometimes its own model.&lt;/p&gt;

&lt;p&gt;Think of it like assigning work to a &lt;strong&gt;specialist engineer&lt;/strong&gt; instead of asking a general assistant to handle everything.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Feature&lt;/th&gt;
&lt;th&gt;Skills&lt;/th&gt;
&lt;th&gt;Subagents&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;How they start&lt;/td&gt;
&lt;td&gt;Triggered manually with slash commands&lt;/td&gt;
&lt;td&gt;Automatically delegated by the main agent&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Context&lt;/td&gt;
&lt;td&gt;Shared conversation context&lt;/td&gt;
&lt;td&gt;Isolated context&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Chat history&lt;/td&gt;
&lt;td&gt;Full conversation visible&lt;/td&gt;
&lt;td&gt;No chat history&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Tools&lt;/td&gt;
&lt;td&gt;Uses the main agent’s tools&lt;/td&gt;
&lt;td&gt;Own restricted toolset&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Model&lt;/td&gt;
&lt;td&gt;Uses the session model&lt;/td&gt;
&lt;td&gt;Can use a different model&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Rule of thumb&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If the task needs conversation context → use a Skill&lt;br&gt;&lt;br&gt;
If the task is self-contained → use a Subagent&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h1&gt;
  
  
  The Three Subagents I Added
&lt;/h1&gt;

&lt;p&gt;To experiment with this setup, I added three subagents to my DevOps project.&lt;/p&gt;

&lt;h3&gt;
  
  
  security-auditor
&lt;/h3&gt;

&lt;p&gt;Reviews Terraform files and detects potential &lt;strong&gt;security risks&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff622xore7u9tmnmpx42l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff622xore7u9tmnmpx42l.png" alt=" " width="800" height="476"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  tf-writer
&lt;/h3&gt;

&lt;p&gt;Generates Terraform infrastructure following &lt;strong&gt;best practices&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8k0i2uqa2dbqlgejeb8r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8k0i2uqa2dbqlgejeb8r.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  cost-optimizer
&lt;/h3&gt;

&lt;p&gt;Analyzes infrastructure configuration for &lt;strong&gt;potential cost inefficiencies&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqtxxsxge8wp147etfi7w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqtxxsxge8wp147etfi7w.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each subagent focuses on a &lt;strong&gt;single responsibility&lt;/strong&gt;, which keeps the analysis precise and avoids context overload.&lt;/p&gt;




&lt;h1&gt;
  
  
  Running the Audit
&lt;/h1&gt;

&lt;p&gt;To test the setup, I gave Claude Code a simple instruction:&lt;/p&gt;

&lt;p&gt;Audit my Terraform files for security issues&lt;/p&gt;

&lt;p&gt;The interesting part is what happened next.&lt;/p&gt;

&lt;p&gt;The main AI agent did &lt;strong&gt;not attempt to perform the audit itself&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Instead, it recognized that the request matched the &lt;strong&gt;security-auditor subagent&lt;/strong&gt; and delegated the task automatically.&lt;/p&gt;

&lt;p&gt;From that point forward, the subagent handled the entire audit independently.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flj9ikw665q663rfyr5wx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flj9ikw665q663rfyr5wx.png" alt=" " width="800" height="447"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr9g7bsb1e5r514t9yeur.png" alt=" " width="800" height="370"&gt;
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Issues I Didn’t Notice
&lt;/h2&gt;

&lt;p&gt;One of the most valuable parts of the audit was how it surfaced issues I had overlooked during deployment.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;security-auditor&lt;/strong&gt; flagged several things including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;CloudFront access logging disabled&lt;/li&gt;
&lt;li&gt;No Web Application Firewall (WAF) protection&lt;/li&gt;
&lt;li&gt;Missing security headers&lt;/li&gt;
&lt;li&gt;S3 versioning not enabled&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each issue was linked to the &lt;strong&gt;specific Terraform resource responsible&lt;/strong&gt;, along with explanations and suggested fixes.&lt;/p&gt;

&lt;p&gt;This level of detail makes infrastructure reviews &lt;strong&gt;far easier to understand and act on&lt;/strong&gt;.&lt;/p&gt;




&lt;h1&gt;
  
  
  Why Isolation Matters
&lt;/h1&gt;

&lt;p&gt;Another interesting detail is how the subagent executed the task.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;security-auditor ran in read-only mode with a clean context&lt;/strong&gt;, focused purely on auditing.&lt;/p&gt;

&lt;p&gt;This isolation prevents unintended infrastructure changes and keeps the analysis focused on one responsibility.&lt;/p&gt;

&lt;p&gt;In practice, it behaved like a &lt;strong&gt;dedicated security reviewer examining Terraform configuration&lt;/strong&gt;.&lt;/p&gt;




&lt;h1&gt;
  
  
  Architecture of the Agentic DevOps Workflow
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7bw95qbdst3jj0caybc0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7bw95qbdst3jj0caybc0.png" alt=" " width="800" height="1200"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h1&gt;
  
  
  What This Means for DevOps
&lt;/h1&gt;

&lt;p&gt;This small experiment showed me how AI can play a larger role in DevOps workflows.&lt;/p&gt;

&lt;p&gt;Instead of using AI only to generate infrastructure code, it can also assist with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Infrastructure auditing
&lt;/li&gt;
&lt;li&gt;Security validation
&lt;/li&gt;
&lt;li&gt;Cost optimization
&lt;/li&gt;
&lt;li&gt;Configuration reviews
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When combined with specialized workers like subagents, AI begins to look less like a chatbot and more like a &lt;strong&gt;team of automated engineering assistants&lt;/strong&gt;.&lt;/p&gt;




&lt;h1&gt;
  
  
  Key Takeaways
&lt;/h1&gt;

&lt;p&gt;Working through this exercise gave me a clearer picture of how &lt;strong&gt;agentic workflows can fit into DevOps practices.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A few things stood out:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Delegation matters.&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
The main agent didn’t attempt to do the security review itself. It delegated the task to a specialized subagent designed for that purpose.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Isolation improves safety.&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
The security-auditor ran in read-only mode with a clean context, preventing unintended infrastructure changes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Structured output makes reviews easier.&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Instead of vague suggestions, the audit returned categorized findings with severity levels and clear remediation steps.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Specialized agents reduce complexity.&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
By splitting responsibilities across subagents (security, cost, code generation), the system stays focused and avoids context overload.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This exercise showed me that AI in DevOps doesn’t have to stop at generating Terraform code.&lt;/p&gt;

&lt;p&gt;It can also help &lt;strong&gt;review, audit, and improve infrastructure configurations in a structured way.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;What I found most valuable wasn’t just the speed of the audit, but the structured way the work was delegated.&lt;/p&gt;

&lt;p&gt;The agent handled orchestration, while the subagent focused entirely on the security review — a workflow that fits naturally into how DevOps teams already operate.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>terraform</category>
      <category>devops</category>
      <category>agenticdevops</category>
    </item>
    <item>
      <title>Deploying a Static Website on Azure Using Blob Storage</title>
      <dc:creator>Goodness Ojonuba</dc:creator>
      <pubDate>Fri, 06 Mar 2026 23:33:17 +0000</pubDate>
      <link>https://forem.com/goodnessoj/deploying-a-static-website-on-azure-using-blob-storage-54ho</link>
      <guid>https://forem.com/goodnessoj/deploying-a-static-website-on-azure-using-blob-storage-54ho</guid>
      <description>&lt;p&gt;This guide walks through the full deployment of a static web application using Azure Blob Storage Static Website Hosting.&lt;/p&gt;

&lt;p&gt;The goal of this deployment is to demonstrate how Azure Storage can be used to host front-end applications without provisioning servers, virtual machines, or container infrastructure.&lt;/p&gt;

&lt;p&gt;In this guide we will deploy the Mini Finance application, a static website composed of HTML, CSS, and image assets, and make it publicly accessible through an Azure-generated endpoint.&lt;/p&gt;

&lt;p&gt;The deployment will cover the full process, including resource creation, configuration, file upload, and verification.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Resource Type&lt;/th&gt;
&lt;th&gt;Resource Name&lt;/th&gt;
&lt;th&gt;Purpose&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Resource Group&lt;/td&gt;
&lt;td&gt;mini-finance-rg&lt;/td&gt;
&lt;td&gt;Logical container for all resources&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Storage Account&lt;/td&gt;
&lt;td&gt;minifinancedemo01&lt;/td&gt;
&lt;td&gt;Stores and serves website files&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Blob Container&lt;/td&gt;
&lt;td&gt;$web&lt;/td&gt;
&lt;td&gt;Hosts static website content&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Static Website Endpoint&lt;/td&gt;
&lt;td&gt;&lt;a href="http://minifinancedemo01.z13.web.core.windows.net" rel="noopener noreferrer"&gt;View Site&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Public access URL&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  Architecture Overview
&lt;/h2&gt;

&lt;p&gt;This deployment follows a simple static hosting architecture.&lt;/p&gt;

&lt;p&gt;User requests are sent to the Azure static website endpoint, which retrieves files from the &lt;code&gt;$web&lt;/code&gt; container inside the storage account.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4wmpc6g9skxapvko3hod.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4wmpc6g9skxapvko3hod.png" alt=" " width="667" height="1000"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This architecture removes the need for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Virtual machines&lt;/li&gt;
&lt;li&gt;Web servers (Apache or Nginx)&lt;/li&gt;
&lt;li&gt;Load balancers&lt;/li&gt;
&lt;li&gt;Container services&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Azure Storage serves the application directly.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Step 1 — Download the Mini Finance Application&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before creating any Azure resources, we first download the application that will be deployed.&lt;/p&gt;

&lt;p&gt;The Mini Finance project is hosted on GitHub and contains the static files required to run the application.&lt;/p&gt;

&lt;p&gt;Steps&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Navigate to the Mini Finance GitHub repository&lt;/li&gt;
&lt;li&gt;Click Code&lt;/li&gt;
&lt;li&gt;Select Download ZIP&lt;/li&gt;
&lt;li&gt;Extract the archive locally&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;After extraction, the project directory should contain files similar to:&lt;/p&gt;

&lt;p&gt;mini-finance/&lt;br&gt;
 ├── index.html&lt;br&gt;
 ├── style.css&lt;br&gt;
 ├── images/&lt;br&gt;
 └── assets/&lt;/p&gt;

&lt;p&gt;These files will later be uploaded to Azure Blob Storage.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvsiglj30xxx5l2m35aze.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvsiglj30xxx5l2m35aze.png" alt="Download Project Repo" width="800" height="402"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxve10ncvg909g3nz0gm4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxve10ncvg909g3nz0gm4.png" alt="Files from file explorer" width="800" height="379"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Step 2 — Create the Resource Group
&lt;/h1&gt;

&lt;p&gt;A &lt;strong&gt;Resource Group&lt;/strong&gt; is used to organize and manage all resources related to the deployment.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configuration
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Property&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Resource Group Name&lt;/td&gt;
&lt;td&gt;&lt;code&gt;mini-finance-demo-rg&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Region&lt;/td&gt;
&lt;td&gt;&lt;code&gt;Spain Central&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  Steps
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Open the &lt;strong&gt;Azure Portal&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Search for &lt;strong&gt;Resource Groups&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Create&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Enter the configuration above&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Review + Create&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Create&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The resource group now acts as the container for all resources used in the deployment&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7idzqfjh0vf13shrxmu6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7idzqfjh0vf13shrxmu6.png" alt=" " width="800" height="290"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Step 3 — Create the Storage Account
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Basics
&lt;/h1&gt;

&lt;p&gt;The &lt;strong&gt;Basics&lt;/strong&gt; tab defines the core properties of the storage account.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Setting&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;th&gt;Explanation&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Subscription&lt;/td&gt;
&lt;td&gt;Your Azure subscription&lt;/td&gt;
&lt;td&gt;Billing account that owns the resource&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Resource Group&lt;/td&gt;
&lt;td&gt;&lt;code&gt;mini-finance-demo-rg&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Groups related resources together&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Storage Account Name&lt;/td&gt;
&lt;td&gt;&lt;code&gt;minifinancedemo01&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Globally unique name used in the endpoint URL&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Region&lt;/td&gt;
&lt;td&gt;&lt;code&gt;Spain Central&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Determines where data is stored&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Performance&lt;/td&gt;
&lt;td&gt;Standard&lt;/td&gt;
&lt;td&gt;Suitable for static website hosting&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Redundancy&lt;/td&gt;
&lt;td&gt;LRS&lt;/td&gt;
&lt;td&gt;Keeps three copies of the data in one datacenter&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F92d0rvtu0ty7ylh1hsot.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F92d0rvtu0ty7ylh1hsot.png" alt="basic" width="800" height="565"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F645440629hv2660gwape.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F645440629hv2660gwape.png" alt=" " width="701" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Advanced
&lt;/h1&gt;

&lt;p&gt;The &lt;strong&gt;Advanced&lt;/strong&gt; tab controls compatibility and security features.&lt;/p&gt;

&lt;p&gt;Recommended configuration:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Setting&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;th&gt;Explanation&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Secure transfer required&lt;/td&gt;
&lt;td&gt;Enabled&lt;/td&gt;
&lt;td&gt;Ensures HTTPS connections&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Allow Blob public access&lt;/td&gt;
&lt;td&gt;Enabled&lt;/td&gt;
&lt;td&gt;Required for static website hosting&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Minimum TLS version&lt;/td&gt;
&lt;td&gt;TLS 1.2&lt;/td&gt;
&lt;td&gt;Ensures secure connections&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Leave other settings as default.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8x7uss25xbdiojc9uipo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8x7uss25xbdiojc9uipo.png" alt="Advance" width="800" height="527"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Networking
&lt;/h1&gt;

&lt;p&gt;The &lt;strong&gt;Networking&lt;/strong&gt; tab controls how the storage account can be accessed.&lt;/p&gt;

&lt;p&gt;For this demo we allow public access so users can reach the website.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Setting&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;th&gt;Explanation&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Public network access&lt;/td&gt;
&lt;td&gt;Enable&lt;/td&gt;
&lt;td&gt;Allows internet access&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Access scope&lt;/td&gt;
&lt;td&gt;Enable from all networks&lt;/td&gt;
&lt;td&gt;Makes the site globally accessible&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3alcqu2zqd5ubedil6ue.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3alcqu2zqd5ubedil6ue.png" alt="Networking" width="800" height="488"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h1&gt;
  
  
  Data Protection
&lt;/h1&gt;

&lt;p&gt;Backup and recovery options are configured here.&lt;/p&gt;

&lt;p&gt;For this demo leave defaults.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Setting&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Soft delete for blobs&lt;/td&gt;
&lt;td&gt;Disabled&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Soft delete for containers&lt;/td&gt;
&lt;td&gt;Disabled&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Versioning&lt;/td&gt;
&lt;td&gt;Disabled&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgb7igf83uxmeckq1n7fw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgb7igf83uxmeckq1n7fw.png" alt="Data Protection" width="800" height="511"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h1&gt;
  
  
  Encryption
&lt;/h1&gt;

&lt;p&gt;Azure automatically encrypts stored data.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Setting&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Encryption type&lt;/td&gt;
&lt;td&gt;Microsoft-managed keys&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;No changes are required.&lt;/p&gt;




&lt;h1&gt;
  
  
  Tags (Optional)
&lt;/h1&gt;

&lt;p&gt;Tags help organize resources in larger environments.&lt;/p&gt;

&lt;p&gt;Example:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Name&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;project&lt;/td&gt;
&lt;td&gt;mini-finance&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;environment&lt;/td&gt;
&lt;td&gt;demo&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhfpeqqg33rsg5ohgdj5k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhfpeqqg33rsg5ohgdj5k.png" alt="Tags" width="800" height="543"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h1&gt;
  
  
  Review + Create
&lt;/h1&gt;

&lt;p&gt;Azure validates the configuration.&lt;/p&gt;

&lt;p&gt;Click:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Review + Create → Create&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fngjf69bnwaefxnihzi89.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fngjf69bnwaefxnihzi89.png" alt="review" width="800" height="496"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Deployment usually completes within &lt;strong&gt;30–60 seconds&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;After deployment click &lt;strong&gt;Go to Resource&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frjdvkdjxd0diekeko59p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frjdvkdjxd0diekeko59p.png" alt=" " width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h1&gt;
  
  
  Step 4 — Enable Static Website Hosting
&lt;/h1&gt;

&lt;ol&gt;
&lt;li&gt;Open the storage account &lt;code&gt;minifinancedemo01&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Navigate to: Data Management → Static website&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Enable Static Website and configure:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Setting&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Static Website&lt;/td&gt;
&lt;td&gt;Enabled&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Index Document Name&lt;/td&gt;
&lt;td&gt;&lt;code&gt;index.html&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Error Document Path&lt;/td&gt;
&lt;td&gt;&lt;code&gt;index.html&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F16nr5wd6ypvs114d2rbo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F16nr5wd6ypvs114d2rbo.png" alt="static web hosting" width="800" height="359"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5au0izz4lekkn2so8vho.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5au0izz4lekkn2so8vho.png" alt=" " width="800" height="284"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Azure automatically creates a container named: $web&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faw1v4kr0tkdx0oab3pqf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faw1v4kr0tkdx0oab3pqf.png" alt=" " width="800" height="293"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h1&gt;
  
  
  Step 5 — Upload the Website Files
&lt;/h1&gt;

&lt;p&gt;Navigate to:&lt;/p&gt;

&lt;p&gt;Data Storage → Containers → $web&lt;/p&gt;

&lt;p&gt;Upload:&lt;/p&gt;

&lt;p&gt;index.html&lt;br&gt;
css/&lt;br&gt;
images/&lt;br&gt;
js/&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq3dklked44fiyep0n3hm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq3dklked44fiyep0n3hm.png" alt=" " width="800" height="341"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Step 6 — Verify the Deployment
&lt;/h1&gt;

&lt;p&gt;Open the endpoint: go back to static web site and copy the endpoint  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://minifinancedemo01.z43.web.core.windows.net/" rel="noopener noreferrer"&gt;https://minifinancedemo01.z43.web.core.windows.net/&lt;/a&gt; (available temporarily for this demo)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbw8jcy5d4xr9ao9kqe8t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbw8jcy5d4xr9ao9kqe8t.png" alt=" " width="800" height="343"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;open it in a new browser: &lt;/p&gt;

&lt;p&gt;Expected result:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Homepage loads&lt;/li&gt;
&lt;li&gt;Styles apply correctly&lt;/li&gt;
&lt;li&gt;Images render&lt;/li&gt;
&lt;li&gt;Navigation works&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb3jxbdtz4gqvqf8gi4ln.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb3jxbdtz4gqvqf8gi4ln.png" alt=" " width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Deployment Summary
&lt;/h1&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Resource&lt;/th&gt;
&lt;th&gt;Name&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Resource Group&lt;/td&gt;
&lt;td&gt;&lt;code&gt;mini-finance-demo-rg&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Storage Account&lt;/td&gt;
&lt;td&gt;&lt;code&gt;minifinancedemo01&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Blob Container&lt;/td&gt;
&lt;td&gt;&lt;code&gt;$web&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Static Website Endpoint&lt;/td&gt;
&lt;td&gt;Azure Generated&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Infrastructure used:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;1 Resource Group&lt;/li&gt;
&lt;li&gt;1 Storage Account&lt;/li&gt;
&lt;li&gt;1 Blob Container&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;No compute resources required.&lt;/p&gt;




&lt;h1&gt;
  
  
  Key Takeaways
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Azure Blob Storage can host static websites&lt;/li&gt;
&lt;li&gt;No servers or infrastructure management required&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;$web&lt;/code&gt; container stores website assets&lt;/li&gt;
&lt;li&gt;Azure automatically provides a public endpoint&lt;/li&gt;
&lt;/ul&gt;




&lt;h1&gt;
  
  
  Final Thoughts
&lt;/h1&gt;

&lt;p&gt;Static website hosting on Azure is one of the simplest ways to deploy front-end applications.&lt;/p&gt;

&lt;p&gt;Instead of managing servers and web servers, you can deploy an entire site using only storage.&lt;/p&gt;

&lt;p&gt;With a few configuration steps and a folder of files, your application becomes publicly accessible across the internet.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>staticwebapps</category>
    </item>
    <item>
      <title>The Complete Guide: Deploying a Static Site on AWS using Nginx and User Data: Completely Automated</title>
      <dc:creator>Goodness Ojonuba</dc:creator>
      <pubDate>Fri, 20 Feb 2026 18:35:41 +0000</pubDate>
      <link>https://forem.com/goodnessoj/the-complete-guide-deploying-a-static-site-on-aws-using-nginx-and-user-data-fbj</link>
      <guid>https://forem.com/goodnessoj/the-complete-guide-deploying-a-static-site-on-aws-using-nginx-and-user-data-fbj</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this guide, you’ll learn how to deploy a static website on an AWS Ubuntu server using Nginx. Instead of manually installing Nginx and copying files each time, we’ll use EC2 user data to automate everything when the instance launches.&lt;/p&gt;

&lt;p&gt;We’ll use the Graphite Creative template from Tooplate:&lt;br&gt;
&lt;a href="https://www.tooplate.com/zip-templates/2156_graphite_creative.zip" rel="noopener noreferrer"&gt;Template&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;📌 Project Prerequisites&lt;/strong&gt;&lt;br&gt;
Before starting, make sure you have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An active AWS account&lt;/li&gt;
&lt;li&gt;Basic understanding of EC2&lt;/li&gt;
&lt;li&gt;A created EC2 key pair (for SSH access)&lt;/li&gt;
&lt;li&gt;Security group allowing:&lt;/li&gt;
&lt;li&gt;SSH (Port 22)&lt;/li&gt;
&lt;li&gt;HTTP (Port 80)&lt;/li&gt;
&lt;li&gt;Internet access enabled on your EC2 instance (public subnet + internet gateway)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;🧠 What We’re Automating&lt;/strong&gt;&lt;br&gt;
When the EC2 instance launches, it will:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Update packages&lt;/li&gt;
&lt;li&gt;Install Nginx&lt;/li&gt;
&lt;li&gt;Install wget &amp;amp; unzip&lt;/li&gt;
&lt;li&gt;Download the website template&lt;/li&gt;
&lt;li&gt;Extract it&lt;/li&gt;
&lt;li&gt;Move files into /var/www/html&lt;/li&gt;
&lt;li&gt;Start and enable Nginx&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;All automatically via User Data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Launch an Ubuntu EC2 Instance&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to AWS Console → EC2&lt;/li&gt;
&lt;li&gt;Click Launch Instance&lt;/li&gt;
&lt;li&gt;Choose: Ubuntu Server 24.04 LTS &lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select instance type: t2.micro (Free Tier eligible)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzxqi4xe1kyezfrqvkdvj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzxqi4xe1kyezfrqvkdvj.png" alt="EC2 config" width="800" height="373"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select your key pair&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Configure Security Group:&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Allow HTTP (80)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Allow SSH (22)&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F81awxpodphphfanagrbp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F81awxpodphphfanagrbp.png" alt="Security group" width="800" height="363"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Add This User Data Script&lt;/strong&gt;&lt;br&gt;
Scroll to Advanced Details → User Data and paste this script:&lt;/p&gt;

&lt;pre&gt;
#!/bin/bash 
apt update -y 
apt install nginx -y 
apt install wget unzip -y
cd /tmp wget https://www.tooplate.com/zip-templates/2156_graphite_creative.zip 
unzip 2156_graphite_creative.zip
rm -rf /var/www/html/*
cp -r 2156_graphite_creative/* /var/www/html/
chown -R www-data:www-data /var/www/html
systemctl enable nginx 
systemctl start nginx &lt;/pre&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvbk5sgq8m4td62qs4nsa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvbk5sgq8m4td62qs4nsa.png" alt="Userdata" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then click Launch Instance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Wait for EC2 to Initialize&lt;/strong&gt;&lt;br&gt;
Give it about 3–5 minutes. wait for 2/2 or 3/3 checks&lt;br&gt;
User Data runs automatically on first boot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpot5tev0npyt1v4zhp9j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpot5tev0npyt1v4zhp9j.png" alt="initializing" width="800" height="376"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4: Access Your Website&lt;/strong&gt;&lt;br&gt;
Go to EC2 Dashboard&lt;br&gt;
Copy the Public IPv4 address&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvzr2fnnsvxpexha6g1rr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvzr2fnnsvxpexha6g1rr.png" alt="webpage" width="800" height="350"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Open your browser:&lt;br&gt;
&lt;code&gt;http://YOUR_PUBLIC_IP&lt;/code&gt;&lt;br&gt;
check to ensure it's http and not https in your URL.&lt;/p&gt;

&lt;p&gt;If everything worked, your Graphite Creative site should load immediately.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmu22qpu02xmopug0ahfu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmu22qpu02xmopug0ahfu.png" alt="webpage loaded" width="800" height="408"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🎯 What Just Happened?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When the instance booted:&lt;/li&gt;
&lt;li&gt;Ubuntu updated&lt;/li&gt;
&lt;li&gt;Nginx installed&lt;/li&gt;
&lt;li&gt;Website downloaded automatically using wget&lt;/li&gt;
&lt;li&gt;Files extracted with unzip&lt;/li&gt;
&lt;li&gt;Nginx configured and started&lt;/li&gt;
&lt;li&gt;Site deployed without manual intervention
That’s the power of User Data automation.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;🚀 Why This Matters&lt;/strong&gt;&lt;br&gt;
This approach gives you:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Reproducible deployments&lt;/li&gt;
&lt;li&gt;Faster provisioning&lt;/li&gt;
&lt;li&gt;Zero manual configuration&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is a simple setup, but the idea behind it is powerful. Instead of treating servers like pets that need manual care, you’re starting to treat them like disposable infrastructure. If something breaks, you don’t fix it manually. You relaunch it and the automation handles the rest.&lt;/p&gt;

&lt;p&gt;From here, you could:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Store your website in an S3 bucket instead of downloading it directly&lt;/li&gt;
&lt;li&gt;Use Terraform to provision the EC2 instance automatically&lt;/li&gt;
&lt;li&gt;Add a domain name and SSL certificate&lt;/li&gt;
&lt;li&gt;Put CloudFront in front of it for better performance&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But the real win here is understanding how User Data works. Once you’re comfortable with that, you can automate almost any server setup.&lt;/p&gt;

&lt;p&gt;If you're learning cloud keep building, keep breaking things, keep  pushing. Mastery comes from daily practice.&lt;/p&gt;

</description>
      <category>nginx</category>
      <category>automation</category>
      <category>aws</category>
      <category>devops</category>
    </item>
    <item>
      <title>More Than Just a Board: Why My First Jira Sprint Was a Lesson in DevOps</title>
      <dc:creator>Goodness Ojonuba</dc:creator>
      <pubDate>Fri, 06 Feb 2026 01:31:28 +0000</pubDate>
      <link>https://forem.com/goodnessoj/more-than-just-a-board-why-my-first-jira-sprint-was-a-lesson-in-devops-4n85</link>
      <guid>https://forem.com/goodnessoj/more-than-just-a-board-why-my-first-jira-sprint-was-a-lesson-in-devops-4n85</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;br&gt;
Before this sprint, Jira felt like a task board where you move tasks/stories around. After running a full sprint myself, I realized Jira is really about visibility, rhythm, and learning, not just tracking work.&lt;/p&gt;

&lt;p&gt;In this sprint, I planned work, ran daily updates, shipped a small UI improvement, tracked progress using the burndown chart, and closed the sprint with a retrospective.&lt;br&gt;
That process changed how I think about delivery.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvd2gubg2cbz6ovo7trv4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvd2gubg2cbz6ovo7trv4.png" alt="Jira board" width="800" height="384"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Daily Scrum: Small Updates, Big Clarity&lt;/strong&gt;&lt;br&gt;
The Daily Scrum looked simple at first. Just three questions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What did I do yesterday?&lt;/li&gt;
&lt;li&gt;What will I do today?&lt;/li&gt;
&lt;li&gt;What is blocking me?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But writing these updates every day forced me to think about progress in small increments, not big unfinished work.&lt;/p&gt;

&lt;p&gt;Even working solo, the daily update created accountability. Each day needed a visible outcome, not just effort.&lt;/p&gt;

&lt;p&gt;This naturally led to shipping smaller improvements more consistently.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpjkb0ni0aehl0l2chbez.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpjkb0ni0aehl0l2chbez.png" alt="Jira story showing Daily Scrum comments" width="645" height="198"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Backlog Refinement Made Sprint Planning Easier&lt;/strong&gt;&lt;br&gt;
Before the sprint started, I created stories, added acceptance criteria, estimated them, and ranked them by value.&lt;/p&gt;

&lt;p&gt;This step made sprint planning much easier because:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The work was already clear&lt;/li&gt;
&lt;li&gt;Stories were small and understandable&lt;/li&gt;
&lt;li&gt;Scope decisions were simpler
Without backlog refinement, sprint planning would have felt like guessing.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff6tchwyzx8hdza27lewp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff6tchwyzx8hdza27lewp.png" alt="Backlog view showing Epic and story with story points" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sprint Planning: Turning Ideas into Commitment&lt;/strong&gt;&lt;br&gt;
Sprint planning was where the work became real.&lt;/p&gt;

&lt;p&gt;Instead of selecting everything, I chose just a few stories that could realistically be completed within the sprint.&lt;/p&gt;

&lt;p&gt;That decision matters. A sprint is not a to-do list. It is a commitment to deliver a usable increment.&lt;/p&gt;

&lt;p&gt;Once Sprint 1 started, the goal was simple: Ship 2–3 visible UI improvements to Gotto Job and show them live.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq515gastlsiyocxkx7ll.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq515gastlsiyocxkx7ll.png" alt="Sprint 1 showing selected story" width="800" height="229"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Burndown Chart: Seeing Progress Visually&lt;/strong&gt;&lt;br&gt;
The burndown chart showed sprint progress over time. Even with a small sprint, it made progress visible in a way the board alone could not.&lt;/p&gt;

&lt;p&gt;It answered an important question: Are we moving toward finishing the sprint goal?&lt;/p&gt;

&lt;p&gt;Instead of relying on memory or assumptions, the chart provided objective feedback about delivery progress.&lt;/p&gt;

&lt;p&gt;That transparency is what makes Scrum effective.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjyqlq35flxymufasctok.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjyqlq35flxymufasctok.png" alt="Burndown Chart" width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1sngbzso8g8q9y9gystf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1sngbzso8g8q9y9gystf.png" alt="work stats" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5hkeoh96jrmfxeyaes9p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5hkeoh96jrmfxeyaes9p.png" alt="Sprint Burndown chart" width="800" height="397"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Shipping One Increment (The DevOps Moment)&lt;/strong&gt;&lt;br&gt;
During the sprint, I implemented one UI improvement, committed the change, deployed it, and verified it on the live site.&lt;/p&gt;

&lt;p&gt;That single cycle represented the DevOps lifecycle in practice: &lt;strong&gt;&lt;em&gt;Plan → Build → Deploy → Verify&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For this project I ship one small change at a time and the first was the Tagline changed to meet the Acceptance Criteria &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Tagline Before Update:&lt;/em&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs1uoi6clvanmy1cuf1ti.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs1uoi6clvanmy1cuf1ti.png" alt="Tagline Before Update:" width="800" height="373"&gt;&lt;/a&gt; &lt;br&gt;
&lt;em&gt;Tagline After Update:&lt;/em&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F93pkbxd18olsga5ce55d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F93pkbxd18olsga5ce55d.png" alt="Tagline After Update:" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Shipping one small improvement felt more valuable than working on many unfinished changes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why the Retro Is the Most Valuable Part&lt;/strong&gt;&lt;br&gt;
The retrospective is where the sprint turns into learning.&lt;/p&gt;

&lt;p&gt;Instead of asking “Did we finish tasks?”, the retrospective asks:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What went well?&lt;/li&gt;
&lt;li&gt;What should improve?&lt;/li&gt;
&lt;li&gt;What should we try next sprint?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is where continuous improvement happens.&lt;/p&gt;

&lt;p&gt;For example, in my sprint, the retrospective captured these reflections:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;What went well:&lt;/strong&gt; The Git-to-deployment workflow was smooth, and the hero tagline update was successfully delivered and verified on the live site. Working in solo mode helped me stay organized and maintain clear ownership of the sprint tasks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;What to improve:&lt;/strong&gt; Automate EC2 deployment using user-data or scripts to reduce manual deployment time and improve consistency.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scrum Pillar – Inspection:&lt;/strong&gt; I practiced inspection by verifying changes locally and on the live URL to ensure the UI updates met the acceptance criteria before closing the story.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scrum Value – Commitment:&lt;/strong&gt; I stayed committed to completing one story at a time and delivering a working increment during the sprint.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6pwf326cc9x3tegvlfqe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6pwf326cc9x3tegvlfqe.png" alt="Retro Note" width="645" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Final Reflection&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Running a sprint showed me that Jira is not the important part. The important part is the workflow around it.&lt;/p&gt;

&lt;p&gt;Daily Scrum builds consistency. Burndown charts create transparency. Retrospectives create improvement.&lt;/p&gt;

&lt;p&gt;Together, they turn work into a repeatable delivery system.&lt;/p&gt;

&lt;p&gt;That was my biggest takeaway from running my first Jira sprint.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;P.S. If you're starting your DevOps journey, you can join the DevOps Micro Internship (DMI) community led by &lt;a href="https://www.linkedin.com/in/pravin-mishra-aws-trainer?utm_source=share&amp;amp;utm_campaign=share_via&amp;amp;utm_content=profile&amp;amp;utm_medium=android_app" rel="noopener noreferrer"&gt;Pravin Mishra&lt;/a&gt; on &lt;a href="https://discord.pravinmishra.com/" rel="noopener noreferrer"&gt;Discord&lt;/a&gt;.It’s a great place to gain hands-on experience and learn with from a global community&lt;/em&gt;&lt;/p&gt;

</description>
      <category>devops</category>
      <category>scrum</category>
      <category>sdlc</category>
    </item>
    <item>
      <title>Git Branching and the Multiverse: Protecting the Sacred Timeline🌠</title>
      <dc:creator>Goodness Ojonuba</dc:creator>
      <pubDate>Fri, 30 Jan 2026 10:51:37 +0000</pubDate>
      <link>https://forem.com/goodnessoj/git-branching-how-to-create-your-multiverse-15p4</link>
      <guid>https://forem.com/goodnessoj/git-branching-how-to-create-your-multiverse-15p4</guid>
      <description>&lt;p&gt;If you are a fan of the MCU, you have probably watched Doctor Strange in the Multiverse of Madness. One moment everything is stable, the next moment there are multiple realities, and one small mistake can trigger a multiversal disaster.&lt;/p&gt;

&lt;p&gt;That is why Doctor Strange protects the Sacred Timeline.&lt;/p&gt;

&lt;p&gt;For me, Git branching felt the same way at the beginning. I see a lot of people create and use branches but it took sometime to truly understand why it was important.Too many commands, too many possibilities, and the fear that one wrong move could break the entire project.&lt;/p&gt;

&lt;p&gt;What finally made it click was understanding isolation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Branching is controlled isolation (The Mirror Verse)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In Git, your main branch is the Sacred Timeline. It must remain stable and predictable&lt;/p&gt;

&lt;p&gt;When you create a branch, Git places your work in isolation. You are still connected to the main timeline, but your changes live in their own safe space. Nothing you do in that branch can break the main branch unless you intentionally merge it.&lt;/p&gt;

&lt;p&gt;This isolation is the real superpower.&lt;br&gt;
You can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Build features without risking production code&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Break things while learning without fear&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Experiment freely and roll back easily&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Work in teams without interfering with each other&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Just like the multiverse, each branch exists independently.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;You are Doctor Strange, managing isolated realities&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;As an devOps Engineer, you are not avoiding chaos by refusing to branch. You are avoiding chaos by isolating it (going into the mirror verse).&lt;/p&gt;

&lt;p&gt;Every time you create a branch, you are saying, “Let me explore this idea in a separate universe where no damage can leak into the Sacred Timeline.”&lt;/p&gt;

&lt;p&gt;If the idea works, great.&lt;br&gt;
If it fails, you close that universe and move on.&lt;/p&gt;

&lt;p&gt;No harm done.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Creating an isolated universe&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When you run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git checkout -b feature-scoreboard 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Git creates a new timeline that looks exactly like main, but it is completely isolated. Any commits you make here stay here.&lt;br&gt;
You can switch between realities anytime:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git checkout main
git checkout feature-scoreboard
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Each switch rewrites your working directory to reflect that universe. Main stays clean. Your experiments stay contained.&lt;/p&gt;

&lt;p&gt;That isolation is what keeps projects sane.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Merging without breaking reality&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once your work is tested and ready, you bring it back carefully.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git checkout main
git merge feature-scoreboard
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is where Git checks if merging will disrupt the Sacred Timeline. If everything aligns, the merge is smooth. &lt;/p&gt;

&lt;p&gt;If not, Git stops and asks you to resolve conflicts before anything breaks.&lt;/p&gt;

&lt;p&gt;Isolation ensures that problems are discovered early, not in production.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Close unstable universes&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Branches are not meant to live forever.&lt;/p&gt;

&lt;p&gt;Once a branch has done its job, delete it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git branch -d feature-scoreboard 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Fewer timelines, fewer problems.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Final thoughts&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Git branching is not about complexity. It is about safety through isolation. It allows you to learn, experiment, and collaborate without constantly worrying about breaking things.&lt;/p&gt;

&lt;p&gt;If you are early in your career, you can start your DevOps journey with &lt;a href="https://www.linkedin.com/in/pravin-mishra-aws-trainer?utm_source=share&amp;amp;utm_campaign=share_via&amp;amp;utm_content=profile&amp;amp;utm_medium=android_app" rel="noopener noreferrer"&gt;Pravin Mishra&lt;/a&gt; by joining the DevOps Micro Internship (DMI) Discord community:&lt;br&gt;
&lt;a href="https://discord.pravinmishra.com/" rel="noopener noreferrer"&gt;https://discord.pravinmishra.com/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is a great program where you get hands-on training and experience, supported by a global community.&lt;/p&gt;

&lt;p&gt;Explore freely, isolate your changes, and always protect the Sacred Timeline 🌌&lt;/p&gt;

</description>
      <category>devops</category>
      <category>git</category>
      <category>github</category>
    </item>
    <item>
      <title>3 Questions That Will Change How You See Linux</title>
      <dc:creator>Goodness Ojonuba</dc:creator>
      <pubDate>Thu, 22 Jan 2026 10:30:39 +0000</pubDate>
      <link>https://forem.com/goodnessoj/3-questions-that-will-change-how-you-see-linux-1jin</link>
      <guid>https://forem.com/goodnessoj/3-questions-that-will-change-how-you-see-linux-1jin</guid>
      <description>&lt;p&gt;I had worked with Linux before. I had used cloud services, deployed applications, and followed DevOps workflows. Still, three simple questions asked by my instructor, &lt;a href="https://www.linkedin.com/in/pravin-mishra-aws-trainer/" rel="noopener noreferrer"&gt;Sir Pravin Mishra&lt;/a&gt;, made me pause and rethink how deeply Linux sits at the center of everything we do in DevOps.&lt;/p&gt;

&lt;p&gt;He didn’t start with commands or configuration files. He started with questions that sounded simple, but stayed with me long after the class.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Three Questions
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Which operating system is used the most by cloud servers?&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Which operating system is used to run containers?&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Which operating system do automation tools like Ansible and Terraform depend on?&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;As we reflected on them, a pattern became obvious. Different tools, different platforms, same foundation. Linux.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why These Questions Matter
&lt;/h2&gt;

&lt;p&gt;DevOps often looks like a collection of tools: cloud platforms, CI/CD pipelines, containers, automation scripts. But underneath all of that is an operating system that quietly holds everything together.&lt;/p&gt;

&lt;p&gt;Most cloud servers run on Linux because it is stable, flexible, and designed for server environments. Containers rely on Linux kernel features. Automation tools are built with Linux environments in mind. Even when you interact mostly through dashboards or managed services, Linux is still doing the real work underneath.&lt;/p&gt;

&lt;p&gt;These questions helped me see DevOps less as a toolset and more as an ecosystem built on a common base.&lt;/p&gt;

&lt;h2&gt;
  
  
  How This Showed Up in Practice
&lt;/h2&gt;

&lt;p&gt;I recently deployed a React application and a ready-made portfolio site. After deployment, I carried out post-production DevOps checks to make sure everything was running as expected.&lt;/p&gt;

&lt;p&gt;During this process, I worked directly with the terminal on an Ubuntu-based AWS instance. Managing processes, configuring the environment, and verifying that everything was running correctly all happened through Linux.&lt;/p&gt;

&lt;p&gt;The servers hosting my applications were Linux-based. The build processes, web servers, and runtime environments all depended on Linux. This time, it wasn’t just something happening in the background. I was interacting with it directly.&lt;/p&gt;

&lt;p&gt;It wasn’t a new concept, but it became a much clearer and more practical realization.&lt;/p&gt;

&lt;h2&gt;
  
  
  A Subtle Shift in Perspective
&lt;/h2&gt;

&lt;p&gt;I don’t see Linux as just another skill on a checklist anymore. I see it as the common language spoken by cloud infrastructure, containers, and automation tools.&lt;/p&gt;

&lt;p&gt;Every deployment reinforces this idea. You can use different platforms and services, but understanding Linux helps you reason better about what’s happening when things work, and when they don’t.&lt;/p&gt;

&lt;h2&gt;
  
  
  Looking Ahead
&lt;/h2&gt;

&lt;p&gt;I’ll keep exploring Linux, not just to run commands, but to see the bigger picture. Every deployment reinforces the lesson from those three questions: understanding the foundation can change the way you see Linux, DevOps, and the tools we rely on.&lt;/p&gt;

</description>
      <category>linux</category>
      <category>devops</category>
      <category>cloudcomputing</category>
    </item>
    <item>
      <title>A Page from the Future</title>
      <dc:creator>Goodness Ojonuba</dc:creator>
      <pubDate>Sun, 11 Jan 2026 17:20:36 +0000</pubDate>
      <link>https://forem.com/goodnessoj/a-page-from-the-future-14mo</link>
      <guid>https://forem.com/goodnessoj/a-page-from-the-future-14mo</guid>
      <description>&lt;p&gt;By 2030, Goodness had established himself as a prominent figure in cloud computing, earning multiple certifications across multi cloud; AWS, Azure, Google Cloud. Over the past 3–5 years, he progressed from foundational cloud projects to a senior cloud architect role, leading multi-cloud migration initiatives that improved operational efficiency and reduced costs for large organizations. His portfolio included hands-on deployments using Infrastructure-as-Code, automated CI/CD pipelines, and scalable serverless architectures. His GitHub repositories, which included full cloud infrastructure projects and multi-cloud integration scripts, became a reference for aspiring cloud engineers worldwide.&lt;/p&gt;

&lt;p&gt;In addition to his professional achievements, Goodness remained deeply committed to his personal mission, “To Thrive and Serve.” He designed community-focused programs that trained hundreds of students and early-career professionals in cloud technologies. These initiatives combined workshops, online tutorials, and mentorship, providing practical skills that bridged the gap between classroom learning and real-world application. Through his guidance, dozens of participants secured their first roles in the tech industry, demonstrating the tangible impact of his work.&lt;/p&gt;

&lt;p&gt;Goodness also became recognized as a thought leader in cloud computing. He published articles on multi-cloud strategy, infrastructure optimization, and emerging technologies, contributing regularly to industry blogs and journals. He spoke at conferences and participated in panels discussing cloud architecture, automation, and sustainable IT practices. His work emphasized efficiency, collaboration, and innovation, showcasing his ability to translate technical expertise into strategic value.&lt;/p&gt;

&lt;p&gt;Despite his technical accomplishments, Goodness never lost sight of the lessons that shaped his journey. He consistently emphasized disciplined learning over fleeting motivation, shared knowledge openly, and embraced the philosophy that the right time to act is always now. This mindset not only helped him achieve his long-standing goal of joining AWS but also enabled him to leave a lasting mark on the global cloud community. By 2032, Goodness had not only built a successful career but also created pathways for others to thrive, demonstrating that technical excellence paired with a commitment to service can transform both individuals and the broader technology ecosystem.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;NB This is part of DevOps Micro-Internship (DMI) by &lt;a href="https://www.linkedin.com/in/pravin-mishra-aws-trainer/" rel="noopener noreferrer"&gt;Pravin Mishra&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
