<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Laura</title>
    <description>The latest articles on Forem by Laura (@lalidevops).</description>
    <link>https://forem.com/lalidevops</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/lalidevops"/>
    <language>en</language>
    <item>
      <title>Course 3 of 3: AIOps ☁️💪</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 23:14:10 +0000</pubDate>
      <link>https://forem.com/lalidevops/course-3-of-3-aiops-2am1</link>
      <guid>https://forem.com/lalidevops/course-3-of-3-aiops-2am1</guid>
      <description>&lt;h3&gt;
  
  
  &lt;strong&gt;Table of Contents&lt;/strong&gt;
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Introduction&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;You Can't Fix What You Can't See&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Monitoring&lt;/li&gt;
&lt;li&gt; Observability&lt;/li&gt;
&lt;li&gt; Monitoring vs Observability&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Observability + AIOps = Smarter IT Operations&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;AWS AIOps Tools&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; AWS CloudWatch Anomaly Detection&lt;/li&gt;
&lt;li&gt; AWS X-Ray insights&lt;/li&gt;
&lt;li&gt; AWS DevOps Guru&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Better Dev Experience&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Amazon Q DeveloperClosing Thoughts&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Closing Thoughts&lt;/p&gt;&lt;/li&gt;

&lt;/ol&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Welcome to the third and final blog post of this 3-part series🎉 where I share my learning process in getting the &lt;a href="https://www.coursera.org/specializations/devops-ai-aws" rel="noopener noreferrer"&gt;&lt;strong&gt;DevOps and AI on AWS Specialization&lt;/strong&gt;&lt;/a&gt; certification. The first blog is about &lt;a href="https://dev.to/lalidevops/course-1-of-3-upgrading-apps-with-gen-ai-1ik"&gt;&lt;strong&gt;Upgrading Apps with Gen AI&lt;/strong&gt;&lt;/a&gt; and the second one is about &lt;a href="https://dev.to/lalidevops/course-2-of-3-cicd-for-generative-ai-applications-3i6a"&gt;&lt;strong&gt;CI/CD for Generative AI Applications&lt;/strong&gt;&lt;/a&gt; .&lt;/p&gt;

&lt;p&gt;This course is all about &lt;strong&gt;Artificial intelligence for IT operations (AIOps)&lt;/strong&gt;, which means using AI to maintain infrastructure. To learn more about what is AIOps, what are the benefits and use cases check &lt;a href="https://aws.amazon.com/what-is/aiops/" rel="noopener noreferrer"&gt;this link&lt;/a&gt; out from Amazon website.&lt;/p&gt;

&lt;h2&gt;
  
  
  You Can't Fix What You Can't See
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Monitoring
&lt;/h3&gt;

&lt;p&gt;Monitoring checks systems health by collecting and analyzing data from systems, based on a predefined set of metrics and logs. In DevOps, it helps teams keep an eye on application health, catch known failures early, and avoid downtime.&lt;/p&gt;

&lt;p&gt;Where monitoring really shines is spotting &lt;strong&gt;long-term trends&lt;/strong&gt;, reveals how the app performs and how usage patterns changes over time. But, to be effective teams must know &lt;strong&gt;which&lt;/strong&gt; metrics and logs to track.&lt;/p&gt;

&lt;h3&gt;
  
  
  Observability
&lt;/h3&gt;

&lt;p&gt;Observability means being able to make sense of &lt;strong&gt;what's happening&lt;/strong&gt; inside a complex system from it's external outputs. When a system is observable, engineers can &lt;strong&gt;pinpoint&lt;/strong&gt; &lt;strong&gt;the&lt;/strong&gt; &lt;strong&gt;root cause&lt;/strong&gt; of a performance issue by analyzing the data already available, directly from telemetry data. Allows us to understand &lt;strong&gt;why&lt;/strong&gt; an issue occurred.&lt;/p&gt;

&lt;p&gt;The three pillars of observability are&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Logs&lt;/strong&gt; (application logs, &lt;em&gt;what's is happening when a failure occurs?&lt;/em&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Metrics&lt;/strong&gt; (CloudWatch metrics, &lt;em&gt;how much CPU utilization?&lt;/em&gt; or app specific metrics)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Traces&lt;/strong&gt; (A trace contains data from each services used to better understand what's the issue and where the error occurred).&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Monitoring vs Obersavility
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Monitoring&lt;/th&gt;
&lt;th&gt;Observability&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Tracks system's performance over time, it's main focus is in finding systems problems and notify stakeholders. Metrics could respond to questions like "Is my app up an running?"&lt;/td&gt;
&lt;td&gt;Uses telemetry data to get a complete picture of the of overall network performance, making it easier to get the root cause of the issue.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Monitoring tools rely on predefined metrics and logs to identify systems errors, usage patterns and know failures but can't provide enough context by itself (Is the app online? Is it offline? Is it experiencing latency issues?)&lt;/td&gt;
&lt;td&gt;Gives a team a complete view of the entire architecture, capturing configurations and data from across the network. Observability tools enhance telemetry data with additional context about the network environment.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Gather data on usage trends and performance, revealing what is happening within a system but they can't respond to "why this event occur?"&lt;/td&gt;
&lt;td&gt;Observability tools goes deeper, provide more context and correlate seemingly unrelated system events.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Monitoring tools typically present system data trough dashboards to view key metrics. However fall short in tracing the origins of the system's errors.&lt;/td&gt;
&lt;td&gt;Observability tools build maps and connect system errors to their root causes, automating the analysis process and making troubleshooting faster and easier.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Observability + AIOps =&lt;/strong&gt; Smarter IT Operations
&lt;/h2&gt;

&lt;p&gt;Improving observability means making sense of a large amounts of data coming from many different resources. This is where &lt;strong&gt;AIOps&lt;/strong&gt; proves it's value, by &lt;strong&gt;automating&lt;/strong&gt; the correlation of logs, traces, and metrics, identifying anomalies in real time, and reducing manual intervention for repetitive analysis tasks. Instead of digging into raw data, teams can focus on solving bigger issues. For example, when a latency spike shows up, AIOps points you straight to the service or component causing it.&lt;/p&gt;

&lt;p&gt;Using AIOps It's like an AI assistant that constantly monitors you infrastructure identifying patterns and anomalies so the team doesn't have to monitor everything themselves.&lt;/p&gt;

&lt;p&gt;Key Capabilities of AIOps:&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Anomaly Detection:&lt;/strong&gt; The AI looks at your system's logs and metrics for suspicious activities, to catch issues before they become a problem. In &lt;a href="https://lalidev.hashnode.dev/course-2-of-3-ci-cd-for-generative-ai-applications#aws-cloudwatch-centralized-logging-and-monitoring" rel="noopener noreferrer"&gt;my previous post&lt;/a&gt; I mentioned a CloudWatch feature that utilizes AI-powered Anomaly Detection.&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Predictive Analysis:&lt;/strong&gt; Predicts future events based on historical data.&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Automated Root Cause Analysis&lt;/strong&gt;: When something breaks, engineers typically have to manually check through logs to identify the issue. Automated root cause analysis saves time by streamlining this process.&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Remediation:&lt;/strong&gt; Doesn't just tell you what's wrong, it can take action to fix it based on policies or real-time learning.&lt;/p&gt;

&lt;p&gt;It can help you monitor and track the entire CI/CD pipeline in real-time to keep things running smoothly.&lt;/p&gt;

&lt;p&gt;AIOps is focused on &lt;strong&gt;operations management&lt;/strong&gt;. This includes: monitoring logs, real-time system health analysis and automated corrective actions&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;AWS AIOps Tools&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  AWS CloudWatch Anomaly Detection
&lt;/h3&gt;

&lt;p&gt;CloudWatch is a comprehensive monitoring and observability platform for your cloud resources and applications. The core components are &lt;strong&gt;alarms&lt;/strong&gt; (notify me you when a metric crosses a defined threshold), &lt;strong&gt;metrics&lt;/strong&gt; (data points collected over time) and &lt;strong&gt;logs&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;When talking about alarms, how do we determine the appropriate threshold to set? CloudWatch can analyze your metrics and establish these thresholds for you. To accomplish this, you can use a feature called &lt;a href="https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Anomaly_Detection.html" rel="noopener noreferrer"&gt;CloudWatch anomaly detection&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  AWS X-Ray Insights
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/xray/latest/devguide/xray-console-insights.html" rel="noopener noreferrer"&gt;AWS X-Ray insights&lt;/a&gt; is a feature that keeps a continuous watch over trace data in your account to identify any issues that may occur. It uses machine learning to detect anomalies and patterns that could cause issues. When anomalies, error rates or fault rates surpass the expected range, it generates an insight that documents the issue and monitors its impact until it's resolved. It also helps you identify the issues's severity and the priority.&lt;/p&gt;

&lt;h3&gt;
  
  
  AWS DevOps Guru
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/devops-guru/" rel="noopener noreferrer"&gt;DevOps Guru&lt;/a&gt; uses machine learning to improve application availability by detecting any anomalous behavior. How does it work? DevOps Guru employs machine learning to identify anomalies, when an anomaly is detected it generates an insight, which is a compilation of related anomalies within an analyzed resource.&lt;/p&gt;

&lt;p&gt;There are two types of insights:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;👉 Reactive Insights:&lt;/strong&gt; Contain anomalies with recommendations, related metrics and events.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;👉 Proactive Insights:&lt;/strong&gt; Tells you about issues that are predicted to affect your application in the future.&lt;/p&gt;

&lt;p&gt;You can receive notifications when an issue arises by setting up an SNS notification topic and configure an email to be alerted.&lt;/p&gt;

&lt;h2&gt;
  
  
  Better Dev Experience
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Amazon Q Developer Security Scanning
&lt;/h3&gt;

&lt;p&gt;This tool helps write secure code and is designed to support developers during the development process. While AIOps concentrates on operational aspects post-deployment, &lt;a href="https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/security.html" rel="noopener noreferrer"&gt;Amazon Q Developer&lt;/a&gt; primarily targets the development phase. Both AI-powered tools enhance processes, but at different stages.&lt;/p&gt;

&lt;p&gt;We always want to minimize and prevent vulnerabilities to reach production, here's where Amazon Q Developer could be beneficial. It offers a &lt;strong&gt;security scanning&lt;/strong&gt; feature that scans your code and allows you to catch vulnerabilities early in the process, before reaching production. You can view details about a finding, relevant information and It can also provide suggestions for fixing the issue.&lt;/p&gt;

&lt;p&gt;As security policies evolve, this tool incorporates this new detectors ensuring the scans are up-to-date.&lt;/p&gt;

&lt;h2&gt;
  
  
  Closing Thoughts
&lt;/h2&gt;

&lt;p&gt;And this is the last blog post in this 3-part series where I share my journey of obtaining the &lt;strong&gt;DevOps and AI on AWS Specialization Certification&lt;/strong&gt; 🎉💪&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk5wav9qodkfrifmxwmej.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk5wav9qodkfrifmxwmej.png" alt=" " width="800" height="376"&gt;&lt;/a&gt;&lt;br&gt;
I learned a lot, and this specialization added tremendous value to me. I highly recommend it. Thanks for reading through. Sharing my journey with you all is a pleasure🙌.&lt;/p&gt;

&lt;p&gt;I'm happy to connect with you on &lt;a href="https://www.linkedin.com/in/lauradiaz91/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, feel free to send a DM and share your thoughts on this blog series.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>aiops</category>
      <category>security</category>
      <category>devops</category>
    </item>
    <item>
      <title>Course 2 of 3: CI/CD for Generative AI Applications ⚒️</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 23:06:07 +0000</pubDate>
      <link>https://forem.com/lalidevops/course-2-of-3-cicd-for-generative-ai-applications-3i6a</link>
      <guid>https://forem.com/lalidevops/course-2-of-3-cicd-for-generative-ai-applications-3i6a</guid>
      <description>&lt;h3&gt;
  
  
  &lt;strong&gt;Table of Contents&lt;/strong&gt;
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Introduction&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Intro to DevOps&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Infrastructure As Code (IaC) in DevOps&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; The Hidden Cost of Manual Processes&lt;/li&gt;
&lt;li&gt; IaC Benefits&lt;/li&gt;
&lt;li&gt; AWS IaC Tools Overview&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Code, Build And Test Phases&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Code Phase&lt;/li&gt;
&lt;li&gt; Build Phase&lt;/li&gt;
&lt;li&gt; Test Phase&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;AI Capabilities in DevOps Workflows&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Testing GenAI Apps&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Tests in the CI Flow&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Continuous Integration (CI)&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Hands-on Labs: Set Up a CI/CD Pipeline&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Create the CodePipeline&lt;/li&gt;
&lt;li&gt; Create the CodeDeploy project&lt;/li&gt;
&lt;li&gt; Adding CodeDeploy to CodePipeline&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Serverless Deployment Strategies&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;AWS CodeDeploy in Your Pipeline&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;CI/CD for Infrastructure&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Automate Infra Deployment With CDK in CI/CD Pipeline&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Monitoring Your Infrastructure&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Monitor, Log, and Audit With CloudTrail&lt;/li&gt;
&lt;li&gt; AWS CloudWatch&lt;/li&gt;
&lt;li&gt; A Powerful Mix: CloudWatch + CloudTrail&lt;/li&gt;
&lt;li&gt; Monitoring with AWS X-Ray&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Operating with Confidence&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Configuration Change Detection with AWS Config&lt;/li&gt;
&lt;li&gt; AWS Systems Manager&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Wrapping Up Part Two: The Journey Continues&lt;/p&gt;&lt;/li&gt;

&lt;/ol&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;This is the second blog post in a series of three, where I share my experience studying and earning the &lt;a href="https://www.coursera.org/specializations/devops-ai-aws" rel="noopener noreferrer"&gt;&lt;strong&gt;DevOps and AI on AWS Specialization from Coursera&lt;/strong&gt;&lt;/a&gt;. If you didn't read the first blogpost, &lt;a href="https://dev.to/lalidevops/course-1-of-3-upgrading-apps-with-gen-ai-1ik"&gt;here's&lt;/a&gt; the link in case you want to have a look :)&lt;/p&gt;

&lt;h2&gt;
  
  
  Intro To DevOps
&lt;/h2&gt;

&lt;p&gt;The heart of the first module is about what's the problem DevOps methodologies are solving: ​Get software updates ​to production as quickly as possible and keep quality at a high-level. DevOps mainly focus on two things: ​&lt;strong&gt;collaboration&lt;/strong&gt; and &lt;strong&gt;automation&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;When talking about the steps involved in creating and deploying software, we refer to sharing our work in source code repositories, building (creating an artifact to deploy the application), and then testing the application to ensure everything functions as expected. We usually automate this process with &lt;strong&gt;continuous integration&lt;/strong&gt; to automatically build and run tests, so developers integrate their changes frequently.&lt;/p&gt;

&lt;p&gt;After we have the artifact built and test, we are ready to release and deploy, the software is ready to go. We take this artifact and deliver to the servers that host the application. We can use tools for &lt;strong&gt;continuous deployment&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Another important aspect is automating the creation and updating of infrastructure and, ideally, we aim to include a step for proactive anomaly detection to quickly identify any unusual activity that might degrade our application's performance.&lt;/p&gt;

&lt;p&gt;In summary, the process involves continuous integration, automated testing, continuous deployment, and infrastructure as code.&lt;/p&gt;

&lt;h2&gt;
  
  
  Infrastructure As Code (IaC) in DevOps
&lt;/h2&gt;

&lt;p&gt;What is Infrastructure as Code? Let's see the definition based on &lt;a href="https://aws.amazon.com/what-is/iac/" rel="noopener noreferrer"&gt;AWS website&lt;/a&gt;:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Infrastructure as code (IaC) is the ability to provision and support your computing infrastructure using code instead of manual processes and settings. Any application environment requires many infrastructure components like operating systems, database connections, and storage. Developers have to regularly set up, update, and maintain the infrastructure to develop, test, and deploy applications.&lt;/p&gt;

&lt;p&gt;Manual infrastructure management is time-consuming and prone to error—especially when you manage applications at scale. Infrastructure as code lets you define your infrastructure's desired state without including all the steps to get to that state. It automates infrastructure management so developers can focus on building and improving applications instead of managing environments. Organizations use infrastructure as code to control costs, reduce risks, and respond with speed to new business opportunities.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The ultimate goal is to automate as much of the tasks as possible. We move away from the manual approach. We want to have the task described as files.&lt;/p&gt;

&lt;p&gt;Going back to the &lt;strong&gt;TravelGuide App&lt;/strong&gt; in &lt;a href="https://lalidev.hashnode.dev/course-1-of-3-upgrading-apps-with-gen-ai" rel="noopener noreferrer"&gt;Course 1&lt;/a&gt; of this DevOps and AI Specialization on AWS, we understand that the application's code requires a platform to operate, such as compute resources (in this case an EC2 instance).&lt;/p&gt;

&lt;p&gt;Our application upgrade also requires the creation of certain cloud resources. We are utilizing a Bedrock knowledge base that needs an S3 bucket as a data source. The EC2 instances running the application will need a VPC network. Additionally, an IAM role is required for authentication, along with IAM policies to manage permissions.&lt;/p&gt;

&lt;p&gt;Infrastructure also includes the configuration of these resources. This configuration is not static, eventually, we'll need to modify and maintain the infrastructure, and we must be agile when making these changes.&lt;/p&gt;

&lt;p&gt;In AWS, this is essential for scaling and automating tasks, particularly in AI-driven applications such as the generative AI feature in the &lt;strong&gt;TravelGuideApp&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Adopting &lt;strong&gt;IaC&lt;/strong&gt; eliminates repetitive manual tasks, ensuring infrastructure changes are documented, consistent, and easily auditable.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Hidden Cost of Manual Processes
&lt;/h3&gt;

&lt;p&gt;💸 Manually stopping and starting EC2 instances or modifying configurations is time-consuming.&lt;/p&gt;

&lt;p&gt;💸 Inconsistent changes across instances increase the likelihood of mistakes.&lt;/p&gt;

&lt;p&gt;💸 Managing large numbers of resources is impractical without automation (hard to scale).&lt;/p&gt;

&lt;p&gt;💸 Limited traceability: Manual changes lack documentation and a change history (limited traceability).&lt;/p&gt;

&lt;h3&gt;
  
  
  IaC Benefits
&lt;/h3&gt;

&lt;p&gt;👉 &lt;strong&gt;Documentation&lt;/strong&gt;: The code can act as a form of documentation. We can check the changes on the source control (full history changes).&lt;/p&gt;

&lt;p&gt;👉 &lt;strong&gt;Pull requests reviews:&lt;/strong&gt; Other members of the team can comment on the infrastructure changes.&lt;/p&gt;

&lt;p&gt;👉 &lt;strong&gt;Scale&lt;/strong&gt;: Manual work consumes time that could be better spent on other tasks. Is better at scale when I need to make hundreds of changes.&lt;/p&gt;

&lt;p&gt;👉 &lt;strong&gt;No human error:&lt;/strong&gt; Automating tasks removed the human error with manual process.&lt;/p&gt;

&lt;h3&gt;
  
  
  AWS IaC Tools Overview
&lt;/h3&gt;

&lt;p&gt;The &lt;a href="https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/Welcome.html" rel="noopener noreferrer"&gt;&lt;strong&gt;AWS CloudFormation&lt;/strong&gt;&lt;/a&gt; is a service that helps you model and set up your AWS resources . You create a template (JSON or YAML files) that describes all the AWS resources that you want. It uses a declarative language to define the desired state.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://docs.aws.amazon.com/cdk/v2/guide/home.html" rel="noopener noreferrer"&gt;AWS Cloud Development Kit or CDK&lt;/a&gt; is an open-source framework we use to model and provision cloud-based applications with familiar programming languages. CDK uses CloudFormation ​to create resources.&lt;/p&gt;

&lt;p&gt;More on this tools on the "CI/CD for Infrastructure" section later.&lt;/p&gt;

&lt;h2&gt;
  
  
  Code, Build And Test Phases
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Code Phase
&lt;/h3&gt;

&lt;p&gt;Without developer and operation teams being integrated and collaborating on a regular basis, ​it might take a while to figure things out when something goes wrong. With DevOps this has evolved, these two teams can work together as a unified team. This impacts how developer writes and manage the code from the beginning to deployment.&lt;/p&gt;

&lt;p&gt;Both teams work together more closely, &lt;strong&gt;collaboration&lt;/strong&gt; becomes a &lt;strong&gt;key&lt;/strong&gt; &lt;strong&gt;player&lt;/strong&gt; here. They help each other and understand more about the whole application lifecycle. Developers needs to be mindful of how the code will be deployed, how can me monitored, maintained and perform in different environments.&lt;/p&gt;

&lt;p&gt;Operations could provide feedback on how the application is performing and provide insights to developers so they can optimize the application of fix bugs. They may also assess code for security, observability, monitoring, and performance.&lt;/p&gt;

&lt;p&gt;Occasionally, code changes can affect operations in ways developers might not immediately consider. Involving individuals with different perspectives can help address potential issues early on. Proactive communication enables both teams to anticipate issues earlier in the development cycle.&lt;/p&gt;

&lt;p&gt;Without DevOps, much of the development process might be manual, making &lt;strong&gt;automation&lt;/strong&gt; &lt;strong&gt;crucial&lt;/strong&gt;. You would want to have processes that automates the application and infrastructure deployment.&lt;/p&gt;

&lt;p&gt;Infrastructure-as-code tools allow you to create templates for infrastructure, write scripts for testing, and use automated deployment tools. This ensures that your code not only functions correctly but also includes the necessary assets for building, testing, and deploying through an automated pipeline.&lt;/p&gt;

&lt;p&gt;How developers commit or deploy code can change with continuous integration and continuous deployment or delivery or &lt;strong&gt;CI/CD&lt;/strong&gt;. This enables your code to be tested more frequently with automated tools, allowing for incremental deployment of changes. By making numerous small updates instead of infrequent, larger ones, changes reach end-users more quickly, accelerating the feedback loop.&lt;/p&gt;

&lt;p&gt;Developers should adopt the best practice of committing smaller, more frequent changes. Additionally, each commit must be production-ready, as it could be deployed to users at any moment.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjuqu9jcvuccav7uunl80.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjuqu9jcvuccav7uunl80.png" alt=" " width="800" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, monitoring and feedback loops become integral to the development process. Developers become more aware in the operation of the application, integrating tools for logs and metrics to quickly identify issues.&lt;/p&gt;

&lt;h3&gt;
  
  
  Build Phase
&lt;/h3&gt;

&lt;p&gt;This phase comes right after the code phase. The build phase will take the code and compile it if needed, depending on the programming language being used. This phase also includes automated testing and some linter process (linters ensure higher code quality).&lt;/p&gt;

&lt;p&gt;The end result of this phase would be an &lt;strong&gt;artifact&lt;/strong&gt; to be deployed.&lt;/p&gt;

&lt;p&gt;If any of these steps (retrieving dependencies, compiling, packaging, or testing) fails, it results in a broken build. A broken build indicates that the code in the deployment branch is not in a functional state. In this case ​we want to get our code back to a good state ​as quick as possible.&lt;/p&gt;

&lt;p&gt;In modern DevOps practices, it's typical to run a build with every commit. There's a service called &lt;a href="https://docs.aws.amazon.com/codebuild/latest/userguide/welcome.html" rel="noopener noreferrer"&gt;AWS CodeBuild&lt;/a&gt;, is a fully managed service. It compiles your source code, runs unit tests, and generates artifacts ready for deployment. It integrates with other AWS services ​like &lt;strong&gt;AWS CodePipeline&lt;/strong&gt; and &lt;strong&gt;AWS CodeDeploy&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS CodeBuild&lt;/strong&gt; ​is the tool we use for continuous integration, it can scale up and run multiple builds ​while multiple developers are working on the application code.&lt;/p&gt;

&lt;p&gt;To configure builds for your application, include a &lt;code&gt;buildspec.yml&lt;/code&gt; file with your source code. This file outlines your desired build process, which AWS CodeBuild reads and executes.&lt;/p&gt;

&lt;h3&gt;
  
  
  Test Phase
&lt;/h3&gt;

&lt;p&gt;​Testing can also save money. The earlier we catch errors in the development process, the less expensive they are to fix.&lt;/p&gt;

&lt;p&gt;We have &lt;a href="https://www.geeksforgeeks.org/software-engineering/differences-between-functional-and-non-functional-testing/" rel="noopener noreferrer"&gt;&lt;strong&gt;Functional and Non-functional Testing&lt;/strong&gt;&lt;/a&gt;. We conduct various types of testing to prevent issues from occurring in production.&lt;/p&gt;

&lt;p&gt;While discussing continuous integration, we can add automated tests to every build. Functional testing begins with unit testing, these tests run quickly and provide feedback. We can include a linter to assess code quality. The goal of all tests is consistent: to prevent any defects from reaching production.&lt;/p&gt;

&lt;p&gt;By incorporating DevOps principles, we can automate much of our testing, ensuring rapid feedback if any changes introduce defects.&lt;/p&gt;

&lt;h2&gt;
  
  
  AI Capabilities in DevOps Workflows
&lt;/h2&gt;

&lt;p&gt;Generative AI applications, such as those using &lt;strong&gt;AWS Bedrock&lt;/strong&gt;, present distinct challenges and opportunities within a DevOps workflow.&lt;/p&gt;

&lt;p&gt;Returning to the &lt;strong&gt;TravelGuide&lt;/strong&gt; &lt;strong&gt;App&lt;/strong&gt; from course 1, we aim to follow good DevOps practices, and you might wonder if working with &lt;strong&gt;GenAI&lt;/strong&gt; changes anything in the process. The app calls the AWS API in the same way other services do. Their behavior can vary based on customizations like prompt engineering and model fine-tuning. What sets it apart are the features of Bedrock that allow us to customize our responses.&lt;/p&gt;

&lt;p&gt;We can customize by fine-tuning or pre-training a model, and it is relatively straightforward to operationalize because we access features through an API rather than building our own model. If we customize something we want to monitor the benefits of this customization, have these changes improved my app? To measure improvement, &lt;strong&gt;Bedrock metrics&lt;/strong&gt; are being created by model, we can run tests and check the metrics created by Bedrock.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Useful metrics&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;InvocationLatency&lt;/em&gt; (Measure response time changes due to prompt/model updates)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;Input/Output Token Count&lt;/em&gt; (Track token usage to optimize cost and performance)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;Number of Invocations&lt;/em&gt; (Monitor service usage patterns)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Custom Metrics (CloudWatch, capture user feedback directly through the application).&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Testing GenAI Apps
&lt;/h2&gt;

&lt;p&gt;The responses from Bedrock are non-deterministic, the same input can yield different outputs or behaviors in different runs, resulting in unpredictable and non-repeatable behavior.&lt;/p&gt;

&lt;p&gt;We can test the code in isolation because the unit tests never talk to Bedrock. We can write our own simulated responses instead of having to do setup to reproduce the same specific edge case. You can now get a sense of this isn't going to be too different from writing regular unit tests that is expecting a response from a database. I would want to mock the response so I can control exactly the response I get from Bedrock service.&lt;/p&gt;

&lt;p&gt;Let's explore how to perform API calls to retrieve and generate data from the Bedrock knowledge base. First, we need to examine the response from the retrieve and generate API.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;HTTP/1.1 200
Content-type: application/json

{
   "citations": [ 
      { 
         "generatedResponsePart": { 
            "textResponsePart": { 
               "span": { 
                  "end": number,
                  "start": number
               },
               "text": "string"
            }
         },
         "retrievedReferences": [ 
            { 
               "content": { 
                  "audio": { 
                     "s3Uri": "string",
                     "transcription": "string"
                  },
                  "byteContent": "string",
                  "row": [ 
                     { 
                        "columnName": "string",
                        "columnValue": "string",
                        "type": "string"
                     }
                  ],
                  "text": "string",
                  "type": "string",
                  "video": { 
                     "s3Uri": "string",
                     "summary": "string"
                  }
               },
               "location": { 
                  "confluenceLocation": { 
                     "url": "string"
                  },
                  "customDocumentLocation": { 
                     "id": "string"
                  },
                  "kendraDocumentLocation": { 
                     "uri": "string"
                  },
                  "s3Location": { 
                     "uri": "string"
                  },
                  "salesforceLocation": { 
                     "url": "string"
                  },
                  "sharePointLocation": { 
                     "url": "string"
                  },
                  "sqlLocation": { 
                     "query": "string"
                  },
                  "type": "string",
                  "webLocation": { 
                     "url": "string"
                  }
               },
               "metadata": { 
                  "string" : JSON value 
               }
            }
         ]
      }
   ],
   "guardrailAction": "string",
   "output": { 
      "text": "string"
   },
   "sessionId": "string"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Response Elements (from &lt;a href="https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent-runtime_RetrieveAndGenerate.html" rel="noopener noreferrer"&gt;official docs&lt;/a&gt;):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;citations&lt;/strong&gt;: A list of segments of the generated response that are based on sources in the knowledge base, alongside information about the sources.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;guardrailAction&lt;/strong&gt;: Indicates whether a guardrail intervention is present in the response..&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;output&lt;/strong&gt;: Contains the response generated from querying the knowledge base.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;sessionId&lt;/strong&gt;: The unique identifier of the session. When you first make a &lt;code&gt;RetrieveAndGenerate&lt;/code&gt; request, Amazon Bedrock automatically generates this value. You must reuse this value for all subsequent requests in the same conversational session. This value allows Amazon Bedrock to maintain context and knowledge from previous interactions. You can't explicitly set the &lt;code&gt;sessionId&lt;/code&gt; yourself.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The citation has a property &lt;strong&gt;retrievedReferences&lt;/strong&gt;, this is one or more knowledge bases and metadata. The unit test will loop through the citations array and build a response that contains the generated response.&lt;/p&gt;

&lt;h2&gt;
  
  
  Tests in the CI Flow
&lt;/h2&gt;

&lt;p&gt;At this point, the unit tests are running locally on my environment but what if I like to add these to the &lt;strong&gt;CI&lt;/strong&gt; &lt;strong&gt;pipeline&lt;/strong&gt;? &lt;strong&gt;AWS CodeBuild&lt;/strong&gt; is great to run our tests.&lt;/p&gt;

&lt;p&gt;We may have a local script for running the tests, and we need to configure the steps that CodeBuild will execute in a &lt;code&gt;buildspec&lt;/code&gt; file. This file will specify what we want &lt;strong&gt;AWS CodeBuild&lt;/strong&gt; to do, what commands you want to run in an specific build.&lt;/p&gt;

&lt;p&gt;Here's the &lt;a href="https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html" rel="noopener noreferrer"&gt;official docs&lt;/a&gt; in case you want to have a look.&lt;/p&gt;

&lt;h2&gt;
  
  
  Continuous Integration (CI)
&lt;/h2&gt;

&lt;p&gt;We would like to run the tests every time a developer push a commit. We need to decide which step goes before the other, like an orchestrator. For this we can use a service called &lt;a href="https://docs.aws.amazon.com/codepipeline/latest/userguide/welcome.html" rel="noopener noreferrer"&gt;&lt;strong&gt;AWS CodePipeline&lt;/strong&gt;&lt;/a&gt;. We can set up a pipeline once, it will detect when a new commit is being pushed and restart the pipeline whenever changes are detected.&lt;/p&gt;

&lt;p&gt;The pipelines are built with stages (logical pieces that describe a phase in the pipeline, e.g., Source, Build, Test). We can add different stages later on, like a deploy stage. Approval and invoke actions are also available, to control what get's deployed to production or not. Waiting on this final check is an extra layer of safety.&lt;/p&gt;

&lt;p&gt;We can also define actions (tasks executed within each stage) to run custom scripts, trigger other systems or perform checks. The customization is pretty good.&lt;/p&gt;

&lt;p&gt;When we release changes in an execution pipeline, push a new commit, or merge a pull request, the source stage monitors the repository for changes, and each execution receives its own ID. When we run our pipeline, we can view the progress in real time and the status of each stage. If an action fails, we can retry it and also view the build logs, which is very useful for debugging.&lt;/p&gt;

&lt;h2&gt;
  
  
  Hands-on Labs: Set Up a CI/CD Pipeline
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Create the CodePipeline
&lt;/h3&gt;

&lt;p&gt;We'll set up the pipeline for CI/CD, starting with a CodeBuild project to perform linting, unit testing, and code coverage reporting.&lt;/p&gt;

&lt;p&gt;At the top of the AWS Management Console, in the search bar, search for &lt;strong&gt;CodePipeline&lt;/strong&gt; and Build Custom Pipeline.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi0hqrum5v3k1ru4yetks.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi0hqrum5v3k1ru4yetks.png" alt=" " width="800" height="324"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the &lt;strong&gt;Choose pipeline settings&lt;/strong&gt; page, give it a name and configure the service role.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fncfidgj1pol9j3fgqnsy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fncfidgj1pol9j3fgqnsy.png" alt=" " width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Continue to &lt;strong&gt;Add source stage&lt;/strong&gt; page and configure the source.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgbhy009uq5xub6xtdnw5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgbhy009uq5xub6xtdnw5.png" alt=" " width="800" height="318"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the &lt;strong&gt;Add build stage&lt;/strong&gt; page, configure the build provider.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl5m3fvfzt4929vyy1saa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl5m3fvfzt4929vyy1saa.png" alt=" " width="800" height="318"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select a name for your project and &lt;strong&gt;Create project&lt;/strong&gt;. This launches a new browser window to create the CodeBuild project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8k7qne5vlja2kj12klv4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8k7qne5vlja2kj12klv4.png" alt=" " width="800" height="244"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the &lt;strong&gt;Service role&lt;/strong&gt;, you can either select an existing service role or a new service role.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5v8z38z08929h6hqe8l5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5v8z38z08929h6hqe8l5.png" alt=" " width="800" height="143"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the &lt;strong&gt;Buildspec&lt;/strong&gt; section, select &lt;strong&gt;Use a buildspec file&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc1birnkiomgl4609yahu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc1birnkiomgl4609yahu.png" alt=" " width="800" height="138"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Continue to &lt;strong&gt;CodePipeline&lt;/strong&gt;. At this point, you can add a test page, but this time I choose Skip test stage and Skip deploy stage.&lt;/p&gt;

&lt;p&gt;On the &lt;strong&gt;Review&lt;/strong&gt; page, choose &lt;strong&gt;Create pipeline&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frekezj9p9xy5azgiby8e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frekezj9p9xy5azgiby8e.png" alt=" " width="800" height="623"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the pipeline is created, it automatically starts. Wait for both stages to display a status of &lt;strong&gt;Succeeded&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk3b9ki524dthqefvdxxb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk3b9ki524dthqefvdxxb.png" alt=" " width="800" height="285"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the Build stage, click the AWS CodeBuild link and review the &lt;em&gt;build logs&lt;/em&gt;. You can inspect both the &lt;strong&gt;Code Coverage&lt;/strong&gt; and &lt;strong&gt;Test&lt;/strong&gt; reports on the &lt;strong&gt;Reports&lt;/strong&gt; tab.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Create the CodeDeploy project&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;In this task, we'll create the CodeDeploy app and deployment group to deploy the application to an EC2 instance.&lt;/p&gt;

&lt;p&gt;Head over to the AWS console and search for a service called &lt;strong&gt;CodeDeploy&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Choose &lt;strong&gt;Applications&lt;/strong&gt; and &lt;strong&gt;Create application&lt;/strong&gt;. Select &lt;strong&gt;EC2 / On-premises&lt;/strong&gt; and &lt;strong&gt;Create application&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw17zegof3bedjctqbtu7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw17zegof3bedjctqbtu7.png" alt=" " width="800" height="230"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, choose &lt;strong&gt;Create deployment group&lt;/strong&gt; and select the &lt;strong&gt;service role&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb3zmgpzhenbssdbsjboa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb3zmgpzhenbssdbsjboa.png" alt=" " width="800" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Configure the environment, choose &lt;strong&gt;Amazon EC2 instances&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F11gw49y8yqfcd2bd9qwl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F11gw49y8yqfcd2bd9qwl.png" alt=" " width="800" height="268"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the &lt;strong&gt;Agent configuration with AWS Systems Manager&lt;/strong&gt; section, for &lt;strong&gt;Install AWS CodeDeploy Agent&lt;/strong&gt;, select Never.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ejq0ovd9rhomd7d2n44.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ejq0ovd9rhomd7d2n44.png" alt=" " width="800" height="139"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For &lt;strong&gt;Load balancer&lt;/strong&gt;, clear &lt;strong&gt;Enable load balancing&lt;/strong&gt; and choose &lt;strong&gt;Create deployment group&lt;/strong&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Adding CodeDeploy to CodePipeline
&lt;/h3&gt;

&lt;p&gt;In this task, we'll update the pipeline to add a new stage for deploying application updates to the EC2 instance. On the AWS Management Console search for &lt;strong&gt;CodePipeline&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Choose your app, in my case &lt;strong&gt;&lt;em&gt;travelapp-pipeline&lt;/em&gt;&lt;/strong&gt; pipeline, then choose &lt;strong&gt;Edit&lt;/strong&gt;. Under &lt;strong&gt;Edit: Build&lt;/strong&gt;, choose &lt;strong&gt;＋ Add stage&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frszfgoccruhmoggielaf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frszfgoccruhmoggielaf.png" alt=" " width="800" height="153"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For &lt;strong&gt;Stage name&lt;/strong&gt;, enter Deploy, then choose &lt;strong&gt;Add Stage&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjnvyr3k2379j2h8yg1s3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjnvyr3k2379j2h8yg1s3.png" alt=" " width="800" height="276"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Under &lt;strong&gt;Edit: Deploy&lt;/strong&gt;, choose &lt;strong&gt;＋ Add action group&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fven2icjhcgt4is8pivq3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fven2icjhcgt4is8pivq3.png" alt=" " width="800" height="160"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And add the following configuration&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn40sjae7v5bn2ge8ej8q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn40sjae7v5bn2ge8ej8q.png" alt=" " width="800" height="366"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, &lt;strong&gt;Save&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffsiiduwvshkn51cxk31b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffsiiduwvshkn51cxk31b.png" alt=" " width="800" height="189"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Go back to CodePipeline, scroll to the top of the pipeline, and select &lt;strong&gt;Release change&lt;/strong&gt;. To confirm, click &lt;strong&gt;Release&lt;/strong&gt;. Wait for the new Deploy stage to show &lt;strong&gt;Succeeded&lt;/strong&gt;. The application has been successfully deployed by &lt;strong&gt;CodeBuild&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Serverless Deployment Strategies&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Various strategies exist for deploying serverless applications. To deploy them effectively, it's important to understand these strategies and consider factors such as rollback, scaling, and monitoring.&lt;/p&gt;

&lt;p&gt;Here's the definition of each deployment strategies from &lt;a href="https://docs.aws.amazon.com/whitepapers/latest/introduction-devops-aws/deployment-strategies.html" rel="noopener noreferrer"&gt;official documentation&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Blue/Green Deployment&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/whitepapers/latest/blue-green-deployments/introduction.html" rel="noopener noreferrer"&gt;Blue/green deployments&lt;/a&gt; provide releases with near zero-downtime and rollback capabilities. The fundamental idea behind blue/green deployment is to shift traffic between two identical environments that are running different versions of your application. The blue environment represents the current application version serving production traffic. In parallel, the green environment is staged running a different version of your application. After the green environment is ready and tested, production traffic is redirected from blue to green. If any problems are identified, you can roll back by reverting traffic back to the blue environment.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Linear Deployment&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Linear deployment means traffic is shifted in equal increments with an equal number of minutes between each increment. You can choose from predefined linear options that specify the percentage of traffic shifted in each increment and the number of minutes between each increment.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Canary Deployment&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The purpose of a &lt;a href="https://wa.aws.amazon.com/wellarchitected/2020-07-02T19-33-23/wat.concept.canary-deployment.en.html" rel="noopener noreferrer"&gt;canary deployment&lt;/a&gt; is to reduce the risk of deploying a new version that impacts the workload. The method will incrementally deploy the new version, making it visible to new users in a slow fashion. As you gain confidence in the deployment, you will deploy it to replace the current version in its entirety.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;All-at-Once Deployment&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;All-at-once deployment means all traffic is shifted from the original environment to the replacement environment all at once.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Selecting the appropriate deployment strategy depends on the application and its specific requirements.&lt;/p&gt;

&lt;h2&gt;
  
  
  AWS CodeDeploy in Your Pipeline
&lt;/h2&gt;

&lt;p&gt;After the continuous integration phase completes all tests and receives approval, &lt;strong&gt;AWS CodeDeploy&lt;/strong&gt; is now ready to deploy the application to the production environment.&lt;/p&gt;

&lt;p&gt;To incorporate the continuous deployment phase, we need to update the pipeline to include &lt;strong&gt;CodeDeploy&lt;/strong&gt; in the CI/CD process. We can add a new stage and call it &lt;strong&gt;deploy&lt;/strong&gt; and then select CodeDeploy as the action provider.&lt;/p&gt;

&lt;h2&gt;
  
  
  CI/CD for Infrastructure
&lt;/h2&gt;

&lt;p&gt;A good practice is to maintain 2 different pipelines, one for the application code and one for the infrastructure changes.&lt;/p&gt;

&lt;p&gt;Let's discuss elevating your automation by using infrastructure as code with AWS &lt;strong&gt;CloudFormation&lt;/strong&gt; and its role in a CI/CD workflow.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS CloudFormation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/cloudformation/" rel="noopener noreferrer"&gt;AWS CloudFormation&lt;/a&gt; is a service that allows us to define our infrastructure in a file (JSON or YAML) and have CloudFormation handle the provisioning. The file is the template (something you define) and the infrastructure components are called &lt;strong&gt;stack&lt;/strong&gt; (the infrastructure that gets created).&lt;/p&gt;

&lt;p&gt;Let's check the following example, in this case &lt;code&gt;AWS::S3::Bucket&lt;/code&gt; resource creates an Amazon S3 bucket:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Type"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"AWS::S3::Bucket"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Properties"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"AbacStatus"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;String&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"AccelerateConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;AccelerateConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"AccessControl"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;String&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"AnalyticsConfigurations"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;AnalyticsConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"BucketEncryption"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;BucketEncryption&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"BucketName"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;String&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"CorsConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;CorsConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"IntelligentTieringConfigurations"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;IntelligentTieringConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"InventoryConfigurations"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;InventoryConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"LifecycleConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;LifecycleConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"LoggingConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;LoggingConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"MetadataConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;MetadataConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"MetadataTableConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;MetadataTableConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"MetricsConfigurations"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;MetricsConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"NotificationConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;NotificationConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"ObjectLockConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;ObjectLockConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"ObjectLockEnabled"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;Boolean&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"OwnershipControls"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;OwnershipControls&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"PublicAccessBlockConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;PublicAccessBlockConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"ReplicationConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;ReplicationConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Tags"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;Tag&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"VersioningConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;VersioningConfiguration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"WebsiteConfiguration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;WebsiteConfiguration&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This code snippet is from the official &lt;a href="https://docs.aws.amazon.com/AWSCloudFormation/latest/TemplateReference/aws-resource-s3-bucket.html" rel="noopener noreferrer"&gt;AWS documentation&lt;/a&gt;. It list all possible properties.&lt;/p&gt;

&lt;p&gt;Here's how it would look like in real life:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"AWSTemplateFormatVersion"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2010-09-09"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Resources"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"S3Bucket"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"AWS::S3::Bucket"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Properties"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"PublicAccessBlockConfiguration"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"BlockPublicAcls"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"BlockPublicPolicy"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"IgnorePublicAcls"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"RestrictPublicBuckets"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"WebsiteConfiguration"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"IndexDocument"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"index.html"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"ErrorDocument"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"error.html"&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"DeletionPolicy"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Retain"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"UpdateReplacePolicy"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Retain"&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"BucketPolicy"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"AWS::S3::BucketPolicy"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Properties"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"PolicyDocument"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"Id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"MyPolicy"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"Version"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2012-10-17"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;                 
                    &lt;/span&gt;&lt;span class="nl"&gt;"Statement"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                            &lt;/span&gt;&lt;span class="nl"&gt;"Sid"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"PublicReadForGetBucketObjects"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                            &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                            &lt;/span&gt;&lt;span class="nl"&gt;"Principal"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                            &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"s3:GetObject"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                            &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                                &lt;/span&gt;&lt;span class="nl"&gt;"Fn::Join"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                                    &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                                    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                                        &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:s3:::"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                                        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                                            &lt;/span&gt;&lt;span class="nl"&gt;"Ref"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"S3Bucket"&lt;/span&gt;&lt;span class="w"&gt;
                                        &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
                                        &lt;/span&gt;&lt;span class="s2"&gt;"/*"&lt;/span&gt;&lt;span class="w"&gt;
                                    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
                                &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
                            &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
                        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Bucket"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"Ref"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"S3Bucket"&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Outputs"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"WebsiteURL"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Value"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Fn::GetAtt"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="s2"&gt;"S3Bucket"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="s2"&gt;"WebsiteURL"&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"URL for website hosted on S3"&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"S3BucketSecureURL"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Value"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Fn::Join"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                        &lt;/span&gt;&lt;span class="s2"&gt;"https://"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                            &lt;/span&gt;&lt;span class="nl"&gt;"Fn::GetAtt"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                                &lt;/span&gt;&lt;span class="s2"&gt;"S3Bucket"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                                &lt;/span&gt;&lt;span class="s2"&gt;"DomainName"&lt;/span&gt;&lt;span class="w"&gt;
                            &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
                        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Name of S3 bucket to hold website content"&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or YAML version:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="err"&gt;AWSTemplateFormatVersion:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2010-09-09&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt;Resources:&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;S&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;Bucket:&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;Type:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;AWS::S&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;::Bucket&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;Properties:&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;PublicAccessBlockConfiguration:&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="err"&gt;BlockPublicAcls:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="err"&gt;BlockPublicPolicy:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="err"&gt;IgnorePublicAcls:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="err"&gt;RestrictPublicBuckets:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;WebsiteConfiguration:&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="err"&gt;IndexDocument:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;index.html&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="err"&gt;ErrorDocument:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;error.html&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;DeletionPolicy:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;Retain&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;UpdateReplacePolicy:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;Retain&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;BucketPolicy:&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;Type:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;AWS::S&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;::BucketPolicy&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;Properties:&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;PolicyDocument:&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="err"&gt;Id:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;MyPolicy&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="err"&gt;Version:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2012-10-17&lt;/span&gt;&lt;span class="w"&gt;                  
        &lt;/span&gt;&lt;span class="err"&gt;Statement:&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;Sid:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;PublicReadForGetBucketObjects&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="err"&gt;Effect:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;Allow&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="err"&gt;Principal:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;'*'&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="err"&gt;Action:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;'s&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;:GetObject'&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="err"&gt;Resource:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;!Join&lt;/span&gt;&lt;span class="w"&gt; 
              &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;''&lt;/span&gt;&lt;span class="w"&gt;
              &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;'arn:aws:s&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;:::'&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;!Ref&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;S&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;Bucket&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;/*&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;Bucket:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;!Ref&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;S&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;Bucket&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt;Outputs:&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;WebsiteURL:&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;Value:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;!GetAtt&lt;/span&gt;&lt;span class="w"&gt; 
      &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;S&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;Bucket&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;WebsiteURL&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;Description:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;URL&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;for&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;website&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;hosted&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;on&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;S&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;S&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;BucketSecureURL:&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;Value:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;!Join&lt;/span&gt;&lt;span class="w"&gt; 
      &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;''&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;'https://'&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;!GetAtt&lt;/span&gt;&lt;span class="w"&gt; 
          &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;S&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;Bucket&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="err"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;DomainName&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;Description:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;of&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;S&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;bucket&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;to&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;hold&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;website&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;content&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To update or add a new stack &lt;strong&gt;CloudFormation&lt;/strong&gt; will review the file for changes and execute the update. Similarly, for deletions, it will identify differences and proceed accordingly. If you wish to keep certain resources from being deleted, you can use the &lt;strong&gt;DeletionPolicy&lt;/strong&gt; property.&lt;/p&gt;

&lt;p&gt;Variables are not support on CloudFormation but the functionality can be achieved using &lt;strong&gt;parameters&lt;/strong&gt;, &lt;strong&gt;mappings&lt;/strong&gt; (for configuration lookups) and &lt;strong&gt;dynamic references&lt;/strong&gt; (securely retrieve values from AWS Secrets Manager or Systems Manager Parameter Store).&lt;/p&gt;

&lt;p&gt;We can also use nested stacks to break down complex template into smaller and reusable stacks. The CloudFormation Modules are reusable and self-contained configurations that can be used across team and projects (just like libraries).&lt;/p&gt;

&lt;p&gt;To prevent errors in CloudFormation we could use a powerful method that uses a CloudFormation featured called &lt;a href="https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/using-cfn-updating-stacks-changesets.html" rel="noopener noreferrer"&gt;&lt;strong&gt;change sets&lt;/strong&gt;&lt;/a&gt;, it can predict the results of an update stack operation so we can check if those are the changes we want before proceeding. You do create a change set operation like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws cloudformation create-change-set --stack-name MyStack \
    --change-set-name SampleChangeSet --use-previous-template \
    --parameters \
      ParameterKey="InstanceType",UsePreviousValue=true ParameterKey="KeyPairName",UsePreviousValue=true ParameterKey="Purpose",ParameterValue="production"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This way, we have a chance to review changes more carefully.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS CDK&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://docs.aws.amazon.com/cdk/v2/guide/home.html" rel="noopener noreferrer"&gt;AWS Cloud Development Kit (CDK)&lt;/a&gt; is an open-source software development framework that allows developers to define cloud infrastructure using familiar programming languages.&lt;/p&gt;

&lt;p&gt;CDK provides pre-built components called &lt;strong&gt;constructs&lt;/strong&gt; which are abstractions of AWS resources.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;This is a collection of pre-written modular and reusable pieces of code, called constructs, that you can use, modify, and integrate to develop your infrastructure quickly.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There are different construct levels, you can refer to the &lt;a href="https://docs.aws.amazon.com/cdk/v2/guide/constructs.html#constructs-lib-levels" rel="noopener noreferrer"&gt;official docs&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;AWS CDK works out of the box with AWS CloudFormation to deploy and provision infrastructure on AWS.&lt;/p&gt;

&lt;p&gt;Choosing between &lt;strong&gt;AWS CDK&lt;/strong&gt; and &lt;strong&gt;CloudFormation&lt;/strong&gt; depends on the specific scenario, team expertise, and project complexity.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Automate Infra Deployment With CDK in CI/CD Pipeline&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;To &lt;strong&gt;automate&lt;/strong&gt; deployment with CDK in a CI/CD pipeline, add a &lt;strong&gt;Deploy&lt;/strong&gt; stage to the previously built CodePipeline and release the change.&lt;/p&gt;

&lt;p&gt;In the this case, the application will authenticate with AWS services DynamoDB and Bedrock using the IAM role and instance profile created by the CDK stack.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F806n1tgk9wxz0vjaoa11.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F806n1tgk9wxz0vjaoa11.png" alt=" " width="800" height="246"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Monitoring Your Infrastructure
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Monitor, Log, and Audit With CloudTrail
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-user-guide.html" rel="noopener noreferrer"&gt;AWS CloudTrail&lt;/a&gt; is a service that enables you to track API actions within an AWS account. It allows you to view the activity of resources and users based on the API calls made in the account. For example, if someone creates an S3 bucket, CloudTrail will log that.&lt;/p&gt;

&lt;p&gt;This service is essential for tracking activity, as it helps identify misconfigurations or unauthorized access. It's also valuable for forensic analysis after a security event, providing historical data on all users and activities, and supports centralized management across multiple accounts.&lt;/p&gt;

&lt;p&gt;This service offers several benefits, including enhanced cross-account visibility, simplified compliance auditing, and easier troubleshooting through aggregated logs.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;AWS CloudWatch&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;When issues arise with your application, you'll want to address them as soon as possible and even before your customers realize it. A fantastic monitoring and observability service is &lt;strong&gt;AWS&lt;/strong&gt; &lt;strong&gt;CloudWatch&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;AWS CloudWatch comes with some metrics out of the box already but the cool thing is that you can create &lt;strong&gt;custom metrics&lt;/strong&gt;. You can also set up an alarm based on those metrics, allowing the team to receive notifications when it enters the alarm state. This service includes more features like dashboards, logs, ​events, network monitoring, and more.&lt;/p&gt;

&lt;p&gt;For deployment and CI/CD, &lt;strong&gt;Amazon CloudWatch&lt;/strong&gt; is invaluable because it can immediately collect metrics and logs each time a new version of your application is deployed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/Install-CloudWatch-Agent.html" rel="noopener noreferrer"&gt;&lt;strong&gt;CloudWatch Logs agent&lt;/strong&gt;&lt;/a&gt; is a software component that allows you to collect logs from your services. It allows you to monitor your infrastructure and applications more thoroughly than the default basic monitoring.&lt;/p&gt;

&lt;p&gt;Finally, I wanted to talk about another cool feature called &lt;a href="https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Anomaly_Detection.html" rel="noopener noreferrer"&gt;CloudWatch anomaly detection&lt;/a&gt;, you can enable anomaly detection for a metric and will use machine learning to identify unusual patterns in your data automatically. Isn't that amazing?&lt;/p&gt;

&lt;h3&gt;
  
  
  A Powerful Mix: CloudWatch + CloudTrail
&lt;/h3&gt;

&lt;p&gt;Did you know that you can create CloudWatch metrics and alarms from CloudTrail data? You can set up a new trail and perform some actions so this new trail logs the events. Simply create a new trail, follow on-screen steps and enable CloudWatch Logs and the select log events.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ct0g5m2iqaj68jpyoyw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ct0g5m2iqaj68jpyoyw.png" alt=" " width="800" height="233"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Review and create the trail. Generate data by navigating the console and exploring various services to ensure the new trail logs the new data, regardless if it's done through the management console or programmatically, CloudTrail will capture all activity.&lt;/p&gt;

&lt;p&gt;Finally, we can view the CloudTrail logs from the Trails menu option, however, we'll notice that these trails are somewhat difficult to scan when you're searching for a specific pattern or detail. If you enabled CloudWatch, you will be able to see CloudWatch Logs, not as a compressed text file but as logs which makes easier to do searches and navigate the data. We can create filters, ​and with these filters, we can create &lt;strong&gt;alarms&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudwatch-alarms-for-cloudtrail.html" rel="noopener noreferrer"&gt;Here's&lt;/a&gt; more about this in the official documentation.&lt;/p&gt;

&lt;h3&gt;
  
  
  Monitoring with AWS X-Ray
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/xray/latest/devguide/aws-xray.html" rel="noopener noreferrer"&gt;AWS X-Ray&lt;/a&gt; is a service that gathers data on requests handled by your application and offers tools to view, filter, and analyze this data, helping you identify issues and opportunities for optimization. It can track requests across various AWS resources and microservices applications, allowing you to identify delays and errors.&lt;/p&gt;

&lt;p&gt;To capture data from my applications, we need to use &lt;a href="https://aws.amazon.com/otel/" rel="noopener noreferrer"&gt;AWS Distro for OpenTelemetry&lt;/a&gt; (ADOT). We can use an &lt;strong&gt;OpenTelemetry SDK&lt;/strong&gt; to instrument our application and an ADOT collector to receive and export traces to the &lt;strong&gt;AWS&lt;/strong&gt; &lt;strong&gt;X-Ray&lt;/strong&gt; service. OpenTelemetry SDKs are an industry standard for tracing instrumentation and support &lt;strong&gt;AWS X-Ray&lt;/strong&gt;, has a large number of library instrumentations implementations for a lot of different languages and it's vendor-agnostic. Alternatively, you can use the &lt;a href="https://docs.aws.amazon.com/xray/latest/devguide/aws-xray-interface-sdk.html" rel="noopener noreferrer"&gt;AWS X-Ray SDK&lt;/a&gt;, an AWS's proprietary distributed tracing solution, which is integrated with the AWS ecosystem.&lt;/p&gt;

&lt;p&gt;After installing OpenTelemetry to instrument the application and beginning to receive traces in the &lt;strong&gt;AWS X-Ray&lt;/strong&gt; service, we can now view the application's activity. This activity can be accessed on the X-Ray console, where you can view the service map generated from the traces, trace requests through various components, identify errors, and examine a visual map that illustrates the flow of requests.&lt;/p&gt;

&lt;h2&gt;
  
  
  Operating with Confidence
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Configuration Change Detection with AWS Config
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/config/" rel="noopener noreferrer"&gt;AWS Config&lt;/a&gt; is a service that offers a comprehensive view of the resources linked to our AWS account. It details their configurations, interrelationships, and any changes in these configurations and relationships over time. &lt;strong&gt;AWS Config&lt;/strong&gt; can create a dashboard displaying noncompliant resources, it helps us understand the state of our AWS resources and how they evolve over time.&lt;/p&gt;

&lt;p&gt;Even though &lt;strong&gt;AWS Config&lt;/strong&gt; provides &lt;a href="https://docs.aws.amazon.com/config/latest/developerguide/evaluate-config_use-managed-rules.html" rel="noopener noreferrer"&gt;&lt;strong&gt;AWS managed rules&lt;/strong&gt;&lt;/a&gt;&lt;em&gt;,&lt;/em&gt; you can also create &lt;strong&gt;custom rules&lt;/strong&gt; to check things you want and report back to AWS Config.&lt;/p&gt;

&lt;p&gt;While &lt;strong&gt;AWS CloudTrail&lt;/strong&gt; acts as a record, providing evidence of activities within the infrastructure, &lt;strong&gt;AWS Config&lt;/strong&gt; highlights the changes that occur. Each service has a distinct log format, so CloudTrail details every aspect of the API call, whereas Config shows the state of the resource before and after a change.&lt;/p&gt;

&lt;p&gt;🙌 AWS Config highlights the changes on the resource side, while CloudTrail provides the evidence.&lt;/p&gt;

&lt;p&gt;Some &lt;strong&gt;benefits&lt;/strong&gt;..&lt;/p&gt;

&lt;p&gt;👉 Integrates effectively with other AWS services&lt;/p&gt;

&lt;p&gt;👉 It monitors configuration changes for supported AWS services, logs the details, and maintains a history for analysis.&lt;/p&gt;

&lt;p&gt;👉 The service can automatically evaluate resource configurations against compliance rules and identify any deviations.&lt;/p&gt;

&lt;h3&gt;
  
  
  AWS Systems Manager
&lt;/h3&gt;

&lt;p&gt;We can gain centralized operational insights with &lt;a href="https://aws.amazon.com/systems-manager/" rel="noopener noreferrer"&gt;AWS Systems Manager&lt;/a&gt;. It's a managed service that we can use to view and control the infrastructure on AWS. It simplifies operations such as patch management, configuration updates, and instance monitoring through a single interface, instead of managing servers manually.&lt;/p&gt;

&lt;p&gt;It's designed to assist in mitigating and recovering from &lt;a href="https://docs.aws.amazon.com/incident-manager/latest/userguide/what-is-incident-manager.html" rel="noopener noreferrer"&gt;incidents&lt;/a&gt; that impact our applications hosted on AWS.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key features&lt;/strong&gt; include executing scripts or predefined workflows to simplifying repetitive tasks, applying security updates across instances, securely storing and retrieving credentials, obtaining metadata about instances, and managing them without the need for SSH, enhancing security.&lt;/p&gt;

&lt;p&gt;There's an interesting tool of AWS Systems Manager called &lt;a href="https://docs.aws.amazon.com/systems-manager/latest/userguide/systems-manager-parameter-store.html" rel="noopener noreferrer"&gt;Parameter Store&lt;/a&gt;. It provides secure storage for secrets and other data. This helps enforce security and can be referenced from scripts, code or inside CloudFormation resources.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up Part Two: The Journey Continues
&lt;/h2&gt;

&lt;p&gt;If you made it this far, THANK YOU so much 🥹🙏, it really means a lot. I started writing this series simply to document what I've been learning, and if even one person finds it useful or feels a little more inspired to explore AWS and cloud, then it was all worth it.&lt;/p&gt;

&lt;p&gt;I especially hope this reaches other women who are curious about cloud but maybe haven't made the jump yet, trust me, you belong here just as much as anyone else, and the community is bigger and more welcoming than you might think.&lt;/p&gt;

&lt;p&gt;Part 3 is on its way, so stay tuned, there's still more to cover and more to share.&lt;/p&gt;

&lt;p&gt;See you in the next one 🚀&lt;/p&gt;

</description>
      <category>aws</category>
      <category>genai</category>
      <category>devops</category>
      <category>awsdevops</category>
    </item>
    <item>
      <title>Course 1 of 3: Upgrading Apps with Gen AI 🤖</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:45:55 +0000</pubDate>
      <link>https://forem.com/lalidevops/course-1-of-3-upgrading-apps-with-gen-ai-1ik</link>
      <guid>https://forem.com/lalidevops/course-1-of-3-upgrading-apps-with-gen-ai-1ik</guid>
      <description>&lt;h3&gt;
  
  
  Table of Contents
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Introduction&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Upgrading a TravelGuide App with Generative AI&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Some Basic Concepts Before We Proceed (Skip If You Want)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Amazon Bedrock: The Problems It Actually Solves&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Custom Knowledge Bases And Model Customization in Amazon Bedrock&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Hands On Lab: Creating a Knowledge Base&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;AI Safety Controls with Amazon Bedrock Guardrails&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Integrating AI Models Through the Bedrock API&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Engineering Better Conversations: Prompt Engineering&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Next Steps&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;I wanted to sharpen my AWS skills and understand where GenAI actually fits in the DevOps pipeline, not just the hype around it. I did some research and came across &lt;a href="https://aws.amazon.com/blogs/training-and-certification/devops-and-ai-on-aws/" rel="noopener noreferrer"&gt;this&lt;/a&gt; blog post.&lt;/p&gt;

&lt;p&gt;Two things were on my mind: deepen my AWS expertise and understand how to responsibly integrate GenAI into production systems.&lt;/p&gt;

&lt;p&gt;This is the first blog post in a series of three, where I share my experience studying and earning the &lt;a href="https://www.coursera.org/specializations/devops-ai-aws" rel="noopener noreferrer"&gt;DevOps and AI on AWS Specialization from Coursera&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Upgrading A TravelGuide App with Generative AI
&lt;/h2&gt;

&lt;p&gt;The first course is called "&lt;a href="https://www.coursera.org/learn/upgrading-apps-generative-ai?specialization=devops-ai-aws" rel="noopener noreferrer"&gt;DevOps and AI on AWS: Upgrading Apps with Generative AI&lt;/a&gt;”, the title is pretty self explanatory but, to be completely honest, I’ve never used an AWS GenAI service like Amazon Bedrock before.&lt;/p&gt;

&lt;p&gt;I had some experience integrating the Gemini API into a frontend app. If you've been following my previous blog posts, you probably know that I used to be a developer and because I wanted to learn how to integrate AI into a Next.js app, I built an app called &lt;a href="https://github.com/lalidiaz/SummarizeAI" rel="noopener noreferrer"&gt;SummarAI&lt;/a&gt;. This app lets you upload a PDF and instantly extracts key insights using AI. In that occation, I worked with the Gemini AI API, but I knew AWS offers an interesting set of tools when it comes to AI, the absolute star is &lt;a href="https://aws.amazon.com/bedrock/?trk=6316d34d-2e8a-4ef0-b9d9-022774976ae5&amp;amp;sc_channel=ps&amp;amp;trk=6316d34d-2e8a-4ef0-b9d9-022774976ae5&amp;amp;sc_channel=ps&amp;amp;ef_id=Cj0KCQiA4eHLBhCzARIsAJ2NZoJ-SZKP2eGteQy6WQrrfM4boz_J2dMn5AQB-YJddyaAl2Xn8mu8_AsaApVvEALw_wcB:G:s&amp;amp;s_kwcid=AL!4422!3!780636715672!e!!g!!amazon%20bedrock!23183030539!184407538741&amp;amp;gad_campaignid=23183030539&amp;amp;gbraid=0AAAAADjHtp-fEbEy8bIcFNufcp9xhX8te" rel="noopener noreferrer"&gt;Amazon Bedrock&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This first course focuses on creating a travel guide application that collects travel information for various destinations. It also helps create itineraries for travelers to follow and allows them to leave reviews of the cities they visit.&lt;/p&gt;

&lt;p&gt;It’s a python app hosted on an EC2 instance, that currently reads items with &lt;a href="https://aws.amazon.com/sdk-for-python/" rel="noopener noreferrer"&gt;Boto3&lt;/a&gt; from DynamoDB tables and displays them.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7z34nf9ky55fhm0noljz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7z34nf9ky55fhm0noljz.png" alt=" " width="800" height="293"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Some Basic Concepts Before We Proceed (Skip If You Want)
&lt;/h2&gt;

&lt;p&gt;What is &lt;a href="https://aws.amazon.com/what-is/generative-ai/" rel="noopener noreferrer"&gt;Generative AI&lt;/a&gt;?&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Generative artificial intelligence (generative AI) is a type of AI that can create new content and ideas, including conversations, stories, images, videos, and music. It can learn human language, programming languages, art, chemistry, biology, or any complex subject matter. It reuses what it knows to solve new problems.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;What is LLM &lt;a href="https://aws.amazon.com/what-is/large-language-model/" rel="noopener noreferrer"&gt;(Large Language Model)&lt;/a&gt;?&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Large language models, also known as LLMs, are very large &lt;a href="https://aws.amazon.com/what-is/deep-learning/?refid=969dbd8a-11ff-42cd-9a8a-3b5ed0adbfba" rel="noopener noreferrer"&gt;deep learning models that&lt;/a&gt; are pre-trained on vast amounts of data. The underlying transformer is a set of &lt;a href="https://aws.amazon.com/what-is/neural-network/?refid=969dbd8a-11ff-42cd-9a8a-3b5ed0adbfba" rel="noopener noreferrer"&gt;neural networks that consist o&lt;/a&gt;f an encoder and a decoder with self-attention capabilities. The encoder and decoder extract meanings from a sequence of text and understand the relationships between words and phrases in it.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Amazon Bedrock: The Problems It Actually Solves
&lt;/h2&gt;

&lt;p&gt;As I go through the course, I can tell this TravelGuide app comes with some limitations: For each city, the the content needs to be written and edited, there’s a manual step that requires researching and crafting the itineraries, this takes a lot of time. Can you imagine if we want to add more cities as the app scales? This process is very time consuming and limits how quickly we can escalate the app. Another limitation is that the itineraries can't be personalized for individual needs. This is static content, so an itinerary for someone interested in pubs would look very different from one created for someone more into museums who doesn't like nights out.&lt;/p&gt;

&lt;p&gt;We could solve these issues and enhance the app by using a service called &lt;a href="https://aws.amazon.com/bedrock/?trk=6316d34d-2e8a-4ef0-b9d9-022774976ae5&amp;amp;sc_channel=ps&amp;amp;trk=6316d34d-2e8a-4ef0-b9d9-022774976ae5&amp;amp;sc_channel=ps&amp;amp;ef_id=Cj0KCQiA4eHLBhCzARIsAJ2NZoJ-SZKP2eGteQy6WQrrfM4boz_J2dMn5AQB-YJddyaAl2Xn8mu8_AsaApVvEALw_wcB:G:s&amp;amp;s_kwcid=AL!4422!3!780636715672!e!!g!!amazon%20bedrock!23183030539!184407538741&amp;amp;gad_campaignid=23183030539&amp;amp;gbraid=0AAAAADjHtp-fEbEy8bIcFNufcp9xhX8te" rel="noopener noreferrer"&gt;Amazon Bedrock&lt;/a&gt; to generate travel recommendations or itineraries. As its own web page says, Amazon Bedrock is “&lt;em&gt;The platform for building generative AI applications and agents at production scale&lt;/em&gt;”. It’s a fully managed AWS service that enables building generative AI apps using foundation models.&lt;/p&gt;

&lt;p&gt;We could use the foundation models provided by Bedrock through an API to create content and generate results in real-time with prompts. This would allow us to create travel recommendations and itineraries, while AWS takes care of the infrastructure needed to run and manage these models.&lt;/p&gt;

&lt;p&gt;Amazon Bedrock provides several different foundation models (I'll refer to them as FMs from now on), and you might feel the same way I did when I started learning about this, it's hard to choose the &lt;em&gt;right&lt;/em&gt; model for the use case. How do you pick the right FM for the task you need to complete? Fortunately, we can categorize FMs by the inputs and outputs they can handle.&lt;/p&gt;

&lt;p&gt;👉 Inputs are generally text and image.&lt;/p&gt;

&lt;p&gt;👉 Outputs can be text, chat, image, and embeddings.&lt;a href="https://signin.aws.amazon.com/signup?request_type=register&amp;amp;refid=6316d34d-2e8a-4ef0-b9d9-022774976ae5" rel="noopener noreferrer"&gt;&lt;br&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You might see the term "embeddings" and wonder what it means. I wanted to share a great website that explains embeddings in a very creative way: &lt;a href="https://huggingface.co/spaces/hesamation/primer-llm-embedding?section=what_are_embeddings?" rel="noopener noreferrer"&gt;&lt;strong&gt;LLM Embeddings Explained: A Visual and Intuitive Guide&lt;/strong&gt;&lt;/a&gt; .&lt;/p&gt;

&lt;p&gt;In a nutshell, embedding vectors are used to find related content when retrieving and enhancing information. Amazon Bedrock Knowledge Bases implement Retrieval-Augmented Generation (RAG) using embeddings.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;With Amazon Bedrock Knowledge Bases, you can provide foundation models and agents with contextual information from your company’s private data sources to deliver more relevant, accurate, and customized responses.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Here’s a &lt;a href="https://docs.aws.amazon.com/bedrock/latest/userguide/models-supported.html" rel="noopener noreferrer"&gt;comparative table&lt;/a&gt; of the supported foundation models in Amazon Bedrock.&lt;/p&gt;
&lt;h2&gt;
  
  
  Custom Knowledge Bases And Model Customization in Amazon Bedrock
&lt;/h2&gt;

&lt;p&gt;Let’s say we want to integrate our private company’s owns data with Bedrock, so the reviews are more relevant for the users. We are talking about private and proprietary data that is owned by a company, and by default, Amazon Bedrock doesn’t have access to this data.&lt;/p&gt;

&lt;p&gt;You can integrate your private data by either customize the model or use Knowledge Bases.&lt;/p&gt;
&lt;h3&gt;
  
  
  &lt;strong&gt;Customize the model&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Train the model on private data, this would lead to better performance for specific use-cases and create a better customer experience. Amazon Bedrock provides the following &lt;a href="https://docs.aws.amazon.com/bedrock/latest/userguide/custom-models.html" rel="noopener noreferrer"&gt;methods&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Distillation&lt;/strong&gt; → Transfers knowledge from a larger, more intelligent model (teacher) to a smaller, faster, and cost-efficient model (student).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Reinforcement fine-tuning&lt;/strong&gt; → Improves foundation model alignment with your specific use case through feedback-based learning. Unlike supervised learning, you don't provide labeled input-output pairs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Supervised fine-tuning&lt;/strong&gt; → Trains a model to improve performance on specific tasks by providing labeled data. The model learns from explicit examples of correct input-output associations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Continued pre-training&lt;/strong&gt; → Provides unlabeled data to pre-train a foundation model by familiarizing it with certain types of inputs. This improves the model's domain knowledge without requiring labeled examples.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Heads-up: There are some costs related to customization, refer to&lt;/strong&gt; &lt;a href="https://docs.aws.amazon.com/general/latest/gr/bedrock.html" rel="noopener noreferrer"&gt;&lt;strong&gt;Amazon Bedrock endpoints and quotas&lt;/strong&gt;&lt;/a&gt;&lt;strong&gt;.&lt;/strong&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Knowledge Bases
&lt;/h3&gt;

&lt;p&gt;As its own website says, "&lt;em&gt;you can give foundation models and agents contextual information from your company’s private data sources to deliver more relevant, accurate, and customized responses&lt;/em&gt;." In this use case, it allows us to connect to private data sources, enabling users to interact and ask questions without needing to fine-tune or retrain the model. The model has access to this data and can provide the answers users need.&lt;/p&gt;

&lt;p&gt;So, it works by uploading your custom data sources to create a repository of information that is used to enhance your prompts. It uses something called Retrieval Augmented Generation (RAG) to achieve this.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frk3972gh6k0qm03n6cil.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frk3972gh6k0qm03n6cil.png" alt=" " width="700" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;RAG is a technique that retrieves data from company sources to augment the prompt, providing more relevant and accurate responses. Amazon Bedrock Knowledge Bases use a vector database to store data as embeddings (numerical representations of text). The knowledge base connects Bedrock with private company data by retrieving the most relevant information at query time. It searches the data using semantic similarity, retrieves the most useful chunks, and uses them as additional context so the model can generate natural language answers. Instead of matching exact keywords, the system understands the meaning and context of a user’s query to return relevant information.&lt;/p&gt;
&lt;h3&gt;
  
  
  Hands On Lab: Creating a Knowledge Base
&lt;/h3&gt;

&lt;p&gt;In this course, there was a hands-on lab where I needed to create the Amazon Bedrock knowledge base and configure it for use by the app.&lt;/p&gt;

&lt;p&gt;To create a knowledge base is pretty straight forward. On the console, head over to Amazon Bedrock and click on &lt;code&gt;Create&lt;/code&gt; . Choose &lt;code&gt;Knowledge Base with vector store&lt;/code&gt; and you will see this screen:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsgm3k9n8uwaka7mjexaz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsgm3k9n8uwaka7mjexaz.png" alt=" " width="800" height="352"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Knowledge bases sync with documents, in this case, the app uses Amazon S3. You can choose different data source options and you can have up to five different data source in a knowledge base.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdefveg0dupkwbr0xz2gj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdefveg0dupkwbr0xz2gj.png" alt=" " width="800" height="352"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Configure the S3 bucket as your data source and upload the necessary data. Make sure to add the correct data source name and select the appropriate S3 bucket.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbo7ixtg9myyf3hworv9i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbo7ixtg9myyf3hworv9i.png" alt=" " width="800" height="352"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the vector store, review and create.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2remtri73k1b9yqmr2op.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2remtri73k1b9yqmr2op.png" alt=" " width="800" height="315"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By selecting the following option, Bedrock will create an Amazon OpenSearch Serverless cluster for us to store these embeddings.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp6441vzvq59uyzeroqf6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp6441vzvq59uyzeroqf6.png" alt=" " width="800" height="211"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Review details and Create Knowledge Base🎉&lt;/p&gt;

&lt;p&gt;Once it's created, remember to &lt;strong&gt;sync&lt;/strong&gt; the data sources to index the content for searching.&lt;/p&gt;

&lt;p&gt;You can test the model by clicking the &lt;code&gt;Test Knowledge Base&lt;/code&gt; button on the console. Choose the model, enter a prompt, and watch the magic happen.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flgb55bodk2lwiux5vjdc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flgb55bodk2lwiux5vjdc.png" alt=" " width="800" height="671"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3cmvtaxgse34ojhkvhqj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3cmvtaxgse34ojhkvhqj.png" alt=" " width="800" height="352"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  You can use a Knowledge Base In Two Ways
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Use the &lt;a href="https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent-runtime_Retrieve.html" rel="noopener noreferrer"&gt;&lt;strong&gt;Retrieve&lt;/strong&gt; API&lt;/a&gt; to query and get information from the Knowledge Base directly without an additional response.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use the &lt;a href="https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent-runtime_RetrieveAndGenerate.html" rel="noopener noreferrer"&gt;&lt;strong&gt;RetrieveAndGenerate API&lt;/strong&gt;&lt;/a&gt;, which takes a prompt as input and generates a response based on the retrieved information.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;
  
  
  AI Safety Controls with Amazon Bedrock Guardrails
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/bedrock/guardrails/" rel="noopener noreferrer"&gt;Guardrails&lt;/a&gt; offer protection and control over LLM responses, going beyond what can be managed through prompt design alone. Bedrock provides configurable guardrails to ensure responsible AI behavior:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Some of the content filters&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Block harmful categories such as hate, insults, or violence.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Grounding and Relevance&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Grounding&lt;/strong&gt;: Confidence threshold for factual accuracy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Relevance&lt;/strong&gt;: Checks if the model response is relevant to the user query&lt;/p&gt;

&lt;p&gt;More on this &lt;a href="https://aws.amazon.com/bedrock/guardrails/" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;h2&gt;
  
  
  Integrating AI Models Through the Bedrock API
&lt;/h2&gt;

&lt;p&gt;Here's an example from the AWS documentation on how to integrate a Python app with the Amazon Bedrock API using the Amazon Titan Text model.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
# SPDX-License-Identifier: Apache-2.0
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
Shows how to create a list of action items from a meeting transcript
with the Amazon Titan Text model (on demand).
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;logging&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;boto3&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;botocore.exceptions&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ClientError&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ImageError&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;Exception&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Custom exception for errors returned by Amazon Titan Text models&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;


&lt;span class="n"&gt;logger&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;logging&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getLogger&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;__name__&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;logging&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;basicConfig&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;level&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;logging&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;INFO&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;


&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;generate_text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;body&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    Generate text using Amazon Titan Text models on demand.
    Args:
        model_id (str): The model ID to use.
        body (str) : The request body to use.
    Returns:
        response (json): The response from the model.
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Generating text with Amazon Titan Text model %s&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;model_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;bedrock&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;boto3&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;client&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;service_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;bedrock-runtime&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;accept&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;application/json&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
    &lt;span class="n"&gt;content_type&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;application/json&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;bedrock&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;invoke_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;body&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;body&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;modelId&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;model_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;accept&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;accept&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;contentType&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;content_type&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;response_body&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;loads&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;body&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;read&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;

    &lt;span class="n"&gt;finish_reason&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;response_body&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;finish_reason&lt;/span&gt; &lt;span class="ow"&gt;is&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;raise&lt;/span&gt; &lt;span class="nc"&gt;ImageError&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Text generation error. Error is &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;finish_reason&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Successfully generated text with Amazon Titan Text model %s&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;model_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response_body&lt;/span&gt;


&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    Entrypoint for Amazon Titan Text model example.
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;logging&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;basicConfig&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;level&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;logging&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;INFO&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                            &lt;span class="nb"&gt;format&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;%(levelname)s: %(message)s&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# You can replace the model_id with any other Titan Text Models
&lt;/span&gt;        &lt;span class="c1"&gt;# Titan Text Model family model_id is as mentioned below:
&lt;/span&gt;        &lt;span class="c1"&gt;# amazon.titan-text-premier-v1:0, amazon.titan-text-express-v1, amazon.titan-text-lite-v1
&lt;/span&gt;        &lt;span class="n"&gt;model_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;amazon.titan-text-premier-v1:0&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;

        &lt;span class="n"&gt;prompt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Meeting transcript: Miguel: Hi Brant, I want to discuss the workstream  
            for our new product launch Brant: Sure Miguel, is there anything in particular you want
            to discuss? Miguel: Yes, I want to talk about how users enter into the product.
            Brant: Ok, in that case let me add in Namita. Namita: Hey everyone 
            Brant: Hi Namita, Miguel wants to discuss how users enter into the product. &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

        &lt;span class="n"&gt;body&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;inputText&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;textGenerationConfig&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;maxTokenCount&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;3072&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;stopSequences&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[],&lt;/span&gt;
                &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;temperature&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;0.7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;topP&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;0.9&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;})&lt;/span&gt;

        &lt;span class="n"&gt;response_body&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;generate_text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;body&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Input token count: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;response_body&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;inputTextTokenCount&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;response_body&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;results&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
            &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Token count: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;tokenCount&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Output text: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;outputText&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Completion reason: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;completionReason&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="n"&gt;ClientError&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;err&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;message&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;err&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
        &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;A client error occurred: %s&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;A client error occured: &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
              &lt;span class="nf"&gt;format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="n"&gt;ImageError&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;err&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;err&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;err&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Finished generating text with the Amazon Titan Text Premier model &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;model_id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;


&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Engineering Better Conversations: Prompt Engineering
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;Prompts in Large Language Models (LLMs) are &lt;strong&gt;the input texts, instructions, or questions users provide to guide the AI to generate specific, relevant, and accurate responses&lt;/strong&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;With Amazon Bedrock, the LLMs are already built and trained for us. The simplest way to get better results from these LLMs is to improve the prompts we provide to them.&lt;/p&gt;

&lt;p&gt;To write good prompts, a good strategy and easy to remember is the &lt;strong&gt;CO-STAR technique (Context → Objective → Style → Tone → Audience → Response).&lt;/strong&gt; Here’s an example of a strong Prompt:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Context:&lt;/em&gt;&lt;/strong&gt; &lt;em&gt;We're launching Wandr, a travel review app that focuses on authentic, local experiences rather than tourist traps. Unlike TripAdvisor, we verify that reviewers actually visited the location through GPS check-ins and require photo proof. Our community consists of adventure travelers and culture enthusiasts who want to discover hidden gems.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Objective:&lt;/em&gt;&lt;/strong&gt; &lt;em&gt;Write an engaging "About Us" section for our app store listing that will convince potential users to download the app. Target length: 200-250 words. Must include our key value propositions: verified reviews, local discovery focus, and spam-free community.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Style:&lt;/em&gt;&lt;/strong&gt; &lt;em&gt;Storytelling approach that paints a picture of the problem we solve. Use vivid, sensory language that evokes the feeling of travel. Structure: problem → solution → benefit. Include a specific example or scenario travelers can relate to.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Tone:&lt;/em&gt;&lt;/strong&gt; &lt;em&gt;Adventurous and inspiring, but trustworthy and grounded. Think "experienced travel buddy giving insider tips" rather than "corporate travel company." Avoid clichés like "wanderlust". Be warm and enthusiastic without being cheesy.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Audience:&lt;/em&gt;&lt;/strong&gt; &lt;em&gt;Millennials and Gen Z travelers who are tired of showing up to "hidden" restaurants only to find them packed with tourists from the same Google search. They value authenticity over luxury, experiences over amenities, and trust peer recommendations more than professional critics.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Response:&lt;/em&gt;&lt;/strong&gt; &lt;em&gt;Provide the app store description, followed by 3-5 bullet points summarizing our key features that could appear below the main description in the app store listing.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We can include instructions on the desired length or formatting of the results we want. Additionally, we can provide sample results in the prompt.&lt;/p&gt;

&lt;h2&gt;
  
  
  Next Steps
&lt;/h2&gt;

&lt;p&gt;This wraps up the first part of my three-part series on AI and AWS Coursera Specialization. I have to admit, I'm really excited about the upcoming course “&lt;strong&gt;DevOps and AI on AWS: CI/CD for Generative AI Applications”&lt;/strong&gt;. I can't wait to share what I learn with you🙌. Stay tuned!&lt;/p&gt;

</description>
      <category>awsandai</category>
      <category>devops</category>
      <category>ai</category>
      <category>aws</category>
    </item>
    <item>
      <title>Got my AWS Cloud Practitioner Certification - CLF-C02 ☁️</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:30:10 +0000</pubDate>
      <link>https://forem.com/lalidevops/got-my-aws-cloud-practitioner-certification-clf-c02-13fc</link>
      <guid>https://forem.com/lalidevops/got-my-aws-cloud-practitioner-certification-clf-c02-13fc</guid>
      <description>&lt;h2&gt;
  
  
  &lt;strong&gt;Table of Contents&lt;/strong&gt;
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Introduction&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Motivation and Background&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Exam Preparation Steps&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;* Stephane Maarek's Ultimate AWS Certified Cloud Practitioner Course

* AWS Cloud Practitioner Essentials Course

* AWS Skill Builder Individual Plan

* Official Practice Exam + Neal Davis's Practice Exams
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;Study Strategy and Tools&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;* Note-taking and Organization

* Interactive Learning
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;Exam Experience&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;* Choosing a Testing Center
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;Highlights of AWS Cloud Practitioner Certification&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;* Innovative Solutions and Tools

* Security Domain
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;Introduction&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;I am thrilled to share my journey of achieving the &lt;strong&gt;AWS Certified Cloud Practitioner certification ☁️🙌&lt;/strong&gt;. This was a significant milestone for me, especially since I started without any prior experience or exposure to AWS Cloud. The preparation and learning process was both challenging and incredibly rewarding.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Motivation and Background&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;The decision to pursue the AWS Certified Cloud Practitioner certification was driven by my desire to enhance my technical skills and understanding of cloud computing. Despite my initial lack of experience, I found the learning journey to be engaging and enlightening, particularly in the area of cloud security 🥷👩🏽‍💻👀, where for sure I am going to dive deeper.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Exam Preparation Steps&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Stephane Maarek's Ultimate AWS Certified Cloud Practitioner Course&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;I began my preparation with the ✅ &lt;a href="https://www.udemy.com/course/aws-certified-cloud-practitioner-new/" rel="noopener noreferrer"&gt;[NEW] Ultimate AWS Certified Cloud Practitioner CLF-C02 course&lt;/a&gt; by Stephane Maarek, he is a solutions architect, consultant and software developer. This comprehensive course provided a solid foundation and covered all the necessary topics in depth and I found it super useful and comprehensive.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;AWS Cloud Practitioner Essentials Course and other resources&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Next, I enrolled in the ✅ &lt;a href="https://explore.skillbuilder.aws/learn/course/external/view/elearning/134/aws-cloud-practitioner-essentials" rel="noopener noreferrer"&gt;&lt;strong&gt;AWS Cloud Practitioner Essentials&lt;/strong&gt;&lt;/a&gt; official course, available for free on the &lt;a href="https://skillbuilder.aws/" rel="noopener noreferrer"&gt;AWS Skill Builder&lt;/a&gt; platform. This course offered a structured and thorough overview of AWS Cloud concepts, helped me filling the gaps from the previous course I've done and also reinforce concepts, this was very useful too!&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;AWS Skill Builder Individual Plan&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;To further enhance my preparation, I signed up for the AWS Skill Builder Individual Plan. This plan includes several valuable resources, I found this ones particularly useful for my exam prep:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;✅&lt;/strong&gt; &lt;a href="https://explore.skillbuilder.aws/learn/course/external/view/elearning/17373/aws-escape-room-for-exam-prep-aws-certified-cloud-practitioner-clf-c02" rel="noopener noreferrer"&gt;&lt;strong&gt;AWS Escape Room: Exam Prep for AWS Certified Cloud Practitioner (CLF-C02)&lt;/strong&gt;&lt;/a&gt;: A 3D virtual escape room game that tested my knowledge interactively.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;✅&lt;/strong&gt; &lt;a href="https://explore.skillbuilder.aws/learn/course/external/view/elearning/16434/exam-prep-standard-course-aws-certified-cloud-practitioner-clf-c02-english" rel="noopener noreferrer"&gt;&lt;strong&gt;Exam Prep Standard Course: AWS Certified Cloud Practitioner&lt;/strong&gt;&lt;/a&gt;: Another comprehensive course that reinforced my understanding.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Additionally, there was an &lt;a href="https://explore.skillbuilder.aws/learn/course/external/view/elearning/16485/exam-prep-enhanced-course-aws-certified-cloud-practitioner-clf-c02-english" rel="noopener noreferrer"&gt;&lt;em&gt;Exam Prep Enhanced Course: AWS Certified Cloud Practitioner&lt;/em&gt;&lt;/a&gt; available, but I chose not to take it since Maarek's course was already very comprehensive but I do recommend you go and check it out!&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Official Practice Exam + Neal Davis's Practice Exams&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Finally, I did the ✅ &lt;a href="https://explore.skillbuilder.aws/learn/course/external/view/elearning/14637/aws-certified-cloud-practitioner-official-practice-exam-clf-c02-english" rel="noopener noreferrer"&gt;&lt;strong&gt;AWS Certified Cloud Practitioner Official Practice Exam&lt;/strong&gt;&lt;/a&gt;  and I purchased the ✅ &lt;a href="https://www.udemy.com/course/aws-certified-cloud-practitioner-practice-exams-c/?couponCode=LEADERSALE24A" rel="noopener noreferrer"&gt;AWS Certified Cloud Practitioner Practice Exams CLF-C02&lt;/a&gt; set from Neal Davis. I took six practice tests a couple of times, meticulously reviewing and taking notes on any questions I got wrong. This practice was crucial in solidifying my knowledge and boosting my confidence.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Study Strategy and Tools&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Note-taking and Organization&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Throughout my preparation, I took detailed notes from each course. I organized these notes in a document on &lt;a href="https://www.notion.so/" rel="noopener noreferrer"&gt;Notion&lt;/a&gt;, adding additional information as I progressed through the resources. This systematic approach helped me consolidate my learning effectively.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Interactive Learning&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Engaging with interactive learning tools, like the &lt;strong&gt;AWS Escape Room game&lt;/strong&gt;, made the study process more enjoyable and helped reinforce my understanding in a practical, hands-on way.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Exam Experience&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Choosing a Testing Center&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Although the AWS exam can be taken from the comfort of your home, I chose to take it at a Test and Training Center in Dubai. This was a personal preference, as I felt more comfortable in a formal testing environment. You can find a Test Center in your city when you book the exam.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Highlights of AWS Cloud Practitioner Certification&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Innovative Solutions and Tools&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;One of the most fascinating aspects of AWS is how it provides solutions for virtually every problem, continuously coming up with innovative tools and services. This aspect of AWS kept me engaged and motivated throughout my study journey.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Security Domain&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;I found the security domain particularly interesting. AWS’s robust security measures and protocols are designed to protect data and applications, which is crucial in today’s digital landscape.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Conclusion&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;In conclusion, preparing for and achieving the AWS Certified Cloud Practitioner certification was a transformative experience. It took me a couple of months to prepare, balancing my study time with a significant holiday break. The journey was well worth it, and I am now excited to pursue my next certification. Thank you for reading about my experience, and I hope it inspires you on your learning path!&lt;/p&gt;




</description>
      <category>cloud</category>
      <category>cloudcomputing</category>
      <category>awscertification</category>
      <category>awscloudpractitioner</category>
    </item>
    <item>
      <title>Frontend To DevOPs: 5 Months In</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:28:14 +0000</pubDate>
      <link>https://forem.com/lalidevops/frontend-to-devops-5-months-in-4be1</link>
      <guid>https://forem.com/lalidevops/frontend-to-devops-5-months-in-4be1</guid>
      <description>&lt;h2&gt;
  
  
  &lt;strong&gt;Table of Contents&lt;/strong&gt;
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Introduction&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Summing Up the Journey&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Expectations vs. Reality&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Tech Stack: What I Had to Learn&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Surprise! Some Frontend Skills Actually Helped&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Advice for Frontend Developers Considering the Switch&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;What's Next for Me&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Some time ago I decided to switch from Frontend Development to DevOps engineering. I started sharing my experience through a &lt;a href="https://lalidev.hashnode.dev/transforming-skills-my-journey-from-frontend-to-cloud-engineering" rel="noopener noreferrer"&gt;series of blog posts&lt;/a&gt; so anyone else on the same journey doesn't feel alone, and anyone thinking about making the same move has the courage to do it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summing Up the Journey
&lt;/h2&gt;

&lt;p&gt;A couple of years ago I was working for a startup. We were a small technical team and I started as a Frontend Developer, that was my main strength. But if you've worked for a startup, you know there's a point where you end up doing whatever it takes to ship fast, so I ended up learning a bit of backend, even Google Analytics (which I hated 😂), and working on the infrastructure with AWS. Luckily, I felt very interested in infrastructure and somehow felt this world was fascinating. I started attending some &lt;a href="https://www.meetup.com/aws-girls-ug-uy/" rel="noopener noreferrer"&gt;AWS Girls UG group&lt;/a&gt; meetups where I got the opportunity to meet cool people and attend talks about different topics, and started to have mentorship sessions with @&lt;a href="https://dev.to@Marianogg9"&gt;Mariano González&lt;/a&gt; , a Senior DevOps engineer.&lt;/p&gt;

&lt;p&gt;As I was getting more interested in the infrastructure side of things, I got &lt;a href="https://lalidev.hashnode.dev/got-my-aws-cloud-practitioner-certification-clf-c02" rel="noopener noreferrer"&gt;AWS Cloud Practitioner Certified&lt;/a&gt;, and that gave me a broad understanding of AWS in general.&lt;/p&gt;

&lt;p&gt;I did a couple of side projects related to DevOps (including a &lt;a href="https://lalidev.hashnode.dev/iac-deploying-a-node-secrets-viewer-with-terraform" rel="noopener noreferrer"&gt;Terraform one&lt;/a&gt;), studied in a &lt;a href="https://bootcamp.295devops.com/?trk=public_post-text" rel="noopener noreferrer"&gt;DevOps Bootcamp&lt;/a&gt; that &lt;a href="https://roxs.295devops.com/" rel="noopener noreferrer"&gt;Roxs&lt;/a&gt; launched, and I wrote a &lt;a href="https://lalidev.hashnode.dev/?source=top_nav_blog_home" rel="noopener noreferrer"&gt;series of blog posts&lt;/a&gt; talking about that. And finally, I worked very hard on solving &lt;a href="https://lalidev.hashnode.dev/my-aws-cloud-resume-challenge" rel="noopener noreferrer"&gt;The Cloud Resume Challenge&lt;/a&gt;. My mentor, @&lt;a href="https://dev.to@Marianogg9"&gt;Mariano González&lt;/a&gt; , found it and recommended I do it, so I did and I learned A LOT.&lt;/p&gt;

&lt;p&gt;Finally, I applied for a Software Developer position at &lt;a href="https://www.streaver.com/" rel="noopener noreferrer"&gt;Streaver&lt;/a&gt;, an AI-driven solutions company based in Uruguay. They saw I'd done a couple of projects on infrastructure and invited me to do a DevOps challenge, which I did, and here I am, working as a DevOps engineer for 5 months already🎉. So I'm gonna share my experience.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Expectations vs. Reality&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;When I was preparing to transition into DevOps, I'll be honest, I didn’t have much experience in infrastructure, so it was hard to picture what a day as a DevOps engineer would actually look like. I watched a couple of "&lt;em&gt;A day as a DevOps engineer&lt;/em&gt;" YouTube videos and asked my mentor @&lt;a href="https://dev.to@Marianogg9"&gt;Mariano González&lt;/a&gt; , a Senior DevOps Engineer, questions about what the day-to-day looked like. While I got some kind of idea from all this research, it still wasn't much to go on. One thing I knew for sure though: &lt;em&gt;I wouldn't get bored.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;I had this mental image of what the role would be like. I thought I'd be spending most of my time writing infrastructure as code, automating deployments, and maybe dealing with the occasional production issue. It seemed like a natural progression from frontend development, just working on a different layer of the stack, right?&lt;/p&gt;

&lt;p&gt;Well, not exactly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What I Expected:&lt;/strong&gt; I imagined DevOps would be mostly about building CI/CD pipelines and writing Terraform configurations. I thought once you set things up correctly, everything would just run smoothly. I also assumed that since I had some AWS experience from my startup days, I'd have a decent head start.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What Actually Happened:&lt;/strong&gt; The scope turned out to be much broader than I anticipated. Yes, there's infrastructure as code and CI/CD pipelines, but there's also monitoring, security, cost optimization, incident response, documentation, and a lot of collaborative work with different teams.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The biggest surprise?&lt;/em&gt; How much time I spend on problem-solving and troubleshooting. Things break, often in unexpected ways, and you need to diagnose issues quickly, sometimes under pressure. Unlike frontend development where you can usually reproduce bugs locally, infrastructure issues can be elusive and involve multiple services interacting in complex ways.&lt;/p&gt;

&lt;p&gt;Another reality check: &lt;strong&gt;&lt;em&gt;the learning never stops&lt;/em&gt;&lt;/strong&gt;. New tools, new best practices, new security vulnerabilities, the landscape evolves constantly.&lt;/p&gt;

&lt;p&gt;But here's what actually surprised me: the impact of the work. As a DevOps you're helping the entire engineering team ship faster and with more confidence. That feeling is incredibly rewarding.&lt;/p&gt;

&lt;p&gt;And was I right about not getting bored? Absolutely. Every day brings something different, and there's always a new challenge to tackle.&lt;/p&gt;

&lt;p&gt;This new job involves a lot of thinking outside the box and finding creative solutions to complex problems. The impostor syndrome is real, especially when you're troubleshooting issues you've never encountered before, but pushing through those moments is part of the growth.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Tech Stack: What I Had to Learn
&lt;/h2&gt;

&lt;p&gt;Coming from a frontend background where I spent my days with JavaScript, React, and CSS, the DevOps tech stack felt like stepping into a completely different universe.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS CDK (Cloud Development Kit)&lt;/strong&gt; was completely new territory for me. It’s a way to write infrastructure as code using actual programming languages. I had never worked with it before this job, and I must say, it's an amazing tool. Being able to define infrastructure using familiar programming concepts like classes, loops, and conditionals felt way more intuitive than I expected. It bridged the gap between my developer mindset and infrastructure work in a way that made the transition smoother.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Python&lt;/strong&gt; My experience was mostly in JavaScript, so picking up Python was quite new to me. At first, the syntax differences threw me off, no curly braces, indentation actually matters. But honestly? Python and I became friends fast. It's clean, readable, and the ecosystem for DevOps tooling is incredible.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS (The Deep Dive)&lt;/strong&gt; I had some AWS experience from my startup days, but this job took it to a whole different level. Cloud Practitioner certification gave me breadth, but actually working as a DevOps engineer is giving me real hands-on experience. I'm talking about IAM policies, ECS, RDS, CloudWatch, Security Groups, everything. Understanding how all these services interact has been a continuous learning process.&lt;/p&gt;

&lt;p&gt;The learning curve is steep, but here's the thing: having a programming background actually helped more than I expected.&lt;/p&gt;

&lt;h2&gt;
  
  
  Surprise! Some Frontend Skills Actually Helped
&lt;/h2&gt;

&lt;p&gt;When I made the switch to DevOps, I kind of assumed most of my frontend skills would become irrelevant. Turns out, I was wrong.&lt;/p&gt;

&lt;p&gt;Troubleshooting and Problem-Solving: As a frontend developer, I spent countless hours debugging why a component wasn't rendering correctly, why an API call was failing, or why the layout broke on a specific browser. That investigative mindset, breaking down a problem, checking logs, isolating variables, testing, translates directly to infrastructure work.&lt;/p&gt;

&lt;p&gt;When a deployment fails or a service goes down, the approach is similar: read the error messages carefully, check the logs, trace back through recent changes, and systematically eliminate possibilities until you find the root cause. The context is different, but the problem-solving process? Pretty much the same.&lt;/p&gt;

&lt;p&gt;Working with Code Infrastructure as code is still code. Learning Python or writing CDK constructs felt less intimidating because I already understood functions, loops, conditionals, etc.&lt;/p&gt;

&lt;p&gt;So while the tech stack was completely different, the core skills I'd developed as a developer, problem-solving, debugging, and thinking in code, turned out to be more transferable than I expected.&lt;/p&gt;

&lt;h2&gt;
  
  
  Advice for Frontend Developers Considering the Switch
&lt;/h2&gt;

&lt;p&gt;If you're a frontend developer thinking about making the jump to DevOps, here's my advice:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Keep an Open Mind&lt;/strong&gt; DevOps is going to challenge a lot of your assumptions about how software works. You'll encounter tools, concepts, and workflows that feel completely foreign at first. Stay curious and open to new ways of thinking. The solutions that work in infrastructure aren't always the same ones you'd use in frontend development, and that's okay.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Think Outside the Box&lt;/strong&gt; Infrastructure problems rarely have a single "right" answer. You'll need to get comfortable with ambiguity and find creative solutions to complex challenges. Sometimes the best fix isn't the obvious one, and that's where the problem-solving gets interesting.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Embrace the "Always Learning" Mindset&lt;/strong&gt; This field evolves constantly, there's always something to learn. Try to learn something new every day, even if it's small. The learning never stops, and that's actually part of what makes it exciting.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Don't Let Impostor Syndrome Take Over&lt;/strong&gt; This is big. You're going to feel like you don't know what you're doing sometimes. You'll see senior engineers troubleshoot issues in minutes that would take you hours. You'll read documentation and feel lost. That's completely normal. Everyone starts somewhere, and feeling uncomfortable means you're growing. The impostor syndrome is real, but don't let it convince you that you don't belong.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Some Days Won't Be Good And That's Okay&lt;/strong&gt; You'll have days where nothing clicks, where you spend hours on a problem and make zero progress, where you feel like you're moving backwards instead of forwards. That's part of the process of learning something new. Those frustrating days are teaching you something, even if it doesn't feel like it in the moment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Get Comfortable Being Uncomfortable&lt;/strong&gt; Here's the truth: you're going to be out of your comfort zone. A lot. And that's exactly where you need to be. Growth happens in discomfort. The goal isn't to eliminate that feeling but to get comfortable with it, to recognize that being challenged and uncertain means you're learning and evolving.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's Next for Me
&lt;/h2&gt;

&lt;p&gt;The journey doesn't stop here. If anything, these first 5 months have shown me just how much there is to learn and explore in this field.&lt;/p&gt;

&lt;p&gt;As I get more comfortable with the context and knowledge, I'll continue doing what I've been doing: learning every day, staying curious, and pushing myself outside my comfort zone. There are so many areas I want to dive deeper into, container security, AWS architectures, and more.&lt;/p&gt;

&lt;p&gt;I'll keep sharing my experiences through these blog posts because if my journey can help even one person feel less alone in their transition or give someone the push they need to make the switch, then it's worth it.&lt;/p&gt;

&lt;p&gt;The adventure continues, and I'm excited to see where it takes me next.&lt;/p&gt;

</description>
      <category>frontendtodevops</category>
      <category>devops</category>
      <category>devopsjourney</category>
      <category>devopsarticles</category>
    </item>
    <item>
      <title>Empowering Women in Cybersecurity: My Experience with ITU's Global Initiative and IT for Girls👩🏽‍💻🙌</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:25:51 +0000</pubDate>
      <link>https://forem.com/lalidevops/empowering-women-in-cybersecurity-my-experience-with-itus-global-initiative-and-it-for-girls-5dg1</link>
      <guid>https://forem.com/lalidevops/empowering-women-in-cybersecurity-my-experience-with-itus-global-initiative-and-it-for-girls-5dg1</guid>
      <description>&lt;h2&gt;
  
  
  Table of Contents
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Leveling Up as a Frontend Developer: My Cybersecurity Learning Journey&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The ITU Her CyberTracks Fellowship: Empowering Women in Cybersecurity Through Global Initiative&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;My Experience So Far&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Study Notes: Technical Knowledge Gained&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Power of Mentorship: Guidance from Industry Leaders&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Looking Forward: Next Steps in My Cybersecurity Journey&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Acknowledgments and Gratitude&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  1. Leveling Up as a Frontend Developer: My Cybersecurity Learning Journey
&lt;/h2&gt;

&lt;p&gt;I am always looking for ways to grow and learn. I usually set aside a few hours each week to research opportunities, quality courses, and resources to keep improving myself. This also keeps my mind active and healthy, especially since I am currently job hunting‼️ (which is a job in itself 😬).&lt;/p&gt;

&lt;p&gt;I've been a Frontend Developer for the past four years, and since last year, I've been learning about Cloud technologies, specifically AWS. I earned my AWS Cloud Practitioner certification and I really enjoyed it. It's fascinating to explore other parts of the development process, not just the frontend and the backend.&lt;/p&gt;

&lt;p&gt;In my last project, "The Cloud Resume Challenge," I focused on the infrastructure side and explored ways to make my app more secure. I understand that security is very important, and we should always aim to build secure applications and systems. This project allowed me to do some research, which broadened my perspective on security. This is where security truly captured my interest. I discovered that I could go above and beyond by adding enhancements to the security aspect of the project. Here are some interesting and useful security improvements I could do:&lt;/p&gt;

&lt;p&gt;I could enhanced my project's security by configuring DNSSEC to prevent "man-in-the-middle" attacks, applying the "least privilege" principle and make use of a service called IAM Access Analyzer, use AWS WAF to protect my public API, diagramming code flow to detect attackers, and setting up code scanning and vulnerability checks on my repositories.&lt;/p&gt;

&lt;p&gt;I wrote a &lt;a href="https://lalidev.hashnode.dev/my-aws-cloud-resume-challenge" rel="noopener noreferrer"&gt;blog&lt;/a&gt; about it if you want to check it out 👀.&lt;/p&gt;

&lt;p&gt;Thanks to &lt;a href="https://www.linkedin.com/company/mujeresit/" rel="noopener noreferrer"&gt;Mujeres IT&lt;/a&gt;, a fantastic support group for women in technology in Uruguay, I discovered an opportunity. The German Embassy was inviting us to apply for a Cybersecurity Program aimed at bridging the gap in the cybersecurity field. I thought, why not? It seemed like a natural next step, so I decided to apply for the program.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. The ITU Her CyberTracks Fellowship: Empowering Women in Cybersecurity Through Global Initiative
&lt;/h2&gt;

&lt;p&gt;Okay, but first things first, what is ITU?&lt;/p&gt;

&lt;p&gt;The International Telecommunication Union, or ITU, is a specialized agency of the United Nations that focuses on information and communication technologies. Established on 1865, as the International Telegraph Union, it is the oldest UN agency and was the first formal international organization. Initially, the ITU aimed to connect telegraph networks between countries. Today, it promotes the global use of the radio spectrum, facilitates international cooperation in assigning satellite orbits, helps develop and coordinate worldwide technical standards, and works to improve telecommunication infrastructure in developing countries. Based in Geneva, Switzerland, the ITU has a global membership that includes 194 countries and about 900 businesses, academic institutions, and international and regional organizations.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://academy.itu.int/" rel="noopener noreferrer"&gt;ITU Academy&lt;/a&gt; offers a large selection of online, face-to-face and blended courses. &lt;strong&gt;ITU Academy Training Centres are internationally recognized&lt;/strong&gt; institutions offering high-quality training to professionals, with a focus on the needs of developing countries.&lt;/p&gt;

&lt;p&gt;About this specific program:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://academy.itu.int/training-courses/full-catalogue/her-cybertracks-incident-response-cybertrack" rel="noopener noreferrer"&gt;Her CyberTracks&lt;/a&gt; is funded by the German Federal Foreign Office and co-implemented by &lt;a href="https://www.linkedin.com/company/gizgmbh/posts/?feedView=all" rel="noopener noreferrer"&gt;&lt;strong&gt;Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) GmbH&lt;/strong&gt;&lt;/a&gt; and &lt;strong&gt;International Telecommunication Union&lt;/strong&gt; &lt;strong&gt;(ITU)&lt;/strong&gt;, in partnership with &lt;a href="https://www.linkedin.com/company/unodc/" rel="noopener noreferrer"&gt;&lt;strong&gt;UNODC&lt;/strong&gt;&lt;/a&gt; &lt;strong&gt;(UN Office on Drugs and Crime)&lt;/strong&gt; and &lt;a href="https://www.linkedin.com/company/lac4/posts/?feedView=all" rel="noopener noreferrer"&gt;&lt;strong&gt;LAC4&lt;/strong&gt;&lt;/a&gt; &lt;strong&gt;(Latin America and Caribbean Cyber Competence Centre)&lt;/strong&gt;. The goal of Her CyberTracks is to support the equal and meaningful participation of women in cybersecurity. To achieve this, the training course provides women with the skills and mindset needed to succeed in cybersecurity through focused capacity building.&lt;/p&gt;

&lt;p&gt;It is a &lt;strong&gt;6-month training program&lt;/strong&gt; offering a comprehensive curriculum focused on three main areas: &lt;strong&gt;TRAIN&lt;/strong&gt;, which provides expert training to develop skills for a secure cyberspace; &lt;strong&gt;MENTOR&lt;/strong&gt;, which creates a platform for senior professionals to guide and support women in their cybersecurity careers; and &lt;strong&gt;INSPIRE&lt;/strong&gt;, which uses role models and events to encourage and empower women to lead in cybersecurity. The program includes online courses, regional training, and mentorship.&lt;/p&gt;

&lt;p&gt;The curriculum includes the &lt;strong&gt;Her CyberTracks Latin America, Europe&lt;/strong&gt; and &lt;strong&gt;Africa trainings&lt;/strong&gt;, which consist of on-site activities featuring soft skills masterclasses, simulations, hands-on exercises, mentorship circles, study visits to cybersecurity organizations, and engaging networking opportunities.&lt;/p&gt;

&lt;p&gt;Participants could choose from three available CyberTracks: Policy &amp;amp; Diplomacy CyberTrack, Incident Response CyberTrack, and Criminal Justice CyberTrack.&lt;/p&gt;

&lt;p&gt;In my case, I applied for the &lt;strong&gt;Incident Response CyberTrack&lt;/strong&gt;. This track is designed for women already in technical roles who want to enter the cybersecurity field and gain practical experience. Incident response is a structured process that organizations use to identify and handle cybersecurity incidents, like data breaches or cyberattacks. Sounds exciting, right? Cybersecurity incidents are inevitable, and having a strong incident response program is essential for managing them effectively.&lt;/p&gt;

&lt;p&gt;About the selection process…&lt;/p&gt;

&lt;p&gt;The program received over 1,000 applications from around the world, but only 159 were accepted. Out of those 159, 55 were chosen for the Incident Response CyberTrack, which was the most popular choice among applicants. I'm thrilled to be one of the participants selected for the program. It will take me to the &lt;strong&gt;Latin America and Caribbean Cyber Competence Centre (LAC4)&lt;/strong&gt; in the Dominican Republic for hands-on training.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;It's important to have such initiatives to provide women with the opportunity to build a career in the cybersecurity field, where only 20% of the workforce is female. Crazy, right?&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  3. My Experience So Far
&lt;/h2&gt;

&lt;p&gt;My first impressions of the program are very positive. The course is well-organized into seven modules, plus a one-week hands-on experience at LAC4. This training happens at the facility in Santo Domingo, Dominican Republic. The platform provides clear information about the modules, topics, and deadlines for completing activities.&lt;/p&gt;

&lt;p&gt;One thing I really like is the forums where you can interact with fellow students. It's amazing to hear from other women studying with me from the other side of the world. I love that multicultural aspect.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Study Notes: Technical Knowledge Gained
&lt;/h2&gt;

&lt;p&gt;By now, I have completed the first 2 modules:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Module 1 - Cybersecurity Fundamentals&lt;/strong&gt;, you can find my study notes &lt;a href="https://inquisitive-cost-09d.notion.site/Module-1-Cybersecurity-Fundamentals-1ece0e502ea580588669d243d9865c79" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo6jgv089rrh1xgrs80bn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo6jgv089rrh1xgrs80bn.png" alt=" " width="800" height="589"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And &lt;strong&gt;Module 2 - Digital First Responder&lt;/strong&gt;, you can also find my study notes &lt;a href="https://inquisitive-cost-09d.notion.site/Module-2-Introduction-to-Digital-First-Responder-1ede0e502ea580149523cd2d24265171?pvs=4" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn65fm00c9og9ea5bal4t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn65fm00c9og9ea5bal4t.png" alt=" " width="800" height="591"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For someone new to cybersecurity, I found these two courses to be an excellent starting point.&lt;/p&gt;

&lt;p&gt;Interestingly, Module Two reminded me of my first job, especially the "Treatment of IT Disruptions" section. Here's a little story: When I was 17, I got my first job at an English institute. I was an Administrative Assistant, but I also worked in the library and multimedia room. I remember whenever something wasn't working with the computers or video equipment (yes, back when VHS was how we watched movies 😂), I was always curious and tried to fix those problems myself. It was fun, and I always learned something new.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. The Power of Mentorship: Guidance from Industry Leaders
&lt;/h2&gt;

&lt;p&gt;As I mentioned earlier, the training includes mentorship. Based on my profile, I was assigned a mentor named &lt;a href="https://www.linkedin.com/in/elvira-napwora/" rel="noopener noreferrer"&gt;Elvira&lt;/a&gt;. She is from Kenya and works as a Cybersecurity Analyst at &lt;strong&gt;NTT DATA Middle East and Africa&lt;/strong&gt;. We've exchanged introductory emails and even had our first one-on-one meeting last week 😃.&lt;/p&gt;

&lt;p&gt;For our first mentorship meeting, I prepared a "First Mentorship Agenda", which we used as a guide. I think this kept the conversation smooth and organized.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1b4pl8wd7baautmikdq1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1b4pl8wd7baautmikdq1.png" alt=" " width="800" height="701"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;During the first mentorship session, I learned a lot from Elvira. She was very encouraging and eagerly shared her views and insights on various topics. I felt comfortable speaking with her. We agreed on the time, platform, and duration of the meeting.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;TIP:&lt;/strong&gt; I strongly suggest being prepared for meetings to make the most of the mentorship. It's also a way to show respect for the mentor, who is generously giving their valuable free time.&lt;/p&gt;

&lt;h2&gt;
  
  
  6. Looking Forward: Next Steps in My Cybersecurity Journey
&lt;/h2&gt;

&lt;p&gt;One of the tasks is to define SMART Goals. One of my SMART Goals is to research career paths and roles within the cybersecurity field so I have a clear vision of where I want to go. During my conversation with Elvira, she advised me to explore defensive security and the various roles involved, and I think that was great advice!&lt;/p&gt;




&lt;blockquote&gt;
&lt;p&gt;Defensive cybersecurity aims to prevent cyber attacks by safeguarding everything from an organization’s systems and software to its full network infrastructure. &lt;/p&gt;
&lt;/blockquote&gt;




&lt;p&gt;Apart from the Fellowship program:&lt;/p&gt;

&lt;p&gt;I am studying in the &lt;strong&gt;Cisco Network Academy&lt;/strong&gt;, following the &lt;a href="https://www.netacad.com/career-paths/cybersecurity?courseLang=en-US" rel="noopener noreferrer"&gt;Junior Cybersecurity Analyst career path&lt;/a&gt; to complement the training. This is a great platform to learn from high-quality content for free.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcw19rjkxfd81ro0876mp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcw19rjkxfd81ro0876mp.png" alt=" " width="800" height="337"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I am also participating in the initiative “&lt;strong&gt;IT For Girls - Ciberseguridad y OT&lt;/strong&gt;” by &lt;a href="https://www.linkedin.com/company/womakerscode/" rel="noopener noreferrer"&gt;&lt;strong&gt;WoMakersCode&lt;/strong&gt;&lt;/a&gt;. This program offers mentorship and training in cybersecurity and OT technologies for Spanish-speaking women, providing 1,000 Udemy scholarships and over 900 Claroty certifications through partnerships with &lt;strong&gt;NTT DATA Europe &amp;amp; Latam&lt;/strong&gt;, &lt;strong&gt;NTT DATA FOUNDATION&lt;/strong&gt;, &lt;strong&gt;Claroty&lt;/strong&gt;, and &lt;strong&gt;Udemy&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm1km2p62att1zvsu27p3.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm1km2p62att1zvsu27p3.jpeg" alt=" " width="800" height="999"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you're a woman curious about cybersecurity and wondering if you belong there, you absolutely do.&lt;/p&gt;

&lt;h2&gt;
  
  
  7. Acknowledgments and Gratitude
&lt;/h2&gt;

&lt;p&gt;I have already posted a thank-you message on my LinkedIn to the ITU organizers and the IT for Girls organizers and speakers. However, I want to express my gratitude again to everyone who makes these initiatives possible. These programs have a significant impact on the women who participate.&lt;/p&gt;

&lt;p&gt;Mentors are incredibly important in technology. They can be positive role models for those starting in the field, where everything is new and uncertain. Special thanks to my mentor &lt;a href="https://www.linkedin.com/in/elvira-napwora/" rel="noopener noreferrer"&gt;Elvira&lt;/a&gt; for volunteering her time and sharing her experience. I look forward to our upcoming one-on-one sessions.&lt;/p&gt;

&lt;p&gt;Thank you to the other women in the cybersecurity field for being so welcoming, inspiring, and encouraging. This supportive network is one of the things I enjoy most about being in tech.&lt;/p&gt;

&lt;p&gt;If you are on a similar path as me or enjoy what I share, feel free to reach out to my email at &lt;a href="mailto:lauradiaz1586@gmail.com"&gt;lauradiaz1586@gmail.com&lt;/a&gt; ✨&lt;/p&gt;

</description>
      <category>womencybersecurity</category>
      <category>hercybertracks</category>
      <category>cybersecurity</category>
      <category>womenincybersecurity</category>
    </item>
    <item>
      <title>My AWS Cloud Resume Challenge ✨👩🏽‍💻</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:20:43 +0000</pubDate>
      <link>https://forem.com/lalidevops/my-aws-cloud-resume-challenge-3h0e</link>
      <guid>https://forem.com/lalidevops/my-aws-cloud-resume-challenge-3h0e</guid>
      <description>&lt;h2&gt;
  
  
  Table of Contents
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Introduction 🩵&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;My Architecture diagram 👩‍🎨&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 1: Certification ☁️&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 2: Convert my resume into HTML 👷🏽‍♀️&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 3: Add minimal styles with CSS 🪄&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 4: Deploy my static website to S3 🪣&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 5: Add Security with HTTPS 🔐&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 6: DNS 👈&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 7: Javascript 🦹&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 8: Database 💾&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 9: API ✨&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 10: Python 🐍&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 11: Tests&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 12: Infrastructure as Code (IaC) 👩🏽‍💻&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 13: Source Control&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 14: CI/CD (Back end)🏄&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 15: CI/CD (Front end) 🏄‍♀️&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;🚨🚨Beyond the Requirements: My Extended Contribution 🚨🚨&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Step 16: Blog post ✏️&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A Heartfelt Thank You to My Mentor 🙏&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Introduction 🩵
&lt;/h2&gt;

&lt;p&gt;Some time ago, my mentor @&lt;a href="https://dev.to@Marianogg9"&gt;Mariano González&lt;/a&gt; shared with me this amazing challenge, it’s called “T&lt;a href="https://cloudresumechallenge.dev/docs/the-challenge/" rel="noopener noreferrer"&gt;he Cloud Resume Challenge&lt;/a&gt;” by &lt;a href="https://forrestbrazeal.com/" rel="noopener noreferrer"&gt;Forrest Brazeal&lt;/a&gt;. This challenge &lt;strong&gt;isn’t&lt;/strong&gt; a tutorial or a how-to guide, it tells you what the outcome of the project should be. It's a hands-on project designed to help you move from cloud certification to a cloud job. Includes many skills that real cloud and DevOps engineers use every day✨.&lt;/p&gt;

&lt;p&gt;In my case I chose to do it with &lt;strong&gt;AWS&lt;/strong&gt;. The challenge consisted on 16 steps and are free for anyone to try.&lt;/p&gt;

&lt;p&gt;In this blog, I will share about my experience with this challenge.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Architecture 👩‍🎨
&lt;/h2&gt;

&lt;p&gt;This is the application's architecture diagram I created. I know it looks a bit small and is hard to read, but hopefully, the icons help. The purpose of sharing the entire architecture diagram is to provide an overview of the services I used to solve the challenge and the enhancements I added myself. I will also share detailed diagrams for each step so you can get more information about them.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fed53grbzl8g41kbss1iv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fed53grbzl8g41kbss1iv.png" alt=" " width="800" height="571"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Certification ☁️
&lt;/h2&gt;

&lt;p&gt;The first step was to have the &lt;a href="https://aws.amazon.com/certification/certified-cloud-practitioner/" rel="noopener noreferrer"&gt;AWS Cloud Practitioner certification&lt;/a&gt; on my resume. I already had this certification, and I wrote a &lt;a href="https://laurainthecloud.hashnode.dev/got-my-aws-cloud-practitioner-certification-clf-c02" rel="noopener noreferrer"&gt;blog&lt;/a&gt; about it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc5zdgnvcgxc9pbs7liuu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc5zdgnvcgxc9pbs7liuu.png" alt=" " width="600" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Looking ahead, after completing this challenge, my next immediate step is to obtain the &lt;strong&gt;Developer Associate Certification&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2: Convert My Resume into HTML 👷🏽‍♀️
&lt;/h2&gt;

&lt;p&gt;The next step was to convert my resume into &lt;strong&gt;HTML&lt;/strong&gt; (HyperText Markup Language) format. I had been a Frontend Developer for the past 4 years, so luckily I knew what HTML is but if you don’t have any coding experience, don’t get intimidated by this. It was pretty easy and straightforward to learn. Here’s a good resource to learn from &lt;a href="https://developer.mozilla.org/en-US/docs/Web/HTML" rel="noopener noreferrer"&gt;Mozilla HTML Docs&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Since my main experience is as a Frontend Developer and I already had experience with React, I could choose to use React or work on the infrastructure of &lt;a href="https://lauradiaz.cloud/" rel="noopener noreferrer"&gt;my own portfolio&lt;/a&gt;. However, for this challenge, as you will see in the following steps, the goal was to understand DNS and HTTP on my own and use S3 instead of using a pre-built static website service like AWS Amplify.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Add Minimal Styles with CSS 🪄
&lt;/h2&gt;

&lt;p&gt;After converting it to HTML, I needed to apply some basic styles using &lt;strong&gt;CSS&lt;/strong&gt;. This made the resume look more polished and visually appealing. I chose to use very simple and minimal styles since the focus of this project is to continue learning Cloud skills.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4: Deploy my static website to S3 🪣
&lt;/h2&gt;

&lt;p&gt;The next step involved using a popular AWS service called &lt;strong&gt;Amazon S3&lt;/strong&gt;, this is an object storage service that stores data for millions of customers worldwide. I used this service to deploy my project.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 5: Add Security with HTTPS 🔐
&lt;/h2&gt;

&lt;p&gt;Ready up for step 5, this step involved a new service called &lt;strong&gt;Amazon CloudFront.&lt;/strong&gt; Amazon CloudFront is a content delivery network (CDN) service that quickly and reliably distributes your static and dynamic content with low latency. &lt;em&gt;Amazon S3 + CloudFront&lt;/em&gt; enables storing, securing, and delivering your static content at scale.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flchamn2s7m1tbsr4b7tl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flchamn2s7m1tbsr4b7tl.png" alt=" " width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After completing the configurations, I was able to access the distribution. The distribution's URL looked something like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz9e2yxcejw2dh559gg7w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz9e2yxcejw2dh559gg7w.png" alt=" " width="800" height="103"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I had experience creating an &lt;strong&gt;S3&lt;/strong&gt; bucket before, but this was the first time I created a CloudFront distribution, and I really enjoyed it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 6: DNS 👈
&lt;/h2&gt;

&lt;p&gt;This next step was an exciting one!🫰&lt;/p&gt;

&lt;p&gt;In this one, I had to point a custom DNS domain name to the CloudFront distribution I just created, so the resume could be accessed at something like &lt;code&gt;lauradiazcloudengineer.com&lt;/code&gt;. For this step I could use any other DNS provider or &lt;strong&gt;Amazon Route 53&lt;/strong&gt; as my DNS provider.&lt;/p&gt;

&lt;p&gt;After I bought my domain name and the status became &lt;code&gt;successful&lt;/code&gt;, I checked the documentation to learn how to route traffic to an &lt;strong&gt;Amazon CloudFront&lt;/strong&gt; using the domain name I just purchased, &lt;code&gt;lauradiazcloudengineer.com&lt;/code&gt; instead of the domain name that &lt;strong&gt;CloudFront&lt;/strong&gt; assigned by default. As you may noticed in the previous step, when you create a distribution, CloudFront assigns a domain name to the distribution, this would look something like &lt;code&gt;foobar.cloudfront.net&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The first thing I needed to do was request a public certificate so that Amazon CloudFront could use HTTPS. For this, I used an AWS service called &lt;strong&gt;AWS Certificate Manager (ACM)&lt;/strong&gt; to obtain a public certificate.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbzgdpiphqrs0u2w5lem1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbzgdpiphqrs0u2w5lem1.png" alt=" " width="800" height="789"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And finally, I needed to configure &lt;strong&gt;Amazon Route 53&lt;/strong&gt; to route traffic to a CloudFront.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fspxqocy7erfxzbw14l1k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fspxqocy7erfxzbw14l1k.png" alt=" " width="800" height="127"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 7: Javascript 🦹
&lt;/h2&gt;

&lt;p&gt;I really enjoyed working on this step because I have been working with JavaScript for 4 years now. When I saw JavaScript included in the challenge, I was quite happy 😊. It included a visitor &lt;strong&gt;counter&lt;/strong&gt; that displays how many people have accessed the site.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 8: Database 💾
&lt;/h2&gt;

&lt;p&gt;Because the visitor counter needed to retrieve and update its count in a database somewhere, I worked with an AWS service called &lt;strong&gt;DynamoDB.&lt;/strong&gt; It is a powerful, serverless, fast, and flexible &lt;a href="https://www.mongodb.com/resources/basics/databases/nosql-explained" rel="noopener noreferrer"&gt;NoSQL database&lt;/a&gt; that is fully managed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 9: API ✨
&lt;/h2&gt;

&lt;p&gt;I used &lt;strong&gt;AWS’s API Gateway&lt;/strong&gt; and &lt;strong&gt;Lambda&lt;/strong&gt; as my serverless backend. I followed the following architecture pattern:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw50ujd0z7lbs4wtu2zea.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw50ujd0z7lbs4wtu2zea.jpeg" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;✅ I created a serverless API to update the visitor counter from a DynamoDB table.&lt;/p&gt;

&lt;p&gt;✅ I created a &lt;strong&gt;Lambda&lt;/strong&gt; function as my backend.&lt;/p&gt;

&lt;p&gt;✅ I created an HTTP API using the &lt;strong&gt;API Gateway&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;From a DevOps perspective, it was important to get notified when my running services did something unexpected. For this, I set up &lt;strong&gt;CloudWatch&lt;/strong&gt; (more details in the upcoming sections).&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 10: Python 🐍
&lt;/h2&gt;

&lt;p&gt;For step number 10, I used a service called &lt;strong&gt;Lambda&lt;/strong&gt;, which is a service that ran my code in response to events and automatically manages the computing resources. I don't have much experience with Python but I decided to give it a try.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 11: Tests
&lt;/h2&gt;

&lt;p&gt;Testing was the only step that was not added to the current iteration of this challenge. Although I understood this was a crucial step and very important one, this choice was made on purpose to allow more time to learn proper testing methods and apply them effectively in future updates.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 12: Infrastructure as Code (IaC) 👩🏽‍💻
&lt;/h2&gt;

&lt;p&gt;I needed to write the infrastructure as code so I wouldn't have to create the services manually. The challenge suggested I could use &lt;em&gt;AWS Serverless Application Model (SAM)&lt;/em&gt; template and deploy using AWS SAM CLI. However, since I had experience working with &lt;strong&gt;Terraform&lt;/strong&gt; and wanted to tackle &lt;a href="https://cloudresumechallenge.dev/docs/extensions/terraform-getting-started/" rel="noopener noreferrer"&gt;this EXTRA challenge,&lt;/a&gt; it was my preferred choice.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;A SIDE NOTE 👀:&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;In case you&lt;/em&gt; are &lt;em&gt;curious about my experience with Terraform, here’s my previous post:&lt;/em&gt; &lt;a href="https://laurainthecloud.hashnode.dev/iac-deploying-a-node-secrets-viewer-with-terraform" rel="noopener noreferrer"&gt;&lt;em&gt;IaC: Deploying a Node Secrets Viewer with Terraform&lt;/em&gt;&lt;/a&gt;&lt;em&gt;. This project demonstrates the deployment of my NodeJS application that retrieves and displays secrets from&lt;/em&gt; &lt;strong&gt;&lt;em&gt;AWS Secrets Manager&lt;/em&gt;&lt;/strong&gt;. &lt;em&gt;The infrastructure is provisioned using Terraform, showcasing Infrastructure as Code (IaC) capabilities with AWS services.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;This step involves another challenge: &lt;a href="https://cloudresumechallenge.dev/docs/extensions/terraform-getting-started/" rel="noopener noreferrer"&gt;Terraform Your Cloud Resume Challenge&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 13: Source Control
&lt;/h2&gt;

&lt;p&gt;I created two different repositories, one for the backend and other for the frontend. The purpose of this step was to update the site automatically whenever you made a change to the code.&lt;/p&gt;

&lt;p&gt;You can view the repositories here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://github.com/lalidiaz/the-cloud-resume-challenge-FE/blob/main/README.md" rel="noopener noreferrer"&gt;Frontend repository&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://github.com/lalidiaz/the-cloud-resume-challenge-BE" rel="noopener noreferrer"&gt;Backend repository&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 14: CI/CD (Back end) 🏄
&lt;/h2&gt;

&lt;p&gt;In the backend repository, there is a Terraform directory that contains all my Infrastructure as Code. Each file contains the code for the specific AWS service, so it’s more organized and clear to understand.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The Python Lambda code is part of the workflow but was not committed, as requested by the challenge's creator.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 15: CI/CD (Front end) 🏄‍♀️
&lt;/h2&gt;

&lt;p&gt;The frontend repository has a directory named &lt;code&gt;resume&lt;/code&gt; that contains the HTML and CSS. &lt;em&gt;The JavaScript code is part of the workflow but is not committed, as requested by the challenge's creator.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;I set up a &lt;code&gt;main.tf&lt;/code&gt; file with my &lt;strong&gt;Terraform&lt;/strong&gt; IaC, so when the GitHub Actions CI/CD pipeline ran, the new website code is pushed to the S3 bucket and updated automatically.&lt;/p&gt;

&lt;h2&gt;
  
  
  🚨🚨 Beyond the Requirements: My Extended Contribution 🚨 🚨
&lt;/h2&gt;

&lt;p&gt;Okay, but this was not the end!&lt;/p&gt;

&lt;p&gt;I wanted to share other &lt;strong&gt;AWS services&lt;/strong&gt; I used to make my infrastructure more robust to cover important DevOps topics like &lt;strong&gt;monitoring&lt;/strong&gt;, &lt;strong&gt;logging&lt;/strong&gt; and &lt;strong&gt;notifications&lt;/strong&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Additional S3 relevant configurations:&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;✅ I enabled bucket versioning. About versioning &lt;a href="https://docs.aws.amazon.com/AmazonS3/latest/userguide/versioning-workflows.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;✅ Configured a lifecycle policy for an &lt;strong&gt;S3&lt;/strong&gt; bucket: Automatically cleaned up old versions of objects in your S3 bucket that were tagged with "TheCloudResumeChallenge" after 90 days.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Cost Allocation and Management and Resource Organization&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;✅ Added tags “TheCloudResumeChallenge” to my resources to easily track cost by project and filter resources if needed.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Cloudfront:&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;✅ Added &lt;strong&gt;Cloudfront&lt;/strong&gt; Response Security Headers Policy.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;✅ Added &lt;strong&gt;Cloudfront&lt;/strong&gt; cache policy: This policy was designed to optimize cache efficiency by minimizing the values that CloudFront included in the cache key.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;API Gateway:&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;✅ Added custom domain name to the api gateway url and the respective mappings.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Simple Notification Service (SNS):
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;✅ Created a topic and subscribed my email to get notifications when there’s an alarm.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;CloudWatch:&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;✅ Added configuration settings for CloudWatch logs.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;✅ Created API Gateway log groups.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;✅ Created CloudWatch alarms for Lambda errors.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;✅ Created CloudWatch alarms for API Gateway: errors and high request count.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3rjjbprv4iqq265hhtku.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3rjjbprv4iqq265hhtku.png" alt=" " width="800" height="610"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 16: Blog post ✏️
&lt;/h2&gt;

&lt;p&gt;Finally, writing a blog post was the last step of this amazing challenge 🤩.&lt;/p&gt;

&lt;p&gt;I am so happy and proud of myself for finishing it. It was challenging and difficult at times, but definitely rewarding.&lt;/p&gt;

&lt;p&gt;I had been blogging about different topics and you could subscribe to my newsletter on hashnode to get notified whenever I post a new blog post.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;My experience taught me that the magic happens when people openly exchange ideas and expertise. This belief in democratizing knowledge drives me to contribute to the tech community while continuously learning from others 💖.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  A Heartfelt Thank You to My Mentor 🙏
&lt;/h2&gt;

&lt;p&gt;To &lt;a href="https://nl.linkedin.com/in/marianogg9" rel="noopener noreferrer"&gt;Mariano&lt;/a&gt;,&lt;/p&gt;

&lt;p&gt;Your guidance as a mentor has been invaluable to my journey as a cloud engineer, and I am very grateful for that ✨. Thank you for always being there to answer my questions and for guiding me to discover solutions on my own, which has helped me develop problem-solving skills that will benefit my career.&lt;/p&gt;

&lt;p&gt;I am especially grateful for the time you've spent reviewing my work and providing detailed feedback, as your insights have pushed me to improve and revealed aspects I might have missed.&lt;/p&gt;

</description>
      <category>switchcareers</category>
      <category>aws</category>
      <category>cloudcomputing</category>
      <category>careerswitch</category>
    </item>
    <item>
      <title>✨ IaC: Deploying a Node Secrets Viewer with Terraform ✨👩🏽‍💻</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:14:12 +0000</pubDate>
      <link>https://forem.com/lalidevops/iac-deploying-a-node-secrets-viewer-with-terraform-3h32</link>
      <guid>https://forem.com/lalidevops/iac-deploying-a-node-secrets-viewer-with-terraform-3h32</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Learn how I built a secure cloud application by integrating Node.js with AWS Secrets Manager and automating the infrastructure deployment using Terraform. This project demonstrates modern cloud development practices and security implementation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Table of contents
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Project Overview&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Challenge&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Solution Architecture&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Node Application&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Infrastructure as Code with Terraform&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Basic Setup&lt;/li&gt;
&lt;li&gt;Setup The Provider&lt;/li&gt;
&lt;li&gt;EC2 Instance Setup&lt;/li&gt;
&lt;li&gt;IAM Roles and Policies&lt;/li&gt;
&lt;li&gt;Security Groups Configuration&lt;/li&gt;
&lt;li&gt;AWS Secrets Manager Setup&lt;/li&gt;
&lt;li&gt;User Data Script Implementation&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Deployment Process&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Prerequisites&lt;/li&gt;
&lt;li&gt;Step-by-Step Deployment Guide&lt;/li&gt;
&lt;li&gt;Deployment Verification&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Cost Considerations&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Cleanup Process&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Future Improvements&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Conclusion&lt;/p&gt;&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Project Overview
&lt;/h2&gt;

&lt;p&gt;This project showcases a practical implementation of cloud security and infrastructure automation with Terraform. At its core, it combines a Node application that interacts with &lt;strong&gt;AWS Secrets Manager&lt;/strong&gt;🔐 and a complete infrastructure setup automated through &lt;strong&gt;Terraform&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;The project consist of two layers:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The application layer:&lt;/strong&gt; A Node application that interacts directly with AWS Secrets Manager using the AWS SDK.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Infrastructure layer:&lt;/strong&gt; The infrastructure is fully automated using &lt;strong&gt;Terraform&lt;/strong&gt;, which provisions and configure the following components:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;An EC2 instance to host the Node application&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS Secrets Manager for secure secrets storage&lt;/li&gt;
&lt;li&gt;IAM roles and policies following the principle of least privilege&lt;/li&gt;
&lt;li&gt;Security groups with strictly controlled access&lt;/li&gt;
&lt;li&gt;Instance profiles for secure EC2 AWS services communication&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;What makes this project particularly interesting is how it connects application development with infrastructure management 🫰🫰🫰.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Challenge
&lt;/h2&gt;

&lt;p&gt;Last week, I was deploying a Node application on an EC2 instance and wondered how to provide environment variables from the .env file to my project. Here's where &lt;strong&gt;AWS Secrets Manager&lt;/strong&gt; becomes useful.&lt;/p&gt;

&lt;p&gt;💡&lt;br&gt;
AWS Secrets Manager helps you manage, retrieve, and rotate database credentials, application credentials, OAuth tokens, API keys, and other secrets. For more information, visit &lt;a rel="noopener noreferrer nofollow" href="https://docs.aws.amazon.com/secretsmanager/latest/userguide/intro.html"&gt;this link&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I was curious about securely sharing secrets in my applications using Secrets Manager 🤔👩🏽‍💻, and I also wanted to learn &lt;strong&gt;Terraform for Infrastructure as Code (IaC) 😃&lt;/strong&gt;. This was my first experience with Terraform and I really enjoyed it!&lt;/p&gt;

&lt;h2&gt;
  
  
  Solution Architecture
&lt;/h2&gt;

&lt;p&gt;The diagram below shows the solution architecture I designed for this project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1736245241727%2F4f23af52-a191-40f7-a704-cf2ed0822ec9.png%2520align%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1736245241727%2F4f23af52-a191-40f7-a704-cf2ed0822ec9.png%2520align%3D" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is a Node application that retrieves secrets from AWS Secrets Manager. This Node application is hosted on an EC2 instance with appropriate IAM roles and policies attached to it.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Node Application
&lt;/h2&gt;

&lt;p&gt;You can find the application in &lt;a href="https://github.com/lalidiaz/terraform-aws-secrets-manager" rel="noopener noreferrer"&gt;my GitHub Repository&lt;/a&gt;. This is a lightweight Node application with a simple UI that serves only to display the &lt;em&gt;secrets&lt;/em&gt;, demonstrating the ability to access them from the Secrets Manager. The app uses the &lt;a href="https://www.npmjs.com/package/@aws-sdk/client-secrets-manager" rel="noopener noreferrer"&gt;AWS SDK&lt;/a&gt; to connect with AWS.&lt;/p&gt;

&lt;h2&gt;
  
  
  Infrastructure as Code with Terraform
&lt;/h2&gt;

&lt;p&gt;For the infrastructure layer of the application, I created &lt;a href="https://github.com/lalidiaz/node-terraform-infrastructure" rel="noopener noreferrer"&gt;this GitHub repository&lt;/a&gt; where you can find all my code.&lt;/p&gt;

&lt;p&gt;💡&lt;br&gt;
Terraform is an infrastructure as code tool that lets you build, change, and version infrastructure safely and efficiently. This includes low-level components like compute instances, storage, and networking.&lt;/p&gt;

&lt;h3&gt;
  
  
  Basic Setup ✨
&lt;/h3&gt;

&lt;p&gt;For the setup, I first visited the &lt;a href="https://developer.hashicorp.com/terraform?product_intent=terraform" rel="noopener noreferrer"&gt;Terraform Docs&lt;/a&gt; to learn how to install it on my local machine. Since I am working with AWS, I read the &lt;a href="https://registry.terraform.io/providers/hashicorp/aws/latest/docs" rel="noopener noreferrer"&gt;AWS Provider documentation&lt;/a&gt; to understand how to interact with various resources.&lt;/p&gt;

&lt;p&gt;If you are using a different provider, you can find the registry here 🙃: &lt;a href="https://developer.hashicorp.com/terraform/language/providers#how-to-find-providers" rel="noopener noreferrer"&gt;https://developer.hashicorp.com/terraform/language/providers#how-to-find-providers&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you have this set up, &lt;strong&gt;&lt;em&gt;make sure you are also logged in to the AWS CLI with the correct credentials&lt;/em&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;About the code: For each Terraform configuration block, I created a file named after the block it references. Why? 🤔 Here are some of the reasons:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;✅ This approach creates a modular and organized structure.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;✅ Each file has a specific purpose.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;✅ It makes it easier for team members to understand and navigate the codebase, as they can quickly find specific configurations.&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;.&lt;/span&gt;
├── data.tf
├── output.tf
├── provider.tf
├── resource.tf
├── userdata.sh
└── .gitignore
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Setup The Provider ☁️
&lt;/h3&gt;

&lt;p&gt;💡&lt;br&gt;
Terraform relies on plugins called "providers" to interact with remote systems. Terraform configurations must declare which providers they require, so that Terraform can install and use them.&lt;/p&gt;

&lt;p&gt;To set up the AWS provider, I created a file named &lt;code&gt;provider.tf&lt;/code&gt; with the following content:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform &lt;span class="o"&gt;{&lt;/span&gt;
  required_providers &lt;span class="o"&gt;{&lt;/span&gt;
    aws &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
      &lt;span class="nb"&gt;source&lt;/span&gt;  &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"hashicorp/aws"&lt;/span&gt;
      version &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"~&amp;gt; 4.16"&lt;/span&gt;
    &lt;span class="o"&gt;}&lt;/span&gt;
  &lt;span class="o"&gt;}&lt;/span&gt;

  required_version &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"&amp;gt;= 1.2.0"&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;

&lt;span class="c"&gt;# Configure the AWS Provider&lt;/span&gt;
provider &lt;span class="s2"&gt;"aws"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  region &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"us-west-2"&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  EC2 Instance Setup 👩🏽‍💻
&lt;/h3&gt;

&lt;p&gt;For the EC2 Instance, I have to tackle different configurations.&lt;/p&gt;

&lt;p&gt;First, I created a &lt;code&gt;resource.tf&lt;/code&gt; file with all the resources, one of the is the EC2 instance with the respective role to access the Secrets Manager and the Security groups.&lt;/p&gt;

&lt;p&gt;This resource creates an EC2 instance with a Linux AMI, instance type t2.micro, an instance profile name, security groups for the instance, and user data to be injected during instance creation.&lt;/p&gt;

&lt;p&gt;I also added a tag for billing tracking 😉.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;resource &lt;span class="s2"&gt;"aws_instance"&lt;/span&gt; &lt;span class="s2"&gt;"app_server"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  ami           &lt;span class="o"&gt;=&lt;/span&gt; data.aws_ami.linux_ami.id
  instance_type &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"t2.micro"&lt;/span&gt;
  iam_instance_profile &lt;span class="o"&gt;=&lt;/span&gt; aws_iam_instance_profile.ec2_instance_profile.name
  key_name &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"aws-terraform-challenge"&lt;/span&gt;
  security_groups &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt; aws_security_group.aws_terraform_challenge_security_group.name &lt;span class="o"&gt;]&lt;/span&gt;
  user_data &lt;span class="o"&gt;=&lt;/span&gt; file&lt;span class="o"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"userdata.sh"&lt;/span&gt;&lt;span class="o"&gt;)&lt;/span&gt;

  tags &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
    Name &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"aws_terraform_challenge"&lt;/span&gt;
  &lt;span class="o"&gt;}&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Keep in mind that the following lines in the code use values from &lt;strong&gt;data blocks&lt;/strong&gt; that retrieve this information.&lt;/p&gt;

&lt;p&gt;I'll explain each part for better understanding:&lt;/p&gt;

&lt;p&gt;The AMI ID is retrieved from the following data block, where I set the &lt;code&gt;most recent&lt;/code&gt; argument to &lt;code&gt;true&lt;/code&gt; to ensure it brings the latest one. The owner is &lt;code&gt;amazon&lt;/code&gt;, and I filter by the &lt;code&gt;Linux 2023 AMI&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;data &lt;span class="s2"&gt;"aws_ami"&lt;/span&gt; &lt;span class="s2"&gt;"linux_ami"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  most_recent &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;true
  &lt;/span&gt;owners      &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"amazon"&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt;

  filter &lt;span class="o"&gt;{&lt;/span&gt;
    name   &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"name"&lt;/span&gt;
    values &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"al2023-ami-*"&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt;
  &lt;span class="o"&gt;}&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  IAM Roles and Policies
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;aws_iam_instance_profile&lt;/code&gt; provides an IAM instance profile.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;resource &lt;span class="s2"&gt;"aws_iam_instance_profile"&lt;/span&gt; &lt;span class="s2"&gt;"ec2_instance_profile"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  name &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"terraform_challenge_instance_profile"&lt;/span&gt;
  role &lt;span class="o"&gt;=&lt;/span&gt; aws_iam_role.ec2_secrets_role.name
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Role:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;resource &lt;span class="s2"&gt;"aws_iam_role"&lt;/span&gt; &lt;span class="s2"&gt;"ec2_secrets_role"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  name &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"secretsrole"&lt;/span&gt;

  assume_role_policy &lt;span class="o"&gt;=&lt;/span&gt; jsonencode&lt;span class="o"&gt;({&lt;/span&gt;
    Version &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"2012-10-17"&lt;/span&gt;
    Statement &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;
      &lt;span class="o"&gt;{&lt;/span&gt;
        Action &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"sts:AssumeRole"&lt;/span&gt;
        Effect &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Allow"&lt;/span&gt;
        Principal &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
          Service &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"ec2.amazonaws.com"&lt;/span&gt;
        &lt;span class="o"&gt;}&lt;/span&gt;
      &lt;span class="o"&gt;}&lt;/span&gt;,
    &lt;span class="o"&gt;]&lt;/span&gt;
  &lt;span class="o"&gt;})&lt;/span&gt;

  tags &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
    Name    &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"EC2SecretsManagerRole"&lt;/span&gt;
    Purpose &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Allow EC2 to access Secrets Manager"&lt;/span&gt;
  &lt;span class="o"&gt;}&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;✨ &lt;strong&gt;PRO TIP:&lt;/strong&gt; Note that I added the tags argument for tracking billing purposes.&lt;/p&gt;

&lt;p&gt;The policies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;resource &lt;span class="s2"&gt;"aws_iam_policy"&lt;/span&gt; &lt;span class="s2"&gt;"my_secrets_policy"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  name   &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"my_secrets_policy"&lt;/span&gt;
  path   &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"/"&lt;/span&gt;
  policy &lt;span class="o"&gt;=&lt;/span&gt; data.aws_iam_policy_document.my_secrets_policy.json
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the code above, the &lt;strong&gt;policy&lt;/strong&gt; argument refers to a reference to a JSON-formatted IAM policy that is generated using a &lt;code&gt;data&lt;/code&gt; block for an &lt;code&gt;aws_iam_policy_document&lt;/code&gt;. This policy is created following the &lt;strong&gt;Least Privilege principle&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;data &lt;span class="s2"&gt;"aws_iam_policy_document"&lt;/span&gt; &lt;span class="s2"&gt;"my_secrets_policy"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
 statement &lt;span class="o"&gt;{&lt;/span&gt;

    actions &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;
      &lt;span class="s2"&gt;"kms:DescribeKey"&lt;/span&gt;,
      &lt;span class="s2"&gt;"kms:ListAliases"&lt;/span&gt;,
      &lt;span class="s2"&gt;"kms:ListKeys"&lt;/span&gt;
    &lt;span class="o"&gt;]&lt;/span&gt;

    resources &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;
      data.aws_kms_key.by_alias.arn,
    &lt;span class="o"&gt;]&lt;/span&gt;
  &lt;span class="o"&gt;}&lt;/span&gt;

  statement &lt;span class="o"&gt;{&lt;/span&gt;
    actions &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;
      &lt;span class="s2"&gt;"secretsmanager:GetSecretValue"&lt;/span&gt;,
      &lt;span class="s2"&gt;"secretsmanager:ListSecretVersionIds"&lt;/span&gt;
    &lt;span class="o"&gt;]&lt;/span&gt;

    resources &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;
        &lt;span class="nb"&gt;join&lt;/span&gt;&lt;span class="o"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;, &lt;span class="o"&gt;[&lt;/span&gt;aws_secretsmanager_secret.secret_terraform_challenge.id, &lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="o"&gt;])&lt;/span&gt;

    &lt;span class="o"&gt;]&lt;/span&gt;
  &lt;span class="o"&gt;}&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;resources&lt;/code&gt; argument in the code above refers to a &lt;code&gt;data.aws_kms_key&lt;/code&gt; piece of information. The AWS KMS Key data block is used to obtain detailed information about the specified KMS Key.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;data &lt;span class="s2"&gt;"aws_kms_key"&lt;/span&gt; &lt;span class="s2"&gt;"by_alias"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  key_id &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"alias/aws/secretsmanager"&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then create the resource that attaches a managed IAM Policy to an IAM role:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;resource &lt;span class="s2"&gt;"aws_iam_role_policy_attachment"&lt;/span&gt; &lt;span class="s2"&gt;"secrets_policy"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  role       &lt;span class="o"&gt;=&lt;/span&gt; aws_iam_role.ec2_secrets_role.name
  policy_arn &lt;span class="o"&gt;=&lt;/span&gt; aws_iam_policy.my_secrets_policy.arn
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Security Groups Configuration ✨
&lt;/h3&gt;

&lt;p&gt;For the security groups, I need to do the following by creating a security group with the right rules:&lt;/p&gt;

&lt;p&gt;✅ Create a security group.&lt;/p&gt;

&lt;p&gt;✅ Associate the security group with a specific VPC. As you can see, the &lt;code&gt;vpc_id&lt;/code&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;✨Allow all outbound rule.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;✨Set up inbound rules to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Allow HTTP from my IP.&lt;/li&gt;
&lt;li&gt;Allow SSH from my IP to connect to my EC2.&lt;/li&gt;
&lt;li&gt;Allow traffic from my IP on port 3000.
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;resource &lt;span class="s2"&gt;"aws_security_group"&lt;/span&gt; &lt;span class="s2"&gt;"aws_terraform_challenge_security_group"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  name        &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"aws_terraform_challenge_security_group"&lt;/span&gt;
  vpc_id      &lt;span class="o"&gt;=&lt;/span&gt; data.aws_vpc.default_vpc.id

  tags &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
    Name &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"aws_terraform_challenge_security_group"&lt;/span&gt;
  &lt;span class="o"&gt;}&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;

resource &lt;span class="s2"&gt;"aws_vpc_security_group_ingress_rule"&lt;/span&gt; &lt;span class="s2"&gt;"allow_http"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  security_group_id &lt;span class="o"&gt;=&lt;/span&gt; aws_security_group.aws_terraform_challenge_security_group.id
  cidr_ipv4         &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"YOUR_IP_HERE"&lt;/span&gt;
  from_port         &lt;span class="o"&gt;=&lt;/span&gt; 80
  ip_protocol       &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"tcp"&lt;/span&gt;
  to_port           &lt;span class="o"&gt;=&lt;/span&gt; 80
&lt;span class="o"&gt;}&lt;/span&gt;

resource &lt;span class="s2"&gt;"aws_vpc_security_group_ingress_rule"&lt;/span&gt; &lt;span class="s2"&gt;"allow_ssh_from_my_ip"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  security_group_id &lt;span class="o"&gt;=&lt;/span&gt; aws_security_group.aws_terraform_challenge_security_group.id
  cidr_ipv4         &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"YOUR_IP_HERE"&lt;/span&gt;
  from_port         &lt;span class="o"&gt;=&lt;/span&gt; 22
  ip_protocol       &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"tcp"&lt;/span&gt;
  to_port           &lt;span class="o"&gt;=&lt;/span&gt; 22
&lt;span class="o"&gt;}&lt;/span&gt;

resource &lt;span class="s2"&gt;"aws_vpc_security_group_ingress_rule"&lt;/span&gt; &lt;span class="s2"&gt;"allow_3000_traffic_from_my_ip"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  security_group_id &lt;span class="o"&gt;=&lt;/span&gt; aws_security_group.aws_terraform_challenge_security_group.id
  cidr_ipv4         &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"YOUR_IP_HERE"&lt;/span&gt;
  from_port         &lt;span class="o"&gt;=&lt;/span&gt; 3000
  ip_protocol       &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"tcp"&lt;/span&gt;
  to_port           &lt;span class="o"&gt;=&lt;/span&gt; 3000
&lt;span class="o"&gt;}&lt;/span&gt;

resource &lt;span class="s2"&gt;"aws_vpc_security_group_egress_rule"&lt;/span&gt; &lt;span class="s2"&gt;"allow_all_egress_rule"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  security_group_id &lt;span class="o"&gt;=&lt;/span&gt; aws_security_group.aws_terraform_challenge_security_group.id
  cidr_ipv4   &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"0.0.0.0/0"&lt;/span&gt;
  ip_protocol &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nt"&gt;-1&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This security group is defined within &lt;code&gt;aws_instance&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;vpc_id&lt;/code&gt; in the code above is obtained from the following data block, which gives details about a specific VPC, in this case, the default VPC.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;data &lt;span class="s2"&gt;"aws_vpc"&lt;/span&gt; &lt;span class="s2"&gt;"default_vpc"&lt;/span&gt;&lt;span class="o"&gt;{&lt;/span&gt;
  default &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;true&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  AWS Secrets Manager Setup👩🏽‍💻
&lt;/h3&gt;

&lt;p&gt;For the Secrets Manager, I created an &lt;code&gt;aws_secretsmanager_secret&lt;/code&gt;, which is a resource to manage AWS Secrets Manager secret metadata. I set the recovery window to &lt;strong&gt;0&lt;/strong&gt; days. Why 0? Because setting it to 0 forces deletion without recovery &lt;em&gt;(This is for the DEMO purposes only).&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;resource &lt;span class="s2"&gt;"aws_secretsmanager_secret"&lt;/span&gt; &lt;span class="s2"&gt;"secret_terraform_challenge"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  name &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"secret_terraform_challenge"&lt;/span&gt;
  recovery_window_in_days &lt;span class="o"&gt;=&lt;/span&gt; 0
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then, set up the &lt;code&gt;aws_secretsmanager_secret_version&lt;/code&gt; to manage the version of the AWS Secrets Manager secret, including its secret value, which in this case is &lt;strong&gt;&lt;em&gt;secret&lt;/em&gt;&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;resource &lt;span class="s2"&gt;"aws_secretsmanager_secret_version"&lt;/span&gt; &lt;span class="s2"&gt;"secret_terraform_challenge_version"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  secret_id     &lt;span class="o"&gt;=&lt;/span&gt; aws_secretsmanager_secret.secret_terraform_challenge.id
  secret_string &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"secret"&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  User Data Script Implementation
&lt;/h3&gt;

&lt;p&gt;To run the script when the EC2 instance is created, I created the following user-data bash script to run the application:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;
yum update &lt;span class="nt"&gt;-y&lt;/span&gt;
curl &lt;span class="nt"&gt;-fsSL&lt;/span&gt; https://rpm.nodesource.com/setup_23.x | bash -
yum &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-y&lt;/span&gt; nodejs git
&lt;span class="nb"&gt;mkdir&lt;/span&gt; &lt;span class="nt"&gt;-p&lt;/span&gt; /home/ec2-user/app &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nb"&gt;cd&lt;/span&gt; /home/ec2-user/app
git clone https://github.com/lalidiaz/terraform-aws-secrets-manager.git &lt;span class="nb"&gt;.&lt;/span&gt;
npm &lt;span class="nb"&gt;install
&lt;/span&gt;npm run start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;🩵 There's something really cool about Terraform 🩵&lt;/strong&gt;: it has blocks called &lt;code&gt;output&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;💡&lt;br&gt;
Output values let you see information about your infrastructure on the command line.&lt;/p&gt;

&lt;p&gt;In this case, they will display the AWS Instance AMI and the Instance Public IP in the console. The public IP will be useful when we deploy the application, as you'll see later.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;output &lt;span class="s2"&gt;"aws_instance_ami"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
    value &lt;span class="o"&gt;=&lt;/span&gt; aws_instance.app_server.ami 
&lt;span class="o"&gt;}&lt;/span&gt;

output &lt;span class="s2"&gt;"aws_instance_public_ip"&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
  value &lt;span class="o"&gt;=&lt;/span&gt; aws_instance.app_server.public_ip
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Deployment Process
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;p&gt;Make sure you have the following prerequisites before running the project:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;An AWS Account&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;AWS CLI installed and configured with appropriate credentials&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Terraform installed on your local machine&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Your IPv4 address (to configure security group rules)&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Step-by-Step Deployment Guide
&lt;/h3&gt;

&lt;p&gt;To deploy the Node application, please clone my repository:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/lalidiaz/node-terraform-infrastructure/tree/main
&lt;span class="nb"&gt;cd &lt;/span&gt;node-terraform-infrastructure
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once you are inside the project's folder, let's deploy the app with Terraform and watch the magic happen ✨🙌!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;💡&lt;br&gt;
The &lt;code&gt;terraform init&lt;/code&gt; command initializes a working directory containing Terraform configuration files.&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 2&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;💡&lt;br&gt;
The &lt;code&gt;terraform plan&lt;/code&gt; command creates an execution plan, which lets you preview the changes that Terraform plans to make to your infrastructure.&lt;/p&gt;

&lt;p&gt;By default, when Terraform creates a plan it:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Reads the current state of any already-existing remote objects to make sure that the Terraform state is up-to-date.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Compares the current configuration to the prior state and noting any differences.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Proposes a set of change actions that should, if applied, make the remote objects match the configuration.&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform plan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 3&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If everything looks correct, you can run the following command to apply the changes:&lt;/p&gt;

&lt;p&gt;💡&lt;br&gt;
The &lt;code&gt;terraform apply&lt;/code&gt; command executes the actions proposed in a Terraform plan.&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;And that's it, the project is deployed successfully!&lt;/strong&gt; ✨🙌🎉&lt;/p&gt;

&lt;h2&gt;
  
  
  Deployment Verification
&lt;/h2&gt;

&lt;p&gt;To verify the application, you will see the output in the console displaying the AWS EC2 Public IP. Just copy the IP and head over to the browser, paste the ip &lt;code&gt;&amp;lt;aws_instance_public_ip&amp;gt;:3000&lt;/code&gt; you should see the following screen:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1736245922050%2Ffcc3b4ec-a0f7-4ef6-b509-73be2616b38b.png%2520align%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1736245922050%2Ffcc3b4ec-a0f7-4ef6-b509-73be2616b38b.png%2520align%3D" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost Considerations
&lt;/h2&gt;

&lt;p&gt;Please note that we are creating an EC2 instance within the free tier, but the AWS Secrets Manager does have associated costs. Check the &lt;a href="https://aws.amazon.com/secrets-manager/pricing/" rel="noopener noreferrer"&gt;documentation pricing page&lt;/a&gt; for more details.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cleanup Process
&lt;/h2&gt;

&lt;p&gt;Finally, remember to always clean up so you don’t incur any changes.&lt;/p&gt;

&lt;p&gt;💡&lt;br&gt;
The &lt;code&gt;terraform destroy&lt;/code&gt; command is a convenient way to destroy all remote objects managed by a particular Terraform configuration.&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform destroy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Future Improvements
&lt;/h2&gt;

&lt;p&gt;After talking with my mentor @&lt;a href="https://dev.to@Marianogg9"&gt;Mariano González&lt;/a&gt; ,he suggested the following improvements for the app, which will be included in the next iterations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Create a Terraform Backend block&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a Module&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Add CI/CD with GitHub Actions&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Refactor the solution to enhance security&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Add a local executor to calculate my IP on the fly instead of hardcoding it&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;While this was my first experience with Terraform, I really enjoyed it and took this challenge very seriously. I spent a few hours working on this solution, and I'm quite happy with the results.Thanks to my mentor @&lt;a href="https://dev.to@Marianogg9"&gt;Mariano González&lt;/a&gt; 🙏 for guiding me and taking the time to explain concepts to me.&lt;/p&gt;

&lt;p&gt;I must admit I was excited when I started working on this project. Although I know I need to keep studying, I learned a lot from this hands-on experience, which gave me the confidence to tackle future challenges!&lt;/p&gt;

&lt;p&gt;Thanks for reading! See you in the next post! 👋&lt;/p&gt;

</description>
      <category>terraform</category>
      <category>devops</category>
      <category>cloud</category>
      <category>iac</category>
    </item>
    <item>
      <title>🐳 My Docker Journey: From Zero to "It Works on My Machine" (For Real This Time 😂!)</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:13:18 +0000</pubDate>
      <link>https://forem.com/lalidevops/my-docker-journey-from-zero-to-it-works-on-my-machine-for-real-this-time--57fn</link>
      <guid>https://forem.com/lalidevops/my-docker-journey-from-zero-to-it-works-on-my-machine-for-real-this-time--57fn</guid>
      <description>&lt;ol&gt;
&lt;li&gt;&lt;p&gt;The Starting Point&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The First “Aha!” Moment&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Real Challenges I Tacked&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;My Biggest Wins&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Some Tips&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Next Steps&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To Anyone Starting Out&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The Starting Point
&lt;/h2&gt;

&lt;p&gt;Like many, I started my DevOps bootcamp thinking "&lt;em&gt;What's the deal with containers?&lt;/em&gt;”&lt;/p&gt;

&lt;p&gt;As I continue my journey to become a DevOps Engineer, this is the fourth article about the &lt;a href="https://bootcamp.295devops.com/docker/docker" rel="noopener noreferrer"&gt;DevOps Bootcamp by Roxs&lt;/a&gt; I'm completing.&lt;/p&gt;

&lt;p&gt;Docker is a tool that packages applications with all their dependencies, ensuring they run consistently anywhere. Imagine it as creating small, portable environments (containers) that keep everything an app needs in one place, eliminating the "it works on my machine" problems 😅!&lt;/p&gt;

&lt;p&gt;I was excited to learn about Docker because it seemed like the key to solving environment issues. Running apps in isolated containers felt powerful. However, I was also intimidated because Docker has its own terminology and commands. The combination of new concepts and tools made it feel like a steep learning curve, but the benefits were too appealing to ignore.&lt;/p&gt;

&lt;h2&gt;
  
  
  The First "Aha!" Moment
&lt;/h2&gt;

&lt;p&gt;While studying the Docker section in the bootcamp, I went through the architecture, learned how to create images, publish those images, run containers from those images, and use Docker Compose.&lt;/p&gt;

&lt;p&gt;But the first &lt;strong&gt;"Aha!" Moment&lt;/strong&gt; came when I started working on the exercises and challenges from the Bootcamp, gaining hands-on experience.&lt;/p&gt;

&lt;p&gt;Alright, let me share what I've been up to in my Docker journey! 🐳&lt;/p&gt;

&lt;p&gt;First up was getting my hands dirty with MySQL containers. Not gonna lie, managing databases used to give me anxiety, but Docker made it feel like putting together LEGO blocks.&lt;/p&gt;

&lt;p&gt;The real game-changer? Running MySQL with PHPMyAdmin containers together. Seeing these two containers talk to each other was like magic ✨ (okay, maybe not magic, but definitely felt like it at first!).&lt;/p&gt;

&lt;p&gt;Then came the fun part: deploying my first PHP web app in a container. Sure, I ran into some obstacles, but that feeling when your application finally spins up in a container? Priceless! 🤩&lt;/p&gt;

&lt;p&gt;The PostgreSQL + pgAdmin setup with Docker Compose was next on my list. Let me tell you - Docker Compose is a lifesaver! Instead of typing multiple commands, it was just one &lt;code&gt;docker compose up&lt;/code&gt; and boom! 💥 Both services running together.&lt;/p&gt;

&lt;p&gt;Finally, I set up a complete PHP development environment 🙌.&lt;/p&gt;

&lt;p&gt;I would like to share the container analogy that made sense in my head 🐳:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🚀🧑‍🚀The Spaceship Analogy&lt;/strong&gt;, I started thinking about containers like tiny spaceships. Each one needs to be completely self-sufficient, its own environment, resources, and life support systems (dependencies). They can dock with the mothership (host system) but remain independent. Just like a spaceship needs to work the same whether it's docked at Earth or Mars, a Docker container runs the same whether it's on my laptop or in the cloud.&lt;/p&gt;

&lt;h2&gt;
  
  
  Real Challenges I Tackled
&lt;/h2&gt;

&lt;p&gt;Let me tell you about the juicy part of my Docker journey, the actual problems I had to solve:&lt;/p&gt;

&lt;p&gt;First up was getting MongoDB and Mongo Express containers to play nice together. Simple enough on paper, right? Well, I learned about container networking and environment variables. But when I finally saw Mongo Express pop up in my browser, connecting smoothly to MongoDB? Was great! 🙌&lt;/p&gt;

&lt;p&gt;The "295topics" project involved getting Node.js, Nginx, and MySQL to work together. Each time a page refreshed, it needed to add a record to MySQL and display it back. It sounds simple, but it's a different story when you're actually doing it!&lt;/p&gt;

&lt;p&gt;While the Docker basics like managing containers and images were not that hard to handle, the real test came when I had to take on the “&lt;strong&gt;Full Stack Challenge with Java, Go, and PostgreSQL&lt;/strong&gt;” challenge. Tackling this felt like I was brewing up something crazy in the lab: managing Java, Go, and PostgreSQL all in one project.&lt;/p&gt;

&lt;p&gt;At first, I was a little intimidated..I mean, Java and Go were completely new to me.&lt;/p&gt;

&lt;p&gt;The goal was to automate the entire build, test, package, and deployment process using Docker and Docker Compose.&lt;/p&gt;

&lt;p&gt;Now, the tricky part was getting these three components: the Java API (backend), the Go web app (frontend), and the PostgreSQL database, to play nice together. I had to make sure they could talk to each other seamlessly, without any problems.&lt;/p&gt;

&lt;p&gt;Just when I thought I had it all figured out, I ran into an issue with the source code.&lt;/p&gt;

&lt;p&gt;So I dug into the code, and started troubleshooting.&lt;/p&gt;

&lt;p&gt;After a few cups of coffee and some intense debugging sessions, I finally managed to solve it!. But I wasn't done yet, I still had to incorporate semantic versioning and set up a bash script to automate the deployment process.&lt;/p&gt;

&lt;p&gt;Writing that deployment script felt like putting together a puzzle 🧩. The script automates the process of building docker images, tag the with versions using a &lt;a href="https://git-scm.com/docs/git-describe" rel="noopener noreferrer"&gt;git describe&lt;/a&gt;, push them to Docker Hub and running &lt;code&gt;docker-compose.yaml&lt;/code&gt; file. ✅&lt;/p&gt;

&lt;p&gt;The best part? Documenting everything on my &lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/tree/main/docker/challenges/challenge_04" rel="noopener noreferrer"&gt;Github Repo&lt;/a&gt;. Not just "it works now," but the actual journey: the obstacles I faced, the solutions I found, and (most importantly) why they worked. Because let's be real, future me is going to thank past me for those notes! 📝&lt;/p&gt;

&lt;p&gt;Looking back, these weren't just tasks to check off a list. Each challenge taught me something valuable about how containers work in the real world. And that feeling when all services are running smoothly, and your Docker Compose up command works? Worth every debugging session! 🚀&lt;/p&gt;

&lt;h2&gt;
  
  
  My Biggest Wins
&lt;/h2&gt;

&lt;p&gt;Seeing my containers finally talk to each other through the custom network, and watching my API successfully store data in the MongoDB container, it was amazing 🤩.&lt;/p&gt;

&lt;p&gt;Before Docker, setting up my development environment was like building a house of cards. One wrong version of Node and everything would collapse. Then came the bootcamp's Docker module. The first time I wrote a proper Dockerfile, created a &lt;code&gt;.dockerignore&lt;/code&gt;, and saw my application running in an isolated container, it clicked🚀.&lt;/p&gt;

&lt;p&gt;It happened during our micro-services project. I was staring at my &lt;code&gt;docker-compose.yml&lt;/code&gt; file, with multiple services, volumes, and networks defined, when suddenly it all made sense. Containers weren't just isolated boxes anymore, they were building blocks. Each container had its purpose: the frontend, the backend API and the database.&lt;/p&gt;

&lt;p&gt;The moment I realized I could bring down my entire development environment with &lt;code&gt;docker compose down&lt;/code&gt; and rebuild it exactly the same way with &lt;code&gt;docker compose up build&lt;/code&gt; was when I truly understood the power of containerization. The best part? I could confidently say "it will work on your machine too" and mean it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Some Tips
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Run &lt;code&gt;docker system prune&lt;/code&gt; regularly to avoid filling your disk with unused images and containers.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Name your containers meaningfully with &lt;code&gt;container_name&lt;/code&gt; in &lt;code&gt;docker-compose.yaml&lt;/code&gt;, it makes logs much easier to follow.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker logs -f container_name&lt;/code&gt; is your best friend for real-time debugging, ALWAY check the logs!&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use &lt;code&gt;docker exec -it container_name sh&lt;/code&gt; to get inside a container and poke around.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Always check container health with &lt;code&gt;docker ps&lt;/code&gt;, the 'Up' time can tell you if containers are crash-looping.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use &lt;code&gt;depends_on&lt;/code&gt; in &lt;code&gt;docker-compose.yaml&lt;/code&gt; to handle service startup order.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And finally the resources that actually helped me was the &lt;a href="https://bootcamp.295devops.com/docker/docker" rel="noopener noreferrer"&gt;Docker section&lt;/a&gt; in the DevOps Bootcamp by Roxs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Next Steps
&lt;/h2&gt;

&lt;p&gt;Getting comfortable with Docker has opened up a whole new world of possibilities. What I've learned about containers isn't just for development anymore, it's about building scalable, production-ready applications. The skills I've gained in Docker and container orchestration are setting the perfect foundation for my &lt;strong&gt;Kubernetes ⚙️&lt;/strong&gt; journey. I can already imagine deploying my applications, setting up automatic scaling, and managing everything through &lt;strong&gt;Kubernetes&lt;/strong&gt;. It's like I've learned to ride a bike 🚲 with Docker, and now I'm ready to hop on a motorcycle 🏍️ with &lt;strong&gt;Kubernetes⚙️&lt;/strong&gt;! I've already started experimenting with minikube on my local machine 🙌👩🏽‍💻.&lt;/p&gt;

&lt;h2&gt;
  
  
  To Anyone Starting Out
&lt;/h2&gt;

&lt;p&gt;Here's the truth: &lt;strong&gt;&lt;em&gt;everyone struggles at first&lt;/em&gt;&lt;/strong&gt;. That moment when you're staring at terminal errors, wondering why your container won't start, or why your services can't seem to talk to each other, we've all been there. But here's the thing: &lt;em&gt;that confusion is temporary&lt;/em&gt;, and the clarity that follows is absolutely worth it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Practical First Steps:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Start small: Like, really small: Get a single container running first. I started with a simple hello world app.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Master the basic commands: &lt;code&gt;docker build&lt;/code&gt;, &lt;code&gt;docker run&lt;/code&gt;, &lt;code&gt;docker ps&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Don't jump into docker-compose until you understand single containers&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As you progress, you might feel like you're hitting a wall or getting stuck with errors. This is when having a mentor can really help. I can't highlight enough how valuable it is to have someone experienced to guide you through these tough times. For me, that mentor has been @&lt;a href="https://dev.to@Marianogg9"&gt;Mariano González&lt;/a&gt;. His support after I spent hours debugging was invaluable. Sometimes, all it takes is a fresh perspective or an explanation of concepts that you're struggling to understand.&lt;/p&gt;

&lt;p&gt;Mentorship isn’t just about getting answers, it's about learning how to approach problems more effectively and building confidence. @&lt;a href="https://dev.to@Marianogg9"&gt;Mariano González&lt;/a&gt;’s insights and patience helped me break through some of my toughest obstacles, and having someone who genuinely cares about helping others can turn an overwhelming learning process into an enjoyable experience.&lt;/p&gt;

&lt;p&gt;So, if you can, find a mentor. They can help you avoid common pitfalls and offer advice. Don’t be afraid to ask questions or ask for help. It's all part of the learning process.&lt;/p&gt;

&lt;p&gt;That's it for today! I'll see you in the next post, which will be all about Kubernetes ⚙️&lt;/p&gt;

</description>
      <category>devopsengineerbootca</category>
      <category>devops</category>
      <category>docker</category>
      <category>dockerimage</category>
    </item>
    <item>
      <title>Bash Scripting 👩🏽‍💻</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:10:18 +0000</pubDate>
      <link>https://forem.com/lalidevops/bash-scripting-4m9a</link>
      <guid>https://forem.com/lalidevops/bash-scripting-4m9a</guid>
      <description>&lt;h2&gt;
  
  
  &lt;strong&gt;Table of Contents&lt;/strong&gt;
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Why did I started learning Bash Scripting?&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;My learning process&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Challenges faced&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Interesting discoveries&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Examples from my experience&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;My humble advice and next steps&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Why did I started learning Bash Scripting?
&lt;/h3&gt;

&lt;p&gt;I started learning Bash Scripting because I am doing the &lt;a href="https://bootcamp.295devops.com/" rel="noopener noreferrer"&gt;RoxsRoss DevOps Bootcamp&lt;/a&gt;. This is the third post about my journey to become a DevOps Engineer.&lt;/p&gt;

&lt;p&gt;You can also check out the &lt;a href="https://laurainthecloud.hashnode.dev/devps-bootcamp-by-roxs" rel="noopener noreferrer"&gt;first post&lt;/a&gt; and the &lt;a href="https://laurainthecloud.hashnode.dev/mastering-linux" rel="noopener noreferrer"&gt;second post&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The next logical step after learning Linux is learning Bash Scripting. But why?🤔&lt;/p&gt;

&lt;p&gt;With Bash, you can &lt;strong&gt;easily automate repetitive tasks&lt;/strong&gt; like deploying applications, managing backups, or monitoring systems. This not only &lt;strong&gt;saves time&lt;/strong&gt; but also &lt;strong&gt;reduces the risk of errors&lt;/strong&gt; that can occur with manual processes.&lt;/p&gt;

&lt;p&gt;Bash allows you to &lt;strong&gt;monitor performance&lt;/strong&gt; and &lt;strong&gt;optimize&lt;/strong&gt; system usage. You can create scripts to make sure your resources are used efficiently ✅.&lt;/p&gt;

&lt;p&gt;Bash is &lt;strong&gt;excellent for logging and monitoring&lt;/strong&gt;. You can set up scripts to collect logs, track system performance, and even trigger alerts based on certain conditions, helping you maintain a reliable system.&lt;/p&gt;

&lt;p&gt;Bash scripting empowers DevOps teams to automate workflows, manage environments, and integrate tools smoothly, leading to faster software delivery and better collaboration between development and operations.&lt;/p&gt;

&lt;h3&gt;
  
  
  My learning process
&lt;/h3&gt;

&lt;p&gt;I must admit that I felt a bit intimidated at first, but I didn't let impostor syndrome take over. Instead, I was very excited about learning Bash👩🏽‍💻.&lt;/p&gt;

&lt;p&gt;I found it very interesting, and there were resources provided to learn and practice, such as books (&lt;a href="https://github.com/bobbyiliev/introduction-to-bash-scripting" rel="noopener noreferrer"&gt;Introduction to Bash Scripting&lt;/a&gt;), &lt;a href="https://bootcamp.295devops.com/bash-scripting/l1" rel="noopener noreferrer"&gt;theory&lt;/a&gt;, practical &lt;a href="https://bootcamp.295devops.com/bash-scripting/l12" rel="noopener noreferrer"&gt;exercises&lt;/a&gt;, and &lt;a href="https://bootcamp.295devops.com/bash-scripting/challenge-bash/challenge-01" rel="noopener noreferrer"&gt;challenges&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I started with an introduction to Bash scripting, followed by the fundamentals:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Basic Linux commands&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Permissions and Properties&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Redirection and Pipes&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Variables and Basic Operations&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Control Structures&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Functions in Bash&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Process Management in Bash&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Text Manipulation in Bash&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Advanced Interaction with the File System in Bash&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Integration and Automation&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I love how the &lt;a href="https://bootcamp.295devops.com/bash-scripting/l1" rel="noopener noreferrer"&gt;DevOps Bootcamp by Roxs&lt;/a&gt; is structured. It felt like it set the foundational knowledge really well.&lt;/p&gt;

&lt;p&gt;After finishing the fundamentals, I practiced with the practical cases and exercises provided in the Bootcamp. You can find them on &lt;a href="https://bootcamp.295devops.com/" rel="noopener noreferrer"&gt;the bootcamp website&lt;/a&gt; or &lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/tree/main/bash/exercises" rel="noopener noreferrer"&gt;my Github&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Regarding the challenges in the Bash section, 2 of the 3 challenges displayed there are already solved in the Linux Challenges section. You can find my solutions for those &lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/tree/main/linux/challenges" rel="noopener noreferrer"&gt;here&lt;/a&gt; 🙌.&lt;/p&gt;

&lt;h3&gt;
  
  
  Challenges faced
&lt;/h3&gt;

&lt;p&gt;Since I am currently working as a Frontend Developer, I didn't find it hard to grasp some concepts like &lt;strong&gt;&lt;em&gt;Functions in Bash&lt;/em&gt;&lt;/strong&gt; or &lt;strong&gt;&lt;em&gt;Control Structures&lt;/em&gt;&lt;/strong&gt; (Loops). Although the syntax was a bit different, the concepts were still similar.&lt;/p&gt;

&lt;p&gt;Here's an example of the syntax for a &lt;strong&gt;for loop&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;

&lt;span class="c"&gt;# Define a list of items&lt;/span&gt;
&lt;span class="nv"&gt;fruits&lt;/span&gt;&lt;span class="o"&gt;=(&lt;/span&gt;&lt;span class="s2"&gt;"apple"&lt;/span&gt; &lt;span class="s2"&gt;"banana"&lt;/span&gt; &lt;span class="s2"&gt;"cherry"&lt;/span&gt;&lt;span class="o"&gt;)&lt;/span&gt;

&lt;span class="c"&gt;# Use a for loop to iterate over the list&lt;/span&gt;
&lt;span class="k"&gt;for &lt;/span&gt;fruit &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;fruits&lt;/span&gt;&lt;span class="p"&gt;[@]&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;do
  &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"I like &lt;/span&gt;&lt;span class="nv"&gt;$fruit&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
&lt;span class="k"&gt;done&lt;/span&gt;

&lt;span class="c"&gt;# I like apple&lt;/span&gt;
&lt;span class="c"&gt;# I like banana&lt;/span&gt;
&lt;span class="c"&gt;# I like cherry&lt;/span&gt;
&lt;span class="c"&gt;# I like date&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;What was a bit tricky for me was &lt;strong&gt;&lt;em&gt;Permissions and Properties&lt;/em&gt;&lt;/strong&gt;. Since I hadn't learned about it before (only when I studied it in the Linux section just before this one), it was a new topic, and I don't use it daily.I believe this is more relevant to Sys Admins or DevOps roles than to Developers, at least in my experience. However, I must say I really enjoyed it.&lt;/p&gt;

&lt;p&gt;What I found very interesting was that there are &lt;strong&gt;two ways&lt;/strong&gt; to &lt;strong&gt;change permissions&lt;/strong&gt;:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Symbolic&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Grant read, write, and execute permissions to the user (owner)&lt;/span&gt;
&lt;span class="c"&gt;# and read and execute permissions to group and others.&lt;/span&gt;
&lt;span class="nb"&gt;chmod &lt;/span&gt;u+rwx,g+rx,o+rx file.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;* `u+rwx`: Adds read, write, and execute permissions to the user (owner).

* `g+rx`: Adds read and execute permissions to the group.

* `o+rx`: Adds read and execute permissions to others.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Numeric&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In numeric notation, each permission is represented by a number:&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;* `4` = read (`r`)

* `2` = write (`w`)

* `1` = execute (`x`)


You add these values together to get the desired permissions for each level (user, group, others):

* `7` = read + write + execute (4 + 2 + 1)

* `6` = read + write (4 + 2)

* `5` = read + execute (4 + 1)

* `4` = read only
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;```bash
# Set permissions to 755:
# - User (owner) has read, write, and execute permissions (7)
# - Group has read and execute permissions (5)
# - Others have read and execute permissions (5)
chmod 755 myfile.txt
```
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Both methods can achieve the same result.&lt;/p&gt;

&lt;p&gt;Fortunately, each topic includes an exercise section for practice. You can find my solutions in my &lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/tree/main/bash/exercises" rel="noopener noreferrer"&gt;Bash Exercises GitHub repository&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Interesting discoveries
&lt;/h3&gt;

&lt;p&gt;I found Bash scripting fascinating because it allows you to schedule tasks to run automatically at specific intervals.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; command_to_execute
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Each &lt;code&gt;*&lt;/code&gt; represents a different time field:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Minute (0-59)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Hour (0-23)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Day of the month (1-31)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Month (1-12)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Day of the week (0-7, where both 0 and 7 represent Sunday)&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;For example, here are a couple of cron jobs that run a backup script:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;This runs a script at 10:00 AM every Monday.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;0 10 &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; 1 /path/to/script.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;This runs a script every Sunday at 5 PM.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;0 17 &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; 0 /path/to/script.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;This runs a script at midnight on the first day of each month.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;0 0 1 &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; /path/to/script.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Another thing I found fascinating was the text manipulation capabilities. My favorite command is &lt;code&gt;grep&lt;/code&gt;, a powerful tool for searching and manipulating text patterns within files.&lt;/p&gt;

&lt;p&gt;Here are some example use cases:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Search for a specific word in a file. For example, let’s say you want to find the word "error" in &lt;code&gt;logfile.txt&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;grep&lt;/span&gt; &lt;span class="s2"&gt;"error"&lt;/span&gt; logfile.txt
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;This will output all lines in &lt;code&gt;logfile.txt&lt;/code&gt; that contain the word "error".&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Use the &lt;code&gt;-i&lt;/code&gt; option to ignore case sensitivity. For example, if you want to find "error" regardless of case:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;grep&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s2"&gt;"error"&lt;/span&gt; logfile.txt
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;This will match "Error", "ERROR", "error", etc.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;To search for a pattern in all files within a directory and its subdirectories, use the &lt;code&gt;-r&lt;/code&gt; option:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;grep&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; &lt;span class="s2"&gt;"error"&lt;/span&gt; /path/to/directory
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;This will look for "error" in all files within &lt;code&gt;/path/to/directory&lt;/code&gt;.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Examples from my experience
&lt;/h3&gt;

&lt;p&gt;I would like to share my solution to &lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/tree/main/bash/challenges/challenge_01" rel="noopener noreferrer"&gt;challenge_01&lt;/a&gt;. This challenge was based on a real-world scenario: “Design an Automated Bash Script for Building a Python Application Using the Flask Framework”.&lt;/p&gt;

&lt;p&gt;To solve this, I created a bash script called &lt;strong&gt;automation.sh&lt;/strong&gt; that does the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Creates a temporary folder named &lt;em&gt;tempdir&lt;/em&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Copies the specified folders and contents into &lt;em&gt;tempdir&lt;/em&gt; as outlined in the challenge.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Creates a Dockerfile inside the &lt;em&gt;tempdir&lt;/em&gt; using “&lt;strong&gt;cat &amp;lt;&amp;lt; EOF &amp;gt; Dockerfile&lt;/strong&gt;.” I chose this method because it allows for multi-line text, which was necessary for the Dockerfile.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Builds the image using the Dockerfile.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Runs the image!&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Final solution&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;

&lt;span class="c"&gt;# Create a temporary folder named tempdir and its subdirectories tempdir/templates and tempdir/static.&lt;/span&gt;
&lt;span class="nb"&gt;mkdir&lt;/span&gt; &lt;span class="nt"&gt;-p&lt;/span&gt; tempdir/templates tempdir/static

&lt;span class="c"&gt;# Confirm the directory structure has been created&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Temporary directories created: tempdir, tempdir/templates, tempdir/static"&lt;/span&gt;

&lt;span class="c"&gt;# Inside the tempdir folder, copy the static/ folder, templates/ folder, and the application desafio2_app.py.&lt;/span&gt;
&lt;span class="nb"&gt;cp&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; static tempdir
&lt;span class="nb"&gt;cp&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; templates tempdir
&lt;span class="nb"&gt;cp &lt;/span&gt;desafio2_app.py tempdir

&lt;span class="c"&gt;# Construct a Dockerfile, which will be located inside the temporary folder tempdir.&lt;/span&gt;
&lt;span class="nb"&gt;cat&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="no"&gt;EOF&lt;/span&gt;&lt;span class="sh"&gt; &amp;gt; tempdir/Dockerfile
    FROM python
    RUN pip install flask
    COPY ./static /home/myapp/static/
    COPY ./templates /home/myapp/templates/
    COPY desafio2_app.py /home/myapp/
    EXPOSE 5050
    CMD ["python3", "/home/myapp/desafio2_app.py"]
&lt;/span&gt;&lt;span class="no"&gt;EOF

&lt;/span&gt;&lt;span class="c"&gt;# Build the image&lt;/span&gt;
docker build &lt;span class="nt"&gt;-t&lt;/span&gt; desafio2_app ./tempdir

&lt;span class="c"&gt;# Run the app&lt;/span&gt;
docker run &lt;span class="nt"&gt;-t&lt;/span&gt; &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="nt"&gt;-p&lt;/span&gt; 5050:5050 &lt;span class="nt"&gt;--name&lt;/span&gt; nombreapprunning desafio2_app
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can also find this and other solutions on my 👉👉👉👉 &lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/blob/main/bash/challenges/challenge_01/challenge_01.md" rel="noopener noreferrer"&gt;Github Repository&lt;/a&gt; 👈👈👈👈.&lt;/p&gt;

&lt;h3&gt;
  
  
  My humble advice and next steps
&lt;/h3&gt;

&lt;p&gt;My advice is to practice regularly. If you are following the &lt;a href="https://bootcamp.295devops.com/" rel="noopener noreferrer"&gt;DevOps Bootcamp by Roxs&lt;/a&gt;, I strongly recommend completing all the exercises and challenges. These are excellent examples of real-world issues to tackle, and I love that ❤️!&lt;/p&gt;

&lt;p&gt;Be patient with yourself and always remember that &lt;strong&gt;perseverance leads to success 💪.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Make the most of blogs like &lt;em&gt;Stack Overflow&lt;/em&gt; and &lt;em&gt;Medium&lt;/em&gt; to see how others solve tasks similar to what you need to do. Learn from those with more experience.&lt;/p&gt;

&lt;p&gt;And last but not least, there's a command called &lt;code&gt;man&lt;/code&gt;. This is a powerful tool that displays the manual pages for different commands. So, whenever you're unsure about a command, you can check the &lt;code&gt;man&lt;/code&gt; page:&lt;/p&gt;

&lt;p&gt;For example, viewing the Manual for &lt;code&gt;grep&lt;/code&gt; command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;man &lt;span class="nb"&gt;grep&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The output looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;GREP&lt;span class="o"&gt;(&lt;/span&gt;1&lt;span class="o"&gt;)&lt;/span&gt;                     General Commands Manual                    GREP&lt;span class="o"&gt;(&lt;/span&gt;1&lt;span class="o"&gt;)&lt;/span&gt;

NAME
     &lt;span class="nb"&gt;grep&lt;/span&gt;, egrep, fgrep, rgrep, bzgrep, bzegrep, bzfgrep, zgrep, zegrep,
     zfgrep – file pattern searcher

SYNOPSIS
     &lt;span class="nb"&gt;grep&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;-abcdDEFGHhIiJLlMmnOopqRSsUVvwXxZz&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;-A&lt;/span&gt; num] &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;-B&lt;/span&gt; num] &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;-C&lt;/span&gt; num]
          &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;-e&lt;/span&gt; pattern] &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;-f&lt;/span&gt; file] &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;--binary-files&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;value] &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;--color&lt;/span&gt;&lt;span class="o"&gt;[=&lt;/span&gt;when]]
          &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;--colour&lt;/span&gt;&lt;span class="o"&gt;[=&lt;/span&gt;when]] &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;--context&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;num] &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;--label&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;--line-buffered&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt;
          &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;--null&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;pattern] &lt;span class="o"&gt;[&lt;/span&gt;file ...]

DESCRIPTION
     The &lt;span class="nb"&gt;grep &lt;/span&gt;utility searches any given input files, selecting lines that
     match one or more patterns.  By default, a pattern matches an input line
     &lt;span class="k"&gt;if &lt;/span&gt;the regular expression &lt;span class="o"&gt;(&lt;/span&gt;RE&lt;span class="o"&gt;)&lt;/span&gt; &lt;span class="k"&gt;in &lt;/span&gt;the pattern matches the input line
     without its trailing newline.  An empty expression matches every line.
     Each input line that matches at least one of the patterns is written to
     the standard output.
     ....
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And that's all about Bash Scripting!&lt;/p&gt;

&lt;p&gt;While Bash helped us automate our workflow, Docker is about to transform how we think about development environments entirely. Think of it as taking our Bash superpowers and wrapping them in a portable, shareable container that works exactly the same way everywhere. No more "but it works on my machine" drama!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Next steps... Docker 🐳&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;See you in the next post, where we'll start our container journey together! 🚀&lt;/p&gt;

</description>
      <category>learnigndevops</category>
      <category>bashscript</category>
      <category>bashscripting</category>
      <category>devops</category>
    </item>
    <item>
      <title>Mastering Linux</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:04:47 +0000</pubDate>
      <link>https://forem.com/lalidevops/mastering-linux-39jb</link>
      <guid>https://forem.com/lalidevops/mastering-linux-39jb</guid>
      <description>&lt;h2&gt;
  
  
  Table of Contents
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Setting the Stage for Linux ⚙️&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Learning process 📚&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Challenges faced 🥷&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Interesting Discoveries ✨&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Examples from my experience 👩🏽‍💻&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The impact Linux had on me 🔎&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Advice 🎁&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Setting the Stage for Linux ⚙️
&lt;/h3&gt;

&lt;p&gt;I've always been amazed by open-source projects. These projects help people solve various problems and allow communities to contribute and come together for a great cause.&lt;/p&gt;

&lt;p&gt;I was motivated to learn Linux because I understand its benefits. Most cloud infrastructure, servers, and containers run on Linux. It supports powerful shell scripting languages like Bash, which are essential for automating routine tasks such as system monitoring, deployment pipelines, and server management.&lt;/p&gt;

&lt;p&gt;For DevOps, managing permissions, user roles, and secure environments is essential, and Linux makes this much easier with its permission and user management system.&lt;/p&gt;

&lt;p&gt;As I shared in &lt;a href="https://laurainthecloud.hashnode.dev/devps-bootcamp-by-roxs" rel="noopener noreferrer"&gt;my previous post&lt;/a&gt; (check out the 🚨🚨🚨&lt;strong&gt;UPDATE&lt;/strong&gt;:🚨🚨🚨 section), I had already downloaded virtualization software and installed Ubuntu on it, so I was ready to start.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why did I choose Ubuntu?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Because it’s user-friendly, has a large community support, offers regular Long-Term Support (LTS) versions, many applications are available and easy to install and is known for its stability and efficiency in handling high loads.&lt;/p&gt;

&lt;h3&gt;
  
  
  Learning process 📚
&lt;/h3&gt;

&lt;p&gt;I began learning Linux in Sep’ 2023 by completing the &lt;a href="https://courses.edx.org/certificates/2be3e8d737214a94b247d459d095d6ce" rel="noopener noreferrer"&gt;Introduction to Linux&lt;/a&gt; course, earning a certification from The Linux Foundation.&lt;/p&gt;

&lt;p&gt;But my journey did not end there. As part of the &lt;a href="https://bootcamp.295devops.com/linux/linux-intro/" rel="noopener noreferrer"&gt;DevOps Bootcamp By Roxs&lt;/a&gt;, I continue to learn and practice Linux.&lt;/p&gt;

&lt;p&gt;After learning about the different distributions and deciding which one to use, I began refreshing my memory on basic commands, the curl command, and bash scripting.&lt;/p&gt;

&lt;p&gt;The Bootcamp includes a series of &lt;a href="https://bootcamp.295devops.com/Linux/ejemplos-linux/ejemplo1" rel="noopener noreferrer"&gt;examples&lt;/a&gt; and &lt;a href="https://bootcamp.295devops.com/Linux/challenge-linux/challenge-02" rel="noopener noreferrer"&gt;challenges&lt;/a&gt; to solve.&lt;/p&gt;

&lt;p&gt;You can find the solutions to the examples and my solutions to the challenges here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/tree/main/linux/exercises" rel="noopener noreferrer"&gt;My solutions to the examples section&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/tree/main/linux/challenges" rel="noopener noreferrer"&gt;My solutions to the challenges section&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Even though I am following the course structure and find the content amazing, if you want to explore further, I highly recommend the &lt;a href="https://www.edx.org/learn/linux/the-linux-foundation-introduction-to-linux?index=product&amp;amp;objectID=course-5a631d1c-cb20-4cfc-9b49-1cc9c8fc981e&amp;amp;webview=false&amp;amp;campaign=Introduction+to+Linux&amp;amp;source=edX&amp;amp;product_category=course&amp;amp;placement_url=https%3A%2F%2Fwww.edx.org%2Flearn%2Flinux" rel="noopener noreferrer"&gt;Introduction to Linux&lt;/a&gt; course from The Linux Foundation.&lt;/p&gt;

&lt;h3&gt;
  
  
  Challenges Faced 🥷
&lt;/h3&gt;

&lt;p&gt;At first, I found it a bit challenging to remember all the commands. To overcome this, I practiced and solved every example and challenge in the Bootcamp section. I also found it very helpful to use this &lt;a href="https://bootcamp.295devops.com/Linux/linux-cli/cheatsheet" rel="noopener noreferrer"&gt;Linux Cheat Sheet&lt;/a&gt; that Roxs put together.&lt;/p&gt;

&lt;p&gt;I also asked for feedback from senior DevOps engineers, read opinions from others on the same journey as me on forums, Reddit, Quora, and blog posts.&lt;/p&gt;

&lt;h3&gt;
  
  
  Interesting Discoveries ✨
&lt;/h3&gt;

&lt;p&gt;Among all the things about Linux, what I liked most was learning how to manage and monitor processes. I enjoyed seeing real-time information about resource usage and using the Crontab command. Being able to schedule tasks with cron is an amazing benefit.&lt;/p&gt;

&lt;h3&gt;
  
  
  Examples from my experience 👩🏽‍💻
&lt;/h3&gt;

&lt;p&gt;I wanted to talk about some real-world examples, I managed to solve the 3 challenges from the Bootcamp in the Linux section.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/blob/main/linux/challenges/challenge_01.md" rel="noopener noreferrer"&gt;The first challenge&lt;/a&gt; involved deploying a Flask application called "Book Library" using Nginx as a reverse proxy and Gunicorn as the WSGI server.&lt;/p&gt;

&lt;p&gt;Nginx as a reverse proxy means that Nginx is acting as an intermediary server that forwards client requests to one or more backend servers and Gunicorn (Green Unicorn) as the WSGI (Web Server Gateway Interface) server means that Gunicorn acts as a bridge between my Flask web app and the web server (Nginx).&lt;/p&gt;

&lt;p&gt;I was able to solve this challenge by practicing with the examples before tackling the actual challenges, which definitely helped me gain the necessary skills.&lt;/p&gt;

&lt;p&gt;In the repository, I added a &lt;strong&gt;DEBUGGING&lt;/strong&gt; section to describe a problem I encountered during the challenge and how I debugged it to successfully solve the issue.&lt;/p&gt;

&lt;p&gt;For &lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/blob/main/linux/challenges/challenge_02.md" rel="noopener noreferrer"&gt;the second challenge&lt;/a&gt; I had to deploy both the frontend and several backend services using PM2 to manage the processes. PM2 is a daemon process manager that will help manage and keep my application online 24/7.&lt;/p&gt;

&lt;p&gt;After researching and reviewing the &lt;a href="https://pm2.keymetrics.io/docs/usage/application-declaration/" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;, I realized I needed to create a configuration file to manage multiple applications with PM2. In &lt;a href="https://pm2.keymetrics.io/docs/usage/application-declaration/" rel="noopener noreferrer"&gt;this section&lt;/a&gt; of the repository, I explain how I generated the configuration file to accomplish this.&lt;/p&gt;

&lt;p&gt;And finally, the &lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp/blob/main/linux/challenges/challenge_03.md" rel="noopener noreferrer"&gt;third challenge&lt;/a&gt; was more like exercises related to Linux commands. I really enjoyed this one!&lt;/p&gt;

&lt;p&gt;It's important to mention that whenever I faced challenges or something didn't work as expected, I documented it in the repository.&lt;/p&gt;

&lt;h3&gt;
  
  
  The impact Linux had on me 🔎
&lt;/h3&gt;

&lt;p&gt;I feel very optimistic and enthusiastic about learning Linux. I see it as an important milestone that will help me build a foundation for more knowledge and everything that comes next.&lt;/p&gt;

&lt;h3&gt;
  
  
  Advice 🎁
&lt;/h3&gt;

&lt;p&gt;I strongly recommend the &lt;a href="https://bootcamp.295devops.com/Linux/linux-intro" rel="noopener noreferrer"&gt;DevOps Bootcamp By Roxs&lt;/a&gt;. The bootcamp is well-structured and very comprehensive.&lt;/p&gt;

&lt;p&gt;The section begins with an introduction to Linux, covering topics like "What is Linux?", its architecture, history, and distributions. After that, there is a thorough analysis of different Linux distributions and reviews for each. Finally, it focuses on the Linux command line interface (CLI).&lt;/p&gt;

&lt;p&gt;I also encourage you to watch the &lt;a href="https://www.youtube.com/watch?v=ciJKo31TFw0&amp;amp;ab_channel=295Devops" rel="noopener noreferrer"&gt;lesson about Linux&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;And finally, &lt;strong&gt;as a personal note&lt;/strong&gt;: &lt;em&gt;DON'T let impostor syndrome take over. Learning new things isn't easy, so being patient with yourself is crucial. Every day is a win if you learn something new.&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Next Steps: Bash Scripting… 👩🏽‍💻🥷
&lt;/h3&gt;

</description>
      <category>linux</category>
      <category>ubuntu</category>
      <category>devops</category>
      <category>devopsarticles</category>
    </item>
    <item>
      <title>DevOps Bootcamp by Roxs 2024 ✨</title>
      <dc:creator>Laura</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:01:24 +0000</pubDate>
      <link>https://forem.com/lalidevops/devops-bootcamp-by-roxs-2024-g3n</link>
      <guid>https://forem.com/lalidevops/devops-bootcamp-by-roxs-2024-g3n</guid>
      <description>&lt;h2&gt;
  
  
  Table of Contents
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Why did I decided to start a DevOps Bootcamp?&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;About the DevOps Bootcamp by Roxs&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;My Goal&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Introduction to Lesson 1: Mastering Linux&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;First challenges faced&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Next Steps&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Conclusion 🙌&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Why did I decided to start a DevOps Bootcamp?
&lt;/h3&gt;

&lt;p&gt;After completing my AWS Cloud Practitioner Certification, I was studying for the AWS Developer Associate Certification as part of my path to becoming a Cloud DevOps Engineer (found &lt;a href="https://d1.awsstatic.com/training-and-certification/docs/AWS_certification_paths.pdf" rel="noopener noreferrer"&gt;here&lt;/a&gt;). However, I realized that I wanted to gain hands-on experience while learning, so I began researching courses and bootcamps to learn more about DevOps in general. I think this is the right move, as it will give me theoretical lessons and the chance to practice what I’ve learned.&lt;/p&gt;

&lt;p&gt;Luckily, I came across the 5th edition of this &lt;a href="https://bootcamp.295devops.com/" rel="noopener noreferrer"&gt;DevOps Bootcamp by Roxs&lt;/a&gt; 👀😀✨☁️&lt;/p&gt;

&lt;h3&gt;
  
  
  About the Bootcamp By Roxs
&lt;/h3&gt;

&lt;p&gt;I was super happy when I found out about this Bootcamp. It’s created by &lt;a href="https://x.com/roxsross" rel="noopener noreferrer"&gt;RoxRoss&lt;/a&gt; (Rossana Suarez), a Software Engineer, DevOps expert, GitLab HERO, AWS HERO, and Docker Captain 🐳, so she has a lot of knowledge to share!&lt;/p&gt;

&lt;p&gt;The bootcamp takes place on her &lt;a href="https://www.youtube.com/watch?v=BztgShcuD1I&amp;amp;list=PLNkefP1xaOeyOpFkp5b1udSmgxjqyBhPf" rel="noopener noreferrer"&gt;YouTube channe&lt;/a&gt;l, with live lessons that you can re-watch later if you didn’t have the chance to join online at that time. This is perfect for me because I live on the other side of the world and can follow along asynchronously.&lt;/p&gt;

&lt;p&gt;She also put together &lt;a href="https://bootcamp.295devops.com/" rel="noopener noreferrer"&gt;this&lt;/a&gt; &lt;strong&gt;AMAZING&lt;/strong&gt; website for the bootcamp, which includes exercises, challenges, and information about each topic. It’s easy to navigate and very clear.&lt;/p&gt;

&lt;p&gt;The first video was an introduction to DevOps in general, and Lesson 1 is all about Linux.&lt;/p&gt;

&lt;p&gt;What I also love about this bootcamp is that she created a Discord space for the community to share their successes and help each other. This is amazing because it promotes a sense of community, something I find SUPER cool in tech.&lt;/p&gt;

&lt;p&gt;This bootcamp dives into different topics like Linux, Bash Scripting, Docker, and DevOps tools, among others. It also includes some books! 📚&lt;/p&gt;

&lt;p&gt;I am super excited and can’t wait to continue watching the lessons!&lt;/p&gt;

&lt;h3&gt;
  
  
  My Goal 💪
&lt;/h3&gt;

&lt;p&gt;As I shared in my previous blog posts, my goal is to become a DevOps Engineer. This step marks an important milestone for me, as it will not only equip me with the tools but also give me more insights into what it would be like to work as a DevOps Engineer in the future.&lt;/p&gt;

&lt;p&gt;I love learning new things and am curious, so this bootcamp got me very excited and happy.&lt;/p&gt;

&lt;h3&gt;
  
  
  Introduction to Lesson 1: Mastering Linux
&lt;/h3&gt;

&lt;p&gt;Even though I’ve done a &lt;a href="https://courses.edx.org/certificates/2be3e8d737214a94b247d459d095d6ce" rel="noopener noreferrer"&gt;Linux Introduction Course&lt;/a&gt; with the Linux Foundation a couple of months ago, I didn’t remember much, so I decided to start from the basics.&lt;/p&gt;

&lt;p&gt;I took detailed notes in my Notion about Linux commands and practiced in the &lt;a href="https://killercoda.com/" rel="noopener noreferrer"&gt;playground&lt;/a&gt; suggested by Roxs.&lt;/p&gt;

&lt;p&gt;Moving forward, I’ve set up this &lt;a href="https://github.com/lalidiaz/DevOps-Bootcamp" rel="noopener noreferrer"&gt;GitHub repo&lt;/a&gt; to track my progress and to have something to look at whenever I need a boost in my mood (this was advice from my mentor @&lt;a href="https://dev.to@Marianogg9"&gt;Mariano González&lt;/a&gt; —thank you, Mariano!). Sometimes it’s hard to keep up when learning something new; it’s challenging and requires a lot of patience, so I thought I’d spread the word on this—it might help someone else in the same situation/path as mine.&lt;/p&gt;

&lt;p&gt;So, I set up the repo. What’s next? Well, I started working on the &lt;a href="https://bootcamp.295devops.com/linux/ejemplos-linux/ejemplo1/" rel="noopener noreferrer"&gt;Examples section&lt;/a&gt; under the Linux topic. At first, I was practicing with the &lt;a href="https://killercoda.com/playgrounds/scenario/ubuntu" rel="noopener noreferrer"&gt;Killercoda Playground&lt;/a&gt;, but Rox mentioned that "If you want to go further and challenge yourself, you can install Linux on your computer," so that’s what I chose to do.&lt;/p&gt;

&lt;h3&gt;
  
  
  First challenges faced
&lt;/h3&gt;

&lt;p&gt;I wanted to install Linux on my computer, but I had to do some research on how to do that because I have a Mac M2, and it was not as simple as just downloading it.&lt;/p&gt;

&lt;p&gt;I heard about VirtualBox in the lesson, and after asking some questions to my mentor @&lt;a href="https://dev.to@Marianogg9"&gt;Mariano González&lt;/a&gt; , I knew I had to download VirtualBox.&lt;/p&gt;

&lt;p&gt;In the bootcamp, there’s a section about Linux Distros: characteristics, the level of friendliness in terms of beginners (like me 😀) and other features. I decided to go with Linux Mint 🌱 because it’s beginner-friendly.&lt;/p&gt;

&lt;p&gt;So, I installed VirtualBox and downloaded Linux Mint. I couldn’t just simply download VirtualBox; I had to download it from the old builds and find the one that allowed the download for M2, which is &lt;a href="https://www.virtualbox.org/wiki/Changelog-7.0#v8" rel="noopener noreferrer"&gt;VirtualBox 7.0.8 (released April 18, 2023)&lt;/a&gt;, more specifically the &lt;strong&gt;*​*Developer preview for macOS / Arm64 (M1/M2) hosts&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Okay, so VirtualBox is downloaded ✅, next step: download Linux Mint. I headed over to the official website and hit download!&lt;/p&gt;

&lt;p&gt;🚨🚨🚨**UPDATE:**🚨🚨🚨&lt;/p&gt;

&lt;p&gt;However, I discovered that VirtualBox doesn’t work with Linux Mint. After researching the issue, reading numerous Stack Overflow posts and Linux blogs, I accidentally came across this &lt;a href="https://www.labsmac.es/instalar-ubuntu-arm-en-un-mac-apple-silicon-m1-y-m2/" rel="noopener noreferrer"&gt;post&lt;/a&gt; and &lt;a href="https://www.youtube.com/watch?v=zb9lyKYvA0w" rel="noopener noreferrer"&gt;tutorial&lt;/a&gt;, which provided the solution. Although the post and video are in Spanish, both are very illustrative and easy to understand. If you have any issues, please let me know, I'm happy to help! 🙂&lt;/p&gt;

&lt;p&gt;After downloading the virtualization software, I encountered one last issue but successfully solved it after further research, so everything is good to go🥷🎉!&lt;/p&gt;

&lt;h3&gt;
  
  
  Next steps
&lt;/h3&gt;

&lt;p&gt;So, next steps: Practice, practice, practice!&lt;/p&gt;

&lt;p&gt;Once I have the Linux environment set up, I’ll start with the Examples section followed by the Challenges section. I’m super excited and curious about what’s next!&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;In this first part, I faced some challenges and focused on practicing Linux and installing the necessary programs.&lt;/p&gt;

&lt;p&gt;I am super happy and excited about this &lt;a href="https://bootcamp.295devops.com/" rel="noopener noreferrer"&gt;DevOps Bootcamp by RoxRoss&lt;/a&gt;. I really love the fact that senior engineers are sharing their knowledge and bringing equal opportunities to everyone. For those who can’t afford a university degree, this is an extraordinary opportunity. I hope one day I can be a mentor to someone else and help them succeed.&lt;/p&gt;

&lt;p&gt;That’s it! See you in the next post 🚀&lt;/p&gt;

</description>
      <category>beginnerdevops</category>
      <category>juniordevops</category>
      <category>devops</category>
      <category>linux</category>
    </item>
  </channel>
</rss>
