<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Chinedu Oji</title>
    <description>The latest articles on Forem by Chinedu Oji (@chxnedu).</description>
    <link>https://forem.com/chxnedu</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/chxnedu"/>
    <language>en</language>
    <item>
      <title>Authenticating GitHub Actions to AWS using IAM Roles</title>
      <dc:creator>Chinedu Oji</dc:creator>
      <pubDate>Tue, 03 Feb 2026 23:50:17 +0000</pubDate>
      <link>https://forem.com/chxnedu/authenticating-github-actions-to-aws-using-iam-roles-47hg</link>
      <guid>https://forem.com/chxnedu/authenticating-github-actions-to-aws-using-iam-roles-47hg</guid>
      <description>&lt;p&gt;We've all been there: creating AWS access keys for authentication, worrying about keeping them safe, and trying to remember to rotate them periodically.&lt;br&gt;
But do we really need to use long-lived access keys for every situation? For GitHub Actions, the answer is no.&lt;br&gt;
In this article, you'll learn how to authenticate a GitHub Actions workflow to AWS using &lt;strong&gt;IAM roles and OpenID Connect (OIDC)&lt;/strong&gt;. This approach lets you eliminate access keys and avoid manually rotating them.&lt;/p&gt;
&lt;h2&gt;
  
  
  ✅Prerequisites:
&lt;/h2&gt;

&lt;p&gt;You need the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An AWS account with sufficient IAM permissions.&lt;/li&gt;
&lt;li&gt;A GitHub repository &lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  🔐Step 1: Create an OpenID Connect Provider in your AWS account
&lt;/h2&gt;

&lt;p&gt;An IAM identity provider (IdP) enables AWS to trust identities that originate outside AWS. In this step, you create an OpenID Connect (OIDC) provider that allows GitHub Actions to request temporary AWS credentials.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Go to the IAM Console&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Identity Providers&lt;/strong&gt; in the left navigation menu&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Add Provider&lt;/strong&gt; and select &lt;strong&gt;OpenID Connect&lt;/strong&gt; as the provider type&lt;/li&gt;
&lt;li&gt;For &lt;strong&gt;Provider URL&lt;/strong&gt;, enter &lt;code&gt;token.actions.githubusercontent.com&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;For &lt;strong&gt;Audience&lt;/strong&gt; enter: &lt;code&gt;sts.amazonaws.com&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Add provider&lt;/strong&gt; to create the Identity Provider&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flyti5p6evyr2agmflws0.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flyti5p6evyr2agmflws0.jpg" alt="Add Identity Provider Dashboard" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  🧩Step 2: Create an IAM role
&lt;/h2&gt;

&lt;p&gt;The IAM role defines what GitHub Actions can access in your AWS account. You will also scope the role's trust policy so that only a specific GitHub organisation, repository, and branch can assume the role.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Select the Identity Provider you just created&lt;/li&gt;
&lt;li&gt;Click the &lt;strong&gt;Assign Role&lt;/strong&gt; button and choose &lt;strong&gt;Create a new role&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;For the Trusted entity type, &lt;strong&gt;Web Identity&lt;/strong&gt; is already pre-selected, and the &lt;strong&gt;Identity provider&lt;/strong&gt; field is populated with the IdP you just created&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;Audience&lt;/strong&gt; list, select &lt;code&gt;sts.amazonaws.com&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Fill in the GitHub Organisation, Repository, and Branch according to your needs and click &lt;strong&gt;Next&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;For the permissions, we will add them after we have created the role, so click &lt;strong&gt;Next&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;On the Review page, add a role name &lt;code&gt;GitHub-Actions-Role&lt;/code&gt; and optionally add a description&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Create role&lt;/strong&gt; after reviewing the role details&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5so1mcw06hf596l6sbnl.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5so1mcw06hf596l6sbnl.jpg" alt="IAM role review" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  🔑Step 3: Assign Permissions to the role
&lt;/h2&gt;

&lt;p&gt;For this example, the workflow uploads files to Amazon S3, so the role requires S3 permissions.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In the dashboard of the newly created role, select &lt;strong&gt;Add permissions → Create inline policy&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Change the view from &lt;strong&gt;Visual&lt;/strong&gt; to &lt;strong&gt;JSON&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Paste the following policy and click &lt;strong&gt;Next&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Version"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2012-10-17"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Statement"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"s3:PutObject"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"s3:PutObjectAcl"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"s3:GetObject"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:s3:::your-bucket-name/*"&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Name the policy &lt;code&gt;S3-permissions&lt;/code&gt; and click &lt;strong&gt;Create policy&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft3u0jpsk9ueywwgxpnt2.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft3u0jpsk9ueywwgxpnt2.jpg" alt="Create policy review page" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  ⚙️Step 4: Create your GitHub Action
&lt;/h2&gt;

&lt;p&gt;In this step, we will create a GitHub Actions workflow that will authenticate to AWS and upload a file to S3.&lt;br&gt;
Create a file in your repository at &lt;code&gt;.github/workflows/s3-upload.yml&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Upload File to S3&lt;/span&gt;

&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt; &lt;span class="nv"&gt;main&lt;/span&gt; &lt;span class="pi"&gt;]&lt;/span&gt;

&lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;AWS_REGION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;us-east-1&lt;/span&gt; &lt;span class="c1"&gt;#Change to reflect your Region&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;upload&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;

    &lt;span class="c1"&gt;# This allows the actions to get temporary credentials&lt;/span&gt;
    &lt;span class="na"&gt;permissions&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;id-token&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;write&lt;/span&gt;
      &lt;span class="na"&gt;contents&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;read&lt;/span&gt;

    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Checkout code&lt;/span&gt;
      &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v4&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Configure AWS credentials&lt;/span&gt;
      &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;aws-actions/configure-aws-credentials@v5&lt;/span&gt;
      &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;role-to-assume&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;arn:aws:iam::YOUR-ACCOUNT-ID:role/YOUR-ROLE-NAME&lt;/span&gt;
        &lt;span class="na"&gt;aws-region&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ env.AWS_REGION }}&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Upload files to S3&lt;/span&gt;
      &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
        &lt;span class="s"&gt;aws s3 cp ./your-file s3://your-bucket-name/&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Replace the following values:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;YOUR-ACCOUNT-ID with your AWS Account ID&lt;/li&gt;
&lt;li&gt;YOUR-ROLE-NAME with the name of the role you created&lt;/li&gt;
&lt;li&gt;your-file with the file you want to upload&lt;/li&gt;
&lt;li&gt;your-bucket-name with your s3 bucket name&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  📌Summary
&lt;/h2&gt;

&lt;p&gt;You have now configured GitHub Actions to authenticate to AWS using an IAM role and OIDC, eliminating the need for long-lived access keys.&lt;/p&gt;

&lt;h2&gt;
  
  
  🛠️Troubleshooting
&lt;/h2&gt;

&lt;p&gt;If the workflow fails, verify the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The IAM role has the required permissions.&lt;/li&gt;
&lt;li&gt;The GitHub organisation, repository, and branch values in the role trust policy are correct.&lt;/li&gt;
&lt;li&gt;The workflow includes the &lt;code&gt;id-token: write&lt;/code&gt; permission.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>iam</category>
      <category>security</category>
    </item>
    <item>
      <title>Tackling the Cloud Resume Challenge</title>
      <dc:creator>Chinedu Oji</dc:creator>
      <pubDate>Wed, 12 Jun 2024 15:47:47 +0000</pubDate>
      <link>https://forem.com/chxnedu/tackling-the-cloud-resume-challenge-ejo</link>
      <guid>https://forem.com/chxnedu/tackling-the-cloud-resume-challenge-ejo</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In this article, I give an overview of the steps and challenges I underwent to complete the Cloud Resume Challenge.&lt;br&gt;
After learning much about AWS and various DevOps tools, I decided it was time to build real projects so I started searching for good projects to implement. While searching, I came across the &lt;a href="https://cloudresumechallenge.dev/" rel="noopener noreferrer"&gt;Cloud Resume Challenge&lt;/a&gt; by Forrest Brazeal and decided to try it out. &lt;br&gt;
The Cloud Resume Challenge is a hands-on project designed to help bridge the gap from cloud certification to cloud job. It incorporates many skills that real cloud and DevOps engineers use daily.&lt;br&gt;
This challenge involves hosting a personal resume website with a visitor counter on Amazon s3, configuring HTTPS and DNS, and setting up CI/CD for deployment. Sounds easy right? That's an oversimplification of the project. In reality, it involves interacting with a lot of tools and services. As a DevOps Engineer, all interactions with AWS should be done with IaC.&lt;br&gt;
I will divide this post into 3 sections;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;FrontEnd&lt;/li&gt;
&lt;li&gt;Backend&lt;/li&gt;
&lt;li&gt;IaC and CI/CD&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  FrontEnd
&lt;/h2&gt;

&lt;p&gt;The FrontEnd part of the project involved the following steps; &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Designing the FrontEnd with HTML and CSS&lt;/strong&gt;&lt;br&gt;
To design the FrontEnd, I took HTML and CSS crash courses to understand the fundamentals which helped me design a basic resume page. I am not a designer by any means or someone with an artistic eye, so my original design was as horrible as expected.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpg4rcnxn77qup2kroaxh.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpg4rcnxn77qup2kroaxh.jpg" alt="My original site " width="800" height="380"&gt;&lt;/a&gt;&lt;br&gt;
After seeing how ugly and bland the site was, I decided to go with an already made &lt;a href="https://www.themezy.com/free-website-templates/151-ceevee-free-responsive-website-template" rel="noopener noreferrer"&gt;template&lt;/a&gt;. Deciding to use this template brought about a problem later on in the project which I will get into in the IaC section.&lt;br&gt;
After making the necessary edits to the template, my site was ready, and it was time to move to the next step.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Hosting on Amazon S3 as a static website&lt;/strong&gt;&lt;br&gt;
To interact with AWS, I created an IAM user specifically for the project and gave that user access to only the required tools to enhance security. I created an S3 bucket and manually uploaded the files for my website, configured the bucket to host a static website and got the output URL. That was okay for hosting the site, but the project requires you to go further by using a custom domain name.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fic2479i46gv6cw9i0h8x.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fic2479i46gv6cw9i0h8x.jpg" alt="Files uploaded to S3" width="800" height="378"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Configuring HTTPS and DNS&lt;/strong&gt;&lt;br&gt;
I registered my domain name with Whogohost, a local Hosting and Domain Registration Company and used Amazon Certificate Manager to request an SSL/TLS certificate for my domain. I also set up a CloudFront distribution to cache content and improve security by redirecting HTTP traffic to HTTPS. After doing all that, my domain name still wasn't pointing to my resume site so I did some digging and found that you have to create a CNAME record with your DNS provider that points that domain to the CloudFront distribution.&lt;br&gt;
&lt;em&gt;images of Cloudfront and ACM&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;My website &lt;a href="https://resume.chxnedu.xyz" rel="noopener noreferrer"&gt;https://resume.chxnedu.xyz&lt;/a&gt; was finally accessible and online.&lt;br&gt;
The result of the FrontEnd section is a static resume website with an HTTPS URL that points to a CloudFront Distribution.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F66nf9z8mcjzk4bdw7g9v.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F66nf9z8mcjzk4bdw7g9v.jpg" alt="Image description" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  BackEnd
&lt;/h2&gt;

&lt;p&gt;The BackEnd section of the project involves setting up a DynamoDB table to store and update the visitor count, setting up an API as an intermediary between the web app and the database, writing Python code for a lambda function that will save a value to the DynamoDB table, and writing tests to ensure the API is always functional. The steps I took;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Setting up the DynamoDb table&lt;br&gt;
The DynamoDB table was simple. It just needs to hold the value of the visitor count. I used the AWS Console to create the table, then created an item and gave it a number attribute with the value of 1.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flk2sd9wczu76frx0p3in.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flk2sd9wczu76frx0p3in.jpg" alt="DynamoDB Table" width="800" height="380"&gt;&lt;/a&gt; &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Setting up the Lambda function and writing Python code&lt;br&gt;
Lambda is an event-driven, serverless computing platform provided by AWS. It is perfect for this use case because it only needs to run when the website is visited which will trigger the visitor counter. I created a lambda function and used Python to write the &lt;a href="https://github.com/Chxnedu/Cloud-Resume-Challenge-AWS-BACKEND/blob/master/python/lambda_code.py" rel="noopener noreferrer"&gt;code&lt;/a&gt;. The Python code updates the visitor counter's value by 1 and displays the new value as output.  After testing the code and confirming it works as expected, I needed to find the right trigger for the lambda function.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fctssllh6x4qhvca3mlpc.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fctssllh6x4qhvca3mlpc.jpg" alt="The Lambda function" width="800" height="380"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Placing an API Gateway in front of the lambda function&lt;br&gt;
Having the javascript code directly communicate with the DynamoDB table is not a good practice, and that's why we created the Lambda function to update the table. Instead of having the javascript code directly trigger the lambda function, an API is implemented. This API ensures that a call to an endpoint by the javascript code triggers the lambda function to run and outputs the new value of the visitor counter. I used AWS API Gateway to create the API and configured an &lt;em&gt;/update_count&lt;/em&gt; endpoint, which when called, triggers the Lambda function to run.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Setting up alarms and Writing a good test for the API&lt;br&gt;
As an engineer, you always need to know when your code encounters an issue, and you can't be checking your deployment every minute of the day. Some tools will monitor your deployment and alert you when errors are encountered. To monitor my deployment I used AWS CloudWatch because of how it easily integrates with AWS services. The metrics I configured CloudWatch to alert me about;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The function invocation crashes or throws an error &lt;/li&gt;
&lt;li&gt;The latency or response time is longer than usual &lt;/li&gt;
&lt;li&gt;The Lambda function is invoked many times in a short period. 
I set the three alarms up on CloudWatch and tested the first metric by changing my code a bit, and it worked.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Writing Javascript code for the visitor counter&lt;br&gt;
To write the Javascript code, I took a crash course and did a lot of research. I wrote a short &lt;a href="https://github.com/Chxnedu/Cloud-Resume-Challenge-AWS/blob/master/Files/js/countcode.js" rel="noopener noreferrer"&gt;javascript code&lt;/a&gt; that fetches the current visitor count from the API endpoint and displays it on the website.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;After I completed those steps, the FrontEnd and the BackEnd were seamlessly integrated, and a visit to the website will update the visitor counter and display the current count.&lt;/p&gt;

&lt;h2&gt;
  
  
  IaC and CI/CD
&lt;/h2&gt;

&lt;p&gt;All my interactions with AWS have been through the web console, and as a DevOps engineer that is unacceptable. &lt;br&gt;
I created separate repositories to store my &lt;a href="https://github.com/Chxnedu/Cloud-Resume-Challenge-AWS" rel="noopener noreferrer"&gt;FrontEnd&lt;/a&gt; and &lt;a href="https://github.com/Chxnedu/Cloud-Resume-Challenge-AWS-BACKEND" rel="noopener noreferrer"&gt;Backend&lt;/a&gt; files and configured GitHub Actions in each repository to run Terraform and a Cypress test. Terraform Cloud was used for the backend because of the seamless integration with my GitHub repository.&lt;br&gt;
While writing Terraform configurations for my resources, I encountered the problem mentioned earlier in the article.&lt;br&gt;
After creating an S3 bucket with Terraform, the files need to be uploaded to the bucket and that is done so by creating a Terraform resource for each file. Now I have a whole file tree that I need to upload, which meant I would have to do that manually for each file and folder. After some research and digging, I found a &lt;a href="https://barneyparker.com/posts/uploading-file-trees-to-s3-with-terraform/" rel="noopener noreferrer"&gt;blog post&lt;/a&gt; that shows how to upload file trees cleverly using some Terraform functions. I implemented this method and had the whole file tree uploaded to the S3 bucket.&lt;br&gt;
Using GitHub Actions, a push to each repository triggers a run that applies my Terraform configuration and runs a Cypress test.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa5by5yrqph1qezxferyf.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa5by5yrqph1qezxferyf.jpg" alt="Successful Backend run" width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With all these setups, I successfully implemented the Cloud Resume Challenge with a DevOps spin.&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>devops</category>
      <category>iac</category>
      <category>aws</category>
    </item>
  </channel>
</rss>
